Sample records for statistical surrogate models

  1. Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines

    NASA Astrophysics Data System (ADS)

    Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.

    2016-12-01

    Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.

  2. Comparing and combining biomarkers as principle surrogates for time-to-event clinical endpoints.

    PubMed

    Gabriel, Erin E; Sachs, Michael C; Gilbert, Peter B

    2015-02-10

    Principal surrogate endpoints are useful as targets for phase I and II trials. In many recent trials, multiple post-randomization biomarkers are measured. However, few statistical methods exist for comparison of or combination of biomarkers as principal surrogates, and none of these methods to our knowledge utilize time-to-event clinical endpoint information. We propose a Weibull model extension of the semi-parametric estimated maximum likelihood method that allows for the inclusion of multiple biomarkers in the same risk model as multivariate candidate principal surrogates. We propose several methods for comparing candidate principal surrogates and evaluating multivariate principal surrogates. These include the time-dependent and surrogate-dependent true and false positive fraction, the time-dependent and the integrated standardized total gain, and the cumulative distribution function of the risk difference. We illustrate the operating characteristics of our proposed methods in simulations and outline how these statistics can be used to evaluate and compare candidate principal surrogates. We use these methods to investigate candidate surrogates in the Diabetes Control and Complications Trial. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Error modeling for surrogates of dynamical systems using machine learning: Machine-learning-based error model for surrogates of dynamical systems

    DOE PAGES

    Trehan, Sumeet; Carlberg, Kevin T.; Durlofsky, Louis J.

    2017-07-14

    A machine learning–based framework for modeling the error introduced by surrogate models of parameterized dynamical systems is proposed. The framework entails the use of high-dimensional regression techniques (eg, random forests, and LASSO) to map a large set of inexpensively computed “error indicators” (ie, features) produced by the surrogate model at a given time instance to a prediction of the surrogate-model error in a quantity of interest (QoI). This eliminates the need for the user to hand-select a small number of informative features. The methodology requires a training set of parameter instances at which the time-dependent surrogate-model error is computed bymore » simulating both the high-fidelity and surrogate models. Using these training data, the method first determines regression-model locality (via classification or clustering) and subsequently constructs a “local” regression model to predict the time-instantaneous error within each identified region of feature space. We consider 2 uses for the resulting error model: (1) as a correction to the surrogate-model QoI prediction at each time instance and (2) as a way to statistically model arbitrary functions of the time-dependent surrogate-model error (eg, time-integrated errors). We then apply the proposed framework to model errors in reduced-order models of nonlinear oil-water subsurface flow simulations, with time-varying well-control (bottom-hole pressure) parameters. The reduced-order models used in this work entail application of trajectory piecewise linearization in conjunction with proper orthogonal decomposition. Moreover, when the first use of the method is considered, numerical experiments demonstrate consistent improvement in accuracy in the time-instantaneous QoI prediction relative to the original surrogate model, across a large number of test cases. When the second use is considered, results show that the proposed method provides accurate statistical predictions of the time- and well-averaged errors.« less

  4. Error modeling for surrogates of dynamical systems using machine learning: Machine-learning-based error model for surrogates of dynamical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trehan, Sumeet; Carlberg, Kevin T.; Durlofsky, Louis J.

    A machine learning–based framework for modeling the error introduced by surrogate models of parameterized dynamical systems is proposed. The framework entails the use of high-dimensional regression techniques (eg, random forests, and LASSO) to map a large set of inexpensively computed “error indicators” (ie, features) produced by the surrogate model at a given time instance to a prediction of the surrogate-model error in a quantity of interest (QoI). This eliminates the need for the user to hand-select a small number of informative features. The methodology requires a training set of parameter instances at which the time-dependent surrogate-model error is computed bymore » simulating both the high-fidelity and surrogate models. Using these training data, the method first determines regression-model locality (via classification or clustering) and subsequently constructs a “local” regression model to predict the time-instantaneous error within each identified region of feature space. We consider 2 uses for the resulting error model: (1) as a correction to the surrogate-model QoI prediction at each time instance and (2) as a way to statistically model arbitrary functions of the time-dependent surrogate-model error (eg, time-integrated errors). We then apply the proposed framework to model errors in reduced-order models of nonlinear oil-water subsurface flow simulations, with time-varying well-control (bottom-hole pressure) parameters. The reduced-order models used in this work entail application of trajectory piecewise linearization in conjunction with proper orthogonal decomposition. Moreover, when the first use of the method is considered, numerical experiments demonstrate consistent improvement in accuracy in the time-instantaneous QoI prediction relative to the original surrogate model, across a large number of test cases. When the second use is considered, results show that the proposed method provides accurate statistical predictions of the time- and well-averaged errors.« less

  5. Identifying taxonomic and functional surrogates for spring biodiversity conservation.

    PubMed

    Jyväsjärvi, Jussi; Virtanen, Risto; Ilmonen, Jari; Paasivirta, Lauri; Muotka, Timo

    2018-02-27

    Surrogate approaches are widely used to estimate overall taxonomic diversity for conservation planning. Surrogate taxa are frequently selected based on rarity or charisma, whereas selection through statistical modeling has been applied rarely. We used boosted-regression-tree models (BRT) fitted to biological data from 165 springs to identify bryophyte and invertebrate surrogates for taxonomic and functional diversity of boreal springs. We focused on these 2 groups because they are well known and abundant in most boreal springs. The best indicators of taxonomic versus functional diversity differed. The bryophyte Bryum weigelii and the chironomid larva Paratrichocladius skirwithensis best indicated taxonomic diversity, whereas the isopod Asellus aquaticus and the chironomid Macropelopia spp. were the best surrogates of functional diversity. In a scoring algorithm for priority-site selection, taxonomic surrogates performed only slightly better than random selection for all spring-dwelling taxa, but they were very effective in representing spring specialists, providing a distinct improvement over random solutions. However, the surrogates for taxonomic diversity represented functional diversity poorly and vice versa. When combined with cross-taxon complementarity analyses, surrogate selection based on statistical modeling provides a promising approach for identifying groundwater-dependent ecosystems of special conservation value, a key requirement of the EU Water Framework Directive. © 2018 Society for Conservation Biology.

  6. On Using Surrogates with Genetic Programming.

    PubMed

    Hildebrandt, Torsten; Branke, Jürgen

    2015-01-01

    One way to accelerate evolutionary algorithms with expensive fitness evaluations is to combine them with surrogate models. Surrogate models are efficiently computable approximations of the fitness function, derived by means of statistical or machine learning techniques from samples of fully evaluated solutions. But these models usually require a numerical representation, and therefore cannot be used with the tree representation of genetic programming (GP). In this paper, we present a new way to use surrogate models with GP. Rather than using the genotype directly as input to the surrogate model, we propose using a phenotypic characterization. This phenotypic characterization can be computed efficiently and allows us to define approximate measures of equivalence and similarity. Using a stochastic, dynamic job shop scenario as an example of simulation-based GP with an expensive fitness evaluation, we show how these ideas can be used to construct surrogate models and improve the convergence speed and solution quality of GP.

  7. Testing for nonlinearity in time series: The method of surrogate data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, J.; Galdrikian, B.; Longtin, A.

    1991-01-01

    We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less

  8. Surrogate model approach for improving the performance of reactive transport simulations

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2016-04-01

    Reactive transport models can serve a large number of important geoscientific applications involving underground resources in industry and scientific research. It is common for simulation of reactive transport to consist of at least two coupled simulation models. First is a hydrodynamics simulator that is responsible for simulating the flow of groundwaters and transport of solutes. Hydrodynamics simulators are well established technology and can be very efficient. When hydrodynamics simulations are performed without coupled geochemistry, their spatial geometries can span millions of elements even when running on desktop workstations. Second is a geochemical simulation model that is coupled to the hydrodynamics simulator. Geochemical simulation models are much more computationally costly. This is a problem that makes reactive transport simulations spanning millions of spatial elements very difficult to achieve. To address this problem we propose to replace the coupled geochemical simulation model with a surrogate model. A surrogate is a statistical model created to include only the necessary subset of simulator complexity for a particular scenario. To demonstrate the viability of such an approach we tested it on a popular reactive transport benchmark problem that involves 1D Calcite transport. This is a published benchmark problem (Kolditz, 2012) for simulation models and for this reason we use it to test the surrogate model approach. To do this we tried a number of statistical models available through the caret and DiceEval packages for R, to be used as surrogate models. These were trained on randomly sampled subset of the input-output data from the geochemical simulation model used in the original reactive transport simulation. For validation we use the surrogate model to predict the simulator output using the part of sampled input data that was not used for training the statistical model. For this scenario we find that the multivariate adaptive regression splines (MARS) method provides the best trade-off between speed and accuracy. This proof-of-concept forms an essential step towards building an interactive visual analytics system to enable user-driven systematic creation of geochemical surrogate models. Such a system shall enable reactive transport simulations with unprecedented spatial and temporal detail to become possible. References: Kolditz, O., Görke, U.J., Shao, H. and Wang, W., 2012. Thermo-hydro-mechanical-chemical processes in porous media: benchmarks and examples (Vol. 86). Springer Science & Business Media.

  9. An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.

    2017-01-01

    Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.

  10. Statistical Validation of Surrogate Endpoints: Another Look at the Prentice Criterion and Other Criteria.

    PubMed

    Saraf, Sanatan; Mathew, Thomas; Roy, Anindya

    2015-01-01

    For the statistical validation of surrogate endpoints, an alternative formulation is proposed for testing Prentice's fourth criterion, under a bivariate normal model. In such a setup, the criterion involves inference concerning an appropriate regression parameter, and the criterion holds if the regression parameter is zero. Testing such a null hypothesis has been criticized in the literature since it can only be used to reject a poor surrogate, and not to validate a good surrogate. In order to circumvent this, an equivalence hypothesis is formulated for the regression parameter, namely the hypothesis that the parameter is equivalent to zero. Such an equivalence hypothesis is formulated as an alternative hypothesis, so that the surrogate endpoint is statistically validated when the null hypothesis is rejected. Confidence intervals for the regression parameter and tests for the equivalence hypothesis are proposed using bootstrap methods and small sample asymptotics, and their performances are numerically evaluated and recommendations are made. The choice of the equivalence margin is a regulatory issue that needs to be addressed. The proposed equivalence testing formulation is also adopted for other parameters that have been proposed in the literature on surrogate endpoint validation, namely, the relative effect and proportion explained.

  11. Surrogates for numerical simulations; optimization of eddy-promoter heat exchangers

    NASA Technical Reports Server (NTRS)

    Patera, Anthony T.; Patera, Anthony

    1993-01-01

    Although the advent of fast and inexpensive parallel computers has rendered numerous previously intractable calculations feasible, many numerical simulations remain too resource-intensive to be directly inserted in engineering optimization efforts. An attractive alternative to direct insertion considers models for computational systems: the expensive simulation is evoked only to construct and validate a simplified, input-output model; this simplified input-output model then serves as a simulation surrogate in subsequent engineering optimization studies. A simple 'Bayesian-validated' statistical framework for the construction, validation, and purposive application of static computer simulation surrogates is presented. As an example, dissipation-transport optimization of laminar-flow eddy-promoter heat exchangers are considered: parallel spectral element Navier-Stokes calculations serve to construct and validate surrogates for the flowrate and Nusselt number; these surrogates then represent the originating Navier-Stokes equations in the ensuing design process.

  12. Integrating surrogate models into subsurface simulation framework allows computation of complex reactive transport scenarios

    NASA Astrophysics Data System (ADS)

    De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael

    2017-04-01

    Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M.: "Flexible Simulation Framework to Couple Processes in Complex 3D Models for Subsurface Utilization Assessment.", Energy Procedia, 97, 2016 p. 494-501.

  13. Modeling the Test-Retest Statistics of a Localization Experiment in the Full Horizontal Plane.

    PubMed

    Morsnowski, André; Maune, Steffen

    2016-10-01

    Two approaches to model the test-retest statistics of a localization experiment basing on Gaussian distribution and on surrogate data are introduced. Their efficiency is investigated using different measures describing directional hearing ability. A localization experiment in the full horizontal plane is a challenging task for hearing impaired patients. In clinical routine, we use this experiment to evaluate the progress of our cochlear implant (CI) recipients. Listening and time effort limit the reproducibility. The localization experiment consists of a 12 loudspeaker circle, placed in an anechoic room, a "camera silens". In darkness, HSM sentences are presented at 65 dB pseudo-erratically from all 12 directions with five repetitions. This experiment is modeled by a set of Gaussian distributions with different standard deviations added to a perfect estimator, as well as by surrogate data. Five repetitions per direction are used to produce surrogate data distributions for the sensation directions. To investigate the statistics, we retrospectively use the data of 33 CI patients with 92 pairs of test-retest-measurements from the same day. The first model does not take inversions into account, (i.e., permutations of the direction from back to front and vice versa are not considered), although they are common for hearing impaired persons particularly in the rear hemisphere. The second model considers these inversions but does not work with all measures. The introduced models successfully describe test-retest statistics of directional hearing. However, since their applications on the investigated measures perform differently no general recommendation can be provided. The presented test-retest statistics enable pair test comparisons for localization experiments.

  14. Bayesian meta-analytical methods to incorporate multiple surrogate endpoints in drug development process.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Riley, Richard D; Abrams, Keith R

    2016-03-30

    A number of meta-analytical methods have been proposed that aim to evaluate surrogate endpoints. Bivariate meta-analytical methods can be used to predict the treatment effect for the final outcome from the treatment effect estimate measured on the surrogate endpoint while taking into account the uncertainty around the effect estimate for the surrogate endpoint. In this paper, extensions to multivariate models are developed aiming to include multiple surrogate endpoints with the potential benefit of reducing the uncertainty when making predictions. In this Bayesian multivariate meta-analytic framework, the between-study variability is modelled in a formulation of a product of normal univariate distributions. This formulation is particularly convenient for including multiple surrogate endpoints and flexible for modelling the outcomes which can be surrogate endpoints to the final outcome and potentially to one another. Two models are proposed, first, using an unstructured between-study covariance matrix by assuming the treatment effects on all outcomes are correlated and second, using a structured between-study covariance matrix by assuming treatment effects on some of the outcomes are conditionally independent. While the two models are developed for the summary data on a study level, the individual-level association is taken into account by the use of the Prentice's criteria (obtained from individual patient data) to inform the within study correlations in the models. The modelling techniques are investigated using an example in relapsing remitting multiple sclerosis where the disability worsening is the final outcome, while relapse rate and MRI lesions are potential surrogates to the disability progression. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  15. Testing Pairwise Association between Spatially Autocorrelated Variables: A New Approach Using Surrogate Lattice Data

    PubMed Central

    Deblauwe, Vincent; Kennel, Pol; Couteron, Pierre

    2012-01-01

    Background Independence between observations is a standard prerequisite of traditional statistical tests of association. This condition is, however, violated when autocorrelation is present within the data. In the case of variables that are regularly sampled in space (i.e. lattice data or images), such as those provided by remote-sensing or geographical databases, this problem is particularly acute. Because analytic derivation of the null probability distribution of the test statistic (e.g. Pearson's r) is not always possible when autocorrelation is present, we propose instead the use of a Monte Carlo simulation with surrogate data. Methodology/Principal Findings The null hypothesis that two observed mapped variables are the result of independent pattern generating processes is tested here by generating sets of random image data while preserving the autocorrelation function of the original images. Surrogates are generated by matching the dual-tree complex wavelet spectra (and hence the autocorrelation functions) of white noise images with the spectra of the original images. The generated images can then be used to build the probability distribution function of any statistic of association under the null hypothesis. We demonstrate the validity of a statistical test of association based on these surrogates with both actual and synthetic data and compare it with a corrected parametric test and three existing methods that generate surrogates (randomization, random rotations and shifts, and iterative amplitude adjusted Fourier transform). Type I error control was excellent, even with strong and long-range autocorrelation, which is not the case for alternative methods. Conclusions/Significance The wavelet-based surrogates are particularly appropriate in cases where autocorrelation appears at all scales or is direction-dependent (anisotropy). We explore the potential of the method for association tests involving a lattice of binary data and discuss its potential for validation of species distribution models. An implementation of the method in Java for the generation of wavelet-based surrogates is available online as supporting material. PMID:23144961

  16. Real-time characterization of partially observed epidemics using surrogate models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safta, Cosmin; Ray, Jaideep; Lefantzi, Sophia

    We present a statistical method, predicated on the use of surrogate models, for the 'real-time' characterization of partially observed epidemics. Observations consist of counts of symptomatic patients, diagnosed with the disease, that may be available in the early epoch of an ongoing outbreak. Characterization, in this context, refers to estimation of epidemiological parameters that can be used to provide short-term forecasts of the ongoing epidemic, as well as to provide gross information on the dynamics of the etiologic agent in the affected population e.g., the time-dependent infection rate. The characterization problem is formulated as a Bayesian inverse problem, and epidemiologicalmore » parameters are estimated as distributions using a Markov chain Monte Carlo (MCMC) method, thus quantifying the uncertainty in the estimates. In some cases, the inverse problem can be computationally expensive, primarily due to the epidemic simulator used inside the inversion algorithm. We present a method, based on replacing the epidemiological model with computationally inexpensive surrogates, that can reduce the computational time to minutes, without a significant loss of accuracy. The surrogates are created by projecting the output of an epidemiological model on a set of polynomial chaos bases; thereafter, computations involving the surrogate model reduce to evaluations of a polynomial. We find that the epidemic characterizations obtained with the surrogate models is very close to that obtained with the original model. We also find that the number of projections required to construct a surrogate model is O(10)-O(10{sup 2}) less than the number of samples required by the MCMC to construct a stationary posterior distribution; thus, depending upon the epidemiological models in question, it may be possible to omit the offline creation and caching of surrogate models, prior to their use in an inverse problem. The technique is demonstrated on synthetic data as well as observations from the 1918 influenza pandemic collected at Camp Custer, Michigan.« less

  17. An entropy-based nonparametric test for the validation of surrogate endpoints.

    PubMed

    Miao, Xiaopeng; Wang, Yong-Cheng; Gangopadhyay, Ashis

    2012-06-30

    We present a nonparametric test to validate surrogate endpoints based on measure of divergence and random permutation. This test is a proposal to directly verify the Prentice statistical definition of surrogacy. The test does not impose distributional assumptions on the endpoints, and it is robust to model misspecification. Our simulation study shows that the proposed nonparametric test outperforms the practical test of the Prentice criterion in terms of both robustness of size and power. We also evaluate the performance of three leading methods that attempt to quantify the effect of surrogate endpoints. The proposed method is applied to validate magnetic resonance imaging lesions as the surrogate endpoint for clinical relapses in a multiple sclerosis trial. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Robust estimation of the proportion of treatment effect explained by surrogate marker information.

    PubMed

    Parast, Layla; McDermott, Mary M; Tian, Lu

    2016-05-10

    In randomized treatment studies where the primary outcome requires long follow-up of patients and/or expensive or invasive obtainment procedures, the availability of a surrogate marker that could be used to estimate the treatment effect and could potentially be observed earlier than the primary outcome would allow researchers to make conclusions regarding the treatment effect with less required follow-up time and resources. The Prentice criterion for a valid surrogate marker requires that a test for treatment effect on the surrogate marker also be a valid test for treatment effect on the primary outcome of interest. Based on this criterion, methods have been developed to define and estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on the surrogate marker. These methods aim to identify useful statistical surrogates that capture a large proportion of the treatment effect. However, current methods to estimate this proportion usually require restrictive model assumptions that may not hold in practice and thus may lead to biased estimates of this quantity. In this paper, we propose a nonparametric procedure to estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on a potential surrogate marker and extend this procedure to a setting with multiple surrogate markers. We compare our approach with previously proposed model-based approaches and propose a variance estimation procedure based on a perturbation-resampling method. Simulation studies demonstrate that the procedure performs well in finite samples and outperforms model-based procedures when the specified models are not correct. We illustrate our proposed procedure using a data set from a randomized study investigating a group-mediated cognitive behavioral intervention for peripheral artery disease participants. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Surrogacy assessment using principal stratification when surrogate and outcome measures are multivariate normal.

    PubMed

    Conlon, Anna S C; Taylor, Jeremy M G; Elliott, Michael R

    2014-04-01

    In clinical trials, a surrogate outcome variable (S) can be measured before the outcome of interest (T) and may provide early information regarding the treatment (Z) effect on T. Using the principal surrogacy framework introduced by Frangakis and Rubin (2002. Principal stratification in causal inference. Biometrics 58, 21-29), we consider an approach that has a causal interpretation and develop a Bayesian estimation strategy for surrogate validation when the joint distribution of potential surrogate and outcome measures is multivariate normal. From the joint conditional distribution of the potential outcomes of T, given the potential outcomes of S, we propose surrogacy validation measures from this model. As the model is not fully identifiable from the data, we propose some reasonable prior distributions and assumptions that can be placed on weakly identified parameters to aid in estimation. We explore the relationship between our surrogacy measures and the surrogacy measures proposed by Prentice (1989. Surrogate endpoints in clinical trials: definition and operational criteria. Statistics in Medicine 8, 431-440). The method is applied to data from a macular degeneration study and an ovarian cancer study.

  20. Surrogacy assessment using principal stratification when surrogate and outcome measures are multivariate normal

    PubMed Central

    Conlon, Anna S. C.; Taylor, Jeremy M. G.; Elliott, Michael R.

    2014-01-01

    In clinical trials, a surrogate outcome variable (S) can be measured before the outcome of interest (T) and may provide early information regarding the treatment (Z) effect on T. Using the principal surrogacy framework introduced by Frangakis and Rubin (2002. Principal stratification in causal inference. Biometrics 58, 21–29), we consider an approach that has a causal interpretation and develop a Bayesian estimation strategy for surrogate validation when the joint distribution of potential surrogate and outcome measures is multivariate normal. From the joint conditional distribution of the potential outcomes of T, given the potential outcomes of S, we propose surrogacy validation measures from this model. As the model is not fully identifiable from the data, we propose some reasonable prior distributions and assumptions that can be placed on weakly identified parameters to aid in estimation. We explore the relationship between our surrogacy measures and the surrogacy measures proposed by Prentice (1989. Surrogate endpoints in clinical trials: definition and operational criteria. Statistics in Medicine 8, 431–440). The method is applied to data from a macular degeneration study and an ovarian cancer study. PMID:24285772

  1. Statistical evaluation of surrogate endpoints with examples from cancer clinical trials.

    PubMed

    Buyse, Marc; Molenberghs, Geert; Paoletti, Xavier; Oba, Koji; Alonso, Ariel; Van der Elst, Wim; Burzykowski, Tomasz

    2016-01-01

    A surrogate endpoint is intended to replace a clinical endpoint for the evaluation of new treatments when it can be measured more cheaply, more conveniently, more frequently, or earlier than that clinical endpoint. A surrogate endpoint is expected to predict clinical benefit, harm, or lack of these. Besides the biological plausibility of a surrogate, a quantitative assessment of the strength of evidence for surrogacy requires the demonstration of the prognostic value of the surrogate for the clinical outcome, and evidence that treatment effects on the surrogate reliably predict treatment effects on the clinical outcome. We focus on these two conditions, and outline the statistical approaches that have been proposed to assess the extent to which these conditions are fulfilled. When data are available from a single trial, one can assess the "individual level association" between the surrogate and the true endpoint. When data are available from several trials, one can additionally assess the "trial level association" between the treatment effect on the surrogate and the treatment effect on the true endpoint. In the latter case, the "surrogate threshold effect" can be estimated as the minimum effect on the surrogate endpoint that predicts a statistically significant effect on the clinical endpoint. All these concepts are discussed in the context of randomized clinical trials in oncology, and illustrated with two meta-analyses in gastric cancer. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Hard, harder, hardest: principal stratification, statistical identifiability, and the inherent difficulty of finding surrogate endpoints.

    PubMed

    Wolfson, Julian; Henn, Lisa

    2014-01-01

    In many areas of clinical investigation there is great interest in identifying and validating surrogate endpoints, biomarkers that can be measured a relatively short time after a treatment has been administered and that can reliably predict the effect of treatment on the clinical outcome of interest. However, despite dramatic advances in the ability to measure biomarkers, the recent history of clinical research is littered with failed surrogates. In this paper, we present a statistical perspective on why identifying surrogate endpoints is so difficult. We view the problem from the framework of causal inference, with a particular focus on the technique of principal stratification (PS), an approach which is appealing because the resulting estimands are not biased by unmeasured confounding. In many settings, PS estimands are not statistically identifiable and their degree of non-identifiability can be thought of as representing the statistical difficulty of assessing the surrogate value of a biomarker. In this work, we examine the identifiability issue and present key simplifying assumptions and enhanced study designs that enable the partial or full identification of PS estimands. We also present example situations where these assumptions and designs may or may not be feasible, providing insight into the problem characteristics which make the statistical evaluation of surrogate endpoints so challenging.

  3. Hard, harder, hardest: principal stratification, statistical identifiability, and the inherent difficulty of finding surrogate endpoints

    PubMed Central

    2014-01-01

    In many areas of clinical investigation there is great interest in identifying and validating surrogate endpoints, biomarkers that can be measured a relatively short time after a treatment has been administered and that can reliably predict the effect of treatment on the clinical outcome of interest. However, despite dramatic advances in the ability to measure biomarkers, the recent history of clinical research is littered with failed surrogates. In this paper, we present a statistical perspective on why identifying surrogate endpoints is so difficult. We view the problem from the framework of causal inference, with a particular focus on the technique of principal stratification (PS), an approach which is appealing because the resulting estimands are not biased by unmeasured confounding. In many settings, PS estimands are not statistically identifiable and their degree of non-identifiability can be thought of as representing the statistical difficulty of assessing the surrogate value of a biomarker. In this work, we examine the identifiability issue and present key simplifying assumptions and enhanced study designs that enable the partial or full identification of PS estimands. We also present example situations where these assumptions and designs may or may not be feasible, providing insight into the problem characteristics which make the statistical evaluation of surrogate endpoints so challenging. PMID:25342953

  4. Model-assisted probability of detection of flaws in aluminum blocks using polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Du, Xiaosong; Leifsson, Leifur; Grandin, Robert; Meeker, William; Roberts, Ronald; Song, Jiming

    2018-04-01

    Probability of detection (POD) is widely used for measuring reliability of nondestructive testing (NDT) systems. Typically, POD is determined experimentally, while it can be enhanced by utilizing physics-based computational models in combination with model-assisted POD (MAPOD) methods. With the development of advanced physics-based methods, such as ultrasonic NDT testing, the empirical information, needed for POD methods, can be reduced. However, performing accurate numerical simulations can be prohibitively time-consuming, especially as part of stochastic analysis. In this work, stochastic surrogate models for computational physics-based measurement simulations are developed for cost savings of MAPOD methods while simultaneously ensuring sufficient accuracy. The stochastic surrogate is used to propagate the random input variables through the physics-based simulation model to obtain the joint probability distribution of the output. The POD curves are then generated based on those results. Here, the stochastic surrogates are constructed using non-intrusive polynomial chaos (NIPC) expansions. In particular, the NIPC methods used are the quadrature, ordinary least-squares (OLS), and least-angle regression sparse (LARS) techniques. The proposed approach is demonstrated on the ultrasonic testing simulation of a flat bottom hole flaw in an aluminum block. The results show that the stochastic surrogates have at least two orders of magnitude faster convergence on the statistics than direct Monte Carlo sampling (MCS). Moreover, the evaluation of the stochastic surrogate models is over three orders of magnitude faster than the underlying simulation model for this case, which is the UTSim2 model.

  5. Constructing Surrogate Models of Complex Systems with Enhanced Sparsity: Quantifying the Influence of Conformational Uncertainty in Biomolecular Solvation

    DOE PAGES

    Lei, Huan; Yang, Xiu; Zheng, Bin; ...

    2015-11-05

    Biomolecules exhibit conformational fluctuations near equilibrium states, inducing uncertainty in various biological properties in a dynamic way. We have developed a general method to quantify the uncertainty of target properties induced by conformational fluctuations. Using a generalized polynomial chaos (gPC) expansion, we construct a surrogate model of the target property with respect to varying conformational states. We also propose a method to increase the sparsity of the gPC expansion by defining a set of conformational “active space” random variables. With the increased sparsity, we employ the compressive sensing method to accurately construct the surrogate model. We demonstrate the performance ofmore » the surrogate model by evaluating fluctuation-induced uncertainty in solvent-accessible surface area for the bovine trypsin inhibitor protein system and show that the new approach offers more accurate statistical information than standard Monte Carlo approaches. Further more, the constructed surrogate model also enables us to directly evaluate the target property under various conformational states, yielding a more accurate response surface than standard sparse grid collocation methods. In particular, the new method provides higher accuracy in high-dimensional systems, such as biomolecules, where sparse grid performance is limited by the accuracy of the computed quantity of interest. Finally, our new framework is generalizable and can be used to investigate the uncertainty of a wide variety of target properties in biomolecular systems.« less

  6. Global Sensitivity Analysis of Environmental Systems via Multiple Indices based on Statistical Moments of Model Outputs

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Riva, M.; Dell'Oca, A.

    2017-12-01

    We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.

  7. Statistical surrogate models for prediction of high-consequence climate change.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less

  8. Beyond multi-fractals: surrogate time series and fields

    NASA Astrophysics Data System (ADS)

    Venema, V.; Simmer, C.

    2007-12-01

    Most natural complex are characterised by variability on a large range of temporal and spatial scales. The two main methodologies to generate such structures are Fourier/FARIMA based algorithms and multifractal methods. The former is restricted to Gaussian data, whereas the latter requires the structure to be self-similar. This work will present so-called surrogate data as an alternative that works with any (empirical) distribution and power spectrum. The best-known surrogate algorithm is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm. We have studied six different geophysical time series (two clouds, runoff of a small and a large river, temperature and rain) and their surrogates. The power spectra and consequently the 2nd order structure functions were replicated accurately. Even the fourth order structure function was more accurately reproduced by the surrogates as would be possible by a fractal method, because the measured structure deviated too strong from fractal scaling. Only in case of the daily rain sums a fractal method could have been more accurate. Just as Fourier and multifractal methods, the current surrogates are not able to model the asymmetric increment distributions observed for runoff, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found differences for the structure functions on small scales. Surrogate methods are especially valuable for empirical studies, because the time series and fields that are generated are able to mimic measured variables accurately. Our main application is radiative transfer through structured clouds. Like many geophysical fields, clouds can only be sampled sparsely, e.g. with in-situ airborne instruments. However, for radiative transfer calculations we need full 3-dimensional cloud fields. A first study relating the measured properties of the cloud droplets and the radiative properties of the cloud field by generating surrogate cloud fields yielded good results within the measurement error. A further test of the suitability of the surrogate clouds for radiative transfer is evaluated by comparing the radiative properties of model cloud fields of sparse cumulus and stratocumulus with their surrogate fields. The bias and root mean square error in various radiative properties is small and the deviations in the radiances and irradiances are not statistically significant, i.e. these deviations can be attributed to the Monte Carlo noise of the radiative transfer calculations. We compared these results with optical properties of synthetic clouds that have either the correct distribution (but no spatial correlations) or the correct power spectrum (but a Gaussian distribution). These clouds did show statistical significant deviations. For more information see: http://www.meteo.uni-bonn.de/venema/themes/surrogates/

  9. Using missing ordinal patterns to detect nonlinearity in time series data.

    PubMed

    Kulp, Christopher W; Zunino, Luciano; Osborne, Thomas; Zawadzki, Brianna

    2017-08-01

    The number of missing ordinal patterns (NMP) is the number of ordinal patterns that do not appear in a series after it has been symbolized using the Bandt and Pompe methodology. In this paper, the NMP is demonstrated as a test for nonlinearity using a surrogate framework in order to see if the NMP for a series is statistically different from the NMP of iterative amplitude adjusted Fourier transform (IAAFT) surrogates. It is found that the NMP works well as a test statistic for nonlinearity, even in the cases of very short time series. Both model and experimental time series are used to demonstrate the efficacy of the NMP as a test for nonlinearity.

  10. Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change

    NASA Astrophysics Data System (ADS)

    Field, R.; Constantine, P.; Boslough, M.

    2011-12-01

    We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We applied the calibrated surrogate model to study the probability that the precipitation rate falls below certain thresholds and utilized the Bayesian approach to quantify our confidence in these predictions. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  11. Definitions and validation criteria for biomarkers and surrogate endpoints: development and testing of a quantitative hierarchical levels of evidence schema.

    PubMed

    Lassere, Marissa N; Johnson, Kent R; Boers, Maarten; Tugwell, Peter; Brooks, Peter; Simon, Lee; Strand, Vibeke; Conaghan, Philip G; Ostergaard, Mikkel; Maksymowych, Walter P; Landewe, Robert; Bresnihan, Barry; Tak, Paul-Peter; Wakefield, Richard; Mease, Philip; Bingham, Clifton O; Hughes, Michael; Altman, Doug; Buyse, Marc; Galbraith, Sally; Wells, George

    2007-03-01

    There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Our objective was to review the literature on biomarkers and surrogates to develop a hierarchical schema that systematically evaluates and ranks the surrogacy status of biomarkers and surrogates; and to obtain feedback from stakeholders. After a systematic search of Medline and Embase on biomarkers, surrogate (outcomes, endpoints, markers, indicators), intermediate endpoints, and leading indicators, a quantitative surrogate validation schema was developed and subsequently evaluated at a stakeholder workshop. The search identified several classification schema and definitions. Components of these were incorporated into a new quantitative surrogate validation level of evidence schema that evaluates biomarkers along 4 domains: Target, Study Design, Statistical Strength, and Penalties. Scores derived from 3 domains the Target that the marker is being substituted for, the Design of the (best) evidence, and the Statistical strength are additive. Penalties are then applied if there is serious counterevidence. A total score (0 to 15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. It was proposed that the term "surrogate" be restricted to markers attaining Levels 1 or 2 only. Most stakeholders agreed that this operationalization of the National Institutes of Health definitions of biomarker, surrogate endpoint, and clinical endpoint was useful. Further development and application of this schema provides incentives and guidance for effective biomarker and surrogate endpoint research, and more efficient drug discovery, development, and approval.

  12. Bayesian calibration of the Community Land Model using surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ray, Jaideep; Hou, Zhangshuan; Huang, Maoyi

    2014-02-01

    We present results from the Bayesian calibration of hydrological parameters of the Community Land Model (CLM), which is often used in climate simulations and Earth system models. A statistical inverse problem is formulated for three hydrological parameters, conditional on observations of latent heat surface fluxes over 48 months. Our calibration method uses polynomial and Gaussian process surrogates of the CLM, and solves the parameter estimation problem using a Markov chain Monte Carlo sampler. Posterior probability densities for the parameters are developed for two sites with different soil and vegetation covers. Our method also allows us to examine the structural errormore » in CLM under two error models. We find that surrogate models can be created for CLM in most cases. The posterior distributions are more predictive than the default parameter values in CLM. Climatologically averaging the observations does not modify the parameters' distributions significantly. The structural error model reveals a correlation time-scale which can be used to identify the physical process that could be contributing to it. While the calibrated CLM has a higher predictive skill, the calibration is under-dispersive.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Huan; Yang, Xiu; Zheng, Bin

    Biomolecules exhibit conformational fluctuations near equilibrium states, inducing uncertainty in various biological properties in a dynamic way. We have developed a general method to quantify the uncertainty of target properties induced by conformational fluctuations. Using a generalized polynomial chaos (gPC) expansion, we construct a surrogate model of the target property with respect to varying conformational states. We also propose a method to increase the sparsity of the gPC expansion by defining a set of conformational “active space” random variables. With the increased sparsity, we employ the compressive sensing method to accurately construct the surrogate model. We demonstrate the performance ofmore » the surrogate model by evaluating fluctuation-induced uncertainty in solvent-accessible surface area for the bovine trypsin inhibitor protein system and show that the new approach offers more accurate statistical information than standard Monte Carlo approaches. Further more, the constructed surrogate model also enables us to directly evaluate the target property under various conformational states, yielding a more accurate response surface than standard sparse grid collocation methods. In particular, the new method provides higher accuracy in high-dimensional systems, such as biomolecules, where sparse grid performance is limited by the accuracy of the computed quantity of interest. Finally, our new framework is generalizable and can be used to investigate the uncertainty of a wide variety of target properties in biomolecular systems.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Huan; Yang, Xiu; Zheng, Bin

    Biomolecules exhibit conformational fluctuations near equilibrium states, inducing uncertainty in various biological properties in a dynamic way. We have developed a general method to quantify the uncertainty of target properties induced by conformational fluctuations. Using a generalized polynomial chaos (gPC) expansion, we construct a surrogate model of the target property with respect to varying conformational states. We also propose a method to increase the sparsity of the gPC expansion by defining a set of conformational “active space” random variables. With the increased sparsity, we employ the compressive sensing method to accurately construct the surrogate model. We demonstrate the performance ofmore » the surrogate model by evaluating fluctuation-induced uncertainty in solvent-accessible surface area for the bovine trypsin inhibitor protein system and show that the new approach offers more accurate statistical information than standard Monte Carlo approaches. Further more, the constructed surrogate model also enables us to directly evaluate the target property under various conformational states, yielding a more accurate response surface than standard sparse grid collocation methods. In particular, the new method provides higher accuracy in high-dimensional systems, such as biomolecules, where sparse grid performance is limited by the accuracy of the computed quantity of interest. Our new framework is generalizable and can be used to investigate the uncertainty of a wide variety of target properties in biomolecular systems.« less

  15. Sequential experimental design based generalised ANOVA

    NASA Astrophysics Data System (ADS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2016-07-01

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  16. Sequential experimental design based generalised ANOVA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover,more » generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.« less

  17. The Biomarker-Surrogacy Evaluation Schema: a review of the biomarker-surrogate literature and a proposal for a criterion-based, quantitative, multidimensional hierarchical levels of evidence schema for evaluating the status of biomarkers as surrogate endpoints.

    PubMed

    Lassere, Marissa N

    2008-06-01

    There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Section 2 is a systematic, historical review of the biomarker-surrogate endpoint literature with special reference to the nomenclature, the systems of classification and statistical methods developed for their evaluation. In Section 3 an explicit, criterion-based, quantitative, multidimensional hierarchical levels of evidence schema - Biomarker-Surrogacy Evaluation Schema - is proposed to evaluate and co-ordinate the multiple dimensions (biological, epidemiological, statistical, clinical trial and risk-benefit evidence) of the biomarker clinical endpoint relationships. The schema systematically evaluates and ranks the surrogacy status of biomarkers and surrogate endpoints using defined levels of evidence. The schema incorporates the three independent domains: Study Design, Target Outcome and Statistical Evaluation. Each domain has items ranked from zero to five. An additional category called Penalties incorporates additional considerations of biological plausibility, risk-benefit and generalizability. The total score (0-15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. The term ;surrogate' is restricted to markers attaining Levels 1 or 2 only. Surrogacy status of markers can then be directly compared within and across different areas of medicine to guide individual, trial-based or drug-development decisions. This schema would facilitate communication between clinical, researcher, regulatory, industry and consumer participants necessary for evaluation of the biomarker-surrogate-clinical endpoint relationship in their different settings.

  18. Conservative strategy-based ensemble surrogate model for optimal groundwater remediation design at DNAPLs-contaminated sites

    NASA Astrophysics Data System (ADS)

    Ouyang, Qi; Lu, Wenxi; Lin, Jin; Deng, Wenbing; Cheng, Weiguo

    2017-08-01

    The surrogate-based simulation-optimization techniques are frequently used for optimal groundwater remediation design. When this technique is used, surrogate errors caused by surrogate-modeling uncertainty may lead to generation of infeasible designs. In this paper, a conservative strategy that pushes the optimal design into the feasible region was used to address surrogate-modeling uncertainty. In addition, chance-constrained programming (CCP) was adopted to compare with the conservative strategy in addressing this uncertainty. Three methods, multi-gene genetic programming (MGGP), Kriging (KRG) and support vector regression (SVR), were used to construct surrogate models for a time-consuming multi-phase flow model. To improve the performance of the surrogate model, ensemble surrogates were constructed based on combinations of different stand-alone surrogate models. The results show that: (1) the surrogate-modeling uncertainty was successfully addressed by the conservative strategy, which means that this method is promising for addressing surrogate-modeling uncertainty. (2) The ensemble surrogate model that combines MGGP with KRG showed the most favorable performance, which indicates that this ensemble surrogate can utilize both stand-alone surrogate models to improve the performance of the surrogate model.

  19. Surrogate Endpoints in Suicide Research

    ERIC Educational Resources Information Center

    Wortzel, Hal S.; Gutierrez, Peter M.; Homaifar, Beeta Y.; Breshears, Ryan E.; Harwood, Jeri E.

    2010-01-01

    Surrogate endpoints frequently substitute for rare outcomes in research. The ability to learn about completed suicides by investigating more readily available and proximate outcomes, such as suicide attempts, has obvious appeal. However, concerns with surrogates from the statistical science perspective exist, and mounting evidence from…

  20. Conjoint Analysis: A Study of the Effects of Using Person Variables.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    Three statistical techniques--conjoint analysis, a multiple linear regression model, and a multiple linear regression model with a surrogate person variable--were used to estimate the relative importance of five university attributes for students in the process of selecting a college. The five attributes include: availability and variety of…

  1. Challenge of surrogate endpoints.

    PubMed

    Furgerson, James L; Hannah, William N; Thompson, Jennifer C

    2012-03-01

    Surrogate endpoints are biomarkers that are intended to substitute for clinical endpoints. They have been used to find novel therapeutic targets, improve the statistical power and shorten the duration of clinical trials, and control the cost of conducting research studies. The more generalized use of surrogate endpoints in clinical decision making can be hazardous and should be undertaken with great caution. This article reviews prior work with surrogate endpoints and highlights caveats and lessons learned from studies using surrogate endpoints.

  2. A new algorithm combining geostatistics with the surrogate data approach to increase the accuracy of comparisons of point radiation measurements with cloud measurements

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Lindau, R.; Varnai, T.; Simmer, C.

    2009-04-01

    Two main groups of statistical methods used in the Earth sciences are geostatistics and stochastic modelling. Geostatistical methods, such as various kriging algorithms, aim at estimating the mean value for every point as well as possible. In case of sparse measurements, such fields have less variability at small scales and a narrower distribution as the true field. This can lead to biases if a nonlinear process is simulated on such a kriged field. Stochastic modelling aims at reproducing the structure of the data. One of the stochastic modelling methods, the so-called surrogate data approach, replicates the value distribution and power spectrum of a certain data set. However, while stochastic methods reproduce the statistical properties of the data, the location of the measurement is not considered. Because radiative transfer through clouds is a highly nonlinear process it is essential to model the distribution (e.g. of optical depth, extinction, liquid water content or liquid water path) accurately as well as the correlations in the cloud field because of horizontal photon transport. This explains the success of surrogate cloud fields for use in 3D radiative transfer studies. However, up to now we could only achieve good results for the radiative properties averaged over the field, but not for a radiation measurement located at a certain position. Therefore we have developed a new algorithm that combines the accuracy of stochastic (surrogate) modelling with the positioning capabilities of kriging. In this way, we can automatically profit from the large geostatistical literature and software. The algorithm is tested on cloud fields from large eddy simulations (LES). On these clouds a measurement is simulated. From the pseudo-measurement we estimated the distribution and power spectrum. Furthermore, the pseudo-measurement is kriged to a field the size of the final surrogate cloud. The distribution, spectrum and the kriged field are the inputs to the algorithm. This algorithm is similar to the standard iterative amplitude adjusted Fourier transform (IAAFT) algorithm, but has an additional iterative step in which the surrogate field is nudged towards the kriged field. The nudging strength is gradually reduced to zero. We work with four types of pseudo-measurements: one zenith pointing measurement (which together with the wind produces a line measurement), five zenith pointing measurements, a slow and a fast azimuth scan (which together with the wind produce spirals). Because we work with LES clouds and the truth is known, we can validate the algorithm by performing 3D radiative transfer calculations on the original LES clouds and on the new surrogate clouds. For comparison also the radiative properties of the kriged fields and standard surrogate fields are computed. Preliminary results already show that these new surrogate clouds reproduce the structure of the original clouds very well and the minima and maxima are located where the pseudo-measurements sees them. The main limitation seems to be the amount of data, which is especially very limited in case of just one zenith pointing measurement.

  3. Validation of surrogate endpoints in advanced solid tumors: systematic review of statistical methods, results, and implications for policy makers.

    PubMed

    Ciani, Oriana; Davis, Sarah; Tappenden, Paul; Garside, Ruth; Stein, Ken; Cantrell, Anna; Saad, Everardo D; Buyse, Marc; Taylor, Rod S

    2014-07-01

    Licensing of, and coverage decisions on, new therapies should rely on evidence from patient-relevant endpoints such as overall survival (OS). Nevertheless, evidence from surrogate endpoints may also be useful, as it may not only expedite the regulatory approval of new therapies but also inform coverage decisions. It is, therefore, essential that candidate surrogate endpoints be properly validated. However, there is no consensus on statistical methods for such validation and on how the evidence thus derived should be applied by policy makers. We review current statistical approaches to surrogate-endpoint validation based on meta-analysis in various advanced-tumor settings. We assessed the suitability of two surrogates (progression-free survival [PFS] and time-to-progression [TTP]) using three current validation frameworks: Elston and Taylor's framework, the German Institute of Quality and Efficiency in Health Care's (IQWiG) framework and the Biomarker-Surrogacy Evaluation Schema (BSES3). A wide variety of statistical methods have been used to assess surrogacy. The strength of the association between the two surrogates and OS was generally low. The level of evidence (observation-level versus treatment-level) available varied considerably by cancer type, by evaluation tools and was not always consistent even within one specific cancer type. Not in all solid tumors the treatment-level association between PFS or TTP and OS has been investigated. According to IQWiG's framework, only PFS achieved acceptable evidence of surrogacy in metastatic colorectal and ovarian cancer treated with cytotoxic agents. Our study emphasizes the challenges of surrogate-endpoint validation and the importance of building consensus on the development of evaluation frameworks.

  4. Surrogate modeling of joint flood risk across coastal watersheds

    NASA Astrophysics Data System (ADS)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  5. Inverse Modeling Using Markov Chain Monte Carlo Aided by Adaptive Stochastic Collocation Method with Transformation

    NASA Astrophysics Data System (ADS)

    Zhang, D.; Liao, Q.

    2016-12-01

    The Bayesian inference provides a convenient framework to solve statistical inverse problems. In this method, the parameters to be identified are treated as random variables. The prior knowledge, the system nonlinearity, and the measurement errors can be directly incorporated in the posterior probability density function (PDF) of the parameters. The Markov chain Monte Carlo (MCMC) method is a powerful tool to generate samples from the posterior PDF. However, since the MCMC usually requires thousands or even millions of forward simulations, it can be a computationally intensive endeavor, particularly when faced with large-scale flow and transport models. To address this issue, we construct a surrogate system for the model responses in the form of polynomials by the stochastic collocation method. In addition, we employ interpolation based on the nested sparse grids and takes into account the different importance of the parameters, under the condition of high random dimensions in the stochastic space. Furthermore, in case of low regularity such as discontinuous or unsmooth relation between the input parameters and the output responses, we introduce an additional transform process to improve the accuracy of the surrogate model. Once we build the surrogate system, we may evaluate the likelihood with very little computational cost. We analyzed the convergence rate of the forward solution and the surrogate posterior by Kullback-Leibler divergence, which quantifies the difference between probability distributions. The fast convergence of the forward solution implies fast convergence of the surrogate posterior to the true posterior. We also tested the proposed algorithm on water-flooding two-phase flow reservoir examples. The posterior PDF calculated from a very long chain with direct forward simulation is assumed to be accurate. The posterior PDF calculated using the surrogate model is in reasonable agreement with the reference, revealing a great improvement in terms of computational efficiency.

  6. An architecture for efficient gravitational wave parameter estimation with multimodal linear surrogate models

    NASA Astrophysics Data System (ADS)

    O'Shaughnessy, Richard; Blackman, Jonathan; Field, Scott E.

    2017-07-01

    The recent direct observation of gravitational waves has further emphasized the desire for fast, low-cost, and accurate methods to infer the parameters of gravitational wave sources. Due to expense in waveform generation and data handling, the cost of evaluating the likelihood function limits the computational performance of these calculations. Building on recently developed surrogate models and a novel parameter estimation pipeline, we show how to quickly generate the likelihood function as an analytic, closed-form expression. Using a straightforward variant of a production-scale parameter estimation code, we demonstrate our method using surrogate models of effective-one-body and numerical relativity waveforms. Our study is the first time these models have been used for parameter estimation and one of the first ever parameter estimation calculations with multi-modal numerical relativity waveforms, which include all \\ell ≤slant 4 modes. Our grid-free method enables rapid parameter estimation for any waveform with a suitable reduced-order model. The methods described in this paper may also find use in other data analysis studies, such as vetting coincident events or the computation of the coalescing-compact-binary detection statistic.

  7. A Bayesian approach to modelling the impact of hydrodynamic shear stress on biofilm deformation

    PubMed Central

    Wilkinson, Darren J.; Jayathilake, Pahala Gedara; Rushton, Steve P.; Bridgens, Ben; Li, Bowen; Zuliani, Paolo

    2018-01-01

    We investigate the feasibility of using a surrogate-based method to emulate the deformation and detachment behaviour of a biofilm in response to hydrodynamic shear stress. The influence of shear force, growth rate and viscoelastic parameters on the patterns of growth, structure and resulting shape of microbial biofilms was examined. We develop a statistical modelling approach to this problem, using combination of Bayesian Poisson regression and dynamic linear models for the emulation. We observe that the hydrodynamic shear force affects biofilm deformation in line with some literature. Sensitivity results also showed that the expected number of shear events, shear flow, yield coefficient for heterotrophic bacteria and extracellular polymeric substance (EPS) stiffness per unit EPS mass are the four principal mechanisms governing the bacteria detachment in this study. The sensitivity of the model parameters is temporally dynamic, emphasising the significance of conducting the sensitivity analysis across multiple time points. The surrogate models are shown to perform well, and produced ≈ 480 fold increase in computational efficiency. We conclude that a surrogate-based approach is effective, and resulting biofilm structure is determined primarily by a balance between bacteria growth, viscoelastic parameters and applied shear stress. PMID:29649240

  8. Statistical characteristics of surrogate data based on geophysical measurements

    NASA Astrophysics Data System (ADS)

    Venema, V.; Bachner, S.; Rust, H. W.; Simmer, C.

    2006-09-01

    In this study, the statistical properties of a range of measurements are compared with those of their surrogate time series. Seven different records are studied, amongst others, historical time series of mean daily temperature, daily rain sums and runoff from two rivers, and cloud measurements. Seven different algorithms are used to generate the surrogate time series. The best-known method is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm, which is able to reproduce the measured distribution as well as the power spectrum. Using this setup, the measurements and their surrogates are compared with respect to their power spectrum, increment distribution, structure functions, annual percentiles and return values. It is found that the surrogates that reproduce the power spectrum and the distribution of the measurements are able to closely match the increment distributions and the structure functions of the measurements, but this often does not hold for surrogates that only mimic the power spectrum of the measurement. However, even the best performing surrogates do not have asymmetric increment distributions, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found deviations of the structure functions on small scales.

  9. Spatial network surrogates for disentangling complex system structure from spatial embedding of nodes

    NASA Astrophysics Data System (ADS)

    Wiedermann, Marc; Donges, Jonathan F.; Kurths, Jürgen; Donner, Reik V.

    2016-04-01

    Networks with nodes embedded in a metric space have gained increasing interest in recent years. The effects of spatial embedding on the networks' structural characteristics, however, are rarely taken into account when studying their macroscopic properties. Here, we propose a hierarchy of null models to generate random surrogates from a given spatially embedded network that can preserve certain global and local statistics associated with the nodes' embedding in a metric space. Comparing the original network's and the resulting surrogates' global characteristics allows one to quantify to what extent these characteristics are already predetermined by the spatial embedding of the nodes and links. We apply our framework to various real-world spatial networks and show that the proposed models capture macroscopic properties of the networks under study much better than standard random network models that do not account for the nodes' spatial embedding. Depending on the actual performance of the proposed null models, the networks are categorized into different classes. Since many real-world complex networks are in fact spatial networks, the proposed approach is relevant for disentangling the underlying complex system structure from spatial embedding of nodes in many fields, ranging from social systems over infrastructure and neurophysiology to climatology.

  10. Nonspinning numerical relativity waveform surrogates: assessing the model

    NASA Astrophysics Data System (ADS)

    Field, Scott; Blackman, Jonathan; Galley, Chad; Scheel, Mark; Szilagyi, Bela; Tiglio, Manuel

    2015-04-01

    Recently, multi-modal gravitational waveform surrogate models have been built directly from data numerically generated by the Spectral Einstein Code (SpEC). I will describe ways in which the surrogate model error can be quantified. This task, in turn, requires (i) characterizing differences between waveforms computed by SpEC with those predicted by the surrogate model and (ii) estimating errors associated with the SpEC waveforms from which the surrogate is built. Both pieces can have numerous sources of numerical and systematic errors. We make an attempt to study the most dominant error sources and, ultimately, the surrogate model's fidelity. These investigations yield information about the surrogate model's uncertainty as a function of time (or frequency) and parameter, and could be useful in parameter estimation studies which seek to incorporate model error. Finally, I will conclude by comparing the numerical relativity surrogate model to other inspiral-merger-ringdown models. A companion talk will cover the building of multi-modal surrogate models.

  11. Two new algorithms to combine kriging with stochastic modelling

    NASA Astrophysics Data System (ADS)

    Venema, Victor; Lindau, Ralf; Varnai, Tamas; Simmer, Clemens

    2010-05-01

    Two main groups of statistical methods used in the Earth sciences are geostatistics and stochastic modelling. Geostatistical methods, such as various kriging algorithms, aim at estimating the mean value for every point as well as possible. In case of sparse measurements, such fields have less variability at small scales and a narrower distribution as the true field. This can lead to biases if a nonlinear process is simulated driven by such a kriged field. Stochastic modelling aims at reproducing the statistical structure of the data in space and time. One of the stochastic modelling methods, the so-called surrogate data approach, replicates the value distribution and power spectrum of a certain data set. While stochastic methods reproduce the statistical properties of the data, the location of the measurement is not considered. This requires the use of so-called constrained stochastic models. Because radiative transfer through clouds is a highly nonlinear process, it is essential to model the distribution (e.g. of optical depth, extinction, liquid water content or liquid water path) accurately. In addition, the correlations within the cloud field are important, especially because of horizontal photon transport. This explains the success of surrogate cloud fields for use in 3D radiative transfer studies. Up to now, however, we could only achieve good results for the radiative properties averaged over the field, but not for a radiation measurement located at a certain position. Therefore we have developed a new algorithm that combines the accuracy of stochastic (surrogate) modelling with the positioning capabilities of kriging. In this way, we can automatically profit from the large geostatistical literature and software. This algorithm is similar to the standard iterative amplitude adjusted Fourier transform (IAAFT) algorithm, but has an additional iterative step in which the surrogate field is nudged towards the kriged field. The nudging strength is gradually reduced to zero during successive iterations. A second algorithm, which we call step-wise kriging, pursues the same aim. Each time the kriging algorithm estimates a value, noise is added to it, after which this new point is accounted for in the estimation of all the later points. In this way, the autocorrelation of the step-krigged field is close to that found in the pseudo measurements. The amount of noise is determined by the kriging uncertainty. The algorithms are tested on cloud fields from large eddy simulations (LES). On these clouds, a measurement is simulated. From these pseudo-measurements, we estimated the power spectrum for the surrogates, the semi-variogram for the (stepwise) kriging and the distribution. Furthermore, the pseudo-measurement is kriged. Because we work with LES clouds and the truth is known, we can validate the algorithm by performing 3D radiative transfer calculations on the original LES clouds and on the two new types of stochastic clouds. For comparison, also the radiative properties of the kriged fields and standard surrogate fields are computed. Preliminary results show that both algorithms reproduce the structure of the original clouds well, and the minima and maxima are located where the pseudo-measurements see them. The main problem for the quality of the structure and the root mean square error is the amount of data, which is especially very limited in case of just one zenith pointing measurement.

  12. Wavelet-based surrogate time series for multiscale simulation of heterogeneous catalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savara, Aditya Ashi; Daw, C. Stuart; Xiong, Qingang

    We propose a wavelet-based scheme that encodes the essential dynamics of discrete microscale surface reactions in a form that can be coupled with continuum macroscale flow simulations with high computational efficiency. This makes it possible to simulate the dynamic behavior of reactor-scale heterogeneous catalysis without requiring detailed concurrent simulations at both the surface and continuum scales using different models. Our scheme is based on the application of wavelet-based surrogate time series that encodes the essential temporal and/or spatial fine-scale dynamics at the catalyst surface. The encoded dynamics are then used to generate statistically equivalent, randomized surrogate time series, which canmore » be linked to the continuum scale simulation. As a result, we illustrate an application of this approach using two different kinetic Monte Carlo simulations with different characteristic behaviors typical for heterogeneous chemical reactions.« less

  13. Improving medical decisions for incapacitated persons: does focusing on "accurate predictions" lead to an inaccurate picture?

    PubMed

    Kim, Scott Y H

    2014-04-01

    The Patient Preference Predictor (PPP) proposal places a high priority on the accuracy of predicting patients' preferences and finds the performance of surrogates inadequate. However, the quest to develop a highly accurate, individualized statistical model has significant obstacles. First, it will be impossible to validate the PPP beyond the limit imposed by 60%-80% reliability of people's preferences for future medical decisions--a figure no better than the known average accuracy of surrogates. Second, evidence supports the view that a sizable minority of persons may not even have preferences to predict. Third, many, perhaps most, people express their autonomy just as much by entrusting their loved ones to exercise their judgment than by desiring to specifically control future decisions. Surrogate decision making faces none of these issues and, in fact, it may be more efficient, accurate, and authoritative than is commonly assumed.

  14. Evaluating principal surrogate endpoints with time-to-event data accounting for time-varying treatment efficacy.

    PubMed

    Gabriel, Erin E; Gilbert, Peter B

    2014-04-01

    Principal surrogate (PS) endpoints are relatively inexpensive and easy to measure study outcomes that can be used to reliably predict treatment effects on clinical endpoints of interest. Few statistical methods for assessing the validity of potential PSs utilize time-to-event clinical endpoint information and to our knowledge none allow for the characterization of time-varying treatment effects. We introduce the time-dependent and surrogate-dependent treatment efficacy curve, ${\\mathrm {TE}}(t|s)$, and a new augmented trial design for assessing the quality of a biomarker as a PS. We propose a novel Weibull model and an estimated maximum likelihood method for estimation of the ${\\mathrm {TE}}(t|s)$ curve. We describe the operating characteristics of our methods via simulations. We analyze data from the Diabetes Control and Complications Trial, in which we find evidence of a biomarker with value as a PS.

  15. Wavelet-based surrogate time series for multiscale simulation of heterogeneous catalysis

    DOE PAGES

    Savara, Aditya Ashi; Daw, C. Stuart; Xiong, Qingang; ...

    2016-01-28

    We propose a wavelet-based scheme that encodes the essential dynamics of discrete microscale surface reactions in a form that can be coupled with continuum macroscale flow simulations with high computational efficiency. This makes it possible to simulate the dynamic behavior of reactor-scale heterogeneous catalysis without requiring detailed concurrent simulations at both the surface and continuum scales using different models. Our scheme is based on the application of wavelet-based surrogate time series that encodes the essential temporal and/or spatial fine-scale dynamics at the catalyst surface. The encoded dynamics are then used to generate statistically equivalent, randomized surrogate time series, which canmore » be linked to the continuum scale simulation. As a result, we illustrate an application of this approach using two different kinetic Monte Carlo simulations with different characteristic behaviors typical for heterogeneous chemical reactions.« less

  16. SU-E-I-71: Quality Assessment of Surrogate Metrics in Multi-Atlas-Based Image Segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, T; Ruan, D

    Purpose: With the ever-growing data of heterogeneous quality, relevance assessment of atlases becomes increasingly critical for multi-atlas-based image segmentation. However, there is no universally recognized best relevance metric and even a standard to compare amongst candidates remains elusive. This study, for the first time, designs a quantification to assess relevance metrics’ quality, based on a novel perspective of the metric as surrogate for inferring the inaccessible oracle geometric agreement. Methods: We first develop an inference model to relate surrogate metrics in image space to the underlying oracle relevance metric in segmentation label space, with a monotonically non-decreasing function subject tomore » random perturbations. Subsequently, we investigate model parameters to reveal key contributing factors to surrogates’ ability in prognosticating the oracle relevance value, for the specific task of atlas selection. Finally, we design an effective contract-to-noise ratio (eCNR) to quantify surrogates’ quality based on insights from these analyses and empirical observations. Results: The inference model was specialized to a linear function with normally distributed perturbations, with surrogate metric exemplified by several widely-used image similarity metrics, i.e., MSD/NCC/(N)MI. Surrogates’ behaviors in selecting the most relevant atlases were assessed under varying eCNR, showing that surrogates with high eCNR dominated those with low eCNR in retaining the most relevant atlases. In an end-to-end validation, NCC/(N)MI with eCNR of 0.12 compared to MSD with eCNR of 0.10 resulted in statistically better segmentation with mean DSC of about 0.85 and the first and third quartiles of (0.83, 0.89), compared to MSD with mean DSC of 0.84 and the first and third quartiles of (0.81, 0.89). Conclusion: The designed eCNR is capable of characterizing surrogate metrics’ quality in prognosticating the oracle relevance value. It has been demonstrated to be correlated with the performance of relevant atlas selection and ultimate label fusion.« less

  17. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  18. Endpoints and surrogate endpoints in colorectal cancer: a review of recent developments.

    PubMed

    Piedbois, Pascal; Buyse, Marc

    2008-07-01

    The purpose of this review is to discuss recently published work on endpoints for early and advanced colorectal cancer, as well as the statistical approaches used to validate surrogate endpoints. Most attempts to validate surrogate endpoints have estimated the correlation between the surrogate and the true endpoint, and between the treatment effects on these endpoints. The correlation approach has made it possible to validate disease-free survival and progression-free survival as acceptable surrogates for overall survival in early and advanced disease, respectively. The search for surrogate endpoints will intensify over the coming years. In parallel, efforts to either standardize or extend the endpoints or both will improve the reliability and relevance of clinical trial results.

  19. Validation of the Family Inpatient Communication Survey.

    PubMed

    Torke, Alexia M; Monahan, Patrick; Callahan, Christopher M; Helft, Paul R; Sachs, Greg A; Wocial, Lucia D; Slaven, James E; Montz, Kianna; Inger, Lev; Burke, Emily S

    2017-01-01

    Although many family members who make surrogate decisions report problems with communication, there is no validated instrument to accurately measure surrogate/clinician communication for older adults in the acute hospital setting. The objective of this study was to validate a survey of surrogate-rated communication quality in the hospital that would be useful to clinicians, researchers, and health systems. After expert review and cognitive interviewing (n = 10 surrogates), we enrolled 350 surrogates (250 development sample and 100 validation sample) of hospitalized adults aged 65 years and older from three hospitals in one metropolitan area. The communication survey and a measure of decision quality were administered within hospital days 3 and 10. Mental health and satisfaction measures were administered six to eight weeks later. Factor analysis showed support for both one-factor (Total Communication) and two-factor models (Information and Emotional Support). Item reduction led to a final 30-item scale. For the validation sample, internal reliability (Cronbach's alpha) was 0.96 (total), 0.94 (Information), and 0.90 (Emotional Support). Confirmatory factor analysis fit statistics were adequate (one-factor model, comparative fit index = 0.981, root mean square error of approximation = 0.62, weighted root mean square residual = 1.011; two-factor model comparative fit index = 0.984, root mean square error of approximation = 0.055, weighted root mean square residual = 0.930). Total score and subscales showed significant associations with the Decision Conflict Scale (Pearson correlation -0.43, P < 0.001 for total score). Emotional Support was associated with improved mental health outcomes at six to eight weeks, such as anxiety (-0.19 P < 0.001), and Information was associated with satisfaction with the hospital stay (0.49, P < 0.001). The survey shows high reliability and validity in measuring communication experiences for hospital surrogates. The scale has promise for measurement of communication quality and is predictive of important outcomes, such as surrogate satisfaction and well-being. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  20. Does the decision in a validation process of a surrogate endpoint change with level of significance of treatment effect? A proposal on validation of surrogate endpoints.

    PubMed

    Sertdemir, Y; Burgut, R

    2009-01-01

    In recent years the use of surrogate end points (S) has become an interesting issue. In clinical trials, it is important to get treatment outcomes as early as possible. For this reason there is a need for surrogate endpoints (S) which are measured earlier than the true endpoint (T). However, before a surrogate endpoint can be used it must be validated. For a candidate surrogate endpoint, for example time to recurrence, the validation result may change dramatically between clinical trials. The aim of this study is to show how the validation criterion (R(2)(trial)) proposed by Buyse et al. are influenced by the magnitude of treatment effect with an application using real data. The criterion R(2)(trial) proposed by Buyse et al. (2000) is applied to the four data sets from colon cancer clinical trials (C-01, C-02, C-03 and C-04). Each clinical trial is analyzed separately for treatment effect on survival (true endpoint) and recurrence free survival (surrogate endpoint) and this analysis is done also for each center in each trial. Results are used for standard validation analysis. The centers were grouped by the Wald statistic in 3 equal groups. Validation criteria R(2)(trial) were 0.641 95% CI (0.432-0.782), 0.223 95% CI (0.008-0.503), 0.761 95% CI (0.550-0.872) and 0.560 95% CI (0.404-0.687) for C-01, C-02, C-03 and C-04 respectively. The R(2)(trial) criteria changed by the Wald statistics observed for the centers used in the validation process. Higher the Wald statistic groups are higher the R(2)(trial) values observed. The recurrence free survival is not a good surrogate for overall survival in clinical trials with non significant treatment effects and moderate for significant treatment effects. This shows that the level of significance of treatment effect should be taken into account in validation process of surrogate endpoints.

  1. Meta-analyses evaluating surrogate endpoints for overall survival in cancer randomized trials: A critical review.

    PubMed

    Savina, Marion; Gourgou, Sophie; Italiano, Antoine; Dinart, Derek; Rondeau, Virginie; Penel, Nicolas; Mathoulin-Pelissier, Simone; Bellera, Carine

    2018-03-01

    In cancer randomized controlled trials (RCT), alternative endpoints are increasingly being used in place of overall survival (OS) to reduce sample size, duration and cost of trials. It is necessary to ensure that these endpoints are valid surrogates for OS. Our aim was to identify meta-analyses that evaluated surrogate endpoints for OS and assess the strength of evidence for each meta-analysis (MA). We performed a systematic review to identify MA of cancer RCTs assessing surrogate endpoints for OS. We evaluated the strength of the association between the endpoints based on (i) the German Institute of Quality and Efficiency in Health Care guidelines and (ii) the Biomarker-Surrogate Evaluation Schema. Fifty-three publications reported on 164 MA, with heterogeneous statistical methods Disease-free survival (DFS) and progression-free survival (PFS) showed good surrogacy properties for OS in colorectal, lung and head and neck cancers. DFS was highly correlated to OS in gastric cancer. The statistical methodology used to evaluate surrogate endpoints requires consistency in order to facilitate the accurate interpretation of the results. Despite the limited number of clinical settings with validated surrogate endpoints for OS, there is evidence of good surrogacy for DFS and PFS in tumor types that account for a large proportion of cancer cases. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Gaussian process-based surrogate modeling framework for process planning in laser powder-bed fusion additive manufacturing of 316L stainless steel

    DOE PAGES

    Tapia, Gustavo; Khairallah, Saad A.; Matthews, Manyalibo J.; ...

    2017-09-22

    Here, Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagrammore » delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.« less

  3. Gaussian process-based surrogate modeling framework for process planning in laser powder-bed fusion additive manufacturing of 316L stainless steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tapia, Gustavo; Khairallah, Saad A.; Matthews, Manyalibo J.

    Here, Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagrammore » delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.« less

  4. Fecal indicator organism modeling and microbial source tracking in environmental waters: Chapter 3.4.6

    USGS Publications Warehouse

    Nevers, Meredith; Byappanahalli, Muruleedhara; Phanikumar, Mantha S.; Whitman, Richard L.

    2016-01-01

    Mathematical models have been widely applied to surface waters to estimate rates of settling, resuspension, flow, dispersion, and advection in order to calculate movement of particles that influence water quality. Of particular interest are the movement, survival, and persistence of microbial pathogens or their surrogates, which may contaminate recreational water, drinking water, or shellfish. Most models devoted to microbial water quality have been focused on fecal indicator organisms (FIO), which act as a surrogate for pathogens and viruses. Process-based modeling and statistical modeling have been used to track contamination events to source and to predict future events. The use of these two types of models require different levels of expertise and input; process-based models rely on theoretical physical constructs to explain present conditions and biological distribution while data-based, statistical models use extant paired data to do the same. The selection of the appropriate model and interpretation of results is critical to proper use of these tools in microbial source tracking. Integration of the modeling approaches could provide insight for tracking and predicting contamination events in real time. A review of modeling efforts reveals that process-based modeling has great promise for microbial source tracking efforts; further, combining the understanding of physical processes influencing FIO contamination developed with process-based models and molecular characterization of the population by gene-based (i.e., biological) or chemical markers may be an effective approach for locating sources and remediating contamination in order to protect human health better.

  5. A surrogate model for thermal characteristics of stratospheric airship

    NASA Astrophysics Data System (ADS)

    Zhao, Da; Liu, Dongxu; Zhu, Ming

    2018-06-01

    A simple and accurate surrogate model is extremely needed to reduce the analysis complexity of thermal characteristics for a stratospheric airship. In this paper, a surrogate model based on the Least Squares Support Vector Regression (LSSVR) is proposed. The Gravitational Search Algorithm (GSA) is used to optimize hyper parameters. A novel framework consisting of a preprocessing classifier and two regression models is designed to train the surrogate model. Various temperature datasets of the airship envelope and the internal gas are obtained by a three-dimensional transient model for thermal characteristics. Using these thermal datasets, two-factor and multi-factor surrogate models are trained and several comparison simulations are conducted. Results illustrate that the surrogate models based on LSSVR-GSA have good fitting and generalization abilities. The pre-treated classification strategy proposed in this paper plays a significant role in improving the accuracy of the surrogate model.

  6. An Open-Source Toolbox for Surrogate Modeling of Joint Contact Mechanics

    PubMed Central

    Eskinazi, Ilan

    2016-01-01

    Goal Incorporation of elastic joint contact models into simulations of human movement could facilitate studying the interactions between muscles, ligaments, and bones. Unfortunately, elastic joint contact models are often too expensive computationally to be used within iterative simulation frameworks. This limitation can be overcome by using fast and accurate surrogate contact models that fit or interpolate input-output data sampled from existing elastic contact models. However, construction of surrogate contact models remains an arduous task. The aim of this paper is to introduce an open-source program called Surrogate Contact Modeling Toolbox (SCMT) that facilitates surrogate contact model creation, evaluation, and use. Methods SCMT interacts with the third party software FEBio to perform elastic contact analyses of finite element models and uses Matlab to train neural networks that fit the input-output contact data. SCMT features sample point generation for multiple domains, automated sampling, sample point filtering, and surrogate model training and testing. Results An overview of the software is presented along with two example applications. The first example demonstrates creation of surrogate contact models of artificial tibiofemoral and patellofemoral joints and evaluates their computational speed and accuracy, while the second demonstrates the use of surrogate contact models in a forward dynamic simulation of an open-chain leg extension-flexion motion. Conclusion SCMT facilitates the creation of computationally fast and accurate surrogate contact models. Additionally, it serves as a bridge between FEBio and OpenSim musculoskeletal modeling software. Significance Researchers may now create and deploy surrogate models of elastic joint contact with minimal effort. PMID:26186761

  7. Modeling of the radiation belt megnetosphere in decisional timeframes

    DOEpatents

    Koller, Josef; Reeves, Geoffrey D; Friedel, Reiner H.W.

    2013-04-23

    Systems and methods for calculating L* in the magnetosphere with essentially the same accuracy as with a physics based model at many times the speed by developing a surrogate trained to be a surrogate for the physics-based model. The trained model can then beneficially process input data falling within the training range of the surrogate model. The surrogate model can be a feedforward neural network and the physics-based model can be the TSK03 model. Operatively, the surrogate model can use parameters on which the physics-based model was based, and/or spatial data for the location where L* is to be calculated. Surrogate models should be provided for each of a plurality of pitch angles. Accordingly, a surrogate model having a closed drift shell can be used from the plurality of models. The feedforward neural network can have a plurality of input-layer units, there being at least one input-layer unit for each physics-based model parameter, a plurality of hidden layer units and at least one output unit for the value of L*.

  8. Surrogate outcomes: experiences at the Common Drug Review

    PubMed Central

    2013-01-01

    Background Surrogate outcomes are a significant challenge in drug evaluation for health technology assessment (HTA) agencies. The research objectives were to: identify factors associated with surrogate use and acceptability in Canada’s Common Drug Review (CDR) recommendations, and compare the CDR with other HTA or regulatory agencies regarding surrogate concerns. Methods Final recommendations were identified from CDR inception (September 2003) to December 31, 2010. Recommendations were classified by type of outcome (surrogate, final, other) and acceptability of surrogates (determined by the presence/absence of statements of concern regarding surrogates). Descriptive and statistical analyses examined factors related to surrogate use and acceptability. For thirteen surrogate-based submissions, recommendations from international HTA and regulatory agencies were reviewed for statements about surrogate acceptability. Results Of 156 final recommendations, 68 (44%) involved surrogates. The overall ‘do not list’ (DNL) rate was 48%; the DNL rate for surrogates was 41% (p = 0.175). The DNL rate was 64% for non-accepted surrogates (n = 28) versus 25% for accepted surrogates (odds ratio 5.4, p = 0.002). Clinical uncertainty, use of economic evidence over price alone, and a premium price were significantly associated with non-accepted surrogates. Surrogates were used most commonly for HIV, diabetes, rare diseases, cardiovascular disease and cancer. For the subset of drugs studied, other HTA agencies did not express concerns for most recommendations, while regulatory agencies frequently stated surrogate acceptance. Conclusions The majority of surrogates were accepted at the CDR. Non-accepted surrogates were significantly associated with clinical uncertainty and a DNL recommendation. There was inconsistency of surrogate acceptability across several international agencies. Stakeholders should consider collaboratively establishing guidelines on the use, validation, and acceptability of surrogates. PMID:24341379

  9. Moment-based metrics for global sensitivity analysis of hydrological systems

    NASA Astrophysics Data System (ADS)

    Dell'Oca, Aronne; Riva, Monica; Guadagnini, Alberto

    2017-12-01

    We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE), other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of) analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.

  10. Evaluation of Escherichia coli biotype 1 as a surrogate for Escherichia coli O157:H7 for cooking, fermentation, freezing, and refrigerated storage in meat processes.

    PubMed

    Keeling, Carisa; Niebuhr, Steven E; Acuff, Gary R; Dickson, James S

    2009-04-01

    Five Escherichia coli biotype I isolates were compared with E. coli O157:H7 under four common meat processing conditions. The processes that were evaluated were freezing, refrigerating, fermentation, and thermal inactivation. For each study, at least one surrogate organism was not statistically different when compared with E. coli O157:H7. However, the four studies did not consistently show the same isolate as having this agreement. The three studies that involved temperature as a method of controlling or reducing the E. coli population all had at least one possible surrogate in common. In the fermentation study, only one isolate (BAA-1429) showed no statistical difference when compared with E. coli O157:H7. However, the population reductions that were observed indicated the isolates BAA-1427 and BAA-1431 would overestimate the surviving E. coli O157:H7 population in a fermented summer sausage. When all of the data from all of the surrogates were examined, it was found that isolates BAA-1427, BAA-1429, and BAA-1430 would be good surrogates for all four of the processes that were examined in this study. There was no statistical difference noted between these three isolates and E. coli O157:H7 in the refrigeration study. These isolates resulted in smaller population reductions than did E. coli O157:H7 in the frozen, fermentation, and thermal inactivation studies. This would indicate that these isolates would overpredict the E. coli O157:H7 population in these three instances. This overprediction results in an additional margin of safety when using E. coli biotype 1 as a surrogate.

  11. Evaluation of bone surrogates for indirect and direct ballistic fractures.

    PubMed

    Bir, Cynthia; Andrecovich, Chris; DeMaio, Marlene; Dougherty, Paul J

    2016-04-01

    The mechanism of injury for fractures to long bones has been studied for both direct ballistic loading as well as indirect. However, the majority of these studies have been conducted on both post-mortem human subjects (PMHS) and animal surrogates which have constraints in terms of storage, preparation and testing. The identification of a validated bone surrogate for use in forensic, medical and engineering testing would provide the ability to investigate ballistic loading without these constraints. Two specific bone surrogates, Sawbones and Synbone, were evaluated in comparison to PMHS for both direct and indirect ballistic loading. For the direct loading, the mean velocity to produce fracture was 121 ± 19 m/s for the PMHS, which was statistically different from the Sawbones (140 ± 7 m/s) and Synbone (146 ± 3 m/s). The average distance to fracture in the indirect loading was .70 cm for the PMHS. The Synbone had a statistically similar average distance to fracture (.61 cm, p=0.54) however the Sawbones average distance to fracture was statistically different (.41 cm, p<0.05). Fractures patterns were found to be comparable to the PMHS for tests conducted with Synbones, however the input parameters were slightly varied to produce similar results. The fractures patterns with the Sawbones were not found to be as comparable to the PMHS. An ideal bone surrogate for ballistic testing was not identified and future work is warranted. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Incorporating approximation error in surrogate based Bayesian inversion

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Zeng, L.; Li, W.; Wu, L.

    2015-12-01

    There are increasing interests in applying surrogates for inverse Bayesian modeling to reduce repetitive evaluations of original model. In this way, the computational cost is expected to be saved. However, the approximation error of surrogate model is usually overlooked. This is partly because that it is difficult to evaluate the approximation error for many surrogates. Previous studies have shown that, the direct combination of surrogates and Bayesian methods (e.g., Markov Chain Monte Carlo, MCMC) may lead to biased estimations when the surrogate cannot emulate the highly nonlinear original system. This problem can be alleviated by implementing MCMC in a two-stage manner. However, the computational cost is still high since a relatively large number of original model simulations are required. In this study, we illustrate the importance of incorporating approximation error in inverse Bayesian modeling. Gaussian process (GP) is chosen to construct the surrogate for its convenience in approximation error evaluation. Numerical cases of Bayesian experimental design and parameter estimation for contaminant source identification are used to illustrate this idea. It is shown that, once the surrogate approximation error is well incorporated into Bayesian framework, promising results can be obtained even when the surrogate is directly used, and no further original model simulations are required.

  13. Adaptive surrogate model based multiobjective optimization for coastal aquifer management

    NASA Astrophysics Data System (ADS)

    Song, Jian; Yang, Yun; Wu, Jianfeng; Wu, Jichun; Sun, Xiaomin; Lin, Jin

    2018-06-01

    In this study, a novel surrogate model assisted multiobjective memetic algorithm (SMOMA) is developed for optimal pumping strategies of large-scale coastal groundwater problems. The proposed SMOMA integrates an efficient data-driven surrogate model with an improved non-dominated sorted genetic algorithm-II (NSGAII) that employs a local search operator to accelerate its convergence in optimization. The surrogate model based on Kernel Extreme Learning Machine (KELM) is developed and evaluated as an approximate simulator to generate the patterns of regional groundwater flow and salinity levels in coastal aquifers for reducing huge computational burden. The KELM model is adaptively trained during evolutionary search to satisfy desired fidelity level of surrogate so that it inhibits error accumulation of forecasting and results in correctly converging to true Pareto-optimal front. The proposed methodology is then applied to a large-scale coastal aquifer management in Baldwin County, Alabama. Objectives of minimizing the saltwater mass increase and maximizing the total pumping rate in the coastal aquifers are considered. The optimal solutions achieved by the proposed adaptive surrogate model are compared against those solutions obtained from one-shot surrogate model and original simulation model. The adaptive surrogate model does not only improve the prediction accuracy of Pareto-optimal solutions compared with those by the one-shot surrogate model, but also maintains the equivalent quality of Pareto-optimal solutions compared with those by NSGAII coupled with original simulation model, while retaining the advantage of surrogate models in reducing computational burden up to 94% of time-saving. This study shows that the proposed methodology is a computationally efficient and promising tool for multiobjective optimizations of coastal aquifer managements.

  14. Three Dimensional CFD Analysis of the GTX Combustor

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Bond, R. B.; Edwards, J. R.

    2002-01-01

    The annular combustor geometry of a combined-cycle engine has been analyzed with three-dimensional computational fluid dynamics. Both subsonic combustion and supersonic combustion flowfields have been simulated. The subsonic combustion analysis was executed in conjunction with a direct-connect test rig. Two cold-flow and one hot-flow results are presented. The simulations compare favorably with the test data for the two cold flow calculations; the hot-flow data was not yet available. The hot-flow simulation indicates that the conventional ejector-ramjet cycle would not provide adequate mixing at the conditions tested. The supersonic combustion ramjet flowfield was simulated with frozen chemistry model. A five-parameter test matrix was specified, according to statistical design-of-experiments theory. Twenty-seven separate simulations were used to assemble surrogate models for combustor mixing efficiency and total pressure recovery. ScramJet injector design parameters (injector angle, location, and fuel split) as well as mission variables (total fuel massflow and freestream Mach number) were included in the analysis. A promising injector design has been identified that provides good mixing characteristics with low total pressure losses. The surrogate models can be used to develop performance maps of different injector designs. Several complex three-way variable interactions appear within the dataset that are not adequately resolved with the current statistical analysis.

  15. An Evaluation of Two Internal Surrogates for Determining the Three-Dimensional Position of Peripheral Lung Tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spoelstra, Femke; Soernsen de Koste, John R. van; Vincent, Andrew

    2009-06-01

    Purpose: Both carina and diaphragm positions have been used as surrogates during respiratory-gated radiotherapy. We studied the correlation of both surrogates with three-dimensional (3D) tumor position. Methods and Materials: A total of 59 repeat artifact-free four-dimensional (4D) computed tomography (CT) scans, acquired during uncoached breathing, were identified in 23 patients with Stage I lung cancer. Repeat scans were co-registered to the initial 4D CT scan, and tumor, carina, and ipsilateral diaphragm were manually contoured in all phases of each 4D CT data set. Correlation between positions of carina and diaphragm with 3D tumor position was studied by use of log-likelihoodmore » ratio statistics. Models to predict 3D tumor position from internal surrogates at end inspiration (EI) and end expiration (EE) were developed, and model accuracy was tested by calculating SDs of differences between predicted and actual tumor positions. Results: Motion of both the carina and diaphragm significantly correlated with tumor motion, but log-likelihood ratios indicated that the carina was more predictive for tumor position. When craniocaudal tumor position was predicted by use of craniocaudal carina positions, the SDs of the differences between the predicted and observed positions were 2.2 mm and 2.4 mm at EI and EE, respectively. The corresponding SDs derived with the diaphragm positions were 3.7 mm and 3.9 mm at EI and EE, respectively. Prediction errors in the other directions were comparable. Prediction accuracy was similar at EI and EE. Conclusions: The carina is a better surrogate of 3D tumor position than diaphragm position. Because residual prediction errors were observed in this analysis, additional studies will be performed using audio-coached scans.« less

  16. 40 CFR Appendix A to Part 63 - Test Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... components by a different analyst). 3.3Surrogate Reference Materials. The analyst may use surrogate compounds... the variance of the proposed method is significantly different from that of the validated method by... variables can be determined in eight experiments rather than 128 (W.J. Youden, Statistical Manual of the...

  17. A comparative research of different ensemble surrogate models based on set pair analysis for the DNAPL-contaminated aquifer remediation strategy optimization.

    PubMed

    Hou, Zeyu; Lu, Wenxi; Xue, Haibo; Lin, Jin

    2017-08-01

    Surrogate-based simulation-optimization technique is an effective approach for optimizing the surfactant enhanced aquifer remediation (SEAR) strategy for clearing DNAPLs. The performance of the surrogate model, which is used to replace the simulation model for the aim of reducing computation burden, is the key of corresponding researches. However, previous researches are generally based on a stand-alone surrogate model, and rarely make efforts to improve the approximation accuracy of the surrogate model to the simulation model sufficiently by combining various methods. In this regard, we present set pair analysis (SPA) as a new method to build ensemble surrogate (ES) model, and conducted a comparative research to select a better ES modeling pattern for the SEAR strategy optimization problems. Surrogate models were developed using radial basis function artificial neural network (RBFANN), support vector regression (SVR), and Kriging. One ES model is assembling RBFANN model, SVR model, and Kriging model using set pair weights according their performance, and the other is assembling several Kriging (the best surrogate modeling method of three) models built with different training sample datasets. Finally, an optimization model, in which the ES model was embedded, was established to obtain the optimal remediation strategy. The results showed the residuals of the outputs between the best ES model and simulation model for 100 testing samples were lower than 1.5%. Using an ES model instead of the simulation model was critical for considerably reducing the computation time of simulation-optimization process and maintaining high computation accuracy simultaneously. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Maximum entropy models as a tool for building precise neural controls.

    PubMed

    Savin, Cristina; Tkačik, Gašper

    2017-10-01

    Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun

    2017-12-01

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabiti, Cristian; Alfonsi, Andrea; Huang, Dongli

    This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.

  1. A review of surrogate models and their application to groundwater modeling

    NASA Astrophysics Data System (ADS)

    Asher, M. J.; Croke, B. F. W.; Jakeman, A. J.; Peeters, L. J. M.

    2015-08-01

    The spatially and temporally variable parameters and inputs to complex groundwater models typically result in long runtimes which hinder comprehensive calibration, sensitivity, and uncertainty analysis. Surrogate modeling aims to provide a simpler, and hence faster, model which emulates the specified output of a more complex model in function of its inputs and parameters. In this review paper, we summarize surrogate modeling techniques in three categories: data-driven, projection, and hierarchical-based approaches. Data-driven surrogates approximate a groundwater model through an empirical model that captures the input-output mapping of the original model. Projection-based models reduce the dimensionality of the parameter space by projecting the governing equations onto a basis of orthonormal vectors. In hierarchical or multifidelity methods the surrogate is created by simplifying the representation of the physical system, such as by ignoring certain processes, or reducing the numerical resolution. In discussing the application to groundwater modeling of these methods, we note several imbalances in the existing literature: a large body of work on data-driven approaches seemingly ignores major drawbacks to the methods; only a fraction of the literature focuses on creating surrogates to reproduce outputs of fully distributed groundwater models, despite these being ubiquitous in practice; and a number of the more advanced surrogate modeling methods are yet to be fully applied in a groundwater modeling context.

  2. [Immunological surrogate endpoints to evaluate vaccine efficacy].

    PubMed

    Jin, Pengfei; Li, Jingxin; Zhou, Yang; Zhu, Fengcai

    2015-12-01

    An immunological surrogate endpoints is a vaccine-induced immune response (either humoral or cellular immune) that predicts protection against clinical endpoints (infection or disease), and can be used to evaluate vaccine efficacy in clinical vaccine trials. Compared with field efficacy trials observing clinical endpoints, immunological vaccine trials could reduce the sample size or shorten the duration of a trial, which promote the license and development of new candidate vaccines. For these reasons, establishing immunological surrogate endpoints is one of 14 Grand Challenges of Global Health of the National Institutes of Health (NIH) and the Bill and Melinda Gates Foundation. From two parts of definition and statistical methods for evaluation of surrogate endpoints, this review provides a more comprehensive description.

  3. Is prostate-specific antigen a valid surrogate end point for survival in hormonally treated patients with metastatic prostate cancer? Joint research of the European Organisation for Research and Treatment of Cancer, the Limburgs Universitair Centrum, and AstraZeneca Pharmaceuticals.

    PubMed

    Collette, Laurence; Burzykowski, Tomasz; Carroll, Kevin J; Newling, Don; Morris, Tom; Schröder, Fritz H

    2005-09-01

    The long duration of phase III clinical trials of overall survival (OS) slows down the treatment-development process. It could be shortened by using surrogate end points. Prostate-specific antigen (PSA) is the most studied biomarker in prostate cancer (PCa). This study attempts to validate PSA end points as surrogates for OS in advanced PCa. Individual data from 2,161 advanced PCa patients treated in studies comparing bicalutamide to castration were used in a meta-analytic approach to surrogate end-point validation. PSA response, PSA normalization, time to PSA progression, and longitudinal PSA measurements were considered. The known association between PSA and OS at the individual patient level was confirmed. The association between the effect of intervention on any PSA end point and on OS was generally low (determination coefficient, < 0.69). It is a common misconception that high correlation between biomarkers and true end point justify the use of the former as surrogates. To statistically validate surrogate end points, a high correlation between the treatment effects on the surrogate and true end point needs to be established across groups of patients treated with two alternative interventions. The levels of association observed in this study indicate that the effect of hormonal treatment on OS cannot be predicted with a high degree of precision from observed treatment effects on PSA end points, and thus statistical validity is unproven. In practice, non-null treatment effects on OS can be predicted only from precisely estimated large effects on time to PSA progression (TTPP; hazard ratio, < 0.50).

  4. A rank test for bivariate time-to-event outcomes when one event is a surrogate

    PubMed Central

    Shaw, Pamela A.; Fay, Michael P.

    2016-01-01

    In many clinical settings, improving patient survival is of interest but a practical surrogate, such as time to disease progression, is instead used as a clinical trial’s primary endpoint. A time-to-first endpoint (e.g. death or disease progression) is commonly analyzed but may not be adequate to summarize patient outcomes if a subsequent event contains important additional information. We consider a surrogate outcome very generally, as one correlated with the true endpoint of interest. Settings of interest include those where the surrogate indicates a beneficial outcome so that the usual time-to-first endpoint of death or surrogate event is nonsensical. We present a new two-sample test for bivariate, interval-censored time-to-event data, where one endpoint is a surrogate for the second, less frequently observed endpoint of true interest. This test examines whether patient groups have equal clinical severity. If the true endpoint rarely occurs, the proposed test acts like a weighted logrank test on the surrogate; if it occurs for most individuals, then our test acts like a weighted logrank test on the true endpoint. If the surrogate is a useful statistical surrogate, our test can have better power than tests based on the surrogate that naively handle the true endpoint. In settings where the surrogate is not valid (treatment affects the surrogate but not the true endpoint), our test incorporates the information regarding the lack of treatment effect from the observed true endpoints and hence is expected to have a dampened treatment effect compared to tests based on the surrogate alone. PMID:27059817

  5. Surrogate outcomes in health technology assessment: an international comparison.

    PubMed

    Velasco Garrido, Marcial; Mangiapane, Sandra

    2009-07-01

    Our aim was to review the recommendations given by health technology assessment (HTA) institutions in their methodological guidelines concerning the use of surrogate outcomes in their assessments. In a second step, we aimed at quantifying the role surrogate parameters take in assessment reports. We analyzed methodological papers and guidelines from HTA agencies with International Network of Agencies for Health Technology Assessment membership as well as from institutions related to pharmaceutical regulation (i.e., reimbursement, pricing). We analyzed the use of surrogate outcomes in a sample of HTA reports randomly drawn from the HTA database. We checked methods, results (including evidence tables), and conclusions sections and extracted the outcomes reported. We report descriptive statistics on the presence of surrogate outcomes in the reports. We identified thirty-four methodological guidelines, twenty of them addressing the issue of outcome parameter choice and the problematic of surrogate outcomes. Overall HTA agencies call on caution regarding the reliance on surrogate outcomes. None of the agencies has provided a list or catalog of acceptable and validated surrogate outcomes. We extracted the outcome parameter of 140 HTA reports. Only around half of the reports determined the outcomes for the assessment prospectively. Surrogate outcomes had been used in 62 percent of the reports. However, only 3.6 percent were based upon surrogate outcomes exclusively. All of them assessed diagnostic or screening technologies and the surrogate outcomes were predominantly test characteristics. HTA institutions seem to agree on a cautious approach to the use of surrogate outcomes in technology assessment. Thorough assessment of health technologies should not rely exclusively on surrogate outcomes.

  6. An investigation into the two-stage meta-analytic copula modelling approach for evaluating time-to-event surrogate endpoints which comprise of one or more events of interest.

    PubMed

    Dimier, Natalie; Todd, Susan

    2017-09-01

    Clinical trials of experimental treatments must be designed with primary endpoints that directly measure clinical benefit for patients. In many disease areas, the recognised gold standard primary endpoint can take many years to mature, leading to challenges in the conduct and quality of clinical studies. There is increasing interest in using shorter-term surrogate endpoints as substitutes for costly long-term clinical trial endpoints; such surrogates need to be selected according to biological plausibility, as well as the ability to reliably predict the unobserved treatment effect on the long-term endpoint. A number of statistical methods to evaluate this prediction have been proposed; this paper uses a simulation study to explore one such method in the context of time-to-event surrogates for a time-to-event true endpoint. This two-stage meta-analytic copula method has been extensively studied for time-to-event surrogate endpoints with one event of interest, but thus far has not been explored for the assessment of surrogates which have multiple events of interest, such as those incorporating information directly from the true clinical endpoint. We assess the sensitivity of the method to various factors including strength of association between endpoints, the quantity of data available, and the effect of censoring. In particular, we consider scenarios where there exist very little data on which to assess surrogacy. Results show that the two-stage meta-analytic copula method performs well under certain circumstances and could be considered useful in practice, but demonstrates limitations that may prevent universal use. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Surrogate modeling of deformable joint contact using artificial neural networks.

    PubMed

    Eskinazi, Ilan; Fregly, Benjamin J

    2015-09-01

    Deformable joint contact models can be used to estimate loading conditions for cartilage-cartilage, implant-implant, human-orthotic, and foot-ground interactions. However, contact evaluations are often so expensive computationally that they can be prohibitive for simulations or optimizations requiring thousands or even millions of contact evaluations. To overcome this limitation, we developed a novel surrogate contact modeling method based on artificial neural networks (ANNs). The method uses special sampling techniques to gather input-output data points from an original (slow) contact model in multiple domains of input space, where each domain represents a different physical situation likely to be encountered. For each contact force and torque output by the original contact model, a multi-layer feed-forward ANN is defined, trained, and incorporated into a surrogate contact model. As an evaluation problem, we created an ANN-based surrogate contact model of an artificial tibiofemoral joint using over 75,000 evaluations of a fine-grid elastic foundation (EF) contact model. The surrogate contact model computed contact forces and torques about 1000 times faster than a less accurate coarse grid EF contact model. Furthermore, the surrogate contact model was seven times more accurate than the coarse grid EF contact model within the input domain of a walking motion. For larger input domains, the surrogate contact model showed the expected trend of increasing error with increasing domain size. In addition, the surrogate contact model was able to identify out-of-contact situations with high accuracy. Computational contact models created using our proposed ANN approach may remove an important computational bottleneck from musculoskeletal simulations or optimizations incorporating deformable joint contact models. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Surrogate Modeling of Deformable Joint Contact using Artificial Neural Networks

    PubMed Central

    Eskinazi, Ilan; Fregly, Benjamin J.

    2016-01-01

    Deformable joint contact models can be used to estimate loading conditions for cartilage-cartilage, implant-implant, human-orthotic, and foot-ground interactions. However, contact evaluations are often so expensive computationally that they can be prohibitive for simulations or optimizations requiring thousands or even millions of contact evaluations. To overcome this limitation, we developed a novel surrogate contact modeling method based on artificial neural networks (ANNs). The method uses special sampling techniques to gather input-output data points from an original (slow) contact model in multiple domains of input space, where each domain represents a different physical situation likely to be encountered. For each contact force and torque output by the original contact model, a multi-layer feed-forward ANN is defined, trained, and incorporated into a surrogate contact model. As an evaluation problem, we created an ANN-based surrogate contact model of an artificial tibiofemoral joint using over 75,000 evaluations of a fine-grid elastic foundation (EF) contact model. The surrogate contact model computed contact forces and torques about 1000 times faster than a less accurate coarse grid EF contact model. Furthermore, the surrogate contact model was seven times more accurate than the coarse grid EF contact model within the input domain of a walking motion. For larger input domains, the surrogate contact model showed the expected trend of increasing error with increasing domain size. In addition, the surrogate contact model was able to identify out-of-contact situations with high accuracy. Computational contact models created using our proposed ANN approach may remove an important computational bottleneck from musculoskeletal simulations or optimizations incorporating deformable joint contact models. PMID:26220591

  9. Selection of Surrogate Bacteria for Use in Food Safety Challenge Studies: A Review.

    PubMed

    Hu, Mengyi; Gurtler, Joshua B

    2017-09-01

    Nonpathogenic surrogate bacteria are prevalently used in a variety of food challenge studies in place of foodborne pathogens such as Listeria monocytogenes, Salmonella, Escherichia coli O157:H7, and Clostridium botulinum because of safety and sanitary concerns. Surrogate bacteria should have growth characteristics and/or inactivation kinetics similar to those of target pathogens under given conditions in challenge studies. It is of great importance to carefully select and validate potential surrogate bacteria when verifying microbial inactivation processes. A validated surrogate responds similar to the targeted pathogen when tested for inactivation kinetics, growth parameters, or survivability under given conditions in agreement with appropriate statistical analyses. However, a considerable number of food studies involving putative surrogate bacteria lack convincing validation sources or adequate validation processes. Most of the validation information for surrogates in these studies is anecdotal and has been collected from previous publications but may not be sufficient for given conditions in the study at hand. This review is limited to an overview of select studies and discussion of the general criteria and approaches for selecting potential surrogate bacteria under given conditions. The review also includes a list of documented bacterial pathogen surrogates and their corresponding food products and treatments to provide guidance for future studies.

  10. Ensemble of surrogates-based optimization for identifying an optimal surfactant-enhanced aquifer remediation strategy at heterogeneous DNAPL-contaminated sites

    NASA Astrophysics Data System (ADS)

    Jiang, Xue; Lu, Wenxi; Hou, Zeyu; Zhao, Haiqing; Na, Jin

    2015-11-01

    The purpose of this study was to identify an optimal surfactant-enhanced aquifer remediation (SEAR) strategy for aquifers contaminated by dense non-aqueous phase liquid (DNAPL) based on an ensemble of surrogates-based optimization technique. A saturated heterogeneous medium contaminated by nitrobenzene was selected as case study. A new kind of surrogate-based SEAR optimization employing an ensemble surrogate (ES) model together with a genetic algorithm (GA) is presented. Four methods, namely radial basis function artificial neural network (RBFANN), kriging (KRG), support vector regression (SVR), and kernel extreme learning machines (KELM), were used to create four individual surrogate models, which were then compared. The comparison enabled us to select the two most accurate models (KELM and KRG) to establish an ES model of the SEAR simulation model, and the developed ES model as well as these four stand-alone surrogate models was compared. The results showed that the average relative error of the average nitrobenzene removal rates between the ES model and the simulation model for 20 test samples was 0.8%, which is a high approximation accuracy, and which indicates that the ES model provides more accurate predictions than the stand-alone surrogate models. Then, a nonlinear optimization model was formulated for the minimum cost, and the developed ES model was embedded into this optimization model as a constrained condition. Besides, GA was used to solve the optimization model to provide the optimal SEAR strategy. The developed ensemble surrogate-optimization approach was effective in seeking a cost-effective SEAR strategy for heterogeneous DNAPL-contaminated sites. This research is expected to enrich and develop the theoretical and technical implications for the analysis of remediation strategy optimization of DNAPL-contaminated aquifers.

  11. Ensemble of Surrogates-based Optimization for Identifying an Optimal Surfactant-enhanced Aquifer Remediation Strategy at Heterogeneous DNAPL-contaminated Sites

    NASA Astrophysics Data System (ADS)

    Lu, W., Sr.; Xin, X.; Luo, J.; Jiang, X.; Zhang, Y.; Zhao, Y.; Chen, M.; Hou, Z.; Ouyang, Q.

    2015-12-01

    The purpose of this study was to identify an optimal surfactant-enhanced aquifer remediation (SEAR) strategy for aquifers contaminated by dense non-aqueous phase liquid (DNAPL) based on an ensemble of surrogates-based optimization technique. A saturated heterogeneous medium contaminated by nitrobenzene was selected as case study. A new kind of surrogate-based SEAR optimization employing an ensemble surrogate (ES) model together with a genetic algorithm (GA) is presented. Four methods, namely radial basis function artificial neural network (RBFANN), kriging (KRG), support vector regression (SVR), and kernel extreme learning machines (KELM), were used to create four individual surrogate models, which were then compared. The comparison enabled us to select the two most accurate models (KELM and KRG) to establish an ES model of the SEAR simulation model, and the developed ES model as well as these four stand-alone surrogate models was compared. The results showed that the average relative error of the average nitrobenzene removal rates between the ES model and the simulation model for 20 test samples was 0.8%, which is a high approximation accuracy, and which indicates that the ES model provides more accurate predictions than the stand-alone surrogate models. Then, a nonlinear optimization model was formulated for the minimum cost, and the developed ES model was embedded into this optimization model as a constrained condition. Besides, GA was used to solve the optimization model to provide the optimal SEAR strategy. The developed ensemble surrogate-optimization approach was effective in seeking a cost-effective SEAR strategy for heterogeneous DNAPL-contaminated sites. This research is expected to enrich and develop the theoretical and technical implications for the analysis of remediation strategy optimization of DNAPL-contaminated aquifers.

  12. Weighted Iterative Bayesian Compressive Sensing (WIBCS) for High Dimensional Polynomial Surrogate Construction

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2016-12-01

    Surrogate construction has become a routine procedure when facing computationally intensive studies requiring multiple evaluations of complex models. In particular, surrogate models, otherwise called emulators or response surfaces, replace complex models in uncertainty quantification (UQ) studies, including uncertainty propagation (forward UQ) and parameter estimation (inverse UQ). Further, surrogates based on Polynomial Chaos (PC) expansions are especially convenient for forward UQ and global sensitivity analysis, also known as variance-based decomposition. However, the PC surrogate construction strongly suffers from the curse of dimensionality. With a large number of input parameters, the number of model simulations required for accurate surrogate construction is prohibitively large. Relatedly, non-adaptive PC expansions typically include infeasibly large number of basis terms far exceeding the number of available model evaluations. We develop Weighted Iterative Bayesian Compressive Sensing (WIBCS) algorithm for adaptive basis growth and PC surrogate construction leading to a sparse, high-dimensional PC surrogate with a very few model evaluations. The surrogate is then readily employed for global sensitivity analysis leading to further dimensionality reduction. Besides numerical tests, we demonstrate the construction on the example of Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  13. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE PAGES

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; ...

    2017-12-27

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  14. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  15. Surrogate waveform models

    NASA Astrophysics Data System (ADS)

    Blackman, Jonathan; Field, Scott; Galley, Chad; Scheel, Mark; Szilagyi, Bela; Tiglio, Manuel

    2015-04-01

    With the advanced detector era just around the corner, there is a strong need for fast and accurate models of gravitational waveforms from compact binary coalescence. Fast surrogate models can be built out of an accurate but slow waveform model with minimal to no loss in accuracy, but may require a large number of evaluations of the underlying model. This may be prohibitively expensive if the underlying is extremely slow, for example if we wish to build a surrogate for numerical relativity. We examine alternate choices to building surrogate models which allow for a more sparse set of input waveforms. Research supported in part by NSERC.

  16. Statistical Tests of System Linearity Based on the Method of Surrogate Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunter, N.; Paez, T.; Red-Horse, J.

    When dealing with measured data from dynamic systems we often make the tacit assumption that the data are generated by linear dynamics. While some systematic tests for linearity and determinism are available - for example the coherence fimction, the probability density fimction, and the bispectrum - fi,u-ther tests that quanti$ the existence and the degree of nonlinearity are clearly needed. In this paper we demonstrate a statistical test for the nonlinearity exhibited by a dynamic system excited by Gaussian random noise. We perform the usual division of the input and response time series data into blocks as required by themore » Welch method of spectrum estimation and search for significant relationships between a given input fkequency and response at harmonics of the selected input frequency. We argue that systematic tests based on the recently developed statistical method of surrogate data readily detect significant nonlinear relationships. The paper elucidates the method of surrogate data. Typical results are illustrated for a linear single degree-of-freedom system and for a system with polynomial stiffness nonlinearity.« less

  17. Early Fungicidal Activity as a Candidate Surrogate Endpoint for All-Cause Mortality in Cryptococcal Meningitis: A Systematic Review of the Evidence.

    PubMed

    Montezuma-Rusca, Jairo M; Powers, John H; Follmann, Dean; Wang, Jing; Sullivan, Brigit; Williamson, Peter R

    2016-01-01

    Cryptococcal meningitis (CM) is a leading cause of HIV-associated mortality. In clinical trials evaluating treatments for CM, biomarkers of early fungicidal activity (EFA) in cerebrospinal fluid (CSF) have been proposed as candidate surrogate endpoints for all- cause mortality (ACM). However, there has been no systematic evaluation of the group-level or trial-level evidence for EFA as a candidate surrogate endpoint for ACM. We conducted a systematic review of randomized trials in treatment of CM to evaluate available evidence for EFA measured as culture negativity at 2 weeks/10 weeks and slope of EFA as candidate surrogate endpoints for ACM. We performed sensitivity analysis on superiority trials and high quality trials as determined by Cochrane measures of trial bias. Twenty-seven trials including 2854 patients met inclusion criteria. Mean ACM was 15.8% at 2 weeks and 27.0% at 10 weeks with no overall significant difference between test and control groups. There was a statistically significant group-level correlation between average EFA and ACM at 10 weeks but not at 2 weeks. There was also no statistically significant group-level correlation between CFU culture negativity at 2weeks/10weeks or average EFA slope at 10 weeks. A statistically significant trial-level correlation was identified between EFA slope and ACM at 2 weeks, but is likely misleading, as there was no treatment effect on ACM. Mortality remains high in short time periods in CM clinical trials. Using published data and Institute of Medicine criteria, evidence for use of EFA as a surrogate endpoint for ACM is insufficient and could provide misleading results from clinical trials. ACM should be used as a primary endpoint evaluating treatments for cryptococcal meningitis.

  18. Inference of reaction rate parameters based on summary statistics from experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalil, Mohammad; Chowdhary, Kamaljit Singh; Safta, Cosmin

    Here, we present the results of an application of Bayesian inference and maximum entropy methods for the estimation of the joint probability density for the Arrhenius rate para meters of the rate coefficient of the H 2/O 2-mechanism chain branching reaction H + O 2 → OH + O. Available published data is in the form of summary statistics in terms of nominal values and error bars of the rate coefficient of this reaction at a number of temperature values obtained from shock-tube experiments. Our approach relies on generating data, in this case OH concentration profiles, consistent with the givenmore » summary statistics, using Approximate Bayesian Computation methods and a Markov Chain Monte Carlo procedure. The approach permits the forward propagation of parametric uncertainty through the computational model in a manner that is consistent with the published statistics. A consensus joint posterior on the parameters is obtained by pooling the posterior parameter densities given each consistent data set. To expedite this process, we construct efficient surrogates for the OH concentration using a combination of Pad'e and polynomial approximants. These surrogate models adequately represent forward model observables and their dependence on input parameters and are computationally efficient to allow their use in the Bayesian inference procedure. We also utilize Gauss-Hermite quadrature with Gaussian proposal probability density functions for moment computation resulting in orders of magnitude speedup in data likelihood evaluation. Despite the strong non-linearity in the model, the consistent data sets all res ult in nearly Gaussian conditional parameter probability density functions. The technique also accounts for nuisance parameters in the form of Arrhenius parameters of other rate coefficients with prescribed uncertainty. The resulting pooled parameter probability density function is propagated through stoichiometric hydrogen-air auto-ignition computations to illustrate the need to account for correlation among the Arrhenius rate parameters of one reaction and across rate parameters of different reactions.« less

  19. Inference of reaction rate parameters based on summary statistics from experiments

    DOE PAGES

    Khalil, Mohammad; Chowdhary, Kamaljit Singh; Safta, Cosmin; ...

    2016-10-15

    Here, we present the results of an application of Bayesian inference and maximum entropy methods for the estimation of the joint probability density for the Arrhenius rate para meters of the rate coefficient of the H 2/O 2-mechanism chain branching reaction H + O 2 → OH + O. Available published data is in the form of summary statistics in terms of nominal values and error bars of the rate coefficient of this reaction at a number of temperature values obtained from shock-tube experiments. Our approach relies on generating data, in this case OH concentration profiles, consistent with the givenmore » summary statistics, using Approximate Bayesian Computation methods and a Markov Chain Monte Carlo procedure. The approach permits the forward propagation of parametric uncertainty through the computational model in a manner that is consistent with the published statistics. A consensus joint posterior on the parameters is obtained by pooling the posterior parameter densities given each consistent data set. To expedite this process, we construct efficient surrogates for the OH concentration using a combination of Pad'e and polynomial approximants. These surrogate models adequately represent forward model observables and their dependence on input parameters and are computationally efficient to allow their use in the Bayesian inference procedure. We also utilize Gauss-Hermite quadrature with Gaussian proposal probability density functions for moment computation resulting in orders of magnitude speedup in data likelihood evaluation. Despite the strong non-linearity in the model, the consistent data sets all res ult in nearly Gaussian conditional parameter probability density functions. The technique also accounts for nuisance parameters in the form of Arrhenius parameters of other rate coefficients with prescribed uncertainty. The resulting pooled parameter probability density function is propagated through stoichiometric hydrogen-air auto-ignition computations to illustrate the need to account for correlation among the Arrhenius rate parameters of one reaction and across rate parameters of different reactions.« less

  20. A cross-cultural study on surrogate mother's empathy and maternal-foetal attachment.

    PubMed

    Lorenceau, Ellen Schenkel; Mazzucca, Luis; Tisseron, Serge; Pizitz, Todd D

    2015-06-01

    Traditional and gestational surrogate mothers assist infertile couples by carrying their children. In 2005, a meta-analysis on surrogacy was conducted but no study had examined empathy and maternal-foetal attachment of surrogate mothers. Assessments of surrogate mothers show no sign of psychopathology, but one study showed differences on several MMPI-2 scales compared to a normative sample: surrogate mothers identified with stereotypically masculine traits such as assertiveness and competition. They had a higher self-esteem and lower levels of anxiety and depression. To determine if there is a difference in empathy and maternal-foetal attachment of surrogate mothers compared to a comparison group of mothers. Three groups of European traditional and gestational surrogate mothers (n=10), Anglo-Saxon traditional and gestational surrogate mothers (n=34) and a European normative sample of mothers (n=32) completed four published psychometric instruments: the Interpersonal Reactivity Index (empathy index), the Hospital Anxiety and Depressions Scale and the MC20, a social desirability scale. Pregnant surrogate mothers filled the Maternal Antenatal Attachment Scale (n=11). Statistical non-parametric analyses of variance were conducted. Depending on cultural background, surrogate mothers present differences in terms of empathy, anxiety and depression, social desirability and quality of attachment to the foetus compared to a normative sample. Environment plays a role for traditional and gestational surrogacy. Surrogate mothers of both groups are less anxious and depressed than normative samples. Maternal-foetal attachment is strong with a slightly lower quality of attachment. Surrogate mother's empathy indexes are similar to normative samples, sometimes higher. Copyright © 2014 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  1. Surrogate-Based Optimization of Biogeochemical Transport Models

    NASA Astrophysics Data System (ADS)

    Prieß, Malte; Slawig, Thomas

    2010-09-01

    First approaches towards a surrogate-based optimization method for a one-dimensional marine biogeochemical model of NPZD type are presented. The model, developed by Oschlies and Garcon [1], simulates the distribution of nitrogen, phytoplankton, zooplankton and detritus in a water column and is driven by ocean circulation data. A key issue is to minimize the misfit between the model output and given observational data. Our aim is to reduce the overall optimization cost avoiding expensive function and derivative evaluations by using a surrogate model replacing the high-fidelity model in focus. This in particular becomes important for more complex three-dimensional models. We analyse a coarsening in the discretization of the model equations as one way to create such a surrogate. Here the numerical stability crucially depends upon the discrete stepsize in time and space and the biochemical terms. We show that for given model parameters the level of grid coarsening can be choosen accordingly yielding a stable and satisfactory surrogate. As one example of a surrogate-based optimization method we present results of the Aggressive Space Mapping technique (developed by John W. Bandler [2, 3]) applied to the optimization of this one-dimensional biogeochemical transport model.

  2. Statistical controversies in clinical research: an initial evaluation of a surrogate end point using a single randomized clinical trial and the Prentice criteria

    PubMed Central

    Heller, G.

    2015-01-01

    Surrogate end point research has grown in recent years with the increasing development and usage of biomarkers in clinical research. Surrogacy analysis is derived through randomized clinical trial data and it is carried out at the individual level and at the trial level. A common surrogate analysis at the individual level is the application of the Prentice criteria. An approach for the evaluation of the Prentice criteria is discussed, with a focus on its most difficult component, the determination of whether the treatment effect is captured by the surrogate. An interpretation of this criterion is illustrated using data from a randomized clinical trial in prostate cancer. PMID:26254442

  3. surrosurv: An R package for the evaluation of failure time surrogate endpoints in individual patient data meta-analyses of randomized clinical trials.

    PubMed

    Rotolo, Federico; Paoletti, Xavier; Michiels, Stefan

    2018-03-01

    Surrogate endpoints are attractive for use in clinical trials instead of well-established endpoints because of practical convenience. To validate a surrogate endpoint, two important measures can be estimated in a meta-analytic context when individual patient data are available: the R indiv 2 or the Kendall's τ at the individual level, and the R trial 2 at the trial level. We aimed at providing an R implementation of classical and well-established as well as more recent statistical methods for surrogacy assessment with failure time endpoints. We also intended incorporating utilities for model checking and visualization and data generating methods described in the literature to date. In the case of failure time endpoints, the classical approach is based on two steps. First, a Kendall's τ is estimated as measure of individual level surrogacy using a copula model. Then, the R trial 2 is computed via a linear regression of the estimated treatment effects; at this second step, the estimation uncertainty can be accounted for via measurement-error model or via weights. In addition to the classical approach, we recently developed an approach based on bivariate auxiliary Poisson models with individual random effects to measure the Kendall's τ and treatment-by-trial interactions to measure the R trial 2 . The most common data simulation models described in the literature are based on: copula models, mixed proportional hazard models, and mixture of half-normal and exponential random variables. The R package surrosurv implements the classical two-step method with Clayton, Plackett, and Hougaard copulas. It also allows to optionally adjusting the second-step linear regression for measurement-error. The mixed Poisson approach is implemented with different reduced models in addition to the full model. We present the package functions for estimating the surrogacy models, for checking their convergence, for performing leave-one-trial-out cross-validation, and for plotting the results. We illustrate their use in practice on individual patient data from a meta-analysis of 4069 patients with advanced gastric cancer from 20 trials of chemotherapy. The surrosurv package provides an R implementation of classical and recent statistical methods for surrogacy assessment of failure time endpoints. Flexible simulation functions are available to generate data according to the methods described in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Reduced order surrogate modelling (ROSM) of high dimensional deterministic simulations

    NASA Astrophysics Data System (ADS)

    Mitry, Mina

    Often, computationally expensive engineering simulations can prohibit the engineering design process. As a result, designers may turn to a less computationally demanding approximate, or surrogate, model to facilitate their design process. However, owing to the the curse of dimensionality, classical surrogate models become too computationally expensive for high dimensional data. To address this limitation of classical methods, we develop linear and non-linear Reduced Order Surrogate Modelling (ROSM) techniques. Two algorithms are presented, which are based on a combination of linear/kernel principal component analysis and radial basis functions. These algorithms are applied to subsonic and transonic aerodynamic data, as well as a model for a chemical spill in a channel. The results of this thesis show that ROSM can provide a significant computational benefit over classical surrogate modelling, sometimes at the expense of a minor loss in accuracy.

  5. Comparative study of surrogate models for groundwater contamination source identification at DNAPL-contaminated sites

    NASA Astrophysics Data System (ADS)

    Hou, Zeyu; Lu, Wenxi

    2018-05-01

    Knowledge of groundwater contamination sources is critical for effectively protecting groundwater resources, estimating risks, mitigating disaster, and designing remediation strategies. Many methods for groundwater contamination source identification (GCSI) have been developed in recent years, including the simulation-optimization technique. This study proposes utilizing a support vector regression (SVR) model and a kernel extreme learning machine (KELM) model to enrich the content of the surrogate model. The surrogate model was itself key in replacing the simulation model, reducing the huge computational burden of iterations in the simulation-optimization technique to solve GCSI problems, especially in GCSI problems of aquifers contaminated by dense nonaqueous phase liquids (DNAPLs). A comparative study between the Kriging, SVR, and KELM models is reported. Additionally, there is analysis of the influence of parameter optimization and the structure of the training sample dataset on the approximation accuracy of the surrogate model. It was found that the KELM model was the most accurate surrogate model, and its performance was significantly improved after parameter optimization. The approximation accuracy of the surrogate model to the simulation model did not always improve with increasing numbers of training samples. Using the appropriate number of training samples was critical for improving the performance of the surrogate model and avoiding unnecessary computational workload. It was concluded that the KELM model developed in this work could reasonably predict system responses in given operation conditions. Replacing the simulation model with a KELM model considerably reduced the computational burden of the simulation-optimization process and also maintained high computation accuracy.

  6. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    NASA Astrophysics Data System (ADS)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of saltwater intrusion are considered. The salinity levels resulting at strategic locations due to these pumping are predicted using the ensemble surrogates and are constrained to be within pre-specified levels. Different realizations of the concentration values are obtained from the ensemble predictions corresponding to each candidate solution of pumping. Reliability concept is incorporated as the percent of the total number of surrogate models which satisfy the imposed constraints. The methodology was applied to a realistic coastal aquifer system in Burdekin delta area in Australia. It was found that all optimal solutions corresponding to a reliability level of 0.99 satisfy all the constraints and as reducing reliability level decreases the constraint violation increases. Thus ensemble surrogate model based simulation-optimization was found to be useful in deriving multi-objective optimal pumping strategies for coastal aquifers under parameter uncertainty.

  7. Surrogate outcomes are associated with low methodological quality of studies of rheumatoid arthritis treated with antitumour necrosis factor agents: a systematic review.

    PubMed

    Nobre, Moacyr Roberto Cuce; da Costa, Frnanda Marques

    2012-02-01

    Surrogate endpoints may be used as substitutes for, but often do not predict clinically relevant events. Objective To assess the methodological quality of articles that present their conclusions based on clinically relevant or surrogate outcomes in a systematic review of randomised trials and cohort studies of patients with rheumatoid arthritis treated with antitumour necrosis factor (TNF) agents. PubMed, Embase and Cochrane databases were searched. The Jadad score, the percentage of Consolidated Standards Of Reporting Trials (CONSORT) statement items adequately reported and levels-of-evidence (Center for Evidence-based Medicine, Oxford) were used in a descriptive synthesis. Among 88 articles appraised, 27 had surrogate endpoints, mainly radiographic, and 44 were duplicate publications; 74% of articles with surrogate and 39% of articles with clinical endpoints (p=0.006). Fewer articles with surrogate endpoints represented a high level of evidence (Level 1b, 33% vs 62%, p=0.037) and the mean percentage of CONSORT statement items met was also lower for articles with surrogate endpoints (62.5 vs 70.7, p=0.026). Although fewer articles with surrogate endpoints were randomised trials (63% vs 74%, p=0.307) and articles with surrogate endpoints had lower Jadad scores (3.0 vs 3.2, p=0.538), these differences were not statistically significant. Studies of anti-TNF agents that report surrogate outcomes are of lesser methodological quality. As such, inclusion of such studies in evidence syntheses may bias results.

  8. A Conceptual Model of the Role of Communication in Surrogate Decision Making for Hospitalized Adults

    PubMed Central

    Torke, Alexia M.; Petronio, Sandra; Sachs, Greg A.; Helft, Paul R.; Purnell, Christianna

    2011-01-01

    Objective To build a conceptual model of the role of communication in decision making, based on literature from medicine, communication studies and medical ethics. Methods We propose a model and describe each construct in detail. We review what is known about interpersonal and patient-physician communication, describe literature about surrogate-clinician communication, and discuss implications for our developing model. Results The communication literature proposes two major elements of interpersonal communication: information processing and relationship building. These elements are composed of constructs such as information disclosure and emotional support that are likely to be relevant to decision making. We propose these elements of communication impact decision making, which in turn affects outcomes for both patients and surrogates. Decision making quality may also mediate the relationship between communication and outcomes. Conclusion Although many elements of the model have been studied in relation to patient-clinician communication, there is limited data about surrogate decision making. There is evidence of high surrogate distress associated with decision making that may be alleviated by communication–focused interventions. More research is needed to test the relationships proposed in the model. Practice Implications Good communication with surrogates may improve both the quality of medical decisions and outcomes for the patient and surrogate. PMID:21889865

  9. A conceptual model of the role of communication in surrogate decision making for hospitalized adults.

    PubMed

    Torke, Alexia M; Petronio, Sandra; Sachs, Greg A; Helft, Paul R; Purnell, Christianna

    2012-04-01

    To build a conceptual model of the role of communication in decision making, based on literature from medicine, communication studies and medical ethics. We proposed a model and described each construct in detail. We review what is known about interpersonal and patient-physician communication, described literature about surrogate-clinician communication, and discussed implications for our developing model. The communication literature proposes two major elements of interpersonal communication: information processing and relationship building. These elements are composed of constructs such as information disclosure and emotional support that are likely to be relevant to decision making. We propose these elements of communication impact decision making, which in turn affects outcomes for both patients and surrogates. Decision making quality may also mediate the relationship between communication and outcomes. Although many elements of the model have been studied in relation to patient-clinician communication, there is limited data about surrogate decision making. There is evidence of high surrogate distress associated with decision making that may be alleviated by communication-focused interventions. More research is needed to test the relationships proposed in the model. Good communication with surrogates may improve both the quality of medical decisions and outcomes for the patient and surrogate. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  10. Parameter inference in small world network disease models with approximate Bayesian Computational methods

    NASA Astrophysics Data System (ADS)

    Walker, David M.; Allingham, David; Lee, Heung Wing Joseph; Small, Michael

    2010-02-01

    Small world network models have been effective in capturing the variable behaviour of reported case data of the SARS coronavirus outbreak in Hong Kong during 2003. Simulations of these models have previously been realized using informed “guesses” of the proposed model parameters and tested for consistency with the reported data by surrogate analysis. In this paper we attempt to provide statistically rigorous parameter distributions using Approximate Bayesian Computation sampling methods. We find that such sampling schemes are a useful framework for fitting parameters of stochastic small world network models where simulation of the system is straightforward but expressing a likelihood is cumbersome.

  11. Surrogate endpoints in oncology: when are they acceptable for regulatory and clinical decisions, and are they currently overused?

    PubMed

    Kemp, Robert; Prasad, Vinay

    2017-07-21

    Surrogate outcomes are not intrinsically beneficial to patients, but are designed to be easier and faster to measure than clinically meaningful outcomes. The use of surrogates as an endpoint in clinical trials and basis for regulatory approval is common, and frequently exceeds the guidance given by regulatory bodies. In this article, we demonstrate that the use of surrogates in oncology is widespread and increasing. At the same time, the strength of association between the surrogates used and clinically meaningful outcomes is often unknown or weak. Attempts to validate surrogates are rarely undertaken. When this is done, validation relies on only a fraction of available data, and often concludes that the surrogate is poor. Post-marketing studies, designed to ensure drugs have meaningful benefits, are often not performed. Alternatively, if a drug fails to improve quality of life or overall survival, market authorization is rarely revoked. We suggest this reliance on surrogates, and the imprecision surrounding their acceptable use, means that numerous drugs are now approved based on small yet statistically significant increases in surrogates of questionable reliability. In turn, this means the benefits of many approved drugs are uncertain. This is an unacceptable situation for patients and professionals, as prior experience has shown that such uncertainty can be associated with significant harm. The use of surrogate outcomes should be limited to situations where a surrogate has demonstrated robust ability to predict meaningful benefits, or where cases are dire, rare or with few treatment options. In both cases, surrogates must be used only when continuing studies examining hard endpoints have been fully recruited.

  12. Surrogacy, Compensation, and Legal Parentage: Against the Adoption Model.

    PubMed

    van Zyl, Liezl; Walker, Ruth

    2015-09-01

    Surrogate motherhood is treated as a form of adoption in many countries: the birth mother and her partner are presumed to be the parents of the child, while the intended parents have to adopt the baby once it is born. Other than compensation for expenses related to the pregnancy, payment to surrogates is not permitted. We believe that the failure to compensate surrogate mothers for their labour as well as the significant risks they undertake is both unfair and exploitative. We accept that introducing payment for surrogates would create a significant tension in the adoption model. However, we recommend rejecting the adoption model altogether rather than continuing to prohibit compensation to surrogates.

  13. Predicting recreational water quality advisories: A comparison of statistical methods

    USGS Publications Warehouse

    Brooks, Wesley R.; Corsi, Steven R.; Fienen, Michael N.; Carvin, Rebecca B.

    2016-01-01

    Epidemiological studies indicate that fecal indicator bacteria (FIB) in beach water are associated with illnesses among people having contact with the water. In order to mitigate public health impacts, many beaches are posted with an advisory when the concentration of FIB exceeds a beach action value. The most commonly used method of measuring FIB concentration takes 18–24 h before returning a result. In order to avoid the 24 h lag, it has become common to ”nowcast” the FIB concentration using statistical regressions on environmental surrogate variables. Most commonly, nowcast models are estimated using ordinary least squares regression, but other regression methods from the statistical and machine learning literature are sometimes used. This study compares 14 regression methods across 7 Wisconsin beaches to identify which consistently produces the most accurate predictions. A random forest model is identified as the most accurate, followed by multiple regression fit using the adaptive LASSO.

  14. Fast neural network surrogates for very high dimensional physics-based models in computational oceanography.

    PubMed

    van der Merwe, Rudolph; Leen, Todd K; Lu, Zhengdong; Frolov, Sergey; Baptista, Antonio M

    2007-05-01

    We present neural network surrogates that provide extremely fast and accurate emulation of a large-scale circulation model for the coupled Columbia River, its estuary and near ocean regions. The circulation model has O(10(7)) degrees of freedom, is highly nonlinear and is driven by ocean, atmospheric and river influences at its boundaries. The surrogates provide accurate emulation of the full circulation code and run over 1000 times faster. Such fast dynamic surrogates will enable significant advances in ensemble forecasts in oceanography and weather.

  15. Potential surrogate endpoints for prostate cancer survival: analysis of a phase III randomized trial.

    PubMed

    Ray, Michael E; Bae, Kyounghwa; Hussain, Maha H A; Hanks, Gerald E; Shipley, William U; Sandler, Howard M

    2009-02-18

    The identification of surrogate endpoints for prostate cancer-specific survival may shorten the length of clinical trials for prostate cancer. We evaluated distant metastasis and general clinical treatment failure as potential surrogates for prostate cancer-specific survival by use of data from the Radiation Therapy and Oncology Group 92-02 randomized trial. Patients (n = 1554 randomly assigned and 1521 evaluable for this analysis) with locally advanced prostate cancer had been treated with 4 months of neoadjuvant and concurrent androgen deprivation therapy with external beam radiation therapy and then randomly assigned to no additional therapy (control arm) or 24 additional months of androgen deprivation therapy (experimental arm). Data from landmark analyses at 3 and 5 years for general clinical treatment failure (defined as documented local disease progression, regional or distant metastasis, initiation of androgen deprivation therapy, or a prostate-specific antigen level of 25 ng/mL or higher after radiation therapy) and/or distant metastasis were tested as surrogate endpoints for prostate cancer-specific survival at 10 years by use of Prentice's four criteria. All statistical tests were two-sided. At 3 years, 1364 patients were alive and contributed data for analysis. Both distant metastasis and general clinical treatment failure at 3 years were consistent with all four of Prentice's criteria for being surrogate endpoints for prostate cancer-specific survival at 10 years. At 5 years, 1178 patients were alive and contributed data for analysis. Although prostate cancer-specific survival was not statistically significantly different between treatment arms at 5 years (P = .08), both endpoints were consistent with Prentice's remaining criteria. Distant metastasis and general clinical treatment failure at 3 years may be candidate surrogate endpoints for prostate cancer-specific survival at 10 years. These endpoints, however, must be validated in other datasets.

  16. Surrogate obesity negatively impacts pregnancy rates in third-party reproduction.

    PubMed

    DeUgarte, Daniel A; DeUgarte, Catherine M; Sahakian, Vicken

    2010-02-01

    In a retrospective cohort review of third-party reproduction, we observed that surrogate body mass index (BMI) negatively impacts implantation rates in oocyte-donor in vitro fertilization cycles. A BMI > or =35 kg/m(2) cutoff is associated with a statistically significant decrease in pregnancy rates but not miscarriage rates. Copyright 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  17. Fast and Accurate Prediction of Numerical Relativity Waveforms from Binary Black Hole Coalescences Using Surrogate Models

    NASA Astrophysics Data System (ADS)

    Blackman, Jonathan; Field, Scott E.; Galley, Chad R.; Szilágyi, Béla; Scheel, Mark A.; Tiglio, Manuel; Hemberger, Daniel A.

    2015-09-01

    Simulating a binary black hole coalescence by solving Einstein's equations is computationally expensive, requiring days to months of supercomputing time. Using reduced order modeling techniques, we construct an accurate surrogate model, which is evaluated in a millisecond to a second, for numerical relativity (NR) waveforms from nonspinning binary black hole coalescences with mass ratios in [1, 10] and durations corresponding to about 15 orbits before merger. We assess the model's uncertainty and show that our modeling strategy predicts NR waveforms not used for the surrogate's training with errors nearly as small as the numerical error of the NR code. Our model includes all spherical-harmonic -2Yℓm waveform modes resolved by the NR code up to ℓ=8 . We compare our surrogate model to effective one body waveforms from 50 M⊙ to 300 M⊙ for advanced LIGO detectors and find that the surrogate is always more faithful (by at least an order of magnitude in most cases).

  18. Fast and Accurate Prediction of Numerical Relativity Waveforms from Binary Black Hole Coalescences Using Surrogate Models.

    PubMed

    Blackman, Jonathan; Field, Scott E; Galley, Chad R; Szilágyi, Béla; Scheel, Mark A; Tiglio, Manuel; Hemberger, Daniel A

    2015-09-18

    Simulating a binary black hole coalescence by solving Einstein's equations is computationally expensive, requiring days to months of supercomputing time. Using reduced order modeling techniques, we construct an accurate surrogate model, which is evaluated in a millisecond to a second, for numerical relativity (NR) waveforms from nonspinning binary black hole coalescences with mass ratios in [1, 10] and durations corresponding to about 15 orbits before merger. We assess the model's uncertainty and show that our modeling strategy predicts NR waveforms not used for the surrogate's training with errors nearly as small as the numerical error of the NR code. Our model includes all spherical-harmonic _{-2}Y_{ℓm} waveform modes resolved by the NR code up to ℓ=8. We compare our surrogate model to effective one body waveforms from 50M_{⊙} to 300M_{⊙} for advanced LIGO detectors and find that the surrogate is always more faithful (by at least an order of magnitude in most cases).

  19. Learning Scene Categories from High Resolution Satellite Image for Aerial Video Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheriyadat, Anil M

    2011-01-01

    Automatic scene categorization can benefit various aerial video processing applications. This paper addresses the problem of predicting the scene category from aerial video frames using a prior model learned from satellite imagery. We show that local and global features in the form of line statistics and 2-D power spectrum parameters respectively can characterize the aerial scene well. The line feature statistics and spatial frequency parameters are useful cues to distinguish between different urban scene categories. We learn the scene prediction model from highresolution satellite imagery to test the model on the Columbus Surrogate Unmanned Aerial Vehicle (CSUAV) dataset ollected bymore » high-altitude wide area UAV sensor platform. e compare the proposed features with the popular Scale nvariant Feature Transform (SIFT) features. Our experimental results show that proposed approach outperforms te SIFT model when the training and testing are conducted n disparate data sources.« less

  20. Surrogate Model Application to the Identification of Optimal Groundwater Exploitation Scheme Based on Regression Kriging Method—A Case Study of Western Jilin Province

    PubMed Central

    An, Yongkai; Lu, Wenxi; Cheng, Weiguo

    2015-01-01

    This paper introduces a surrogate model to identify an optimal exploitation scheme, while the western Jilin province was selected as the study area. A numerical simulation model of groundwater flow was established first, and four exploitation wells were set in the Tongyu county and Qian Gorlos county respectively so as to supply water to Daan county. Second, the Latin Hypercube Sampling (LHS) method was used to collect data in the feasible region for input variables. A surrogate model of the numerical simulation model of groundwater flow was developed using the regression kriging method. An optimization model was established to search an optimal groundwater exploitation scheme using the minimum average drawdown of groundwater table and the minimum cost of groundwater exploitation as multi-objective functions. Finally, the surrogate model was invoked by the optimization model in the process of solving the optimization problem. Results show that the relative error and root mean square error of the groundwater table drawdown between the simulation model and the surrogate model for 10 validation samples are both lower than 5%, which is a high approximation accuracy. The contrast between the surrogate-based simulation optimization model and the conventional simulation optimization model for solving the same optimization problem, shows the former only needs 5.5 hours, and the latter needs 25 days. The above results indicate that the surrogate model developed in this study could not only considerably reduce the computational burden of the simulation optimization process, but also maintain high computational accuracy. This can thus provide an effective method for identifying an optimal groundwater exploitation scheme quickly and accurately. PMID:26264008

  1. Pull out strength calculator for pedicle screws using a surrogate ensemble approach.

    PubMed

    Varghese, Vicky; Ramu, Palaniappan; Krishnan, Venkatesh; Saravana Kumar, Gurunathan

    2016-12-01

    Pedicle screw instrumentation is widely used in the treatment of spinal disorders and deformities. Currently, the surgeon decides the holding power of instrumentation based on the perioperative feeling which is subjective in nature. The objective of the paper is to develop a surrogate model which will predict the pullout strength of pedicle screw based on density, insertion angle, insertion depth and reinsertion. A Taguchi's orthogonal array was used to design an experiment to find the factors effecting pullout strength of pedicle screw. The pullout studies were carried using polyaxial pedicle screw on rigid polyurethane foam block according to American society for testing of materials (ASTM F543). Analysis of variance (ANOVA) and Tukey's honestly significant difference multiple comparison tests were done to find factor effect. Based on the experimental results, surrogate models based on Krigging, polynomial response surface and radial basis function were developed for predicting the pullout strength for different combination of factors. An ensemble of these surrogates based on weighted average surrogate model was also evaluated for prediction. Density, insertion depth, insertion angle and reinsertion have a significant effect (p <0.05) on pullout strength of pedicle screw. Weighted average surrogate performed the best in predicting the pull out strength amongst the surrogate models considered in this study and acted as insurance against bad prediction. A predictive model for pullout strength of pedicle screw was developed using experimental values and surrogate models. This can be used in pre-surgical planning and decision support system for spine surgeon. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Evaluating principal surrogate endpoints with time-to-event data accounting for time-varying treatment efficacy

    PubMed Central

    Gabriel, Erin E.; Gilbert, Peter B.

    2014-01-01

    Principal surrogate (PS) endpoints are relatively inexpensive and easy to measure study outcomes that can be used to reliably predict treatment effects on clinical endpoints of interest. Few statistical methods for assessing the validity of potential PSs utilize time-to-event clinical endpoint information and to our knowledge none allow for the characterization of time-varying treatment effects. We introduce the time-dependent and surrogate-dependent treatment efficacy curve, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\mathrm {TE}}(t|s)$\\end{document}, and a new augmented trial design for assessing the quality of a biomarker as a PS. We propose a novel Weibull model and an estimated maximum likelihood method for estimation of the \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\mathrm {TE}}(t|s)$\\end{document} curve. We describe the operating characteristics of our methods via simulations. We analyze data from the Diabetes Control and Complications Trial, in which we find evidence of a biomarker with value as a PS. PMID:24337534

  3. Hypothesis test for synchronization: twin surrogates revisited.

    PubMed

    Romano, M Carmen; Thiel, Marco; Kurths, Jürgen; Mergenthaler, Konstantin; Engbert, Ralf

    2009-03-01

    The method of twin surrogates has been introduced to test for phase synchronization of complex systems in the case of passive experiments. In this paper we derive new analytical expressions for the number of twins depending on the size of the neighborhood, as well as on the length of the trajectory. This allows us to determine the optimal parameters for the generation of twin surrogates. Furthermore, we determine the quality of the twin surrogates with respect to several linear and nonlinear statistics depending on the parameters of the method. In the second part of the paper we perform a hypothesis test for phase synchronization in the case of experimental data from fixational eye movements. These miniature eye movements have been shown to play a central role in neural information processing underlying the perception of static visual scenes. The high number of data sets (21 subjects and 30 trials per person) allows us to compare the generated twin surrogates with the "natural" surrogates that correspond to the different trials. We show that the generated twin surrogates reproduce very well all linear and nonlinear characteristics of the underlying experimental system. The synchronization analysis of fixational eye movements by means of twin surrogates reveals that the synchronization between the left and right eye is significant, indicating that either the centers in the brain stem generating fixational eye movements are closely linked, or, alternatively that there is only one center controlling both eyes.

  4. Make the Most of the Data You've Got: Bayesian Models and a Surrogate Species Approach to Assessing Benefits of Upstream Migration Flows for the Endangered Australian Grayling

    NASA Astrophysics Data System (ADS)

    Webb, J. Angus; Koster, Wayne M.; Stuart, Ivor G.; Reich, Paul; Stewardson, Michael J.

    2018-03-01

    Environmental water managers must make best use of allocations, and adaptive management is one means of improving effectiveness of environmental water delivery. Adaptive management relies on generation of new knowledge from monitoring and evaluation, but it is often difficult to make clear inferences from available monitoring data. Alternative approaches to assessment of flow benefits may offer an improved pathway to adaptive management. We developed Bayesian statistical models to inform adaptive management of the threatened Australian grayling ( Prototroctes maraena) in the coastal Thomson River, South-East Victoria Australia. The models assessed the importance of flows in spring and early summer (migration flows) for upstream dispersal and colonization of juveniles of this diadromous species. However, Australian grayling young-of-year were recorded in low numbers, and models provided no indication of the benefit of migration flows. To overcome this limitation, we applied the same models to young-of-year of a surrogate species (tupong— Pseudaphritis urvilli)—a more common diadromous species expected to respond to flow similarly to Australian grayling—and found strong positive responses to migration flows. Our results suggest two complementary approaches to supporting adaptive management of Australian grayling. First, refine monitoring approaches to allow direct measurement of effects of migration flows, a process currently under way. Second, while waiting for improved data, further investigate the use of tupong as a surrogate species. More generally, alternative approaches to assessment can improve knowledge to inform adaptive management, and this can occur while monitoring is being revised to directly target environmental responses of interest.

  5. Make the Most of the Data You've Got: Bayesian Models and a Surrogate Species Approach to Assessing Benefits of Upstream Migration Flows for the Endangered Australian Grayling.

    PubMed

    Webb, J Angus; Koster, Wayne M; Stuart, Ivor G; Reich, Paul; Stewardson, Michael J

    2018-03-01

    Environmental water managers must make best use of allocations, and adaptive management is one means of improving effectiveness of environmental water delivery. Adaptive management relies on generation of new knowledge from monitoring and evaluation, but it is often difficult to make clear inferences from available monitoring data. Alternative approaches to assessment of flow benefits may offer an improved pathway to adaptive management. We developed Bayesian statistical models to inform adaptive management of the threatened Australian grayling (Prototroctes maraena) in the coastal Thomson River, South-East Victoria Australia. The models assessed the importance of flows in spring and early summer (migration flows) for upstream dispersal and colonization of juveniles of this diadromous species. However, Australian grayling young-of-year were recorded in low numbers, and models provided no indication of the benefit of migration flows. To overcome this limitation, we applied the same models to young-of-year of a surrogate species (tupong-Pseudaphritis urvilli)-a more common diadromous species expected to respond to flow similarly to Australian grayling-and found strong positive responses to migration flows. Our results suggest two complementary approaches to supporting adaptive management of Australian grayling. First, refine monitoring approaches to allow direct measurement of effects of migration flows, a process currently under way. Second, while waiting for improved data, further investigate the use of tupong as a surrogate species. More generally, alternative approaches to assessment can improve knowledge to inform adaptive management, and this can occur while monitoring is being revised to directly target environmental responses of interest.

  6. A surrogate-based sensitivity quantification and Bayesian inversion of a regional groundwater flow model

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor

    2018-02-01

    Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.

  7. Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization

    DOE PAGES

    Xi, Maolong; Lu, Dan; Gui, Dongwei; ...

    2016-11-27

    Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so asmore » to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO 3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.« less

  8. Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization

    NASA Astrophysics Data System (ADS)

    Xi, Maolong; Lu, Dan; Gui, Dongwei; Qi, Zhiming; Zhang, Guannan

    2017-01-01

    Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so as to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.

  9. Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xi, Maolong; Lu, Dan; Gui, Dongwei

    Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so asmore » to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO 3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasari, Paul K. R.; Shazeeb, Mohammed Salman; Könik, Arda

    Purpose: Binning list-mode acquisitions as a function of a surrogate signal related to respiration has been employed to reduce the impact of respiratory motion on image quality in cardiac emission tomography (SPECT and PET). Inherent in amplitude binning is the assumption that there is a monotonic relationship between the amplitude of the surrogate signal and respiratory motion of the heart. This assumption is not valid in the presence of hysteresis when heart motion exhibits a different relationship with the surrogate during inspiration and expiration. The purpose of this study was to investigate the novel approach of using the Bouc–Wen (BW)more » model to provide a signal accounting for hysteresis when binning list-mode data with the goal of thereby improving motion correction. The study is based on the authors’ previous observations that hysteresis between chest and abdomen markers was indicative of hysteresis between abdomen markers and the internal motion of the heart. Methods: In 19 healthy volunteers, they determined the internal motion of the heart and diaphragm in the superior–inferior direction during free breathing using MRI navigators. A visual tracking system (VTS) synchronized with MRI acquisition tracked the anterior–posterior motions of external markers placed on the chest and abdomen. These data were employed to develop and test the Bouc–Wen model by inputting the VTS derived chest and abdomen motions into it and using the resulting output signals as surrogates for cardiac motion. The data of the volunteers were divided into training and testing sets. The training set was used to obtain initial values for the model parameters for all of the volunteers in the set, and for set members based on whether they were or were not classified as exhibiting hysteresis using a metric derived from the markers. These initial parameters were then employed with the testing set to estimate output signals. Pearson’s linear correlation coefficient between the abdomen, chest, average of chest and abdomen markers, and Bouc–Wen derived signals versus the true internal motion of the heart from MRI was used to judge the signals match to the heart motion. Results: The results show that the Bouc–Wen model generated signals demonstrated strong correlation with the heart motion. This correlation was slightly larger on average than that of the external surrogate signals derived from the abdomen marker, and average of the abdomen and chest markers, but was not statistically significantly different from them. Conclusions: The results suggest that the proposed model has the potential to be a unified framework for modeling hysteresis in respiratory motion in cardiac perfusion studies and beyond.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Bo, E-mail: luboufl@gmail.com; Park, Justin C.; Fan, Qiyong

    Purpose: Accurately localizing lung tumor localization is essential for high-precision radiation therapy techniques such as stereotactic body radiation therapy (SBRT). Since direct monitoring of tumor motion is not always achievable due to the limitation of imaging modalities for treatment guidance, placement of fiducial markers on the patient’s body surface to act as a surrogate for tumor position prediction is a practical alternative for tracking lung tumor motion during SBRT treatments. In this work, the authors propose an innovative and robust model to solve the multimarker position optimization problem. The model is able to overcome the major drawbacks of the sparsemore » optimization approach (SOA) model. Methods: The principle-component-analysis (PCA) method was employed as the framework to build the authors’ statistical prediction model. The method can be divided into two stages. The first stage is to build the surrogate tumor matrix and calculate its eigenvalues and associated eigenvectors. The second stage is to determine the “best represented” columns of the eigenvector matrix obtained from stage one and subsequently acquire the optimal marker positions as well as numbers. Using 4-dimensional CT (4DCT) and breath hold CT imaging data, the PCA method was compared to the SOA method with respect to calculation time, average prediction accuracy, prediction stability, noise resistance, marker position consistency, and marker distribution. Results: The PCA and SOA methods which were both tested were on all 11 patients for a total of 130 cases including 4DCT and breath-hold CT scenarios. The maximum calculation time for the PCA method was less than 1 s with 64 752 surface points, whereas the average calculation time for the SOA method was over 12 min with 400 surface points. Overall, the tumor center position prediction errors were comparable between the two methods, and all were less than 1.5 mm. However, for the extreme scenarios (breath hold), the prediction errors for the PCA method were not only smaller, but were also more stable than for the SOA method. Results obtained by imposing a series of random noises to the surrogates indicated that the PCA method was much more noise resistant than the SOA method. The marker position consistency tests using various combinations of 4DCT phases to construct the surrogates suggested that the marker position predictions of the PCA method were more consistent than those of the SOA method, in spite of surrogate construction. Marker distribution tests indicated that greater than 80% of the calculated marker positions fell into the high cross correlation and high motion magnitude regions for both of the algorithms. Conclusions: The PCA model is an accurate, efficient, robust, and practical model for solving the multimarker position optimization problem to predict lung tumor motion during SBRT treatments. Due to its generality, PCA model can also be applied to other imaging guidance system whichever using surface motion as the surrogates.« less

  12. Comparison of organs' shapes with geometric and Zernike 3D moments.

    PubMed

    Broggio, D; Moignier, A; Ben Brahim, K; Gardumi, A; Grandgirard, N; Pierrat, N; Chea, M; Derreumaux, S; Desbrée, A; Boisserie, G; Aubert, B; Mazeron, J-J; Franck, D

    2013-09-01

    The morphological similarity of organs is studied with feature vectors based on geometric and Zernike 3D moments. It is particularly investigated if outliers and average models can be identified. For this purpose, the relative proximity to the mean feature vector is defined, principal coordinate and clustering analyses are also performed. To study the consistency and usefulness of this approach, 17 livers and 76 hearts voxel models from several sources are considered. In the liver case, models with similar morphological feature are identified. For the limited amount of studied cases, the liver of the ICRP male voxel model is identified as a better surrogate than the female one. For hearts, the clustering analysis shows that three heart shapes represent about 80% of the morphological variations. The relative proximity and clustering analysis rather consistently identify outliers and average models. For the two cases, identification of outliers and surrogate of average models is rather robust. However, deeper classification of morphological feature is subject to caution and can only be performed after cross analysis of at least two kinds of feature vectors. Finally, the Zernike moments contain all the information needed to re-construct the studied objects and thus appear as a promising tool to derive statistical organ shapes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiangjiang; Li, Weixuan; Lin, Guang

    In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters.more » To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen–Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.« less

  14. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation ofmore » the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.« less

  15. Design Mining Interacting Wind Turbines.

    PubMed

    Preen, Richard J; Bull, Larry

    2016-01-01

    An initial study has recently been presented of surrogate-assisted evolutionary algorithms used to design vertical-axis wind turbines wherein candidate prototypes are evaluated under fan-generated wind conditions after being physically instantiated by a 3D printer. Unlike other approaches, such as computational fluid dynamics simulations, no mathematical formulations were used and no model assumptions were made. This paper extends that work by exploring alternative surrogate modelling and evolutionary techniques. The accuracy of various modelling algorithms used to estimate the fitness of evaluated individuals from the initial experiments is compared. The effect of temporally windowing surrogate model training samples is explored. A surrogate-assisted approach based on an enhanced local search is introduced; and alternative coevolution collaboration schemes are examined.

  16. Adaptive surrogate model based multi-objective transfer trajectory optimization between different libration points

    NASA Astrophysics Data System (ADS)

    Peng, Haijun; Wang, Wei

    2016-10-01

    An adaptive surrogate model-based multi-objective optimization strategy that combines the benefits of invariant manifolds and low-thrust control toward developing a low-computational-cost transfer trajectory between libration orbits around the L1 and L2 libration points in the Sun-Earth system has been proposed in this paper. A new structure for a multi-objective transfer trajectory optimization model that divides the transfer trajectory into several segments and gives the dominations for invariant manifolds and low-thrust control in different segments has been established. To reduce the computational cost of multi-objective transfer trajectory optimization, a mixed sampling strategy-based adaptive surrogate model has been proposed. Numerical simulations show that the results obtained from the adaptive surrogate-based multi-objective optimization are in agreement with the results obtained using direct multi-objective optimization methods, and the computational workload of the adaptive surrogate-based multi-objective optimization is only approximately 10% of that of direct multi-objective optimization. Furthermore, the generating efficiency of the Pareto points of the adaptive surrogate-based multi-objective optimization is approximately 8 times that of the direct multi-objective optimization. Therefore, the proposed adaptive surrogate-based multi-objective optimization provides obvious advantages over direct multi-objective optimization methods.

  17. Eutrophication risk assessment in coastal embayments using simple statistical models.

    PubMed

    Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G

    2003-09-01

    A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.

  18. Modeling methods for merging computational and experimental aerodynamic pressure data

    NASA Astrophysics Data System (ADS)

    Haderlie, Jacob C.

    This research describes a process to model surface pressure data sets as a function of wing geometry from computational and wind tunnel sources and then merge them into a single predicted value. The described merging process will enable engineers to integrate these data sets with the goal of utilizing the advantages of each data source while overcoming the limitations of both; this provides a single, combined data set to support analysis and design. The main challenge with this process is accurately representing each data source everywhere on the wing. Additionally, this effort demonstrates methods to model wind tunnel pressure data as a function of angle of attack as an initial step towards a merging process that uses both location on the wing and flow conditions (e.g., angle of attack, flow velocity or Reynold's number) as independent variables. This surrogate model of pressure as a function of angle of attack can be useful for engineers that need to predict the location of zero-order discontinuities, e.g., flow separation or normal shocks. Because, to the author's best knowledge, there is no published, well-established merging method for aerodynamic pressure data (here, the coefficient of pressure Cp), this work identifies promising modeling and merging methods, and then makes a critical comparison of these methods. Surrogate models represent the pressure data for both data sets. Cubic B-spline surrogate models represent the computational simulation results. Machine learning and multi-fidelity surrogate models represent the experimental data. This research compares three surrogates for the experimental data (sequential--a.k.a. online--Gaussian processes, batch Gaussian processes, and multi-fidelity additive corrector) on the merits of accuracy and computational cost. The Gaussian process (GP) methods employ cubic B-spline CFD surrogates as a model basis function to build a surrogate model of the WT data, and this usage of the CFD surrogate in building the WT data could serve as a "merging" because the resulting WT pressure prediction uses information from both sources. In the GP approach, this model basis function concept seems to place more "weight" on the Cp values from the wind tunnel (WT) because the GP surrogate uses the CFD to approximate the WT data values. Conversely, the computationally inexpensive additive corrector method uses the CFD B-spline surrogate to define the shape of the spanwise distribution of the Cp while minimizing prediction error at all spanwise locations for a given arc length position; this, too, combines information from both sources to make a prediction of the 2-D WT-based Cp distribution, but the additive corrector approach gives more weight to the CFD prediction than to the WT data. Three surrogate models of the experimental data as a function of angle of attack are also compared for accuracy and computational cost. These surrogates are a single Gaussian process model (a single "expert"), product of experts, and generalized product of experts. The merging approach provides a single pressure distribution that combines experimental and computational data. The batch Gaussian process method provides a relatively accurate surrogate that is computationally acceptable, and can receive wind tunnel data from port locations that are not necessarily parallel to a variable direction. On the other hand, the sequential Gaussian process and additive corrector methods must receive a sufficient number of data points aligned with one direction, e.g., from pressure port bands (tap rows) aligned with the freestream. The generalized product of experts best represents wind tunnel pressure as a function of angle of attack, but at higher computational cost than the single expert approach. The format of the application data from computational and experimental sources in this work precluded the merging process from including flow condition variables (e.g., angle of attack) in the independent variables, so the merging process is only conducted in the wing geometry variables of arc length and span. The merging process of Cp data allows a more "hands-off" approach to aircraft design and analysis, (i.e., not as many engineers needed to debate the Cp distribution shape) and generates Cp predictions at any location on the wing. However, the cost with these benefits are engineer time (learning how to build surrogates), computational time in constructing the surrogates, and surrogate accuracy (surrogates introduce error into data predictions). This dissertation effort used the Trap Wing / First AIAA CFD High-Lift Prediction Workshop as a relevant transonic wing with a multi-element high-lift system, and this work identified that the batch GP model for the WT data and the B-spline surrogate for the CFD might best be combined using expert belief weights to describe Cp as a function of location on the wing element surface. (Abstract shortened by ProQuest.).

  19. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE PAGES

    Butler, Troy; Wildey, Timothy

    2018-01-01

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  20. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, Troy; Wildey, Timothy

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  1. Model-independent test for scale-dependent non-Gaussianities in the cosmic microwave background.

    PubMed

    Räth, C; Morfill, G E; Rossmanith, G; Banday, A J; Górski, K M

    2009-04-03

    We present a model-independent method to test for scale-dependent non-Gaussianities in combination with scaling indices as test statistics. Therefore, surrogate data sets are generated, in which the power spectrum of the original data is preserved, while the higher order correlations are partly randomized by applying a scale-dependent shuffling procedure to the Fourier phases. We apply this method to the Wilkinson Microwave Anisotropy Probe data of the cosmic microwave background and find signatures for non-Gaussianities on large scales. Further tests are required to elucidate the origin of the detected anomalies.

  2. Comparing biomarkers as principal surrogate endpoints.

    PubMed

    Huang, Ying; Gilbert, Peter B

    2011-12-01

    Recently a new definition of surrogate endpoint, the "principal surrogate," was proposed based on causal associations between treatment effects on the biomarker and on the clinical endpoint. Despite its appealing interpretation, limited research has been conducted to evaluate principal surrogates, and existing methods focus on risk models that consider a single biomarker. How to compare principal surrogate value of biomarkers or general risk models that consider multiple biomarkers remains an open research question. We propose to characterize a marker or risk model's principal surrogate value based on the distribution of risk difference between interventions. In addition, we propose a novel summary measure (the standardized total gain) that can be used to compare markers and to assess the incremental value of a new marker. We develop a semiparametric estimated-likelihood method to estimate the joint surrogate value of multiple biomarkers. This method accommodates two-phase sampling of biomarkers and is more widely applicable than existing nonparametric methods by incorporating continuous baseline covariates to predict the biomarker(s), and is more robust than existing parametric methods by leaving the error distribution of markers unspecified. The methodology is illustrated using a simulated example set and a real data set in the context of HIV vaccine trials. © 2011, The International Biometric Society.

  3. Global Parameter Optimization of CLM4.5 Using Sparse-Grid Based Surrogates

    NASA Astrophysics Data System (ADS)

    Lu, D.; Ricciuto, D. M.; Gu, L.

    2016-12-01

    Calibration of the Community Land Model (CLM) is challenging because of its model complexity, large parameter sets, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time. The goal of this study is to calibrate some of the CLM parameters in order to improve model projection of carbon fluxes. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first use advanced sparse grid (SG) interpolation to construct a surrogate system of the actual CLM model, and then we calibrate the surrogate model in the optimization process. As the surrogate model is a polynomial whose evaluation is fast, it can be efficiently evaluated with sufficiently large number of times in the optimization, which facilitates the global search. We calibrate five parameters against 12 months of GPP, NEP, and TLAI data from the U.S. Missouri Ozark (US-MOz) tower. The results indicate that an accurate surrogate model can be created for the CLM4.5 with a relatively small number of SG points (i.e., CLM4.5 simulations), and the application of the optimized parameters leads to a higher predictive capacity than the default parameter values in the CLM4.5 for the US-MOz site.

  4. Identifying experimental surrogates for Bacillus anthracis spores: a review

    PubMed Central

    2010-01-01

    Bacillus anthracis, the causative agent of anthrax, is a proven biological weapon. In order to study this threat, a number of experimental surrogates have been used over the past 70 years. However, not all surrogates are appropriate for B. anthracis, especially when investigating transport, fate and survival. Although B. atrophaeus has been widely used as a B. anthracis surrogate, the two species do not always behave identically in transport and survival models. Therefore, we devised a scheme to identify a more appropriate surrogate for B. anthracis. Our selection criteria included risk of use (pathogenicity), phylogenetic relationship, morphology and comparative survivability when challenged with biocides. Although our knowledge of certain parameters remains incomplete, especially with regards to comparisons of spore longevity under natural conditions, we found that B. thuringiensis provided the best overall fit as a non-pathogenic surrogate for B. anthracis. Thus, we suggest focusing on this surrogate in future experiments of spore fate and transport modelling. PMID:21092338

  5. Dependence of subject-specific parameters for a fast helical CT respiratory motion model on breathing rate: an animal study

    NASA Astrophysics Data System (ADS)

    O'Connell, Dylan; Thomas, David H.; Lamb, James M.; Lewis, John H.; Dou, Tai; Sieren, Jered P.; Saylor, Melissa; Hofmann, Christian; Hoffman, Eric A.; Lee, Percy P.; Low, Daniel A.

    2018-02-01

    To determine if the parameters relating lung tissue displacement to a breathing surrogate signal in a previously published respiratory motion model vary with the rate of breathing during image acquisition. An anesthetized pig was imaged using multiple fast helical scans to sample the breathing cycle with simultaneous surrogate monitoring. Three datasets were collected while the animal was mechanically ventilated with different respiratory rates: 12 bpm (breaths per minute), 17 bpm, and 24 bpm. Three sets of motion model parameters describing the correspondences between surrogate signals and tissue displacements were determined. The model error was calculated individually for each dataset, as well asfor pairs of parameters and surrogate signals from different experiments. The values of one model parameter, a vector field denoted α which related tissue displacement to surrogate amplitude, determined for each experiment were compared. The mean model error of the three datasets was 1.00  ±  0.36 mm with a 95th percentile value of 1.69 mm. The mean error computed from all combinations of parameters and surrogate signals from different datasets was 1.14  ±  0.42 mm with a 95th percentile of 1.95 mm. The mean difference in α over all pairs of experiments was 4.7%  ±  5.4%, and the 95th percentile was 16.8%. The mean angle between pairs of α was 5.0  ±  4.0 degrees, with a 95th percentile of 13.2 mm. The motion model parameters were largely unaffected by changes in the breathing rate during image acquisition. The mean error associated with mismatched sets of parameters and surrogate signals was 0.14 mm greater than the error achieved when using parameters and surrogate signals acquired with the same breathing rate, while maximum respiratory motion was 23.23 mm on average.

  6. Probability distributions of molecular observables computed from Markov models. II. Uncertainties in observables and their time-evolution

    NASA Astrophysics Data System (ADS)

    Chodera, John D.; Noé, Frank

    2010-09-01

    Discrete-state Markov (or master equation) models provide a useful simplified representation for characterizing the long-time statistical evolution of biomolecules in a manner that allows direct comparison with experiments as well as the elucidation of mechanistic pathways for an inherently stochastic process. A vital part of meaningful comparison with experiment is the characterization of the statistical uncertainty in the predicted experimental measurement, which may take the form of an equilibrium measurement of some spectroscopic signal, the time-evolution of this signal following a perturbation, or the observation of some statistic (such as the correlation function) of the equilibrium dynamics of a single molecule. Without meaningful error bars (which arise from both approximation and statistical error), there is no way to determine whether the deviations between model and experiment are statistically meaningful. Previous work has demonstrated that a Bayesian method that enforces microscopic reversibility can be used to characterize the statistical component of correlated uncertainties in state-to-state transition probabilities (and functions thereof) for a model inferred from molecular simulation data. Here, we extend this approach to include the uncertainty in observables that are functions of molecular conformation (such as surrogate spectroscopic signals) characterizing each state, permitting the full statistical uncertainty in computed spectroscopic experiments to be assessed. We test the approach in a simple model system to demonstrate that the computed uncertainties provide a useful indicator of statistical variation, and then apply it to the computation of the fluorescence autocorrelation function measured for a dye-labeled peptide previously studied by both experiment and simulation.

  7. EHR-based phenotyping: Bulk learning and evaluation.

    PubMed

    Chiu, Po-Hsiang; Hripcsak, George

    2017-06-01

    In data-driven phenotyping, a core computational task is to identify medical concepts and their variations from sources of electronic health records (EHR) to stratify phenotypic cohorts. A conventional analytic framework for phenotyping largely uses a manual knowledge engineering approach or a supervised learning approach where clinical cases are represented by variables encompassing diagnoses, medicinal treatments and laboratory tests, among others. In such a framework, tasks associated with feature engineering and data annotation remain a tedious and expensive exercise, resulting in poor scalability. In addition, certain clinical conditions, such as those that are rare and acute in nature, may never accumulate sufficient data over time, which poses a challenge to establishing accurate and informative statistical models. In this paper, we use infectious diseases as the domain of study to demonstrate a hierarchical learning method based on ensemble learning that attempts to address these issues through feature abstraction. We use a sparse annotation set to train and evaluate many phenotypes at once, which we call bulk learning. In this batch-phenotyping framework, disease cohort definitions can be learned from within the abstract feature space established by using multiple diseases as a substrate and diagnostic codes as surrogates. In particular, using surrogate labels for model training renders possible its subsequent evaluation using only a sparse annotated sample. Moreover, statistical models can be trained and evaluated, using the same sparse annotation, from within the abstract feature space of low dimensionality that encapsulates the shared clinical traits of these target diseases, collectively referred to as the bulk learning set. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Evaluation of countermeasures for red light running by traffic simulator-based surrogate safety measures.

    PubMed

    Lee, Changju; So, Jaehyun Jason; Ma, Jiaqi

    2018-01-02

    The conflicts among motorists entering a signalized intersection with the red light indication have become a national safety issue. Because of its sensitivity, efforts have been made to investigate the possible causes and effectiveness of countermeasures using comparison sites and/or before-and-after studies. Nevertheless, these approaches are ineffective when comparison sites cannot be found, or crash data sets are not readily available or not reliable for statistical analysis. Considering the random nature of red light running (RLR) crashes, an inventive approach regardless of data availability is necessary to evaluate the effectiveness of each countermeasure face to face. The aims of this research are to (1) review erstwhile literature related to red light running and traffic safety models; (2) propose a practical methodology for evaluation of RLR countermeasures with a microscopic traffic simulation model and surrogate safety assessment model (SSAM); (3) apply the proposed methodology to actual signalized intersection in Virginia, with the most prevalent scenarios-increasing the yellow signal interval duration, installing an advance warning sign, and an RLR camera; and (4) analyze the relative effectiveness by RLR frequency and the number of conflicts (rear-end and crossing). All scenarios show a reduction in RLR frequency (-7.8, -45.5, and -52.4%, respectively), but only increasing the yellow signal interval duration results in a reduced total number of conflicts (-11.3%; a surrogate safety measure of possible RLR-related crashes). An RLR camera makes the greatest reduction (-60.9%) in crossing conflicts (a surrogate safety measure of possible angle crashes), whereas increasing the yellow signal interval duration results in only a 12.8% reduction of rear-end conflicts (a surrogate safety measure of possible rear-end crash). Although increasing the yellow signal interval duration is advantageous because this reduces the total conflicts (a possibility of total RLR-related crashes), each countermeasure shows different effects by RLR-related conflict types that can be referred to when making a decision. Given that each intersection has different RLR crash issues, evaluated countermeasures are directly applicable to enhance the cost and time effectiveness, according to the situation of the target intersection. In addition, the proposed methodology is replicable at any site that has a dearth of crash data and/or comparison sites in order to test any other countermeasures (both engineering and enforcement countermeasures) for RLR crashes.

  9. redNumerical modelling of a peripheral arterial stenosis using dimensionally reduced models and kernel methods.

    PubMed

    Köppl, Tobias; Santin, Gabriele; Haasdonk, Bernard; Helmig, Rainer

    2018-05-06

    In this work, we consider two kinds of model reduction techniques to simulate blood flow through the largest systemic arteries, where a stenosis is located in a peripheral artery i.e. in an artery that is located far away from the heart. For our simulations we place the stenosis in one of the tibial arteries belonging to the right lower leg (right post tibial artery). The model reduction techniques that are used are on the one hand dimensionally reduced models (1-D and 0-D models, the so-called mixed-dimension model) and on the other hand surrogate models produced by kernel methods. Both methods are combined in such a way that the mixed-dimension models yield training data for the surrogate model, where the surrogate model is parametrised by the degree of narrowing of the peripheral stenosis. By means of a well-trained surrogate model, we show that simulation data can be reproduced with a satisfactory accuracy and that parameter optimisation or state estimation problems can be solved in a very efficient way. Furthermore it is demonstrated that a surrogate model enables us to present after a very short simulation time the impact of a varying degree of stenosis on blood flow, obtaining a speedup of several orders over the full model. This article is protected by copyright. All rights reserved.

  10. Active Subspaces for Wind Plant Surrogate Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Ryan N; Quick, Julian; Dykes, Katherine L

    Understanding the uncertainty in wind plant performance is crucial to their cost-effective design and operation. However, conventional approaches to uncertainty quantification (UQ), such as Monte Carlo techniques or surrogate modeling, are often computationally intractable for utility-scale wind plants because of poor congergence rates or the curse of dimensionality. In this paper we demonstrate that wind plant power uncertainty can be well represented with a low-dimensional active subspace, thereby achieving a significant reduction in the dimension of the surrogate modeling problem. We apply the active sub-spaces technique to UQ of plant power output with respect to uncertainty in turbine axial inductionmore » factors, and find a single active subspace direction dominates the sensitivity in power output. When this single active subspace direction is used to construct a quadratic surrogate model, the number of model unknowns can be reduced by up to 3 orders of magnitude without compromising performance on unseen test data. We conclude that the dimension reduction achieved with active subspaces makes surrogate-based UQ approaches tractable for utility-scale wind plants.« less

  11. Evaluating surrogate endpoints, prognostic markers, and predictive markers: Some simple themes.

    PubMed

    Baker, Stuart G; Kramer, Barnett S

    2015-08-01

    A surrogate endpoint is an endpoint observed earlier than the true endpoint (a health outcome) that is used to draw conclusions about the effect of treatment on the unobserved true endpoint. A prognostic marker is a marker for predicting the risk of an event given a control treatment; it informs treatment decisions when there is information on anticipated benefits and harms of a new treatment applied to persons at high risk. A predictive marker is a marker for predicting the effect of treatment on outcome in a subgroup of patients or study participants; it provides more rigorous information for treatment selection than a prognostic marker when it is based on estimated treatment effects in a randomized trial. We organized our discussion around a different theme for each topic. "Fundamentally an extrapolation" refers to the non-statistical considerations and assumptions needed when using surrogate endpoints to evaluate a new treatment. "Decision analysis to the rescue" refers to use the use of decision analysis to evaluate an additional prognostic marker because it is not possible to choose between purely statistical measures of marker performance. "The appeal of simplicity" refers to a straightforward and efficient use of a single randomized trial to evaluate overall treatment effect and treatment effect within subgroups using predictive markers. The simple themes provide a general guideline for evaluation of surrogate endpoints, prognostic markers, and predictive markers. © The Author(s) 2014.

  12. Surrogate based wind farm layout optimization using manifold mapping

    NASA Astrophysics Data System (ADS)

    Kaja Kamaludeen, Shaafi M.; van Zuijle, Alexander; Bijl, Hester

    2016-09-01

    High computational cost associated with the high fidelity wake models such as RANS or LES serves as a primary bottleneck to perform a direct high fidelity wind farm layout optimization (WFLO) using accurate CFD based wake models. Therefore, a surrogate based multi-fidelity WFLO methodology (SWFLO) is proposed. The surrogate model is built using an SBO method referred as manifold mapping (MM). As a verification, optimization of spacing between two staggered wind turbines was performed using the proposed surrogate based methodology and the performance was compared with that of direct optimization using high fidelity model. Significant reduction in computational cost was achieved using MM: a maximum computational cost reduction of 65%, while arriving at the same optima as that of direct high fidelity optimization. The similarity between the response of models, the number of mapping points and its position, highly influences the computational efficiency of the proposed method. As a proof of concept, realistic WFLO of a small 7-turbine wind farm is performed using the proposed surrogate based methodology. Two variants of Jensen wake model with different decay coefficients were used as the fine and coarse model. The proposed SWFLO method arrived at the same optima as that of the fine model with very less number of fine model simulations.

  13. Fast Prediction and Evaluation of Gravitational Waveforms Using Surrogate Models

    NASA Astrophysics Data System (ADS)

    Field, Scott E.; Galley, Chad R.; Hesthaven, Jan S.; Kaye, Jason; Tiglio, Manuel

    2014-07-01

    We propose a solution to the problem of quickly and accurately predicting gravitational waveforms within any given physical model. The method is relevant for both real-time applications and more traditional scenarios where the generation of waveforms using standard methods can be prohibitively expensive. Our approach is based on three offline steps resulting in an accurate reduced order model in both parameter and physical dimensions that can be used as a surrogate for the true or fiducial waveform family. First, a set of m parameter values is determined using a greedy algorithm from which a reduced basis representation is constructed. Second, these m parameters induce the selection of m time values for interpolating a waveform time series using an empirical interpolant that is built for the fiducial waveform family. Third, a fit in the parameter dimension is performed for the waveform's value at each of these m times. The cost of predicting L waveform time samples for a generic parameter choice is of order O(mL+mcfit) online operations, where cfit denotes the fitting function operation count and, typically, m ≪L. The result is a compact, computationally efficient, and accurate surrogate model that retains the original physics of the fiducial waveform family while also being fast to evaluate. We generate accurate surrogate models for effective-one-body waveforms of nonspinning binary black hole coalescences with durations as long as 105M, mass ratios from 1 to 10, and for multiple spherical harmonic modes. We find that these surrogates are more than 3 orders of magnitude faster to evaluate as compared to the cost of generating effective-one-body waveforms in standard ways. Surrogate model building for other waveform families and models follows the same steps and has the same low computational online scaling cost. For expensive numerical simulations of binary black hole coalescences, we thus anticipate extremely large speedups in generating new waveforms with a surrogate. As waveform generation is one of the dominant costs in parameter estimation algorithms and parameter space exploration, surrogate models offer a new and practical way to dramatically accelerate such studies without impacting accuracy. Surrogates built in this paper, as well as others, are available from GWSurrogate, a publicly available python package.

  14. Sobol‧ sensitivity analysis of NAPL-contaminated aquifer remediation process based on multiple surrogates

    NASA Astrophysics Data System (ADS)

    Luo, Jiannan; Lu, Wenxi

    2014-06-01

    Sobol‧ sensitivity analyses based on different surrogates were performed on a trichloroethylene (TCE)-contaminated aquifer to assess the sensitivity of the design variables of remediation duration, surfactant concentration and injection rates at four wells to remediation efficiency First, the surrogate models of a multi-phase flow simulation model were constructed by applying radial basis function artificial neural network (RBFANN) and Kriging methods, and the two models were then compared. Based on the developed surrogate models, the Sobol‧ method was used to calculate the sensitivity indices of the design variables which affect the remediation efficiency. The coefficient of determination (R2) and the mean square error (MSE) of these two surrogate models demonstrated that both models had acceptable approximation accuracy, furthermore, the approximation accuracy of the Kriging model was slightly better than that of the RBFANN model. Sobol‧ sensitivity analysis results demonstrated that the remediation duration was the most important variable influencing remediation efficiency, followed by rates of injection at wells 1 and 3, while rates of injection at wells 2 and 4 and the surfactant concentration had negligible influence on remediation efficiency. In addition, high-order sensitivity indices were all smaller than 0.01, which indicates that interaction effects of these six factors were practically insignificant. The proposed Sobol‧ sensitivity analysis based on surrogate is an effective tool for calculating sensitivity indices, because it shows the relative contribution of the design variables (individuals and interactions) to the output performance variability with a limited number of runs of a computationally expensive simulation model. The sensitivity analysis results lay a foundation for the optimal groundwater remediation process optimization.

  15. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  16. Reduced cost mission design using surrogate models

    NASA Astrophysics Data System (ADS)

    Feldhacker, Juliana D.; Jones, Brandon A.; Doostan, Alireza; Hampton, Jerrad

    2016-01-01

    This paper uses surrogate models to reduce the computational cost associated with spacecraft mission design in three-body dynamical systems. Sampling-based least squares regression is used to project the system response onto a set of orthogonal bases, providing a representation of the ΔV required for rendezvous as a reduced-order surrogate model. Models are presented for mid-field rendezvous of spacecraft in orbits in the Earth-Moon circular restricted three-body problem, including a halo orbit about the Earth-Moon L2 libration point (EML-2) and a distant retrograde orbit (DRO) about the Moon. In each case, the initial position of the spacecraft, the time of flight, and the separation between the chaser and the target vehicles are all considered as design inputs. The results show that sample sizes on the order of 102 are sufficient to produce accurate surrogates, with RMS errors reaching 0.2 m/s for the halo orbit and falling below 0.01 m/s for the DRO. A single function call to the resulting surrogate is up to two orders of magnitude faster than computing the same solution using full fidelity propagators. The expansion coefficients solved for in the surrogates are then used to conduct a global sensitivity analysis of the ΔV on each of the input parameters, which identifies the separation between the spacecraft as the primary contributor to the ΔV cost. Finally, the models are demonstrated to be useful for cheap evaluation of the cost function in constrained optimization problems seeking to minimize the ΔV required for rendezvous. These surrogate models show significant advantages for mission design in three-body systems, in terms of both computational cost and capabilities, over traditional Monte Carlo methods.

  17. Dynamic role and importance of surrogate species for assessing potential adverse environmental impacts of genetically engineered insect-resistant plants on non-target organisms.

    PubMed

    Wach, Michael; Hellmich, Richard L; Layton, Raymond; Romeis, Jörg; Gadaleta, Patricia G

    2016-08-01

    Surrogate species have a long history of use in research and regulatory settings to understand the potentially harmful effects of toxic substances including pesticides. More recently, surrogate species have been used to evaluate the potential effects of proteins contained in genetically engineered insect resistant (GEIR) crops. Species commonly used in GEIR crop testing include beneficial organisms such as honeybees, arthropod predators, and parasitoids. The choice of appropriate surrogates is influenced by scientific factors such as the knowledge of the mode of action and the spectrum of activity as well as societal factors such as protection goals that assign value to certain ecosystem services such as pollination or pest control. The primary reasons for using surrogates include the inability to test all possible organisms, the restrictions on using certain organisms in testing (e.g., rare, threatened, or endangered species), and the ability to achieve greater sensitivity and statistical power by using laboratory testing of certain species. The acceptance of surrogate species data can allow results from one region to be applied or "transported" for use in another region. On the basis of over a decade of using surrogate species to evaluate potential effects of GEIR crops, it appears that the current surrogates have worked well to predict effects of GEIR crops that have been developed (Carstens et al. GM Crops Food 5:1-5, 2014), and it is expected that they should work well to predict effects of future GEIR crops based on similar technologies.

  18. Surrogate Based Uni/Multi-Objective Optimization and Distribution Estimation Methods

    NASA Astrophysics Data System (ADS)

    Gong, W.; Duan, Q.; Huo, X.

    2017-12-01

    Parameter calibration has been demonstrated as an effective way to improve the performance of dynamic models, such as hydrological models, land surface models, weather and climate models etc. Traditional optimization algorithms usually cost a huge number of model evaluations, making dynamic model calibration very difficult, or even computationally prohibitive. With the help of a serious of recently developed adaptive surrogate-modelling based optimization methods: uni-objective optimization method ASMO, multi-objective optimization method MO-ASMO, and probability distribution estimation method ASMO-PODE, the number of model evaluations can be significantly reduced to several hundreds, making it possible to calibrate very expensive dynamic models, such as regional high resolution land surface models, weather forecast models such as WRF, and intermediate complexity earth system models such as LOVECLIM. This presentation provides a brief introduction to the common framework of adaptive surrogate-based optimization algorithms of ASMO, MO-ASMO and ASMO-PODE, a case study of Common Land Model (CoLM) calibration in Heihe river basin in Northwest China, and an outlook of the potential applications of the surrogate-based optimization methods.

  19. Recovery Efficiency, False Negative Rate, and Limit of Detection Performance of a Validated Macrofoam-Swab Sampling Method with Low Surface Concentrations of Two Bacillus anthracis Surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Hutchison, Janine R.; Kaiser, Brooke L. D.

    The performance of a macrofoam-swab sampling method was evaluated using Bacillus anthracis Sterne (BAS) and Bacillus atrophaeus Nakamura (BG) spores applied at nine low target amounts (2-500 spores) to positive-control plates and test coupons (2 in × 2 in) of four surface materials (glass, stainless steel, vinyl tile, and plastic). Test results from cultured samples were used to evaluate the effects of surrogate, surface concentration, and surface material on recovery efficiency (RE), false negative rate (FNR), and limit of detection. For RE, surrogate and surface material had statistically significant effects, but concentration did not. Mean REs were the lowest formore » vinyl tile (50.8% with BAS, 40.2% with BG) and the highest for glass (92.8% with BAS, 71.4% with BG). FNR values ranged from 0 to 0.833 for BAS and 0 to 0.806 for BG, with values increasing as concentration decreased in the range tested (0.078 to 19.375 CFU/cm2, where CFU denotes ‘colony forming units’). Surface material also had a statistically significant effect. A FNR-concentration curve was fit for each combination of surrogate and surface material. For both surrogates, the FNR curves tended to be the lowest for glass and highest for vinyl title. The FNR curves for BG tended to be higher than for BAS at lower concentrations, especially for glass. Results using a modified Rapid Viability-Polymerase Chain Reaction (mRV-PCR) analysis method were also obtained. The mRV-PCR results and comparisons to the culture results are discussed in a separate report.« less

  20. Recovery Efficiency, False Negative Rate, and Limit of Detection Performance of a Validated Macrofoam-Swab Sampling Method with Low Surface Concentrations of Two Bacillus anthracis Surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Hutchison, Janine R.; Deatherage Kaiser, Brooke L

    The performance of a macrofoam-swab sampling method was evaluated using Bacillus anthracis Sterne (BAS) and Bacillus atrophaeus Nakamura (BG) spores applied at nine low target amounts (2-500 spores) to positive-control plates and test coupons (2 in. × 2 in.) of four surface materials (glass, stainless steel, vinyl tile, and plastic). Test results from cultured samples were used to evaluate the effects of surrogate, surface concentration, and surface material on recovery efficiency (RE), false negative rate (FNR), and limit of detection. For RE, surrogate and surface material had statistically significant effects, but concentration did not. Mean REs were the lowest formore » vinyl tile (50.8% with BAS, 40.2% with BG) and the highest for glass (92.8% with BAS, 71.4% with BG). FNR values ranged from 0 to 0.833 for BAS and 0 to 0.806 for BG, with values increasing as concentration decreased in the range tested (0.078 to 19.375 CFU/cm 2, where CFU denotes ‘colony forming units’). Surface material also had a statistically significant effect. A FNR-concentration curve was fit for each combination of surrogate and surface material. For both surrogates, the FNR curves tended to be the lowest for glass and highest for vinyl title. The FNR curves for BG tended to be higher than for BAS at lower concentrations, especially for glass. Results using a modified Rapid Viability-Polymerase Chain Reaction (mRV-PCR) analysis method were also obtained. The mRV-PCR results and comparisons to the culture results will be discussed in a subsequent report.« less

  1. Surrogate endpoints for overall survival in advanced colorectal cancer: a clinician's perspective.

    PubMed

    Piedbois, Pascal; Miller Croswell, Jennifer

    2008-10-01

    Surrogate endpoints in oncology research and practice have garnered increasing attention over the past two decades. This activity has largely been driven by the promise surrogate endpoints appear to hold: the potential to get new therapies to seriously ill patients more rapidly. However, uncertainties abound. Even agreeing upon a definition of a "valid" surrogate endpoint has not been a straightforward exercise; this article begins by highlighting differences in how this term has been previously captured and applied, as well as laying out the basic criteria essential for its application in advanced colorectal cancer. Ideally, these elements include (but are not limited to) ease of measurement, rapid indication of treatment effect, and, most importantly, reliable and consistent prediction of the true impact of a treatment on the ultimate outcome of interest: overall survival. The strengths and weaknesses of current potential surrogate endpoints in advanced colorectal cancer, including performance status, carcinoembryonic antigen plasma level, overall response rate, time to progression, and disease-free survival, are each considered in turn. Finally, limitations of surrogate endpoints in the clinical setting, including challenges in extrapolation to new therapies, and the incomplete provision of information about potential adverse effects, are discussed. Work remains to be done between physicians and statisticians to bridge the gap between that which is statistically demonstrable and that which will be clinically useful.The term ;surrogate endpoint' was virtually unknown by most oncologists 15 years ago. A search in PubMed [http://www.ncbi.nlm.nih.gov] based on the words ;surrogate and cancer' shows that more than 2000 papers were published in medical journals in the last 20 years, with a dramatic increase of interest in the last five years. Interestingly, the same trend is observed when the words ;surrogate and heart' are entered into PubMed, suggesting that the issue of surrogate endpoints goes beyond the field of oncology, although the frequency of discussion varies (Figure 1; note different y-axis scales for oncology and cardiology).The goal of the present paper is to discuss the main issues surrounding surrogate endpoints from a clinician's point of view, using as an example surrogate endpoints of overall survival (OS) in advanced colorectal cancer (ACC).

  2. Surrogacy assessment using principal stratification with multivariate normal and Gaussian copula models.

    PubMed

    Taylor, Jeremy M G; Conlon, Anna S C; Elliott, Michael R

    2015-08-01

    The validation of intermediate markers as surrogate markers (S) for the true outcome of interest (T) in clinical trials offers the possibility for trials to be run more quickly and cheaply by using the surrogate endpoint in place of the true endpoint. Working within a principal stratification framework, we propose causal quantities to evaluate surrogacy using a Gaussian copula model for an ordinal surrogate and time-to-event final outcome. The methods are applied to data from four colorectal cancer clinical trials, where S is tumor response and T is overall survival. For the Gaussian copula model, a Bayesian estimation strategy is used and, as some parameters are not identifiable from the data, we explore the use of informative priors that are consistent with reasonable assumptions in the surrogate marker setting to aid in estimation. While there is some bias in the estimation of the surrogacy quantities of interest, the estimation procedure does reasonably well at distinguishing between poor and good surrogate markers. Some of the parameters of the proposed model are not identifiable from the data, and therefore, assumptions must be made in order to aid in their estimation. The proposed quantities can be used in combination to provide evidence about the validity of S as a surrogate marker for T. © The Author(s) 2014.

  3. Surrogate safety measures from traffic simulation models

    DOT National Transportation Integrated Search

    2003-01-01

    This project investigates the potential for deriving surrogate measures of safety from existing microscopic traffic simulation models for intersections. The process of computing the measures in the simulation, extracting the required data, and summar...

  4. Uncertainty in the Bayesian meta-analysis of normally distributed surrogate endpoints

    PubMed Central

    Thompson, John R; Spata, Enti; Abrams, Keith R

    2015-01-01

    We investigate the effect of the choice of parameterisation of meta-analytic models and related uncertainty on the validation of surrogate endpoints. Different meta-analytical approaches take into account different levels of uncertainty which may impact on the accuracy of the predictions of treatment effect on the target outcome from the treatment effect on a surrogate endpoint obtained from these models. A range of Bayesian as well as frequentist meta-analytical methods are implemented using illustrative examples in relapsing–remitting multiple sclerosis, where the treatment effect on disability worsening is the primary outcome of interest in healthcare evaluation, while the effect on relapse rate is considered as a potential surrogate to the effect on disability progression, and in gastric cancer, where the disease-free survival has been shown to be a good surrogate endpoint to the overall survival. Sensitivity analysis was carried out to assess the impact of distributional assumptions on the predictions. Also, sensitivity to modelling assumptions and performance of the models were investigated by simulation. Although different methods can predict mean true outcome almost equally well, inclusion of uncertainty around all relevant parameters of the model may lead to less certain and hence more conservative predictions. When investigating endpoints as candidate surrogate outcomes, a careful choice of the meta-analytical approach has to be made. Models underestimating the uncertainty of available evidence may lead to overoptimistic predictions which can then have an effect on decisions made based on such predictions. PMID:26271918

  5. Uncertainty in the Bayesian meta-analysis of normally distributed surrogate endpoints.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Spata, Enti; Abrams, Keith R

    2017-10-01

    We investigate the effect of the choice of parameterisation of meta-analytic models and related uncertainty on the validation of surrogate endpoints. Different meta-analytical approaches take into account different levels of uncertainty which may impact on the accuracy of the predictions of treatment effect on the target outcome from the treatment effect on a surrogate endpoint obtained from these models. A range of Bayesian as well as frequentist meta-analytical methods are implemented using illustrative examples in relapsing-remitting multiple sclerosis, where the treatment effect on disability worsening is the primary outcome of interest in healthcare evaluation, while the effect on relapse rate is considered as a potential surrogate to the effect on disability progression, and in gastric cancer, where the disease-free survival has been shown to be a good surrogate endpoint to the overall survival. Sensitivity analysis was carried out to assess the impact of distributional assumptions on the predictions. Also, sensitivity to modelling assumptions and performance of the models were investigated by simulation. Although different methods can predict mean true outcome almost equally well, inclusion of uncertainty around all relevant parameters of the model may lead to less certain and hence more conservative predictions. When investigating endpoints as candidate surrogate outcomes, a careful choice of the meta-analytical approach has to be made. Models underestimating the uncertainty of available evidence may lead to overoptimistic predictions which can then have an effect on decisions made based on such predictions.

  6. Vitamin D Supplementation Does Not Impact Insulin Resistance in Black and White Children.

    PubMed

    Ferira, Ashley J; Laing, Emma M; Hausman, Dorothy B; Hall, Daniel B; McCabe, George P; Martin, Berdine R; Hill Gallant, Kathleen M; Warden, Stuart J; Weaver, Connie M; Peacock, Munro; Lewis, Richard D

    2016-04-01

    Vitamin D supplementation trials with diabetes-related outcomes have been conducted almost exclusively in adults and provide equivocal findings. The objective of this study was to determine the dose-response of vitamin D supplementation on fasting glucose, insulin, and a surrogate measure of insulin resistance in white and black children aged 9–13 years, who participated in the Georgia, Purdue, and Indiana University (or GAPI) trial: a 12-week multisite, randomized, triple-masked, dose-response, placebo-controlled vitamin D trial. Black and white children in the early stages of puberty (N = 323, 50% male, 51% black) were equally randomized to receive vitamin D3 (0, 400, 1000, 2000, or 4000 IU/day) for 12 weeks. Fasting serum 25-hydroxyvitamin D (25(OH)D), glucose and insulin were assessed at baseline and weeks 6 and 12. Homeostasis model assessment of insulin resistance was used as a surrogate measure of insulin resistance. Statistical analyses were conducted as intent-to-treat using a mixed effects model. Baseline serum 25(OH)D was inversely associated with insulin (r = −0.140, P = 0.017) and homeostasis model assessment of insulin resistance (r = −0.146, P = 0.012) after adjusting for race, sex, age, pubertal maturation, fat mass, and body mass index. Glucose, insulin, and insulin resistance increased (F > 5.79, P < .003) over the 12 weeks, despite vitamin D dose-dependent increases in serum 25(OH)D. Despite significant baseline inverse relationships between serum 25(OH)D and measures of insulin resistance, vitamin D supplementation had no impact on fasting glucose, insulin, or a surrogate measure of insulin resistance over 12 weeks in apparently healthy children.

  7. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  8. Statistical Emulator for Expensive Classification Simulators

    NASA Technical Reports Server (NTRS)

    Ross, Jerret; Samareh, Jamshid A.

    2016-01-01

    Expensive simulators prevent any kind of meaningful analysis to be performed on the phenomena they model. To get around this problem the concept of using a statistical emulator as a surrogate representation of the simulator was introduced in the 1980's. Presently, simulators have become more and more complex and as a result running a single example on these simulators is very expensive and can take days to weeks or even months. Many new techniques have been introduced, termed criteria, which sequentially select the next best (most informative to the emulator) point that should be run on the simulator. These criteria methods allow for the creation of an emulator with only a small number of simulator runs. We follow and extend this framework to expensive classification simulators.

  9. A review on design of experiments and surrogate models in aircraft real-time and many-query aerodynamic analyses

    NASA Astrophysics Data System (ADS)

    Yondo, Raul; Andrés, Esther; Valero, Eusebio

    2018-01-01

    Full scale aerodynamic wind tunnel testing, numerical simulation of high dimensional (full-order) aerodynamic models or flight testing are some of the fundamental but complex steps in the various design phases of recent civil transport aircrafts. Current aircraft aerodynamic designs have increase in complexity (multidisciplinary, multi-objective or multi-fidelity) and need to address the challenges posed by the nonlinearity of the objective functions and constraints, uncertainty quantification in aerodynamic problems or the restrained computational budgets. With the aim to reduce the computational burden and generate low-cost but accurate models that mimic those full order models at different values of the design variables, Recent progresses have witnessed the introduction, in real-time and many-query analyses, of surrogate-based approaches as rapid and cheaper to simulate models. In this paper, a comprehensive and state-of-the art survey on common surrogate modeling techniques and surrogate-based optimization methods is given, with an emphasis on models selection and validation, dimensionality reduction, sensitivity analyses, constraints handling or infill and stopping criteria. Benefits, drawbacks and comparative discussions in applying those methods are described. Furthermore, the paper familiarizes the readers with surrogate models that have been successfully applied to the general field of fluid dynamics, but not yet in the aerospace industry. Additionally, the review revisits the most popular sampling strategies used in conducting physical and simulation-based experiments in aircraft aerodynamic design. Attractive or smart designs infrequently used in the field and discussions on advanced sampling methodologies are presented, to give a glance on the various efficient possibilities to a priori sample the parameter space. Closing remarks foster on future perspectives, challenges and shortcomings associated with the use of surrogate models by aircraft industrial aerodynamicists, despite their increased interest among the research communities.

  10. Mitigating Errors in External Respiratory Surrogate-Based Models of Tumor Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malinowski, Kathleen T.; Fischell Department of Bioengineering, University of Maryland, College Park, MD; McAvoy, Thomas J.

    2012-04-01

    Purpose: To investigate the effect of tumor site, measurement precision, tumor-surrogate correlation, training data selection, model design, and interpatient and interfraction variations on the accuracy of external marker-based models of tumor position. Methods and Materials: Cyberknife Synchrony system log files comprising synchronously acquired positions of external markers and the tumor from 167 treatment fractions were analyzed. The accuracy of Synchrony, ordinary-least-squares regression, and partial-least-squares regression models for predicting the tumor position from the external markers was evaluated. The quantity and timing of the data used to build the predictive model were varied. The effects of tumor-surrogate correlation and the precisionmore » in both the tumor and the external surrogate position measurements were explored by adding noise to the data. Results: The tumor position prediction errors increased during the duration of a fraction. Increasing the training data quantities did not always lead to more accurate models. Adding uncorrelated noise to the external marker-based inputs degraded the tumor-surrogate correlation models by 16% for partial-least-squares and 57% for ordinary-least-squares. External marker and tumor position measurement errors led to tumor position prediction changes 0.3-3.6 times the magnitude of the measurement errors, varying widely with model algorithm. The tumor position prediction errors were significantly associated with the patient index but not with the fraction index or tumor site. Partial-least-squares was as accurate as Synchrony and more accurate than ordinary-least-squares. Conclusions: The accuracy of surrogate-based inferential models of tumor position was affected by all the investigated factors, except for the tumor site and fraction index.« less

  11. Analysis of a Multi-Fidelity Surrogate for Handling Real Gas Equations of State

    NASA Astrophysics Data System (ADS)

    Ouellet, Frederick; Park, Chanyoung; Rollin, Bertrand; Balachandar, S.

    2017-06-01

    The explosive dispersal of particles is a complex multiphase and multi-species fluid flow problem. In these flows, the detonation products of the explosive must be treated as real gas while the ideal gas equation of state is used for the surrounding air. As the products expand outward from the detonation point, they mix with ambient air and create a mixing region where both state equations must be satisfied. One of the most accurate, yet computationally expensive, methods to handle this problem is an algorithm that iterates between both equations of state until pressure and thermal equilibrium are achieved inside of each computational cell. This work aims to use a multi-fidelity surrogate model to replace this process. A Kriging model is used to produce a curve fit which interpolates selected data from the iterative algorithm using Bayesian statistics. We study the model performance with respect to the iterative method in simulations using a finite volume code. The model's (i) computational speed, (ii) memory requirements and (iii) computational accuracy are analyzed to show the benefits of this novel approach. Also, optimizing the combination of model accuracy and computational speed through the choice of sampling points is explained. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program as a Cooperative Agreement under the Predictive Science Academic Alliance Program under Contract No. DE-NA0002378.

  12. Biomarkers and surrogate endpoints in kidney disease.

    PubMed

    Hartung, Erum A

    2016-03-01

    Kidney disease and its related comorbidities impose a large public health burden. Despite this, the number of clinical trials in nephrology lags behind many other fields. An important factor contributing to the relatively slow pace of nephrology trials is that existing clinical endpoints have significant limitations. "Hard" endpoints for chronic kidney disease, such as progression to end-stage renal disease, may not be reached for decades. Traditional biomarkers, such as serum creatinine in acute kidney injury, may lack sensitivity and predictive value. Finding new biomarkers to serve as surrogate endpoints is therefore an important priority in kidney disease research and may help to accelerate nephrology clinical trials. In this paper, I first review key concepts related to the selection of clinical trial endpoints and discuss statistical and regulatory considerations related to the evaluation of biomarkers as surrogate endpoints. This is followed by a discussion of the challenges and opportunities in developing novel biomarkers and surrogate endpoints in three major areas of nephrology research: acute kidney injury, chronic kidney disease, and autosomal dominant polycystic kidney disease.

  13. A multi-damages identification method for cantilever beam based on mode shape curvatures and Kriging surrogate model

    NASA Astrophysics Data System (ADS)

    Xie, Fengle; Jiang, Zhansi; Jiang, Hui

    2018-05-01

    This paper presents a multi-damages identification method for Cantilever Beam. First, the damage location is identified by using the mode shape curvatures. Second, samples of varying damage severities at the damage location and their corresponding natural frequencies are used to construct the initial Kriging surrogate model. Then a particle swarm optimization (PSO) algorithm is employed to identify the damage severities based on Kriging surrogate model. The simulation study of a double-damaged cantilever beam demonstrated that the proposed method is effective.

  14. Regression modeling of particle size distributions in urban storm water: advancements through improved sample collection methods

    USGS Publications Warehouse

    Fienen, Michael N.; Selbig, William R.

    2012-01-01

    A new sample collection system was developed to improve the representation of sediment entrained in urban storm water by integrating water quality samples from the entire water column. The depth-integrated sampler arm (DISA) was able to mitigate sediment stratification bias in storm water, thereby improving the characterization of suspended-sediment concentration and particle size distribution at three independent study locations. Use of the DISA decreased variability, which improved statistical regression to predict particle size distribution using surrogate environmental parameters, such as precipitation depth and intensity. The performance of this statistical modeling technique was compared to results using traditional fixed-point sampling methods and was found to perform better. When environmental parameters can be used to predict particle size distributions, environmental managers have more options when characterizing concentrations, loads, and particle size distributions in urban runoff.

  15. Real-time, continuous water-quality monitoring in Indiana and Kentucky

    USGS Publications Warehouse

    Shoda, Megan E.; Lathrop, Timothy R.; Risch, Martin R.

    2015-01-01

    Water-quality “super” gages (also known as “sentry” gages) provide real-time, continuous measurements of the physical and chemical characteristics of stream water at or near selected U.S. Geological Survey (USGS) streamgages in Indiana and Kentucky. A super gage includes streamflow and water-quality instrumentation and representative stream sample collection for laboratory analysis. USGS scientists can use statistical surrogate models to relate instrument values to analyzed chemical concentrations at a super gage. Real-time, continuous and laboratory-analyzed concentration and load data are publicly accessible on USGS Web pages.

  16. Toward a Psychology of Surrogate Decision Making.

    PubMed

    Tunney, Richard J; Ziegler, Fenja V

    2015-11-01

    In everyday life, many of the decisions that we make are made on behalf of other people. A growing body of research suggests that we often, but not always, make different decisions on behalf of other people than the other person would choose. This is problematic in the practical case of legally designated surrogate decision makers, who may not meet the substituted judgment standard. Here, we review evidence from studies of surrogate decision making and examine the extent to which surrogate decision making accurately predicts the recipient's wishes, or if it is an incomplete or distorted application of the surrogate's own decision-making processes. We find no existing domain-general model of surrogate decision making. We propose a framework by which surrogate decision making can be assessed and a novel domain-general theory as a unifying explanatory concept for surrogate decisions. © The Author(s) 2015.

  17. On the relationship between the causal-inference and meta-analytic paradigms for the validation of surrogate endpoints.

    PubMed

    Alonso, Ariel; Van der Elst, Wim; Molenberghs, Geert; Buyse, Marc; Burzykowski, Tomasz

    2015-03-01

    The increasing cost of drug development has raised the demand for surrogate endpoints when evaluating new drugs in clinical trials. However, over the years, it has become clear that surrogate endpoints need to be statistically evaluated and deemed valid, before they can be used as substitutes of "true" endpoints in clinical studies. Nowadays, two paradigms, based on causal-inference and meta-analysis, dominate the scene. Nonetheless, although the literature emanating from these paradigms is wide, till now the relationship between them has largely been left unexplored. In the present work, we discuss the conceptual framework underlying both approaches and study the relationship between them using theoretical elements and the analysis of a real case study. Furthermore, we show that the meta-analytic approach can be embedded within a causal-inference framework on the one hand and that it can be heuristically justified why surrogate endpoints successfully evaluated using this approach will often be appealing from a causal-inference perspective as well, on the other. A newly developed and user friendly R package Surrogate is provided to carry out the evaluation exercise. © 2014, The International Biometric Society.

  18. EVALUATING HABITAT AS A SURROGATE FOR POPULATION VIABILITY USING A SPATIALLY EXPLICIT POPULATION MODEL

    EPA Science Inventory

    Because data for conservation planning are always limited, surrogates are often substituted for intractable measurements such as species richness or population viability. We examined the ability of habitat quality to act as a surrogate for population performance for both Red-sho...

  19. Surrogate Safety Assessment Model (SSAM)--software user manual

    DOT National Transportation Integrated Search

    2008-05-01

    This document presents guidelines for the installation and use of the Surrogate Safety Assessment Model (SSAM) software. For more information regarding the SSAM application, including discussion of theoretical background and the results of a series o...

  20. Evaluating surrogate endpoints, prognostic markers, and predictive markers — some simple themes

    PubMed Central

    Baker, Stuart G.; Kramer, Barnett S.

    2014-01-01

    Background A surrogate endpoint is an endpoint observed earlier than the true endpoint (a health outcome) that is used to draw conclusions about the effect of treatment on the unobserved true endpoint. A prognostic marker is a marker for predicting the risk of an event given a control treatment; it informs treatment decisions when there is information on anticipated benefits and harms of a new treatment applied to persons at high risk. A predictive marker is a marker for predicting the effect of treatment on outcome in a subgroup of patients or study participants; it provides more rigorous information for treatment selection than a prognostic marker when it is based on estimated treatment effects in a randomized trial. Methods We organized our discussion around a different theme for each topic. Results “Fundamentally an extrapolation” refers to the non-statistical considerations and assumptions needed when using surrogate endpoints to evaluate a new treatment. “Decision analysis to the rescue” refers to use the use of decision analysis to evaluate an additional prognostic marker because it is not possible to choose between purely statistical measures of marker performance. “The appeal of simplicity” refers to a straightforward and efficient use of a single randomized trial to evaluate overall treatment effect and treatment effect within subgroups using predictive markers. Conclusion The simple themes provide a general guideline for evaluation of surrogate endpoints, prognostic markers, and predictive markers. PMID:25385934

  1. Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions

    NASA Technical Reports Server (NTRS)

    Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.

    2011-01-01

    A surrogate model methodology is described for predicting in real time the residual strength of flight structures with discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. A residual strength test of a metallic, integrally-stiffened panel is simulated to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data would, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high-fidelity fracture simulation framework provide useful tools for adaptive flight technology.

  2. Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions

    NASA Technical Reports Server (NTRS)

    Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.

    2011-01-01

    A surrogate model methodology is described for predicting, during flight, the residual strength of aircraft structures that sustain discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. Two ductile fracture simulations are presented to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data does, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high fidelity fracture simulation framework provide useful tools for adaptive flight technology.

  3. Surrogate screening models for the low physical activity criterion of frailty.

    PubMed

    Eckel, Sandrah P; Bandeen-Roche, Karen; Chaves, Paulo H M; Fried, Linda P; Louis, Thomas A

    2011-06-01

    Low physical activity, one of five criteria in a validated clinical phenotype of frailty, is assessed by a standardized, semiquantitative questionnaire on up to 20 leisure time activities. Because of the time demanded to collect the interview data, it has been challenging to translate to studies other than the Cardiovascular Health Study (CHS), for which it was developed. Considering subsets of activities, we identified and evaluated streamlined surrogate assessment methods and compared them to one implemented in the Women's Health and Aging Study (WHAS). Using data on men and women ages 65 and older from the CHS, we applied logistic regression models to rank activities by "relative influence" in predicting low physical activity.We considered subsets of the most influential activities as inputs to potential surrogate models (logistic regressions). We evaluated predictive accuracy and predictive validity using the area under receiver operating characteristic curves and assessed criterion validity using proportional hazards models relating frailty status (defined using the surrogate) to mortality. Walking for exercise and moderately strenuous household chores were highly influential for both genders. Women required fewer activities than men for accurate classification. The WHAS model (8 CHS activities) was an effective surrogate, but a surrogate using 6 activities (walking, chores, gardening, general exercise, mowing and golfing) was also highly predictive. We recommend a 6 activity questionnaire to assess physical activity for men and women. If efficiency is essential and the study involves only women, fewer activities can be included.

  4. Measurement errors in the assessment of exposure to solar ultraviolet radiation and its impact on risk estimates in epidemiological studies.

    PubMed

    Dadvand, Payam; Basagaña, Xavier; Barrera-Gómez, Jose; Diffey, Brian; Nieuwenhuijsen, Mark

    2011-07-01

    To date, many studies addressing long-term effects of ultraviolet radiation (UVR) exposure on human health have relied on a range of surrogates such as the latitude of the city of residence, ambient UVR levels, or time spent outdoors to estimate personal UVR exposure. This study aimed to differentiate the contributions of personal behaviour and ambient UVR levels on facial UVR exposure and to evaluate the impact of using UVR exposure surrogates on detecting exposure-outcome associations. Data on time-activity, holiday behaviour, and ambient UVR levels were obtained for adult (aged 25-55 years old) indoor workers in six European cities: Athens (37°N), Grenoble (45°N), Milan (45°N), Prague (50°N), Oxford (52°N), and Helsinki (60°N). Annual UVR facial exposure levels were simulated for 10,000 subjects for each city, using a behavioural UVR exposure model. Within-city variations of facial UVR exposure were three times larger than the variation between cities, mainly because of time-activity patterns. In univariate models, ambient UVR levels, latitude and time spent outdoors, each accounted for less than one fourth of the variation in facial exposure levels. Use of these surrogates to assess long-term exposure to UVR resulted in requiring more than four times more participants to achieve similar statistical power to the study that applied simulated facial exposure. Our results emphasise the importance of integrating both personal behaviour and ambient UVR levels/latitude in exposure assessment methodologies.

  5. Experimental Validation and Combustion Modeling of a JP-8 Surrogate in a Single Cylinder Diesel Engine

    DTIC Science & Technology

    2014-04-15

    SINGLE CYLINDER DIESEL ENGINE Amit Shrestha, Umashankar Joshi, Ziliang Zheng, Tamer Badawy, Naeim A. Henein, Wayne State University, Detroit, MI, USA...13-03-2014 4. TITLE AND SUBTITLE EXPERIMENTAL VALIDATION AND COMBUSTION MODELING OF A JP-8 SURROGATE IN A SINGLE CYLINDER DIESEL ENGINE 5a...INTERNATIONAL UNCLASSIFIED • Validate a two-component JP-8 surrogate in a single cylinder diesel engine. Validation parameters include – Ignition delay

  6. Maintaining tumor targeting accuracy in real-time motion compensation systems for respiration-induced tumor motion.

    PubMed

    Malinowski, Kathleen; McAvoy, Thomas J; George, Rohini; Dieterich, Sonja; D'Souza, Warren D

    2013-07-01

    To determine how best to time respiratory surrogate-based tumor motion model updates by comparing a novel technique based on external measurements alone to three direct measurement methods. Concurrently measured tumor and respiratory surrogate positions from 166 treatment fractions for lung or pancreas lesions were analyzed. Partial-least-squares regression models of tumor position from marker motion were created from the first six measurements in each dataset. Successive tumor localizations were obtained at a rate of once per minute on average. Model updates were timed according to four methods: never, respiratory surrogate-based (when metrics based on respiratory surrogate measurements exceeded confidence limits), error-based (when localization error ≥ 3 mm), and always (approximately once per minute). Radial tumor displacement prediction errors (mean ± standard deviation) for the four schema described above were 2.4 ± 1.2, 1.9 ± 0.9, 1.9 ± 0.8, and 1.7 ± 0.8 mm, respectively. The never-update error was significantly larger than errors of the other methods. Mean update counts over 20 min were 0, 4, 9, and 24, respectively. The same improvement in tumor localization accuracy could be achieved through any of the three update methods, but significantly fewer updates were required when the respiratory surrogate method was utilized. This study establishes the feasibility of timing image acquisitions for updating respiratory surrogate models without direct tumor localization.

  7. A comparison of the thermal inactivation kinetics of human norovirus surrogates and hepatitis A virus in buffered cell culture medium.

    PubMed

    Bozkurt, Hayriye; D'Souza, Doris H; Davidson, P Michael

    2014-09-01

    Human noroviruses and hepatitis A virus (HAV) are considered as epidemiologically significant causes of foodborne disease. Therefore, studies are needed to bridge existing data gaps and determine appropriate parameters for thermal inactivation of human noroviruses and HAV. The objectives of this research were to compare the thermal inactivation kinetics of human norovirus surrogates (murine norovirus (MNV-1), and feline calicivirus (FCV-F9)) and HAV in buffered medium (2-ml vials), compare first-order and Weibull models to describe the data, calculate Arrhenius activation energy for each model, and evaluate model efficiency using selected statistical criteria. The D-values calculated from the first-order model (50-72 °C) ranged from 0.21-19.75 min for FCV-F9, 0.25-36.28 min for MNV-1, and 0.88-56.22 min for HAV. Using the Weibull model, the tD = 1 (time to destroy 1 log) for FCV-F9, MNV-1 and HAV at the same temperatures ranged from 0.10-13.27, 0.09-26.78, and 1.03-39.91 min, respectively. The z-values for FCV-F9, MNV-1, and HAV were 9.66 °C, 9.16 °C, and 14.50 °C, respectively, using the Weibull model. For the first order model, z-values were 9.36 °C, 9.32 °C, and 12.49 °C for FCV-F9, MNV-1, and HAV, respectively. For the Weibull model, estimated activation energies for FCV-F9, MNV-1, and HAV were 225, 278, and 182 kJ/mol, respectively, while the calculated activation energies for the first order model were 195, 202, and 171 kJ/mol, respectively. Knowledge of the thermal inactivation kinetics of norovirus surrogates and HAV will allow the development of processes that produce safer food products and improve consumer safety. Copyright © 2014. Published by Elsevier Ltd.

  8. Evaluation of an exposure assessment used in epidemiological studies of diesel exhaust and lung cancer in underground mines

    PubMed Central

    Crump, Kenny; Van Landingham, Cynthia

    2012-01-01

    NIOSH/NCI (National Institute of Occupational Safety and Health and National Cancer Institute) developed exposure estimates for respirable elemental carbon (REC) as a surrogate for exposure to diesel exhaust (DE) for different jobs in eight underground mines by year beginning in the 1940s—1960s when diesel equipment was first introduced into these mines. These estimates played a key role in subsequent epidemiological analyses of the potential relationship between exposure to DE and lung cancer conducted in these mines. We report here on a reanalysis of some of the data from this exposure assessment. Because samples of REC were limited primarily to 1998–2001, NIOSH/NCI used carbon monoxide (CO) as a surrogate for REC. In addition, because CO samples were limited, particularly in the earlier years, they used the ratio of diesel horsepower (HP) to the mine air exhaust rate as a surrogate for CO. There are considerable uncertainties connected with each of these surrogate-based steps. The estimates of HP appear to involve considerable uncertainty, although we had no data upon which to evaluate the magnitude of this uncertainty. A sizable percentage (45%) of the CO samples used in the HP to CO model was below the detection limit which required NIOSH/NCI to assign CO values to these samples. In their preferred REC estimates, NIOSH/NCI assumed a linear relation between C0 and REC, although they provided no credible support for that assumption. Their assumption of a stable relationship between HP and CO also is questionable, and our reanalysis found a statistically significant relationship in only one-half of the mines. We re-estimated yearly REC exposures mainly using NIOSH/NCI methods but with some important differences: (i) rather than simply assuming a linear relationship, we used data from the mines to estimate the CO—REC relationship; (ii) we used a different method for assigning values to nondetect CO measurements; and (iii) we took account of statistical uncertainty to estimate bounds for REC exposures. This exercise yielded significantly different exposure estimates than estimated by NIOSH/NCI. However, this analysis did not incorporate the full range of uncertainty in REC exposures because of additional uncertainties in the assumptions underlying the modeling and in the underlying data (e.g. HP and mine exhaust rates). Estimating historical exposures in a cohort is generally a very difficult undertaking. However, this should not prevent one from recognizing the uncertainty in the resulting estimates in any use made of them. PMID:22594934

  9. Observations on Three Endpoint Properties and Their Relationship to Regulatory Outcomes of European Oncology Marketing Applications

    PubMed Central

    Stolk, Pieter; McAuslane, James Neil; Schellens, Jan; Breckenridge, Alasdair M.; Leufkens, Hubert

    2015-01-01

    Background. Guidance and exploratory evidence indicate that the type of endpoints and the magnitude of their outcome can define a therapy’s clinical activity; however, little empirical evidence relates specific endpoint properties with regulatory outcomes. Materials and Methods. We explored the relationship of 3 endpoint properties to regulatory outcomes by assessing 50 oncology marketing authorization applications (MAAs; reviewed from 2009 to 2013). Results. Overall, 16 (32%) had a negative outcome. The most commonly used hard endpoints were overall survival (OS) and the duration of response or stable disease. OS was a component of 91% approved and 63% failed MAAs. The most commonly used surrogate endpoints were progression-free survival (PFS), response rate, and health-related quality of life assessments. There was no difference (p = .3801) between the approved and failed MAA cohorts in the proportion of hard endpoints used. A mean of slightly more than four surrogate endpoints were used per approved MAA compared with slightly more than two for failed MAAs. Longer OS and PFS duration outcomes were generally associated with approvals, often when not statistically significant. The approved cohort was associated with a preponderance of statistically significant (p < .05) improvements in primary endpoints (p < .0001 difference between the approved and failed groups). Conclusion. Three key endpoint properties (type of endpoint [hard/surrogate], magnitude of an endpoint outcome, and its statistical significance) are consistent with the European Medicines Agency guidance and, notwithstanding the contribution of unique disease-specific circumstances, are associated with a predictable positive outcome for oncology MAAs. Implications for Practice: Regulatory decisions made by the European Medicines Agency determine which new medicines will be available to European prescribers and for which therapeutic indications. Regulatory success or failure can be influenced by many factors. This study assessed three key properties of endpoints used in preauthorization trials (type of endpoint [hard/surrogate], magnitude of endpoint outcome, and its statistical significance) and whether they are associated with a positive regulatory outcome. Clinicians can use these properties, which are described in the publicly available European public assessment reports, to help guide their understanding of the clinical effect of new oncologic therapies. PMID:25948678

  10. Observations on Three Endpoint Properties and Their Relationship to Regulatory Outcomes of European Oncology Marketing Applications.

    PubMed

    Liberti, Lawrence; Stolk, Pieter; McAuslane, James Neil; Schellens, Jan; Breckenridge, Alasdair M; Leufkens, Hubert

    2015-06-01

    Guidance and exploratory evidence indicate that the type of endpoints and the magnitude of their outcome can define a therapy's clinical activity; however, little empirical evidence relates specific endpoint properties with regulatory outcomes. We explored the relationship of 3 endpoint properties to regulatory outcomes by assessing 50 oncology marketing authorization applications (MAAs; reviewed from 2009 to 2013). Overall, 16 (32%) had a negative outcome. The most commonly used hard endpoints were overall survival (OS) and the duration of response or stable disease. OS was a component of 91% approved and 63% failed MAAs. The most commonly used surrogate endpoints were progression-free survival (PFS), response rate, and health-related quality of life assessments. There was no difference (p = .3801) between the approved and failed MAA cohorts in the proportion of hard endpoints used. A mean of slightly more than four surrogate endpoints were used per approved MAA compared with slightly more than two for failed MAAs. Longer OS and PFS duration outcomes were generally associated with approvals, often when not statistically significant. The approved cohort was associated with a preponderance of statistically significant (p < .05) improvements in primary endpoints (p < .0001 difference between the approved and failed groups). Three key endpoint properties (type of endpoint [hard/surrogate], magnitude of an endpoint outcome, and its statistical significance) are consistent with the European Medicines Agency guidance and, notwithstanding the contribution of unique disease-specific circumstances, are associated with a predictable positive outcome for oncology MAAs. Regulatory decisions made by the European Medicines Agency determine which new medicines will be available to European prescribers and for which therapeutic indications. Regulatory success or failure can be influenced by many factors. This study assessed three key properties of endpoints used in preauthorization trials (type of endpoint [hard/surrogate], magnitude of endpoint outcome, and its statistical significance) and whether they are associated with a positive regulatory outcome. Clinicians can use these properties, which are described in the publicly available European public assessment reports, to help guide their understanding of the clinical effect of new oncologic therapies. ©AlphaMed Press.

  11. Greedy Sampling and Incremental Surrogate Model-Based Tailoring of Aeroservoelastic Model Database for Flexible Aircraft

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Pant, Kapil; Brenner, Martin J.; Ouellette, Jeffrey A.

    2018-01-01

    This paper presents a data analysis and modeling framework to tailor and develop linear parameter-varying (LPV) aeroservoelastic (ASE) model database for flexible aircrafts in broad 2D flight parameter space. The Kriging surrogate model is constructed using ASE models at a fraction of grid points within the original model database, and then the ASE model at any flight condition can be obtained simply through surrogate model interpolation. The greedy sampling algorithm is developed to select the next sample point that carries the worst relative error between the surrogate model prediction and the benchmark model in the frequency domain among all input-output channels. The process is iterated to incrementally improve surrogate model accuracy till a pre-determined tolerance or iteration budget is met. The methodology is applied to the ASE model database of a flexible aircraft currently being tested at NASA/AFRC for flutter suppression and gust load alleviation. Our studies indicate that the proposed method can reduce the number of models in the original database by 67%. Even so the ASE models obtained through Kriging interpolation match the model in the original database constructed directly from the physics-based tool with the worst relative error far below 1%. The interpolated ASE model exhibits continuously-varying gains along a set of prescribed flight conditions. More importantly, the selected grid points are distributed non-uniformly in the parameter space, a) capturing the distinctly different dynamic behavior and its dependence on flight parameters, and b) reiterating the need and utility for adaptive space sampling techniques for ASE model database compaction. The present framework is directly extendible to high-dimensional flight parameter space, and can be used to guide the ASE model development, model order reduction, robust control synthesis and novel vehicle design of flexible aircraft.

  12. Runoff load estimation of particulate and dissolved nitrogen in Lake Inba watershed using continuous monitoring data on turbidity and electric conductivity.

    PubMed

    Kim, J; Nagano, Y; Furumai, H

    2012-01-01

    Easy-to-measure surrogate parameters for water quality indicators are needed for real time monitoring as well as for generating data for model calibration and validation. In this study, a novel linear regression model for estimating total nitrogen (TN) based on two surrogate parameters is proposed based on evaluation of pollutant loads flowing into a eutrophic lake. Based on their runoff characteristics during wet weather, electric conductivity (EC) and turbidity were selected as surrogates for particulate nitrogen (PN) and dissolved nitrogen (DN), respectively. Strong linear relationships were established between PN and turbidity and DN and EC, and both models subsequently combined for estimation of TN. This model was evaluated by comparison of estimated and observed TN runoff loads during rainfall events. This analysis showed that turbidity and EC are viable surrogates for PN and DN, respectively, and that the linear regression model for TN concentration was successful in estimating TN runoff loads during rainfall events and also under dry weather conditions.

  13. The Limits of Surrogates' Moral Authority and Physician Professionalism: Can the Paradigm of Palliative Sedation Be Instructive?

    PubMed

    Berger, Jeffrey T

    2017-01-01

    With narrow exception, physicians' treatment of incapacitated patients requires the consent of health surrogates. Although the decision-making authority of surrogates is appropriately broad, their moral authority is not without limits. Discerning these bounds is particularly germane to ethically complex treatments and has important implications for the welfare of patients, for the professional integrity of clinicians, and, in fact, for the welfare of surrogates. Palliative sedation is one such complex treatment; as such, it provides a valuable model for analyzing the scope of surrogates' moral authority. Guidelines for palliative sedation that present it as a "last-resort" treatment for severe and intractable suffering yet require surrogate consent in order to offer it are ethically untenable, precisely because the moral limits of surrogate authority have not been considered. © 2017 The Hastings Center.

  14. Role of Volatility in the Development of JP-8 Surrogates for Diesel Engine Application

    DTIC Science & Technology

    2014-01-01

    distillation curves of the surrogate fuels were calculated using the Aspen HYSYS [41] software package, and the Peng- Robinson model was chosen to...distillation curves for the surrogate fuels developed in this investigation, the accuracy of Aspen HYSYS software predictions were compared with...and SF3. The distillation curves calculated using Aspen HYSYS software for the five surrogate fuels of Table 1 are shown in Figure 7, along with the

  15. Reduced-order surrogate models for Green's functions in black hole spacetimes

    NASA Astrophysics Data System (ADS)

    Galley, Chad; Wardell, Barry

    2016-03-01

    The fundamental nature of linear wave propagation in curved spacetime is encoded in the retarded Green's function (or propagator). Green's functions are useful tools because almost any field quantity of interest can be computed via convolution integrals with a source. In addition, perturbation theories involving nonlinear wave propagation can be expressed in terms of multiple convolutions of the Green's function. Recently, numerical solutions for propagators in black hole spacetimes have been found that are globally valid and accurate for computing physical quantities. However, the data generated is too large for practical use because the propagator depends on two spacetime points that must be sampled finely to yield accurate convolutions. I describe how to build a reduced-order model that can be evaluated as a substitute, or surrogate, for solutions of the curved spacetime Green's function equation. The resulting surrogate accurately and quickly models the original and out-of-sample data. I discuss applications of the surrogate, including self-consistent evolutions and waveforms of extreme mass ratio binaries. Green's function surrogate models provide a new and practical way to handle many old problems involving wave propagation and motion in curved spacetimes.

  16. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  17. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  18. Formulation of an RP-1 Pyrolysis Surrogate from Shock Tube Measurements of Fuel and Ethylene Time Histories

    DTIC Science & Technology

    2012-04-01

    They also provide modelers (in both kinetics and computational fluid dynamics) with a method of representing, during simulation, a fuel that may have...1 and RP-2 from [Huber 2009a] Composition, mole fraction Fluid RP-1 surrogate RP-2 surrogate -methyldecalin 0.354 0.354 5-methylnonane 0.150...modeling and experimental results. Experimental Thermal and Fluid Science, 28(7):701–708, 2004. L. F. Albright, B. L. Crynes, and W. H. Corcoran

  19. Study Designs and Statistical Analyses for Biomarker Research

    PubMed Central

    Gosho, Masahiko; Nagashima, Kengo; Sato, Yasunori

    2012-01-01

    Biomarkers are becoming increasingly important for streamlining drug discovery and development. In addition, biomarkers are widely expected to be used as a tool for disease diagnosis, personalized medication, and surrogate endpoints in clinical research. In this paper, we highlight several important aspects related to study design and statistical analysis for clinical research incorporating biomarkers. We describe the typical and current study designs for exploring, detecting, and utilizing biomarkers. Furthermore, we introduce statistical issues such as confounding and multiplicity for statistical tests in biomarker research. PMID:23012528

  20. Inactivation of Tulane virus, a novel surrogate for human norovirus

    USDA-ARS?s Scientific Manuscript database

    Human noroviruses (HuNoVs) are the major cause of non-bacterial epidemics of gastroenteritis. Due to the inability to cultivate HuNoVs and the lack of an efficient small animal model, surrogates are used to study HuNoV biology. Two such surrogates, the feline calicivirus (FCV) and the murine norovir...

  1. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  2. Vehicle manoeuvers as surrogate safety measures: Extracting data from the gps-enabled smartphones of regular drivers.

    PubMed

    Stipancic, Joshua; Miranda-Moreno, Luis; Saunier, Nicolas

    2018-06-01

    Network screening is a key element in identifying and prioritizing hazardous sites for engineering treatment. Traditional screening methods have used observed crash frequency or severity ranking criteria and statistical modelling approaches, despite the fact that crash-based methods are reactive. Alternatively, surrogate safety measures (SSMs) have become popular, making use of new data sources including video and, more rarely, GPS data. The purpose of this study is to examine vehicle manoeuvres of braking and accelerating extracted from a large quantity of GPS data collected using the smartphones of regular drivers, and to explore their potential as SSMs through correlation with historical collision frequency and severity across different facility types. GPS travel data was collected in Quebec City, Canada in 2014. The sample for this study contained over 4000 drivers and 21,000 trips. Hard braking (HBEs) and accelerating events (HAEs) were extracted and compared to historical crash data using Spearman's correlation coefficient and pairwise Kolmogorov-Smirnov tests. Both manoeuvres were shown to be positively correlated with crash frequency at the link and intersection levels, though correlations were much stronger when considering intersections. Locations with more braking and accelerating also tend to have more collisions. Concerning severity, higher numbers of vehicle manoeuvres were also related to increased collision severity, though this relationship was not always statistically significant. The inclusion of severity testing, which is an independent dimension of safety, represents a substantial contribution to the existing literature. Future work will focus on developing a network screening model that incorporates these SSMs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Using a gel/plastic surrogate to study the biomechanical response of the head under air shock loading: a combined experimental and numerical investigation.

    PubMed

    Zhu, Feng; Wagner, Christina; Dal Cengio Leonardi, Alessandra; Jin, Xin; Vandevord, Pamela; Chou, Clifford; Yang, King H; King, Albert I

    2012-03-01

    A combined experimental and numerical study was conducted to determine a method to elucidate the biomechanical response of a head surrogate physical model under air shock loading. In the physical experiments, a gel-filled egg-shaped skull/brain surrogate was exposed to blast overpressure in a shock tube environment, and static pressures within the shock tube and the surrogate were recorded throughout the event. A numerical model of the shock tube was developed using the Eulerian approach and validated against experimental data. An arbitrary Lagrangian-Eulerian (ALE) fluid-structure coupling algorithm was then utilized to simulate the interaction of the shock wave and the head surrogate. After model validation, a comprehensive series of parametric studies was carried out on the egg-shaped surrogate FE model to assess the effect of several key factors, such as the elastic modulus of the shell, bulk modulus of the core, head orientation, and internal sensor location, on pressure and strain responses. Results indicate that increasing the elastic modulus of the shell within the range simulated in this study led to considerable rise of the overpressures. Varying the bulk modulus of the core from 0.5 to 2.0 GPa, the overpressure had an increase of 7.2%. The curvature of the surface facing the shock wave significantly affected both the peak positive and negative pressures. Simulations of the head surrogate with the blunt end facing the advancing shock front had a higher pressure compared to the simulations with the pointed end facing the shock front. The influence of an opening (possibly mimicking anatomical apertures) on the peak pressures was evaluated using a surrogate head with a hole on the shell of the blunt end. It was revealed that the presence of the opening had little influence on the positive pressures but could affect the negative pressure evidently.

  4. An Empirical Assessment and Comparison of Species-Based and Habitat-Based Surrogates: A Case Study of Forest Vertebrates and Large Old Trees

    PubMed Central

    Lindenmayer, David B.; Barton, Philip S.; Lane, Peter W.; Westgate, Martin J.; McBurney, Lachlan; Blair, David; Gibbons, Philip; Likens, Gene E.

    2014-01-01

    A holy grail of conservation is to find simple but reliable measures of environmental change to guide management. For example, particular species or particular habitat attributes are often used as proxies for the abundance or diversity of a subset of other taxa. However, the efficacy of such kinds of species-based surrogates and habitat-based surrogates is rarely assessed, nor are different kinds of surrogates compared in terms of their relative effectiveness. We use 30-year datasets on arboreal marsupials and vegetation structure to quantify the effectiveness of: (1) the abundance of a particular species of arboreal marsupial as a species-based surrogate for other arboreal marsupial taxa, (2) hollow-bearing tree abundance as a habitat-based surrogate for arboreal marsupial abundance, and (3) a combination of species- and habitat-based surrogates. We also quantify the robustness of species-based and habitat-based surrogates over time. We then use the same approach to model overall species richness of arboreal marsupials. We show that a species-based surrogate can appear to be a valid surrogate until a habitat-based surrogate is co-examined, after which the effectiveness of the former is lost. The addition of a species-based surrogate to a habitat-based surrogate made little difference in explaining arboreal marsupial abundance, but altered the co-occurrence relationship between species. Hence, there was limited value in simultaneously using a combination of kinds of surrogates. The habitat-based surrogate also generally performed significantly better and was easier and less costly to gather than the species-based surrogate. We found that over 30 years of study, the relationships which underpinned the habitat-based surrogate generally remained positive but variable over time. Our work highlights why it is important to compare the effectiveness of different broad classes of surrogates and identify situations when either species- or habitat-based surrogates are likely to be superior. PMID:24587050

  5. An empirical assessment and comparison of species-based and habitat-based surrogates: a case study of forest vertebrates and large old trees.

    PubMed

    Lindenmayer, David B; Barton, Philip S; Lane, Peter W; Westgate, Martin J; McBurney, Lachlan; Blair, David; Gibbons, Philip; Likens, Gene E

    2014-01-01

    A holy grail of conservation is to find simple but reliable measures of environmental change to guide management. For example, particular species or particular habitat attributes are often used as proxies for the abundance or diversity of a subset of other taxa. However, the efficacy of such kinds of species-based surrogates and habitat-based surrogates is rarely assessed, nor are different kinds of surrogates compared in terms of their relative effectiveness. We use 30-year datasets on arboreal marsupials and vegetation structure to quantify the effectiveness of: (1) the abundance of a particular species of arboreal marsupial as a species-based surrogate for other arboreal marsupial taxa, (2) hollow-bearing tree abundance as a habitat-based surrogate for arboreal marsupial abundance, and (3) a combination of species- and habitat-based surrogates. We also quantify the robustness of species-based and habitat-based surrogates over time. We then use the same approach to model overall species richness of arboreal marsupials. We show that a species-based surrogate can appear to be a valid surrogate until a habitat-based surrogate is co-examined, after which the effectiveness of the former is lost. The addition of a species-based surrogate to a habitat-based surrogate made little difference in explaining arboreal marsupial abundance, but altered the co-occurrence relationship between species. Hence, there was limited value in simultaneously using a combination of kinds of surrogates. The habitat-based surrogate also generally performed significantly better and was easier and less costly to gather than the species-based surrogate. We found that over 30 years of study, the relationships which underpinned the habitat-based surrogate generally remained positive but variable over time. Our work highlights why it is important to compare the effectiveness of different broad classes of surrogates and identify situations when either species- or habitat-based surrogates are likely to be superior.

  6. A general framework to learn surrogate relevance criterion for atlas based image segmentation

    NASA Astrophysics Data System (ADS)

    Zhao, Tingting; Ruan, Dan

    2016-09-01

    Multi-atlas based image segmentation sees great opportunities in the big data era but also faces unprecedented challenges in identifying positive contributors from extensive heterogeneous data. To assess data relevance, image similarity criteria based on various image features widely serve as surrogates for the inaccessible geometric agreement criteria. This paper proposes a general framework to learn image based surrogate relevance criteria to better mimic the behaviors of segmentation based oracle geometric relevance. The validity of its general rationale is verified in the specific context of fusion set selection for image segmentation. More specifically, we first present a unified formulation for surrogate relevance criteria and model the neighborhood relationship among atlases based on the oracle relevance knowledge. Surrogates are then trained to be small for geometrically relevant neighbors and large for irrelevant remotes to the given targets. The proposed surrogate learning framework is verified in corpus callosum segmentation. The learned surrogates demonstrate superiority in inferring the underlying oracle value and selecting relevant fusion set, compared to benchmark surrogates.

  7. Uncertainty quantification of resonant ultrasound spectroscopy for material property and single crystal orientation estimation on a complex part

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Mayes, Alexander; Jauriqui, Leanne; Biedermann, Eric; Heffernan, Julieanne; Livings, Richard; Goodlet, Brent; Mazdiyasni, Siamack

    2018-04-01

    A case study is presented evaluating uncertainty in Resonance Ultrasound Spectroscopy (RUS) inversion for a single crystal (SX) Ni-based superalloy Mar-M247 cylindrical dog-bone specimens. A number of surrogate models were developed with FEM model solutions, using different sampling schemes (regular grid, Monte Carlo sampling, Latin Hyper-cube sampling) and model approaches, N-dimensional cubic spline interpolation and Kriging. Repeated studies were used to quantify the well-posedness of the inversion problem, and the uncertainty was assessed in material property and crystallographic orientation estimates given typical geometric dimension variability in aerospace components. Surrogate model quality was found to be an important factor in inversion results when the model more closely represents the test data. One important discovery was when the model matches well with test data, a Kriging surrogate model using un-sorted Latin Hypercube sampled data performed as well as the best results from an N-dimensional interpolation model using sorted data. However, both surrogate model quality and mode sorting were found to be less critical when inverting properties from either experimental data or simulated test cases with uncontrolled geometric variation.

  8. GENERATING SOPHISTICATED SPATIAL SURROGATES USING THE MIMS SPATIAL ALLOCATOR

    EPA Science Inventory

    The Multimedia Integrated Modeling System (MIMS) Spatial Allocator is open-source software for generating spatial surrogates for emissions modeling, changing the map projection of Shapefiles, and performing other types of spatial allocation that does not require the use of a comm...

  9. Love as a regulative ideal in surrogate decision making.

    PubMed

    Stonestreet, Erica Lucast

    2014-10-01

    This discussion aims to give a normative theoretical basis for a "best judgment" model of surrogate decision making rooted in a regulative ideal of love. Currently, there are two basic models of surrogate decision making for incompetent patients: the "substituted judgment" model and the "best interests" model. The former draws on the value of autonomy and responds with respect; the latter draws on the value of welfare and responds with beneficence. It can be difficult to determine which of these two models is more appropriate for a given patient, and both approaches may seem inadequate for a surrogate who loves the patient. The proposed "best judgment" model effectively draws on the values incorporated in each of the traditional standards, but does so because these values are important to someone who loves a patient, since love responds to the patient as the specific person she is. © The Author 2014. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Statistical iterative material image reconstruction for spectral CT using a semi-empirical forward model

    NASA Astrophysics Data System (ADS)

    Mechlem, Korbinian; Ehn, Sebastian; Sellerer, Thorsten; Pfeiffer, Franz; Noël, Peter B.

    2017-03-01

    In spectral computed tomography (spectral CT), the additional information about the energy dependence of attenuation coefficients can be exploited to generate material selective images. These images have found applications in various areas such as artifact reduction, quantitative imaging or clinical diagnosis. However, significant noise amplification on material decomposed images remains a fundamental problem of spectral CT. Most spectral CT algorithms separate the process of material decomposition and image reconstruction. Separating these steps is suboptimal because the full statistical information contained in the spectral tomographic measurements cannot be exploited. Statistical iterative reconstruction (SIR) techniques provide an alternative, mathematically elegant approach to obtaining material selective images with improved tradeoffs between noise and resolution. Furthermore, image reconstruction and material decomposition can be performed jointly. This is accomplished by a forward model which directly connects the (expected) spectral projection measurements and the material selective images. To obtain this forward model, detailed knowledge of the different photon energy spectra and the detector response was assumed in previous work. However, accurately determining the spectrum is often difficult in practice. In this work, a new algorithm for statistical iterative material decomposition is presented. It uses a semi-empirical forward model which relies on simple calibration measurements. Furthermore, an efficient optimization algorithm based on separable surrogate functions is employed. This partially negates one of the major shortcomings of SIR, namely high computational cost and long reconstruction times. Numerical simulations and real experiments show strongly improved image quality and reduced statistical bias compared to projection-based material decomposition.

  11. Diesel surrogate fuels for engine testing and chemical-kinetic modeling: Compositions and properties

    DOE PAGES

    Mueller, Charles J.; Cannella, William J.; Bays, J. Timothy; ...

    2016-01-07

    The primary objectives of this work were to formulate, blend, and characterize a set of four ultralow-sulfur diesel surrogate fuels in quantities sufficient to enable their study in single-cylinder-engine and combustion-vessel experiments. The surrogate fuels feature increasing levels of compositional accuracy (i.e., increasing exactness in matching hydrocarbon structural characteristics) relative to the single target diesel fuel upon which the surrogate fuels are based. This approach was taken to assist in determining the minimum level of surrogate-fuel compositional accuracy that is required to adequately emulate the performance characteristics of the target fuel under different combustion modes. For each of the fourmore » surrogate fuels, an approximately 30 L batch was blended, and a number of the physical and chemical properties were measured. In conclusion, this work documents the surrogate-fuel creation process and the results of the property measurements.« less

  12. Diesel Surrogate Fuels for Engine Testing and Chemical-Kinetic Modeling: Compositions and Properties

    PubMed Central

    Mueller, Charles J.; Cannella, William J.; Bays, J. Timothy; Bruno, Thomas J.; DeFabio, Kathy; Dettman, Heather D.; Gieleciak, Rafal M.; Huber, Marcia L.; Kweon, Chol-Bum; McConnell, Steven S.; Pitz, William J.; Ratcliff, Matthew A.

    2016-01-01

    The primary objectives of this work were to formulate, blend, and characterize a set of four ultralow-sulfur diesel surrogate fuels in quantities sufficient to enable their study in single-cylinder-engine and combustion-vessel experiments. The surrogate fuels feature increasing levels of compositional accuracy (i.e., increasing exactness in matching hydrocarbon structural characteristics) relative to the single target diesel fuel upon which the surrogate fuels are based. This approach was taken to assist in determining the minimum level of surrogate-fuel compositional accuracy that is required to adequately emulate the performance characteristics of the target fuel under different combustion modes. For each of the four surrogate fuels, an approximately 30 L batch was blended, and a number of the physical and chemical properties were measured. This work documents the surrogate-fuel creation process and the results of the property measurements. PMID:27330248

  13. A Bayesian Approach to Surrogacy Assessment Using Principal Stratification in Clinical Trials

    PubMed Central

    Li, Yun; Taylor, Jeremy M.G.; Elliott, Michael R.

    2011-01-01

    Summary A surrogate marker (S) is a variable that can be measured earlier and often easier than the true endpoint (T) in a clinical trial. Most previous research has been devoted to developing surrogacy measures to quantify how well S can replace T or examining the use of S in predicting the effect of a treatment (Z). However, the research often requires one to fit models for the distribution of T given S and Z. It is well known that such models do not have causal interpretations because the models condition on a post-randomization variable S. In this paper, we directly model the relationship among T, S and Z using a potential outcomes framework introduced by Frangakis and Rubin (2002). We propose a Bayesian estimation method to evaluate the causal probabilities associated with the cross-classification of the potential outcomes of S and T when S and T are both binary. We use a log-linear model to directly model the association between the potential outcomes of S and T through the odds ratios. The quantities derived from this approach always have causal interpretations. However, this causal model is not identifiable from the data without additional assumptions. To reduce the non-identifiability problem and increase the precision of statistical inferences, we assume monotonicity and incorporate prior belief that is plausible in the surrogate context by using prior distributions. We also explore the relationship among the surrogacy measures based on traditional models and this counterfactual model. The method is applied to the data from a glaucoma treatment study. PMID:19673864

  14. Maintaining tumor targeting accuracy in real-time motion compensation systems for respiration-induced tumor motion

    PubMed Central

    Malinowski, Kathleen; McAvoy, Thomas J.; George, Rohini; Dieterich, Sonja; D’Souza, Warren D.

    2013-01-01

    Purpose: To determine how best to time respiratory surrogate-based tumor motion model updates by comparing a novel technique based on external measurements alone to three direct measurement methods. Methods: Concurrently measured tumor and respiratory surrogate positions from 166 treatment fractions for lung or pancreas lesions were analyzed. Partial-least-squares regression models of tumor position from marker motion were created from the first six measurements in each dataset. Successive tumor localizations were obtained at a rate of once per minute on average. Model updates were timed according to four methods: never, respiratory surrogate-based (when metrics based on respiratory surrogate measurements exceeded confidence limits), error-based (when localization error ≥3 mm), and always (approximately once per minute). Results: Radial tumor displacement prediction errors (mean ± standard deviation) for the four schema described above were 2.4 ± 1.2, 1.9 ± 0.9, 1.9 ± 0.8, and 1.7 ± 0.8 mm, respectively. The never-update error was significantly larger than errors of the other methods. Mean update counts over 20 min were 0, 4, 9, and 24, respectively. Conclusions: The same improvement in tumor localization accuracy could be achieved through any of the three update methods, but significantly fewer updates were required when the respiratory surrogate method was utilized. This study establishes the feasibility of timing image acquisitions for updating respiratory surrogate models without direct tumor localization. PMID:23822413

  15. Program to Optimize Simulated Trajectories II (POST2) Surrogate Models for Mars Ascent Vehicle (MAV) Performance Assessment

    NASA Technical Reports Server (NTRS)

    Zwack, M. R.; Dees, P. D.; Thomas, H. D.; Polsgrove, T. P.; Holt, J. B.

    2017-01-01

    The primary purpose of the multiPOST tool is to enable the execution of much larger sets of vehicle cases to allow for broader trade space exploration. However, this exploration is not achieved solely with the increased case throughput. The multiPOST tool is applied to carry out a Design of Experiments (DOE), which is a set of cases that have been structured to capture a maximum amount of information about the design space with minimal computational effort. The results of the DOE are then used to fit a surrogate model, ultimately enabling parametric design space exploration. The approach used for the MAV study includes both DOE and surrogate modeling. First, the primary design considerations for the vehicle were used to develop the variables and ranges for the multiPOST DOE. The final set of DOE variables were carefully selected in order to capture the desired vehicle trades and take into account any special considerations for surrogate modeling. Next, the DOE sets were executed through multiPOST. Following successful completion of the DOE cases, a manual verification trial was performed. The trial involved randomly selecting cases from the DOE set and running them by hand. The results from the human analyst's run and multiPOST were then compared to ensure that the automated runs were being executed properly. Completion of the verification trials was then followed by surrogate model fitting. After fits to the multiPOST data were successfully created, the surrogate models were used as a stand-in for POST2 to carry out the desired MAV trades. Using the surrogate models in lieu of POST2 allowed for visualization of vehicle sensitivities to the input variables as well as rapid evaluation of vehicle performance. Although the models introduce some error into the output of the trade study, they were very effective at identifying areas of interest within the trade space for further refinement by human analysts. The next section will cover all of the ground rules and assumptions associated with DOE setup and multiPOST execution. Section 3.1 gives the final DOE variables and ranges, while section 3.2 addresses the POST2 specific assumptions. The results of the verification trials are given in section 4. Section 5 gives the surrogate model fitting results, including the goodness-of-fit metrics for each fit. Finally, the MAV specific results are discussed in section 6.

  16. The effect of prognostic data presentation format on perceived risk among surrogate decision makers of critically ill patients: a randomized comparative trial.

    PubMed

    Chapman, Andy R; Litton, Edward; Chamberlain, Jenny; Ho, Kwok M

    2015-04-01

    The purpose of this study is to determine whether varying the format used to present prognostic data alters the perception of risk among surrogate decision makers in the intensive care unit (ICU). This was a prospective randomized comparative trial conducted in a 23-bed adult tertiary ICU. Enrolled surrogate decision makers were randomized to 1 of 2 questionnaires, which presented hypothetical ICU scenarios, identical other than the format in which prognostic data were presented (eg, frequencies vs percentages). Participants were asked to rate the risk associated with each prognostic statement. We enrolled 141 surrogate decision makers. The perception of risk varied significantly dependent on the presentation format. For "quantitative data," risks were consistently perceived as higher, when presented as frequencies (eg, 1 in 50) compared with equivalent percentages (eg, 2%). Framing "qualitative data" in terms of chance of "death" rather than "survival" led to a statistically significant increase in perceived risks. Framing "quantitative" data in this way did not significantly affect risk perception. Data format had a significant effect on how surrogate decision makers interpreted risk. Qualitative statements are interpreted widely and affected by framing. Where possible, multiple quantitative formats should be used for presenting prognostic information. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  17. Comparison of surrogate indices for insulin sensitivity with parameters of the intravenous glucose tolerance test in early lactation dairy cattle.

    PubMed

    Alves-Nores, V; Castillo, C; Hernandez, J; Abuelo, A

    2017-10-01

    The aim of this study was to investigate the correlation between different surrogate indices and parameters of the intravenous glucose tolerance test (IVGTT) in dairy cows at the start of their lactation. Ten dairy cows underwent IVGTT on Days 3 to 7 after calving. Areas under the curve during the 90 min after infusion, peak and nadir concentrations, elimination rates, and times to reach half-maximal and basal concentrations for glucose, insulin, nonesterified fatty acids, and β-hydroxybutyrate were calculated. Surrogate indices were computed using the average of the IVGTT basal samples, and their correlation with the IVGTT parameters studied through the Spearman's rank test. No statistically significant or strong correlation coefficients (P > 0.05; |ρ| < 0.50) were observed between the insulin sensitivity measures derived from the IVGTT and any of the surrogate indices. Therefore, these results support that the assessment of insulin sensitivity in early lactation cattle cannot rely on the calculation of surrogate indices in just a blood sample, and the more laborious tests (ie, hyperinsulinemic euglycemic clamp test or IVGTT) should be employed to predict the sensitivity of the peripheral tissues to insulin accurately. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. A virtual climate library of surface temperature over North America for 1979-2015

    NASA Astrophysics Data System (ADS)

    Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras

    2017-10-01

    The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context—for example, to document trends in extreme events in response to climate change—is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979-2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life.

  19. A virtual climate library of surface temperature over North America for 1979–2015

    PubMed Central

    Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras

    2017-01-01

    The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context—for example, to document trends in extreme events in response to climate change—is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979–2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life. PMID:29039842

  20. A virtual climate library of surface temperature over North America for 1979-2015.

    PubMed

    Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras

    2017-10-17

    The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context-for example, to document trends in extreme events in response to climate change-is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979-2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life.

  1. Biomarkers and surrogate endpoints in kidney disease

    PubMed Central

    2015-01-01

    Kidney disease and its related comorbidities impose a large public health burden. Despite this, the number of clinical trials in nephrology lags behind many other fields. An important factor contributing to the relatively slow pace of nephrology trials is that existing clinical endpoints have significant limitations. “Hard” endpoints for chronic kidney disease, such as progression to end-stage renal disease, may not be reached for decades. Traditional biomarkers, such as serum creatinine in acute kidney injury, may lack sensitivity and predictive value. Finding new biomarkers to serve as surrogate endpoints is therefore an important priority in kidney disease research and may help to accelerate nephrology clinical trials. In this paper, I first review key concepts related to the selection of clinical trial endpoints and discuss statistical and regulatory considerations related to the evaluation of biomarkers as surrogate endpoints. This is followed by a discussion of the challenges and opportunities in developing novel biomarkers and surrogate endpoints in three major areas of nephrology research: acute kidney injury, chronic kidney disease, and autosomal dominant polycystic kidney disease. PMID:25980469

  2. Developing a particle tracking surrogate model to improve inversion of ground water - Surface water models

    NASA Astrophysics Data System (ADS)

    Cousquer, Yohann; Pryet, Alexandre; Atteia, Olivier; Ferré, Ty P. A.; Delbart, Célestine; Valois, Rémi; Dupuy, Alain

    2018-03-01

    The inverse problem of groundwater models is often ill-posed and model parameters are likely to be poorly constrained. Identifiability is improved if diverse data types are used for parameter estimation. However, some models, including detailed solute transport models, are further limited by prohibitive computation times. This often precludes the use of concentration data for parameter estimation, even if those data are available. In the case of surface water-groundwater (SW-GW) models, concentration data can provide SW-GW mixing ratios, which efficiently constrain the estimate of exchange flow, but are rarely used. We propose to reduce computational limits by simulating SW-GW exchange at a sink (well or drain) based on particle tracking under steady state flow conditions. Particle tracking is used to simulate advective transport. A comparison between the particle tracking surrogate model and an advective-dispersive model shows that dispersion can often be neglected when the mixing ratio is computed for a sink, allowing for use of the particle tracking surrogate model. The surrogate model was implemented to solve the inverse problem for a real SW-GW transport problem with heads and concentrations combined in a weighted hybrid objective function. The resulting inversion showed markedly reduced uncertainty in the transmissivity field compared to calibration on head data alone.

  3. Committee-Based Active Learning for Surrogate-Assisted Particle Swarm Optimization of Expensive Problems.

    PubMed

    Wang, Handing; Jin, Yaochu; Doherty, John

    2017-09-01

    Function evaluations (FEs) of many real-world optimization problems are time or resource consuming, posing a serious challenge to the application of evolutionary algorithms (EAs) to solve these problems. To address this challenge, the research on surrogate-assisted EAs has attracted increasing attention from both academia and industry over the past decades. However, most existing surrogate-assisted EAs (SAEAs) either still require thousands of expensive FEs to obtain acceptable solutions, or are only applied to very low-dimensional problems. In this paper, a novel surrogate-assisted particle swarm optimization (PSO) inspired from committee-based active learning (CAL) is proposed. In the proposed algorithm, a global model management strategy inspired from CAL is developed, which searches for the best and most uncertain solutions according to a surrogate ensemble using a PSO algorithm and evaluates these solutions using the expensive objective function. In addition, a local surrogate model is built around the best solution obtained so far. Then, a PSO algorithm searches on the local surrogate to find its optimum and evaluates it. The evolutionary search using the global model management strategy switches to the local search once no further improvement can be observed, and vice versa. This iterative search process continues until the computational budget is exhausted. Experimental results comparing the proposed algorithm with a few state-of-the-art SAEAs on both benchmark problems up to 30 decision variables as well as an airfoil design problem demonstrate that the proposed algorithm is able to achieve better or competitive solutions with a limited budget of hundreds of exact FEs.

  4. A Method of Surrogate Model Construction which Leverages Lower-Fidelity Information using Space Mapping Techniques

    DTIC Science & Technology

    2014-03-27

    fidelity. This pairing is accomplished through the use of a space mapping technique, which is a process where the design space of a lower fidelity model...is aligned a higher fidelity model. The intent of applying space mapping techniques to the field of surrogate construction is to leverage the

  5. An adaptive sparse-grid high-order stochastic collocation method for Bayesian inference in groundwater reactive transport modeling

    NASA Astrophysics Data System (ADS)

    Zhang, Guannan; Lu, Dan; Ye, Ming; Gunzburger, Max; Webster, Clayton

    2013-10-01

    Bayesian analysis has become vital to uncertainty quantification in groundwater modeling, but its application has been hindered by the computational cost associated with numerous model executions required by exploring the posterior probability density function (PPDF) of model parameters. This is particularly the case when the PPDF is estimated using Markov Chain Monte Carlo (MCMC) sampling. In this study, a new approach is developed to improve the computational efficiency of Bayesian inference by constructing a surrogate of the PPDF, using an adaptive sparse-grid high-order stochastic collocation (aSG-hSC) method. Unlike previous works using first-order hierarchical basis, this paper utilizes a compactly supported higher-order hierarchical basis to construct the surrogate system, resulting in a significant reduction in the number of required model executions. In addition, using the hierarchical surplus as an error indicator allows locally adaptive refinement of sparse grids in the parameter space, which further improves computational efficiency. To efficiently build the surrogate system for the PPDF with multiple significant modes, optimization techniques are used to identify the modes, for which high-probability regions are defined and components of the aSG-hSC approximation are constructed. After the surrogate is determined, the PPDF can be evaluated by sampling the surrogate system directly without model execution, resulting in improved efficiency of the surrogate-based MCMC compared with conventional MCMC. The developed method is evaluated using two synthetic groundwater reactive transport models. The first example involves coupled linear reactions and demonstrates the accuracy of our high-order hierarchical basis approach in approximating high-dimensional posteriori distribution. The second example is highly nonlinear because of the reactions of uranium surface complexation, and demonstrates how the iterative aSG-hSC method is able to capture multimodal and non-Gaussian features of PPDF caused by model nonlinearity. Both experiments show that aSG-hSC is an effective and efficient tool for Bayesian inference.

  6. An adaptive Gaussian process-based method for efficient Bayesian experimental design in groundwater contaminant source identification problems: ADAPTIVE GAUSSIAN PROCESS-BASED INVERSION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiangjiang; Li, Weixuan; Zeng, Lingzao

    Surrogate models are commonly used in Bayesian approaches such as Markov Chain Monte Carlo (MCMC) to avoid repetitive CPU-demanding model evaluations. However, the approximation error of a surrogate may lead to biased estimations of the posterior distribution. This bias can be corrected by constructing a very accurate surrogate or implementing MCMC in a two-stage manner. Since the two-stage MCMC requires extra original model evaluations, the computational cost is still high. If the information of measurement is incorporated, a locally accurate approximation of the original model can be adaptively constructed with low computational cost. Based on this idea, we propose amore » Gaussian process (GP) surrogate-based Bayesian experimental design and parameter estimation approach for groundwater contaminant source identification problems. A major advantage of the GP surrogate is that it provides a convenient estimation of the approximation error, which can be incorporated in the Bayesian formula to avoid over-confident estimation of the posterior distribution. The proposed approach is tested with a numerical case study. Without sacrificing the estimation accuracy, the new approach achieves about 200 times of speed-up compared to our previous work using two-stage MCMC.« less

  7. Efficient SRAM yield optimization with mixture surrogate modeling

    NASA Astrophysics Data System (ADS)

    Zhongjian, Jiang; Zuochang, Ye; Yan, Wang

    2016-12-01

    Largely repeated cells such as SRAM cells usually require extremely low failure-rate to ensure a moderate chi yield. Though fast Monte Carlo methods such as importance sampling and its variants can be used for yield estimation, they are still very expensive if one needs to perform optimization based on such estimations. Typically the process of yield calculation requires a lot of SPICE simulation. The circuit SPICE simulation analysis accounted for the largest proportion of time in the process yield calculation. In the paper, a new method is proposed to address this issue. The key idea is to establish an efficient mixture surrogate model. The surrogate model is based on the design variables and process variables. This model construction method is based on the SPICE simulation to get a certain amount of sample points, these points are trained for mixture surrogate model by the lasso algorithm. Experimental results show that the proposed model is able to calculate accurate yield successfully and it brings significant speed ups to the calculation of failure rate. Based on the model, we made a further accelerated algorithm to further enhance the speed of the yield calculation. It is suitable for high-dimensional process variables and multi-performance applications.

  8. Rapid Optimization of External Quantum Efficiency of Thin Film Solar Cells Using Surrogate Modeling of Absorptivity.

    PubMed

    Kaya, Mine; Hajimirza, Shima

    2018-05-25

    This paper uses surrogate modeling for very fast design of thin film solar cells with improved solar-to-electricity conversion efficiency. We demonstrate that the wavelength-specific optical absorptivity of a thin film multi-layered amorphous-silicon-based solar cell can be modeled accurately with Neural Networks and can be efficiently approximated as a function of cell geometry and wavelength. Consequently, the external quantum efficiency can be computed by averaging surrogate absorption and carrier recombination contributions over the entire irradiance spectrum in an efficient way. Using this framework, we optimize a multi-layer structure consisting of ITO front coating, metallic back-reflector and oxide layers for achieving maximum efficiency. Our required computation time for an entire model fitting and optimization is 5 to 20 times less than the best previous optimization results based on direct Finite Difference Time Domain (FDTD) simulations, therefore proving the value of surrogate modeling. The resulting optimization solution suggests at least 50% improvement in the external quantum efficiency compared to bare silicon, and 25% improvement compared to a random design.

  9. A data-driven dynamics simulation framework for railway vehicles

    NASA Astrophysics Data System (ADS)

    Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2018-03-01

    The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.

  10. A video to improve patient and surrogate understanding of cardiopulmonary resuscitation choices in the ICU: a randomized controlled trial.

    PubMed

    Wilson, Michael E; Krupa, Artur; Hinds, Richard F; Litell, John M; Swetz, Keith M; Akhoundi, Abbasali; Kashyap, Rahul; Gajic, Ognjen; Kashani, Kianoush

    2015-03-01

    To determine if a video depicting cardiopulmonary resuscitation and resuscitation preference options would improve knowledge and decision making among patients and surrogates in the ICU. Randomized, unblinded trial. Single medical ICU. Patients and surrogate decision makers in the ICU. The usual care group received a standard pamphlet about cardiopulmonary resuscitation and cardiopulmonary resuscitation preference options plus routine code status discussions with clinicians. The video group received usual care plus an 8-minute video that depicted cardiopulmonary resuscitation, showed a simulated hospital code, and explained resuscitation preference options. One hundred three patients and surrogates were randomized to usual care. One hundred five patients and surrogates were randomized to video plus usual care. Median total knowledge scores (0-15 points possible for correct answers) in the video group were 13 compared with 10 in the usual care group, p value of less than 0.0001. Video group participants had higher rates of understanding the purpose of cardiopulmonary resuscitation and resuscitation options and terminology and could correctly name components of cardiopulmonary resuscitation. No statistically significant differences in documented resuscitation preferences following the interventions were found between the two groups, although the trial was underpowered to detect such differences. A majority of participants felt that the video was helpful in cardiopulmonary resuscitation decision making (98%) and would recommend the video to others (99%). A video depicting cardiopulmonary resuscitation and explaining resuscitation preference options was associated with improved knowledge of in-hospital cardiopulmonary resuscitation options and cardiopulmonary resuscitation terminology among patients and surrogate decision makers in the ICU, compared with receiving a pamphlet on cardiopulmonary resuscitation. Patients and surrogates found the video helpful in decision making and would recommend the video to others.

  11. Sediment transport and evaluation of sediment surrogate ratings in the Kootenai River near Bonners Ferry, Idaho, Water Years 2011–14

    USGS Publications Warehouse

    Wood, Molly S.; Fosness, Ryan L.; Etheridge, Alexandra B.

    2015-12-14

    Acoustic surrogate ratings were developed between backscatter data collected using acoustic Doppler velocity meters (ADVMs) and results of suspended-sediment samples. Ratings were successfully fit to various sediment size classes (total, fines, and sands) using ADVMs of different frequencies (1.5 and 3 megahertz). Surrogate ratings also were developed using variations of streamflow and seasonal explanatory variables. The streamflow surrogate ratings produced average annual sediment load estimates that were 8–32 percent higher, depending on site and sediment type, than estimates produced using the acoustic surrogate ratings. The streamflow surrogate ratings tended to overestimate suspended-sediment concentrations and loads during periods of elevated releases from Libby Dam as well as on the falling limb of the streamflow hydrograph. Estimates from the acoustic surrogate ratings more closely matched suspended-sediment sample results than did estimates from the streamflow surrogate ratings during these periods as well as for rating validation samples collected in water year 2014. Acoustic surrogate technologies are an effective means to obtain continuous, accurate estimates of suspended-sediment concentrations and loads for general monitoring and sediment-transport modeling. In the Kootenai River, continued operation of the acoustic surrogate sites and use of the acoustic surrogate ratings to calculate continuous suspended-sediment concentrations and loads will allow for tracking changes in sediment transport over time.

  12. Internet health information seeking is a team sport: analysis of the Pew Internet Survey.

    PubMed

    Sadasivam, Rajani S; Kinney, Rebecca L; Lemon, Stephenie C; Shimada, Stephanie L; Allison, Jeroan J; Houston, Thomas K

    2013-03-01

    Previous studies examining characteristics of Internet health information seekers do not distinguish between those who only seek for themselves, and surrogate seekers who look for health information for family or friends. Identifying the unique characteristics of surrogate seekers would help in developing Internet interventions that better support these information seekers. To assess differences between self seekers versus those that act also as surrogate seekers. We analyzed data from the cross-sectional Pew Internet and American Life Project November/December 2008 health survey. Our dependent variable was self-report of type of health information seeking (surrogate versus self seeking). Independent variables included demographics, health status, and caregiving. After bivariate comparisons, we then developed multivariable models using logistic regression to assess characteristics associated with surrogate seeking. Out of 1250 respondents who reported seeking health information online, 56% (N=705) reported being surrogate seekers. In multivariable models, compared with those who sought information for themselves only, surrogate seekers were more likely both married and a parent (OR=1.57, CI=1.08, 2.28), having good (OR=2.05, CI=1.34, 3.12) or excellent (OR=2.72, CI=1.70, 4.33) health status, being caregiver of an adult relative (OR=1.76, CI=1.34, 2.30), having someone close with a serious medical condition (OR=1.62, CI=1.21, 2.17) and having someone close to them facing a chronic illness (OR=1.55, CI=1.17, 2.04). Our findings provide evidence that information needs of surrogate seekers are not being met, specifically of caregivers. Additional research is needed to develop new functions that support surrogate seekers. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. Average absorption cross-section of the human body measured at 1-12 GHz in a reverberant chamber: results of a human volunteer study

    NASA Astrophysics Data System (ADS)

    Flintoft, I. D.; Robinson, M. P.; Melia, G. C. R.; Marvin, A. C.; Dawson, J. F.

    2014-07-01

    The electromagnetic absorption cross-section (ACS) averaged over polarization and angle-of-incidence of 60 ungrounded adult subjects was measured at microwave frequencies of 1-12 GHz in a reverberation chamber. Average ACS is important in non-ionizing dosimetry and exposure studies, and is closely related to the whole-body averaged specific absorption rate (WBSAR). The average ACS was measured with a statistical uncertainty of less than 3% and high frequency resolution for individuals with a range of body shapes and sizes allowing the statistical distribution of WBSAR over a real population with individual internal and external morphologies to be determined. The average ACS of all subjects was found to vary from 0.15 to 0.4 m2 for an individual subject it falls with frequency over 1-6 GHz, and then rises slowly over the 6-12 GHz range in which few other studies have been conducted. Average ACS and WBSAR are then used as a surrogate for worst-case ACS/WBSAR, in order to study their variability across a real population compared to literature results from simulations using numerical phantoms with a limited range of anatomies. Correlations with body morphological parameters such as height, mass and waist circumference have been investigated: the strongest correlation is with body surface area (BSA) at all frequencies above 1 GHz, however direct proportionality to BSA is not established until above 5 GHz. When the average ACS is normalized to the BSA, the resulting absorption efficiency shows a negative correlation with the estimated thickness of subcutaneous body fat. Surrogate models and statistical analysis of the measurement data are presented and compared to similar models from the literature. The overall dispersion of measured average WBSAR of the sample of the UK population studied is consistent with the dispersion of simulated worst-case WBSAR across multiple numerical phantom families. The statistical results obtained allow the calibration of human exposure assessments made with particular phantoms to a population with a range of individual morphologies.

  14. Average absorption cross-section of the human body measured at 1-12 GHz in a reverberant chamber: results of a human volunteer study.

    PubMed

    Flintoft, I D; Robinson, M P; Melia, G C R; Marvin, A C; Dawson, J F

    2014-07-07

    The electromagnetic absorption cross-section (ACS) averaged over polarization and angle-of-incidence of 60 ungrounded adult subjects was measured at microwave frequencies of 1-12 GHz in a reverberation chamber. Average ACS is important in non-ionizing dosimetry and exposure studies, and is closely related to the whole-body averaged specific absorption rate (WBSAR). The average ACS was measured with a statistical uncertainty of less than 3% and high frequency resolution for individuals with a range of body shapes and sizes allowing the statistical distribution of WBSAR over a real population with individual internal and external morphologies to be determined. The average ACS of all subjects was found to vary from 0.15 to 0.4 m(2); for an individual subject it falls with frequency over 1-6 GHz, and then rises slowly over the 6-12 GHz range in which few other studies have been conducted. Average ACS and WBSAR are then used as a surrogate for worst-case ACS/WBSAR, in order to study their variability across a real population compared to literature results from simulations using numerical phantoms with a limited range of anatomies. Correlations with body morphological parameters such as height, mass and waist circumference have been investigated: the strongest correlation is with body surface area (BSA) at all frequencies above 1 GHz, however direct proportionality to BSA is not established until above 5 GHz. When the average ACS is normalized to the BSA, the resulting absorption efficiency shows a negative correlation with the estimated thickness of subcutaneous body fat. Surrogate models and statistical analysis of the measurement data are presented and compared to similar models from the literature. The overall dispersion of measured average WBSAR of the sample of the UK population studied is consistent with the dispersion of simulated worst-case WBSAR across multiple numerical phantom families. The statistical results obtained allow the calibration of human exposure assessments made with particular phantoms to a population with a range of individual morphologies.

  15. Tracking contamination through ground beef production and identifying points of recontamination using a novel green fluorescent protein (GFP) expressing, E. coli O103, non-pathogenic surrogate

    USDA-ARS?s Scientific Manuscript database

    Introduction: Commonly, ground beef processors conduct studies to model contaminant flow through their production systems using surrogate organisms. Typical surrogate organisms may not behave as Escherichia coli O157:H7 during grinding and are not easy to detect at very low levels. Purpose: Develop...

  16. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  17. Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite

    NASA Astrophysics Data System (ADS)

    Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin

    2017-09-01

    State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.

  18. Convergence analysis of surrogate-based methods for Bayesian inverse problems

    NASA Astrophysics Data System (ADS)

    Yan, Liang; Zhang, Yuan-Xiang

    2017-12-01

    The major challenges in the Bayesian inverse problems arise from the need for repeated evaluations of the forward model, as required by Markov chain Monte Carlo (MCMC) methods for posterior sampling. Many attempts at accelerating Bayesian inference have relied on surrogates for the forward model, typically constructed through repeated forward simulations that are performed in an offline phase. Although such approaches can be quite effective at reducing computation cost, there has been little analysis of the approximation on posterior inference. In this work, we prove error bounds on the Kullback-Leibler (KL) distance between the true posterior distribution and the approximation based on surrogate models. Our rigorous error analysis show that if the forward model approximation converges at certain rate in the prior-weighted L 2 norm, then the posterior distribution generated by the approximation converges to the true posterior at least two times faster in the KL sense. The error bound on the Hellinger distance is also provided. To provide concrete examples focusing on the use of the surrogate model based methods, we present an efficient technique for constructing stochastic surrogate models to accelerate the Bayesian inference approach. The Christoffel least squares algorithms, based on generalized polynomial chaos, are used to construct a polynomial approximation of the forward solution over the support of the prior distribution. The numerical strategy and the predicted convergence rates are then demonstrated on the nonlinear inverse problems, involving the inference of parameters appearing in partial differential equations.

  19. Investigating conflict in ICUs - Is the clinicians’ perspective enough?

    PubMed Central

    Schuster, Rachel A.; Hong, Seo Yeon; Arnold, Robert M.; White, Douglas B.

    2013-01-01

    Objective Most studies have assessed conflict between clinicians and surrogate decision makers in ICUs from only clinicians’ perspectives. It is unknown if surrogates’ perceptions differ from clinicians’. We sought to determine the degree of agreement between physicians and surrogates about conflict, and to identify predictors of physician-surrogate conflict. Design Prospective cohort study. Setting Four ICUs of two hospitals in San Francisco, California. Patients 230 surrogate decision makers and 100 physicians of 175 critically ill patients. Measurements Questionnaires addressing participants’ perceptions of whether there was physician-surrogate conflict, as well as attitudes and preferences about clinician-surrogate communication; kappa scores to quantify physician-surrogate concordance about the presence of conflict; and hierarchical multivariate modeling to determine predictors of conflict. Main Results Either the physician or surrogate identified conflict in 63% of cases. Physicians were less likely to perceive conflict than surrogates (27.8% vs 42.3%; p=0.007). Agreement between physicians and surrogates about conflict was poor (kappa = 0.14). Multivariable analysis with surrogate-assessed conflict as the outcome revealed that higher levels of surrogates’ satisfaction with physicians’ bedside manner were associated with lower odds of conflict (OR: 0.75 per 1 point increase in satisfaction, 95% CI 0.59–0.96). Multivariable analysis with physician-assessed conflict as the outcome revealed that the surrogate having felt discriminated against in the healthcare setting was associated with higher odds of conflict (OR 17.5, 95% CI 1.6–190.1) while surrogates’ satisfaction with physicians’ bedside manner was associated with lower odds of conflict (0–10 scale, OR 0.76 per 1 point increase, 95% CI 0.58–0.99). Conclusions Conflict between physicians and surrogates is common in ICUs. There is little agreement between physicians and surrogates about whether physician-surrogate conflict has occurred. Further work is needed to develop reliable and valid methods to assess conflict. In the interim, future studies should assess conflict from the perspective of both clinicians and surrogates. PMID:24434440

  20. Robust optimization of supersonic ORC nozzle guide vanes

    NASA Astrophysics Data System (ADS)

    Bufi, Elio A.; Cinnella, Paola

    2017-03-01

    An efficient Robust Optimization (RO) strategy is developed for the design of 2D supersonic Organic Rankine Cycle turbine expanders. The dense gas effects are not-negligible for this application and they are taken into account describing the thermodynamics by means of the Peng-Robinson-Stryjek-Vera equation of state. The design methodology combines an Uncertainty Quantification (UQ) loop based on a Bayesian kriging model of the system response to the uncertain parameters, used to approximate statistics (mean and variance) of the uncertain system output, a CFD solver, and a multi-objective non-dominated sorting algorithm (NSGA), also based on a Kriging surrogate of the multi-objective fitness function, along with an adaptive infill strategy for surrogate enrichment at each generation of the NSGA. The objective functions are the average and variance of the isentropic efficiency. The blade shape is parametrized by means of a Free Form Deformation (FFD) approach. The robust optimal blades are compared to the baseline design (based on the Method of Characteristics) and to a blade obtained by means of a deterministic CFD-based optimization.

  1. Correlation between external and internal respiratory motion: a validation study.

    PubMed

    Ernst, Floris; Bruder, Ralf; Schlaefer, Alexander; Schweikard, Achim

    2012-05-01

    In motion-compensated image-guided radiotherapy, accurate tracking of the target region is required. This tracking process includes building a correlation model between external surrogate motion and the motion of the target region. A novel correlation method is presented and compared with the commonly used polynomial model. The CyberKnife system (Accuray, Inc., Sunnyvale/CA) uses a polynomial correlation model to relate externally measured surrogate data (optical fibres on the patient's chest emitting red light) to infrequently acquired internal measurements (X-ray data). A new correlation algorithm based on ɛ -Support Vector Regression (SVR) was developed. Validation and comparison testing were done with human volunteers using live 3D ultrasound and externally measured infrared light-emitting diodes (IR LEDs). Seven data sets (5:03-6:27 min long) were recorded from six volunteers. Polynomial correlation algorithms were compared to the SVR-based algorithm demonstrating an average increase in root mean square (RMS) accuracy of 21.3% (0.4 mm). For three signals, the increase was more than 29% and for one signal as much as 45.6% (corresponding to more than 1.5 mm RMS). Further analysis showed the improvement to be statistically significant. The new SVR-based correlation method outperforms traditional polynomial correlation methods for motion tracking. This method is suitable for clinical implementation and may improve the overall accuracy of targeted radiotherapy.

  2. Optimization and Control of Agent-Based Models in Biology: A Perspective.

    PubMed

    An, G; Fitzpatrick, B G; Christley, S; Federico, P; Kanarek, A; Neilan, R Miller; Oremland, M; Salinas, R; Laubenbacher, R; Lenhart, S

    2017-01-01

    Agent-based models (ABMs) have become an increasingly important mode of inquiry for the life sciences. They are particularly valuable for systems that are not understood well enough to build an equation-based model. These advantages, however, are counterbalanced by the difficulty of analyzing and using ABMs, due to the lack of the type of mathematical tools available for more traditional models, which leaves simulation as the primary approach. As models become large, simulation becomes challenging. This paper proposes a novel approach to two mathematical aspects of ABMs, optimization and control, and it presents a few first steps outlining how one might carry out this approach. Rather than viewing the ABM as a model, it is to be viewed as a surrogate for the actual system. For a given optimization or control problem (which may change over time), the surrogate system is modeled instead, using data from the ABM and a modeling framework for which ready-made mathematical tools exist, such as differential equations, or for which control strategies can explored more easily. Once the optimization problem is solved for the model of the surrogate, it is then lifted to the surrogate and tested. The final step is to lift the optimization solution from the surrogate system to the actual system. This program is illustrated with published work, using two relatively simple ABMs as a demonstration, Sugarscape and a consumer-resource ABM. Specific techniques discussed include dimension reduction and approximation of an ABM by difference equations as well systems of PDEs, related to certain specific control objectives. This demonstration illustrates the very challenging mathematical problems that need to be solved before this approach can be realistically applied to complex and large ABMs, current and future. The paper outlines a research program to address them.

  3. Multi-Scale Modeling, Surrogate-Based Analysis, and Optimization of Lithium-Ion Batteries for Vehicle Applications

    NASA Astrophysics Data System (ADS)

    Du, Wenbo

    A common attribute of electric-powered aerospace vehicles and systems such as unmanned aerial vehicles, hybrid- and fully-electric aircraft, and satellites is that their performance is usually limited by the energy density of their batteries. Although lithium-ion batteries offer distinct advantages such as high voltage and low weight over other battery technologies, they are a relatively new development, and thus significant gaps in the understanding of the physical phenomena that govern battery performance remain. As a result of this limited understanding, batteries must often undergo a cumbersome design process involving many manual iterations based on rules of thumb and ad-hoc design principles. A systematic study of the relationship between operational, geometric, morphological, and material-dependent properties and performance metrics such as energy and power density is non-trivial due to the multiphysics, multiphase, and multiscale nature of the battery system. To address these challenges, two numerical frameworks are established in this dissertation: a process for analyzing and optimizing several key design variables using surrogate modeling tools and gradient-based optimizers, and a multi-scale model that incorporates more detailed microstructural information into the computationally efficient but limited macro-homogeneous model. In the surrogate modeling process, multi-dimensional maps for the cell energy density with respect to design variables such as the particle size, ion diffusivity, and electron conductivity of the porous cathode material are created. A combined surrogate- and gradient-based approach is employed to identify optimal values for cathode thickness and porosity under various operating conditions, and quantify the uncertainty in the surrogate model. The performance of multiple cathode materials is also compared by defining dimensionless transport parameters. The multi-scale model makes use of detailed 3-D FEM simulations conducted at the particle-level. A monodisperse system of ellipsoidal particles is used to simulate the effective transport coefficients and interfacial reaction current density within the porous microstructure. Microscopic simulation results are shown to match well with experimental measurements, while differing significantly from homogenization approximations used in the macroscopic model. Global sensitivity analysis and surrogate modeling tools are applied to couple the two length scales and complete the multi-scale model.

  4. A review of selected inorganic surface water quality-monitoring practices: are we really measuring what we think, and if so, are we doing it right?

    USGS Publications Warehouse

    Horowitz, Arthur J.

    2013-01-01

    Successful environmental/water quality-monitoring programs usually require a balance between analytical capabilities, the collection and preservation of representative samples, and available financial/personnel resources. Due to current economic conditions, monitoring programs are under increasing pressure to do more with less. Hence, a review of current sampling and analytical methodologies, and some of the underlying assumptions that form the bases for these programs seems appropriate, to see if they are achieving their intended objectives within acceptable error limits and/or measurement uncertainty, in a cost-effective manner. That evaluation appears to indicate that several common sampling/processing/analytical procedures (e.g., dip (point) samples/measurements, nitrogen determinations, total recoverable analytical procedures) are generating biased or nonrepresentative data, and that some of the underlying assumptions relative to current programs, such as calendar-based sampling and stationarity are no longer defensible. The extensive use of statistical models as well as surrogates (e.g., turbidity) also needs to be re-examined because the hydrologic interrelationships that support their use tend to be dynamic rather than static. As a result, a number of monitoring programs may need redesigning, some sampling and analytical procedures may need to be updated, and model/surrogate interrelationships may require recalibration.

  5. Surrogate analysis and index developer (SAID) tool and real-time data dissemination utilities

    USGS Publications Warehouse

    Domanski, Marian M.; Straub, Timothy D.; Wood, Molly S.; Landers, Mark N.; Wall, Gary R.; Brady, Steven J.

    2015-01-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Critical to advancing the operational use of surrogates are tools to process and evaluate the data along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research, and on surrogate monitoring sites currently in operation. The Surrogate Analysis and Index Developer (SAID) standalone tool, under development by the U.S. Geological Survey (USGS), assists in the creation of regression models that relate response and explanatory variables by providing visual and quantitative diagnostics to the user. SAID also processes acoustic parameters to be used as explanatory variables for suspended-sediment concentrations. The sediment acoustic method utilizes acoustic parameters from fixed-mount stationary equipment. The background theory and method used by the tool have been described in recent publications, and the tool also serves to support sediment-acoustic-index methods being drafted by the multi-agency Sediment Acoustic Leadership Team (SALT), and other surrogate guidelines like USGS Techniques and Methods 3-C4 for turbidity and SSC. The regression models in SAID can be used in utilities that have been developed to work with the USGS National Water Information System (NWIS) and for the USGS National Real-Time Water Quality (NRTWQ) Web site. The real-time dissemination of predicted SSC and prediction intervals for each time step has substantial potential to improve understanding of sediment-related water-quality and associated engineering and ecological management decisions.

  6. Advanced functional network analysis in the geosciences: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen

    2013-04-01

    Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.

  7. Analysis of safety impacts of access management alternatives using the surrogate safety assessment model : final report.

    DOT National Transportation Integrated Search

    2017-06-01

    The purpose of this study was to evaluate if the Surrogate Safety Assessment Model (SSAM) could be used to assess the safety of a highway segment or an intersection in terms of the number and type of conflicts and to compare the safety effects of mul...

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Juliane

    MISO is an optimization framework for solving computationally expensive mixed-integer, black-box, global optimization problems. MISO uses surrogate models to approximate the computationally expensive objective function. Hence, derivative information, which is generally unavailable for black-box simulation objective functions, is not needed. MISO allows the user to choose the initial experimental design strategy, the type of surrogate model, and the sampling strategy.

  9. Predicting trace organic compound attenuation by ozone oxidation: Development of indicator and surrogate models.

    PubMed

    Park, Minkyu; Anumol, Tarun; Daniels, Kevin D; Wu, Shimin; Ziska, Austin D; Snyder, Shane A

    2017-08-01

    Ozone oxidation has been demonstrated to be an effective treatment process for the attenuation of trace organic compounds (TOrCs); however, predicting TOrC attenuation by ozone processes is challenging in wastewaters. Since ozone is rapidly consumed, determining the exposure times of ozone and hydroxyl radical proves to be difficult. As direct potable reuse schemes continue to gain traction, there is an increasing need for the development of real-time monitoring strategies for TOrC abatement in ozone oxidation processes. Hence, this study is primarily aimed at developing indicator and surrogate models for the prediction of TOrC attenuation by ozone oxidation. To this end, the second-order kinetic equations with a second-phase R ct value (ratio of hydroxyl radical exposure to molecular ozone exposure) were used to calculate comparative kinetics of TOrC attenuation and the reduction of indicator and spectroscopic surrogate parameters, including UV absorbance at 254 nm (UVA 254 ) and total fluorescence (TF). The developed indicator model using meprobamate as an indicator compound and the surrogate models with UVA 254 and TF exhibited good predictive power for the attenuation of 13 kinetically distinct TOrCs in five filtered and unfiltered wastewater effluents (R 2 values > 0.8). This study is intended to help provide a guideline for the implementation of indicator/surrogate models for real-time monitoring of TOrC abatement with ozone processes and integrate them into a regulatory framework in water reuse. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. surrkick: Black-hole kicks from numerical-relativity surrogate models

    NASA Astrophysics Data System (ADS)

    Gerosa, Davide; Hébert, François; Stein, Leo C.

    2018-04-01

    surrkick quickly and reliably extract recoils imparted to generic, precessing, black hole binaries. It uses a numerical-relativity surrogate model to obtain the gravitational waveform given a set of binary parameters, and from this waveform directly integrates the gravitational-wave linear momentum flux. This entirely bypasses the need of fitting formulae which are typically used to model black-hole recoils in astrophysical contexts.

  11. False-negative rate, limit of detection and recovery efficiency performance of a validated macrofoam-swab sampling method for low surface concentrations of Bacillus anthracis Sterne and Bacillus atrophaeus spores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G. F.; Deatherage Kaiser, B. L.; Amidan, B. G.

    The performance of a macrofoam-swab sampling method was evaluated using Bacillus anthracis Sterne (BAS) and Bacillus atrophaeus Nakamura (BG) spores applied at nine low target amounts (2-500 spores) to positive-control plates and test coupons (2 in × 2 in) of four surface materials (glass, stainless steel, vinyl tile, and plastic). Test results from cultured samples were used to evaluate the effects of surrogate, surface concentration, and surface material on recovery efficiency (RE), false negative rate (FNR), and limit of detection. For RE, surrogate and surface material had statistically significant effects, but concentration did not. Mean REs were the lowest formore » vinyl tile (50.8% with BAS and 40.2% with BG) and the highest for glass (92.8% with BAS and 71.4% with BG). FNR values ranged from 0 to 0.833 for BAS and 0 to 0.806 for BG; values increased as concentration decreased in the range tested (0.078 to 19.375 CFU/cm2). Surface material also had a statistically significant effect. A FNR-concentration curve was fit for each combination of surrogate and surface material. For both surrogates, the FNR curves tended to be the lowest for glass and highest for vinyl title. The FNR curves for BG tended to be higher than for BAS at lower concentrations, especially for glass. Results using a modified Rapid Viability-Polymerase Chain Reaction (mRV-PCR) analysis method were also obtained. The mRV-PCR results and comparisons to the culture results will be discussed in a subsequent article.« less

  12. Improving Mixed Variable Optimization of Computational and Model Parameters Using Multiple Surrogate Functions

    DTIC Science & Technology

    2008-03-01

    multiplicative corrections as well as space mapping transformations for models defined over a lower dimensional space. A corrected surrogate model for the...correction functions used in [72]. If the low fidelity model g(x̃) is defined over a lower dimensional space then a space mapping transformation is...required. As defined in [21, 72], space mapping is a method of mapping between models of different dimensionality or fidelity. Let P denote the space

  13. How Do You Determine Whether The Earth Is Warming Up?

    NASA Astrophysics Data System (ADS)

    Restrepo, J. M.; Comeau, D.; Flaschka, H.

    2012-12-01

    How does one determine whether the extreme summer temperatures in the North East of the US, or in Moscow during the summer of 2010, was an extreme weather fluctuation or the result of a systematic global climate warming trend? It is only under exceptional circumstances that one can determine whether an observational climate signal belongs to a particular statistical distribution. In fact, observed climate signals are rarely "statistical" and thus there is usually no way to rigorously obtain enough field data to produce a trend or tendency, based upon data alone. Furthermore, this type of data is often multi-scale. We propose a trend or tendency methodology that does not make use of a parametric or a statistical assumption. The most important feature of this trend strategy is that it is defined in very precise mathematical terms. The tendency is easily understood and practical, and its algorithmic realization is fairly robust. In addition to proposing a trend, the methodology can be adopted to generate surrogate statistical models, useful in reduced filtering schemes of time dependent processes.

  14. Uncertainty propagation through an aeroelastic wind turbine model using polynomial surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murcia, Juan Pablo; Réthoré, Pierre-Elouan; Dimitrov, Nikolay

    Polynomial surrogates are used to characterize the energy production and lifetime equivalent fatigue loads for different components of the DTU 10 MW reference wind turbine under realistic atmospheric conditions. The variability caused by different turbulent inflow fields are captured by creating independent surrogates for the mean and standard deviation of each output with respect to the inflow realizations. A global sensitivity analysis shows that the turbulent inflow realization has a bigger impact on the total distribution of equivalent fatigue loads than the shear coefficient or yaw miss-alignment. The methodology presented extends the deterministic power and thrust coefficient curves to uncertaintymore » models and adds new variables like damage equivalent fatigue loads in different components of the turbine. These surrogate models can then be implemented inside other work-flows such as: estimation of the uncertainty in annual energy production due to wind resource variability and/or robust wind power plant layout optimization. It can be concluded that it is possible to capture the global behavior of a modern wind turbine and its uncertainty under realistic inflow conditions using polynomial response surfaces. In conclusion, the surrogates are a way to obtain power and load estimation under site specific characteristics without sharing the proprietary aeroelastic design.« less

  15. Uncertainty propagation through an aeroelastic wind turbine model using polynomial surrogates

    DOE PAGES

    Murcia, Juan Pablo; Réthoré, Pierre-Elouan; Dimitrov, Nikolay; ...

    2017-07-17

    Polynomial surrogates are used to characterize the energy production and lifetime equivalent fatigue loads for different components of the DTU 10 MW reference wind turbine under realistic atmospheric conditions. The variability caused by different turbulent inflow fields are captured by creating independent surrogates for the mean and standard deviation of each output with respect to the inflow realizations. A global sensitivity analysis shows that the turbulent inflow realization has a bigger impact on the total distribution of equivalent fatigue loads than the shear coefficient or yaw miss-alignment. The methodology presented extends the deterministic power and thrust coefficient curves to uncertaintymore » models and adds new variables like damage equivalent fatigue loads in different components of the turbine. These surrogate models can then be implemented inside other work-flows such as: estimation of the uncertainty in annual energy production due to wind resource variability and/or robust wind power plant layout optimization. It can be concluded that it is possible to capture the global behavior of a modern wind turbine and its uncertainty under realistic inflow conditions using polynomial response surfaces. In conclusion, the surrogates are a way to obtain power and load estimation under site specific characteristics without sharing the proprietary aeroelastic design.« less

  16. Climate Change and Runoff Statistics: a Process Study for the Rhine Basin using a coupled Climate-Runoff Model

    NASA Astrophysics Data System (ADS)

    Kleinn, J.; Frei, C.; Gurtz, J.; Vidale, P. L.; Schär, C.

    2003-04-01

    The consequences of extreme runoff and extreme water levels are within the most important weather induced natural hazards. The question about the impact of a global climate change on the runoff regime, especially on the frequency of floods, is of utmost importance. In winter-time, two possible climate effects could influence the runoff statistis of large Central European rivers: the shift from snowfall to rain as a consequence of higher temperatures and the increase of heavy precipitation events due to an intensification of the hydrological cycle. The combined effect on the runoff statistics is examined in this study for the river Rhine. To this end, sensitivity experiments with a model chain including a regional climate model and a distributed runoff model are presented. The experiments are based on an idealized surrogate climate change scenario which stipulates a uniform increase in temperature by 2 Kelvin and an increase in atmospheric specific humidity by 15% (resulting from unchanged relative humidity) in the forcing fields for the regional climate model. The regional climate model CHRM is based on the mesoscale weather prediction model HRM of the German Weather Service (DWD) and has been adapted for climate simulations. The model is being used in a nested mode with horizontal resolutions of 56 km and 14 km. The boundary conditions are taken from the original ECMWF reanalysis and from a modified version representing the surrogate scenario. The distributed runoff model (WaSiM) is used at a horizontal resolution of 1 km for the whole Rhine basin down to Cologne. The coupling of the models is provided by a downscaling of the climate model fields (precipitaion, temperature, radiation, humidity, and wind) to the resolution of the distributed runoff model. The simulations cover the period of September 1987 to January 1994 with a special emphasis on the five winter seasons 1989/90 until 1993/94, each from November until January. A detailed validation of the control simulation shows a good correspondence of the precipitation fields from the regional climate model with measured fields regarding the distribution of precipitation at the scale of the Rhine basin. Systematic errors are visible at the scale of single subcatchements, in the altitudinal distribution and in the frequency distribution of precipitation. These errors only marginally affect the runoff simulations, which show good correspondence with runoff observations. The presentation includes results from the scenario simulations for the whole basin as well as for Alpine and lowland subcatchements. The change in the runoff statistics is being analyzed with respect to the changes in snowfall and to the fequency distribution of precipitation.

  17. The value of surrogate endpoints for predicting real-world survival across five cancer types.

    PubMed

    Shafrin, Jason; Brookmeyer, Ron; Peneva, Desi; Park, Jinhee; Zhang, Jie; Figlin, Robert A; Lakdawalla, Darius N

    2016-01-01

    It is unclear how well different outcome measures in randomized controlled trials (RCTs) perform in predicting real-world cancer survival. We assess the ability of RCT overall survival (OS) and surrogate endpoints - progression-free survival (PFS) and time to progression (TTP) - to predict real-world OS across five cancers. We identified 20 treatments and 31 indications for breast, colorectal, lung, ovarian, and pancreatic cancer that had a phase III RCT reporting median OS and median PFS or TTP. Median real-world OS was determined using a Kaplan-Meier estimator applied to patients in the Surveillance and Epidemiology End Results (SEER)-Medicare database (1991-2010). Performance of RCT OS and PFS/TTP in predicting real-world OS was measured using t-tests, median absolute prediction error, and R(2) from linear regressions. Among 72,600 SEER-Medicare patients similar to RCT participants, median survival was 5.9 months for trial surrogates, 14.1 months for trial OS, and 13.4 months for real-world OS. For this sample, regression models using clinical trial OS and trial surrogates as independent variables predicted real-world OS significantly better than models using surrogates alone (P = 0.026). Among all real-world patients using sample treatments (N = 309,182), however, adding trial OS did not improve predictive power over predictions based on surrogates alone (P = 0.194). Results were qualitatively similar using median absolute prediction error and R(2) metrics. Among the five tumor types investigated, trial OS and surrogates were each independently valuable in predicting real-world OS outcomes for patients similar to trial participants. In broader real-world populations, however, trial OS added little incremental value over surrogates alone.

  18. Sparse Polynomial Chaos Surrogate for ACME Land Model via Iterative Bayesian Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2015-12-01

    For computationally expensive climate models, Monte-Carlo approaches of exploring the input parameter space are often prohibitive due to slow convergence with respect to ensemble size. To alleviate this, we build inexpensive surrogates using uncertainty quantification (UQ) methods employing Polynomial Chaos (PC) expansions that approximate the input-output relationships using as few model evaluations as possible. However, when many uncertain input parameters are present, such UQ studies suffer from the curse of dimensionality. In particular, for 50-100 input parameters non-adaptive PC representations have infeasible numbers of basis terms. To this end, we develop and employ Weighted Iterative Bayesian Compressive Sensing to learn the most important input parameter relationships for efficient, sparse PC surrogate construction with posterior uncertainty quantified due to insufficient data. Besides drastic dimensionality reduction, the uncertain surrogate can efficiently replace the model in computationally intensive studies such as forward uncertainty propagation and variance-based sensitivity analysis, as well as design optimization and parameter estimation using observational data. We applied the surrogate construction and variance-based uncertainty decomposition to Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  19. Hypothesis Testing Using Spatially Dependent Heavy Tailed Multisensor Data

    DTIC Science & Technology

    2014-12-01

    Office of Research 113 Bowne Hall Syracuse, NY 13244 -1200 ABSTRACT HYPOTHESIS TESTING USING SPATIALLY DEPENDENT HEAVY-TAILED MULTISENSOR DATA Report...consistent with the null hypothesis of linearity and can be used to estimate the distribution of a test statistic that can discrimi- nate between the null... Test for nonlinearity. Histogram is generated using the surrogate data. The statistic of the original time series is represented by the solid line

  20. Modified Bayesian Kriging for Noisy Response Problems for Reliability Analysis

    DTIC Science & Technology

    2015-01-01

    52242, USA nicholas-gaul@uiowa.edu Mary Kathryn Cowles Department of Statistics & Actuarial Science College of Liberal Arts and Sciences , The...Forrester, A. I. J., & Keane, A. J. (2009). Recent advances in surrogate-based optimization. Progress in Aerospace Sciences , 45(1–3), 50-79. doi...Wiley. [27] Sacks, J., Welch, W. J., Toby J. Mitchell, & Wynn, H. P. (1989). Design and analysis of computer experiments. Statistical Science , 4

  1. An Efficient Variable Screening Method for Effective Surrogate Models for Reliability-Based Design Optimization

    DTIC Science & Technology

    2014-04-01

    surrogate model generation is difficult for high -dimensional problems, due to the curse of dimensionality. Variable screening methods have been...a variable screening model was developed for the quasi-molecular treatment of ion-atom collision [16]. In engineering, a confidence interval of...for high -level radioactive waste [18]. Moreover, the design sensitivity method can be extended to the variable screening method because vital

  2. Beauty and the beast: Some perspectives on efficient model analysis, surrogate models, and the future of modeling

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.

    2015-12-01

    For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical of (2) and the global averaged methods typical of (3) compare for typical systems? The discussion will use examples of response of the Greenland glacier to global warming and surface and groundwater modeling.

  3. Development and application of a green fluorescent protein (GFP) expressing E. coli O103 surrogate for tracking contamination through grinding and identifying persistent points of contamination

    USDA-ARS?s Scientific Manuscript database

    Objective: To 1.) develop and validate an easily trackable E. coli O157:H7/non-O157 STEC surrogate that can be detected to the same level of sensitivity as E. coli O157:H7; and 2.) apply the trackable surrogate to model contamination passage through grinding and identify points where contamination ...

  4. Parametric geometric model and hydrodynamic shape optimization of a flying-wing structure underwater glider

    NASA Astrophysics Data System (ADS)

    Wang, Zhen-yu; Yu, Jian-cheng; Zhang, Ai-qun; Wang, Ya-xing; Zhao, Wen-tao

    2017-12-01

    Combining high precision numerical analysis methods with optimization algorithms to make a systematic exploration of a design space has become an important topic in the modern design methods. During the design process of an underwater glider's flying-wing structure, a surrogate model is introduced to decrease the computation time for a high precision analysis. By these means, the contradiction between precision and efficiency is solved effectively. Based on the parametric geometry modeling, mesh generation and computational fluid dynamics analysis, a surrogate model is constructed by adopting the design of experiment (DOE) theory to solve the multi-objects design optimization problem of the underwater glider. The procedure of a surrogate model construction is presented, and the Gaussian kernel function is specifically discussed. The Particle Swarm Optimization (PSO) algorithm is applied to hydrodynamic design optimization. The hydrodynamic performance of the optimized flying-wing structure underwater glider increases by 9.1%.

  5. Thermophysics Characterization of Kerosene Combustion

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    2000-01-01

    A one-formula surrogate fuel formulation and its quasi-global combustion kinetics model are developed to support the design of injectors and thrust chambers of kerosene-fueled rocket engines. This surrogate fuel model depicts a fuel blend that properly represents the general physical and chemical properties of kerosene. The accompanying gaseous-phase thermodynamics of the surrogate fuel is anchored with the heat of formation of kerosene and verified by comparing a series of one-dimensional rocket thrust chamber calculations. The quasi-global combustion kinetics model consists of several global steps for parent fuel decomposition, soot formation, and soot oxidation, and a detailed wet-CO mechanism. The final thermophysics formulations are incorporated with a computational fluid dynamics model for prediction of the combustor efficiency of an uni-element, tri-propellant combustor and the radiation of a kerosene-fueled thruster plume. The model predictions agreed reasonably well with those of the tests.

  6. Thermophysics Characterization of Kerosene Combustion

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    2001-01-01

    A one-formula surrogate fuel formulation and its quasi-global combustion kinetics model are developed to support the design of injectors and thrust chambers of kerosene-fueled rocket engines. This surrogate fuel model depicts a fuel blend that properly represents the general physical and chemical properties of kerosene. The accompanying gaseous-phase thermodynamics of the surrogate fuel is anchored with the heat of formation of kerosene and verified by comparing a series of one-dimensional rocket thrust chamber calculations. The quasi-global combustion kinetics model consists of several global steps for parent fuel decomposition, soot formation, and soot oxidation and a detailed wet-CO mechanism to complete the combustion process. The final thermophysics formulations are incorporated with a computational fluid dynamics model for prediction of the combustion efficiency of an unielement, tripropellant combustor and the radiation of a kerosene-fueled thruster plume. The model predictions agreed reasonably well with those of the tests.

  7. Wavelet analysis in ecology and epidemiology: impact of statistical tests.

    PubMed

    Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario

    2014-02-06

    Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the 'beta-surrogate' method.

  8. Uncertainty quantification in capacitive RF MEMS switches

    NASA Astrophysics Data System (ADS)

    Pax, Benjamin J.

    Development of radio frequency micro electrical-mechanical systems (RF MEMS) has led to novel approaches to implement electrical circuitry. The introduction of capacitive MEMS switches, in particular, has shown promise in low-loss, low-power devices. However, the promise of MEMS switches has not yet been completely realized. RF-MEMS switches are known to fail after only a few months of operation, and nominally similar designs show wide variability in lifetime. Modeling switch operation using nominal or as-designed parameters cannot predict the statistical spread in the number of cycles to failure, and probabilistic methods are necessary. A Bayesian framework for calibration, validation and prediction offers an integrated approach to quantifying the uncertainty in predictions of MEMS switch performance. The objective of this thesis is to use the Bayesian framework to predict the creep-related deflection of the PRISM RF-MEMS switch over several thousand hours of operation. The PRISM switch used in this thesis is the focus of research at Purdue's PRISM center, and is a capacitive contacting RF-MEMS switch. It employs a fixed-fixed nickel membrane which is electrostatically actuated by applying voltage between the membrane and a pull-down electrode. Creep plays a central role in the reliability of this switch. The focus of this thesis is on the creep model, which is calibrated against experimental data measured for a frog-leg varactor fabricated and characterized at Purdue University. Creep plasticity is modeled using plate element theory with electrostatic forces being generated using either parallel plate approximations where appropriate, or solving for the full 3D potential field. For the latter, structure-electrostatics interaction is determined through immersed boundary method. A probabilistic framework using generalized polynomial chaos (gPC) is used to create surrogate models to mitigate the costly full physics simulations, and Bayesian calibration and forward propagation of uncertainty are performed using this surrogate model. The first step in the analysis is Bayesian calibration of the creep related parameters. A computational model of the frog-leg varactor is created, and the computed creep deflection of the device over 800 hours is used to generate a surrogate model using a polynomial chaos expansion in Hermite polynomials. Parameters related to the creep phenomenon are calibrated using Bayesian calibration with experimental deflection data from the frog-leg device. The calibrated input distributions are subsequently propagated through a surrogate gPC model for the PRISM MEMS switch to produce probability density functions of the maximum membrane deflection of the membrane over several thousand hours. The assumptions related to the Bayesian calibration and forward propagation are analyzed to determine the sensitivity to these assumptions of the calibrated input distributions and propagated output distributions of the PRISM device. The work is an early step in understanding the role of geometric variability, model uncertainty, numerical errors and experimental uncertainties in the long-term performance of RF-MEMS.

  9. Comparison between Surrogate Indexes of Insulin Sensitivity/Resistance and Hyperinsulinemic Euglycemic Glucose Clamps in Rhesus Monkeys

    PubMed Central

    Lee, Ho-Won; Muniyappa, Ranganath; Yan, Xu; Yue, Lilly Q.; Linden, Ellen H.; Chen, Hui; Hansen, Barbara C.

    2011-01-01

    The euglycemic glucose clamp is the reference method for assessing insulin sensitivity in humans and animals. However, clamps are ill-suited for large studies because of extensive requirements for cost, time, labor, and technical expertise. Simple surrogate indexes of insulin sensitivity/resistance including quantitative insulin-sensitivity check index (QUICKI) and homeostasis model assessment (HOMA) have been developed and validated in humans. However, validation studies of QUICKI and HOMA in both rats and mice suggest that differences in metabolic physiology between rodents and humans limit their value in rodents. Rhesus monkeys are a species more similar to humans than rodents. Therefore, in the present study, we evaluated data from 199 glucose clamp studies obtained from a large cohort of 86 monkeys with a broad range of insulin sensitivity. Data were used to evaluate simple surrogate indexes of insulin sensitivity/resistance (QUICKI, HOMA, Log HOMA, 1/HOMA, and 1/Fasting insulin) with respect to linear regression, predictive accuracy using a calibration model, and diagnostic performance using receiver operating characteristic. Most surrogates had modest linear correlations with SIClamp (r ≈ 0.4–0.64) with comparable correlation coefficients. Predictive accuracy determined by calibration model analysis demonstrated better predictive accuracy of QUICKI than HOMA and Log HOMA. Receiver operating characteristic analysis showed equivalent sensitivity and specificity of most surrogate indexes to detect insulin resistance. Thus, unlike in rodents but similar to humans, surrogate indexes of insulin sensitivity/resistance including QUICKI and log HOMA may be reasonable to use in large studies of rhesus monkeys where it may be impractical to conduct glucose clamp studies. PMID:21209021

  10. Surrogate and clinical endpoints for studies in peripheral artery occlusive disease: Are statistics the brakes?

    PubMed

    Waliszewski, Matthias W; Redlich, Ulf; Breul, Victor; Tautenhahn, Jörg

    2017-04-30

    The aim of this review is to present the available clinical and surrogate endpoints that may be used in future studies performed in patients with peripheral artery occlusive disease (PAOD). Importantly, we describe statistical limitations of the most commonly used endpoints and offer some guidance with respect to study design for a given sample size. The proposed endpoints may be used in studies using surgical or interventional revascularization and/or drug treatments. Considering recently published study endpoints and designs, the usefulness of these endpoints for reimbursement is evaluated. Based on these potential study endpoints and patient sample size estimates with different non-inferiority or tests for difference hypotheses, a rating relative to their corresponding reimbursement values is attempted. As regards the benefit for the patients and for the payers, walking distance and the ankle brachial index (ABI) are the most feasible endpoints in a relatively small study samples given that other non-vascular impact factors can be controlled. Angiographic endpoints such as minimal lumen diameter (MLD) do not seem useful from a reimbursement standpoint despite their intuitiveness. Other surrogate endpoints, such as transcutaneous oxygen tension measurements, have yet to be established as useful endpoints in reasonably sized studies with patients with critical limb ischemia (CLI). From a reimbursement standpoint, WD and ABI are effective endpoints for a moderate study sample size given that non-vascular confounding factors can be controlled.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, T; Ruan, D

    Purpose: The growing size and heterogeneity in training atlas necessitates sophisticated schemes to identify only the most relevant atlases for the specific multi-atlas-based image segmentation problem. This study aims to develop a model to infer the inaccessible oracle geometric relevance metric from surrogate image similarity metrics, and based on such model, provide guidance to atlas selection in multi-atlas-based image segmentation. Methods: We relate the oracle geometric relevance metric in label space to the surrogate metric in image space, by a monotonically non-decreasing function with additive random perturbations. Subsequently, a surrogate’s ability to prognosticate the oracle order for atlas subset selectionmore » is quantified probabilistically. Finally, important insights and guidance are provided for the design of fusion set size, balancing the competing demands to include the most relevant atlases and to exclude the most irrelevant ones. A systematic solution is derived based on an optimization framework. Model verification and performance assessment is performed based on clinical prostate MR images. Results: The proposed surrogate model was exemplified by a linear map with normally distributed perturbation, and verified with several commonly-used surrogates, including MSD, NCC and (N)MI. The derived behaviors of different surrogates in atlas selection and their corresponding performance in ultimate label estimate were validated. The performance of NCC and (N)MI was similarly superior to MSD, with a 10% higher atlas selection probability and a segmentation performance increase in DSC by 0.10 with the first and third quartiles of (0.83, 0.89), compared to (0.81, 0.89). The derived optimal fusion set size, valued at 7/8/8/7 for MSD/NCC/MI/NMI, agreed well with the appropriate range [4, 9] from empirical observation. Conclusion: This work has developed an efficacious probabilistic model to characterize the image-based surrogate metric on atlas selection. Analytical insights lead to valid guiding principles on fusion set size design.« less

  12. Predicting Treatment Effect from Surrogate Endpoints and Historical Trials | Division of Cancer Prevention

    Cancer.gov

    By Stuart G. Baker, 2017 Introduction This software fits a zero-intercept random effects linear model to data on surrogate and true endpoints in previous trials. Requirement:  Mathematica Version 11 or later. |

  13. Construction and characterization of outbreak Escherichia coli O157:H7 surrogate strains for use in field studies.

    PubMed

    Webb, Cathy C; Erickson, Marilyn C; Davey, Lindsey E; Payton, Alison S; Doyle, Michael P

    2014-11-01

    Escherichia coli O157:H7 has been the causative agent of many outbreaks associated with leafy green produce consumption. Elucidating the mechanism by which contamination occurs requires monitoring interactions between the pathogen and the plant under typical production conditions. Intentional introduction of virulent strains into fields is not an acceptable practice. As an alternative, attenuated strains of natural isolates have been used as surrogates of the virulent strains; however, the attachment properties and environmental stabilities of these attenuated isolates may differ from the unattenuated outbreak strains. In this study, the Shiga toxin (stx1, stx2, and/or stx2c) genes as well as the eae gene encoding intimin of two E. coli O157:H7 outbreak isolates, F4546 (1997 alfalfa sprout) and K4492 (2006 lettuce), were deleted. Individual gene deletions were confirmed by polymerase chain reaction (PCR) and DNA sequencing. The mutant strains did not produce Shiga toxin. The growth kinetics of these mutant strains under nutrient-rich and minimal conditions were identical to those of their wild-type strains. Attachment to the surface of lettuce leaves was comparable between wild-type/mutant pairs F4546/MD46 and K4492/MD47. Adherence to soil particles was also comparable between the virulent and surrogate pairs, although the F4546/MD46 pair exhibited statistically greater attachment than the K4492/MD47 pair (p≤0.05). Wild-type and mutant pairs F4546/MD46 and K4492/MD47 inoculated into wet or dry soils had statistically similar survival rates over the 7-day storage period at 20°C. A plasmid, pGFPuv, containing green fluorescent protein was transformed into each of the mutant strains, allowing for ease of identification and detection of surrogate strains on plant material or soil. These pGFPuv-containing surrogate strains will enable the investigation of pathogen interaction with plants and soil in the farm production environment where the virulent pathogen cannot be used.

  14. Surrogate-based optimization of hydraulic fracturing in pre-existing fracture networks

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Sun, Yunwei; Fu, Pengcheng; Carrigan, Charles R.; Lu, Zhiming; Tong, Charles H.; Buscheck, Thomas A.

    2013-08-01

    Hydraulic fracturing has been used widely to stimulate production of oil, natural gas, and geothermal energy in formations with low natural permeability. Numerical optimization of fracture stimulation often requires a large number of evaluations of objective functions and constraints from forward hydraulic fracturing models, which are computationally expensive and even prohibitive in some situations. Moreover, there are a variety of uncertainties associated with the pre-existing fracture distributions and rock mechanical properties, which affect the optimized decisions for hydraulic fracturing. In this study, a surrogate-based approach is developed for efficient optimization of hydraulic fracturing well design in the presence of natural-system uncertainties. The fractal dimension is derived from the simulated fracturing network as the objective for maximizing energy recovery sweep efficiency. The surrogate model, which is constructed using training data from high-fidelity fracturing models for mapping the relationship between uncertain input parameters and the fractal dimension, provides fast approximation of the objective functions and constraints. A suite of surrogate models constructed using different fitting methods is evaluated and validated for fast predictions. Global sensitivity analysis is conducted to gain insights into the impact of the input variables on the output of interest, and further used for parameter screening. The high efficiency of the surrogate-based approach is demonstrated for three optimization scenarios with different and uncertain ambient conditions. Our results suggest the critical importance of considering uncertain pre-existing fracture networks in optimization studies of hydraulic fracturing.

  15. Mass diffusion coefficient measurement for vitreous humor using FEM and MRI

    NASA Astrophysics Data System (ADS)

    Rattanakijsuntorn, Komsan; Penkova, Anita; Sadha, Satwindar S.

    2018-01-01

    In early studies, the ‘contour method’ for determining the diffusion coefficient of the vitreous humor was developed. This technique relied on careful injection of an MRI contrast agent (surrogate drug) into the vitreous humor of fresh bovine eyes, and tracking the contours of the contrast agent in time. In addition, an analytical solution was developed for the theoretical contours built on point source model for the injected surrogate drug. The match between theoretical and experimental contours as a least square fit, while floating the diffusion coefficient, led to the value of the diffusion coefficient. This method had its limitation that the initial injection of the surrogate had to be spherical or ellipsoidal because of the analytical result based on the point-source model. With a new finite element model for the analysis in this study, the technique is much less restrictive and handles irregular shapes of the initial bolus. The fresh bovine eyes were used for drug diffusion study in the vitreous and three contrast agents of different molecular masses: gadolinium-diethylenetriaminepentaacetic acid (Gd-DTPA, 938 Da), non-ionic gadoteridol (Prohance, 559 Da), and bovine albumin conjugated with gadolinium (Galbumin, 74 kDa) were used as drug surrogates to visualize the diffusion process by MRI. The 3D finite element model was developed to determine the diffusion coefficients of these surrogates with the images from MRI. This method can be used for other types of bioporous media provided the concentration profile can be visualized (by methods such as MRI or fluorescence).

  16. Evaluation of safety effect of turbo-roundabout lane dividers using floating car data and video observation.

    PubMed

    Kieć, Mariusz; Ambros, Jiří; Bąk, Radosław; Gogolín, Ondřej

    2018-06-01

    Roundabouts are one of the safest types of intersections. However, the needs to meet the requirements of operation, capacity, traffic organization and surrounding development lead to a variety of design solutions. One of such alternatives are turbo-roundabouts, which simplify drivers' decision making, limit lane changing in the roundabout, and induce low driving speed thanks to raised lane dividers. However, in spite of their generally positive reception, the safety impact of turbo-roundabouts has not been sufficiently studied. Given the low number of existing turbo-roundabouts and the statistical rarity of accident occurrence, the prevalent previously conducted studies applied only simple before-after designs or relied on traffic conflicts in micro-simulations. Nevertheless, the presence of raised lane dividers is acknowledged as an important feature of well performing and safe turbo-roundabouts. Following the previous Polish studies, the primary objective of the present study was assessment of influence of presence of lane dividers on road safety and developing a reliable and valid surrogate safety measure based on field data, which will circumvent the limitations of accident data or micro-simulations. The secondary objective was using the developed surrogate safety measure to assess and compare the safety levels of Polish turbo-roundabout samples with and without raised lane dividers. The surrogate safety measure was based on speed and lane behaviour. Speed was obtained from video observations and floating car data, which enabled the construction of representative speed profiles. Lane behaviour data was gathered from video observations. The collection of the data allowed for a relative validation of the method by comparing the safety performance of turbo-roundabouts with and without raised lane dividers. In the end, the surrogate measure was applied for evaluation of safety levels and enhancement of the existing safety performance functions, which combine traffic volumes, and speeds as a function of radii). The final models may help quantify the safety impact of different turbo-roundabout solutions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. EVALUATION OF MURINE NOROVIRUS, FELINE CALICIVIRUS, POLIOVIRUS, AND MS2 AS SURROGATES FOR HUMAN NOROVIRUS IN a Model of Viral Persistence in SURFACE Water AND GROUNDWATER

    EPA Science Inventory

    Human noroviruses (NoV) are a significant cause of non bacterial gastroenteritis worldwide with contaminated drinking water a potential transmission route. The absence of a cell culture infectivity model for NoV necessitates the use of molecular methods and/or viral surrogate mod...

  18. Mortality of aircraft maintenance workers exposed to trichloroethylene and other hydrocarbons and chemicals: extended follow up

    PubMed Central

    Radican, Larry; Blair, Aaron; Stewart, Patricia; Wartenberg, Daniel

    2009-01-01

    Objective To extend follow-up of 14,455 workers from 1990 to 2000, and evaluate mortality risk from exposure to trichloroethylene (TCE) and other chemicals. Methods Multivariable Cox models were used to estimate relative risk for exposed vs. unexposed workers based on previously developed exposure surrogates. Results Among TCE exposed workers, there was no statistically significant increased risk of all-cause mortality (RR=1.04) or death from all cancers (RR=1.03). Exposure-response gradients for TCE were relatively flat and did not materially change since 1990. Statistically significant excesses were found for several chemical exposure subgroups and causes, and were generally consistent with the previous follow up. Conclusions Patterns of mortality have not changed substantially since 1990. While positive associations with several cancers were observed, and are consistent with the published literature, interpretation is limited due to the small numbers of events for specific exposures. PMID:19001957

  19. Accurate quantification of PGE2 in the polyposis in rat colon (Pirc) model by surrogate analyte-based UPLC-MS/MS.

    PubMed

    Yun, Changhong; Dashwood, Wan-Mohaiza; Kwong, Lawrence N; Gao, Song; Yin, Taijun; Ling, Qinglan; Singh, Rashim; Dashwood, Roderick H; Hu, Ming

    2018-01-30

    An accurate and reliable UPLC-MS/MS method is reported for the quantification of endogenous Prostaglandin E2 (PGE 2 ) in rat colonic mucosa and polyps. This method adopted the "surrogate analyte plus authentic bio-matrix" approach, using two different stable isotopic labeled analogs - PGE 2 -d9 as the surrogate analyte and PGE 2 -d4 as the internal standard. A quantitative standard curve was constructed with the surrogate analyte in colonic mucosa homogenate, and the method was successfully validated with the authentic bio-matrix. Concentrations of endogenous PGE 2 in both normal and inflammatory tissue homogenates were back-calculated based on the regression equation. Because of no endogenous interference on the surrogate analyte determination, the specificity was particularly good. By using authentic bio-matrix for validation, the matrix effect and exaction recovery are identically same for the quantitative standard curve and actual samples - this notably increased the assay accuracy. The method is easy, fast, robust and reliable for colon PGE 2 determination. This "surrogate analyte" approach was applied to measure the Pirc (an Apc-mutant rat kindred that models human FAP) mucosa and polyps PGE 2 , one of the strong biomarkers of colorectal cancer. A similar concept could be applied to endogenous biomarkers in other tissues. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A traits-based approach for prioritizing species for monitoring and surrogacy selection

    DOE PAGES

    Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...

    2016-11-28

    The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less

  1. A traits-based approach for prioritizing species for monitoring and surrogacy selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.

    The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less

  2. WE-AB-303-11: Verification of a Deformable 4DCT Motion Model for Lung Tumor Tracking Using Different Driving Surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woelfelschneider, J; Friedrich-Alexander-University Erlangen-Nuremberg, Erlangen, DE; Seregni, M

    2015-06-15

    Purpose: Tumor tracking is an advanced technique to treat intra-fractionally moving tumors. The aim of this study is to validate a surrogate-driven model based on four-dimensional computed tomography (4DCT) that is able to predict CT volumes corresponding to arbitrary respiratory states. Further, the comparison of three different driving surrogates is evaluated. Methods: This study is based on multiple 4DCTs of two patients treated for bronchial carcinoma and metastasis. Analyses for 18 additional patients are currently ongoing. The motion model was estimated from the planning 4DCT through deformable image registration. To predict a certain phase of a follow-up 4DCT, the modelmore » considers for inter-fractional variations (baseline correction) and intra-fractional respiratory parameters (amplitude and phase) derived from surrogates. In this evaluation, three different approaches were used to extract the motion surrogate: for each 4DCT phase, the 3D thoraco-abdominal surface motion, the body volume and the anterior-posterior motion of a virtual single external marker defined on the sternum were investigated. The estimated volumes resulting from the model were compared to the ground-truth clinical 4DCTs using absolute HU differences in the lung volume and landmarks localized using the Scale Invariant Feature Transform (SIFT). Results: The results show absolute HU differences between estimated and ground-truth images with median values limited to 55 HU and inter-quartile ranges (IQR) lower than 100 HU. Median 3D distances between about 1500 matching landmarks are below 2 mm for 3D surface motion and body volume methods. The single marker surrogates Result in increased median distances up to 0.6 mm. Analyses for the extended database incl. 20 patients are currently in progress. Conclusion: The results depend mainly on the image quality of the initial 4DCTs and the deformable image registration. All investigated surrogates can be used to estimate follow-up 4DCT phases, however uncertainties decrease for three-dimensional approaches. This work was funded in parts by the German Research Council (DFG) - KFO 214/2.« less

  3. Incidence of Changes in Respiration-Induced Tumor Motion and Its Relationship With Respiratory Surrogates During Individual Treatment Fractions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malinowski, Kathleen; Department of Radiation Oncology, University of Maryland School of Medicine, Baltimore, MD; McAvoy, Thomas J.

    2012-04-01

    Purpose: To determine how frequently (1) tumor motion and (2) the spatial relationship between tumor and respiratory surrogate markers change during a treatment fraction in lung and pancreas cancer patients. Methods and Materials: A Cyberknife Synchrony system radiographically localized the tumor and simultaneously tracked three respiratory surrogate markers fixed to a form-fitting vest. Data in 55 lung and 29 pancreas fractions were divided into successive 10-min blocks. Mean tumor positions and tumor position distributions were compared across 10-min blocks of data. Treatment margins were calculated from both 10 and 30 min of data. Partial least squares (PLS) regression models ofmore » tumor positions as a function of external surrogate marker positions were created from the first 10 min of data in each fraction; the incidence of significant PLS model degradation was used to assess changes in the spatial relationship between tumors and surrogate markers. Results: The absolute change in mean tumor position from first to third 10-min blocks was >5 mm in 13% and 7% of lung and pancreas cases, respectively. Superior-inferior and medial-lateral differences in mean tumor position were significantly associated with the lobe of lung. In 61% and 54% of lung and pancreas fractions, respectively, margins calculated from 30 min of data were larger than margins calculated from 10 min of data. The change in treatment margin magnitude for superior-inferior motion was >1 mm in 42% of lung and 45% of pancreas fractions. Significantly increasing tumor position prediction model error (mean {+-} standard deviation rates of change of 1.6 {+-} 2.5 mm per 10 min) over 30 min indicated tumor-surrogate relationship changes in 63% of fractions. Conclusions: Both tumor motion and the relationship between tumor and respiratory surrogate displacements change in most treatment fractions for patient in-room time of 30 min.« less

  4. A model for emergency department end-of-life communications after acute devastating events--part I: decision-making capacity, surrogates, and advance directives.

    PubMed

    Limehouse, Walter E; Feeser, V Ramana; Bookman, Kelly J; Derse, Arthur

    2012-09-01

    Making decisions for a patient affected by sudden devastating illness or injury traumatizes a patient's family and loved ones. Even in the absence of an emergency, surrogates making end-of-life treatment decisions may experience negative emotional effects. Helping surrogates with these end-of-life decisions under emergent conditions requires the emergency physician (EP) to be clear, making medical recommendations with sensitivity. This model for emergency department (ED) end-of-life communications after acute devastating events comprises the following steps: 1) determine the patient's decision-making capacity; 2) identify the legal surrogate; 3) elicit patient values as expressed in completed advance directives; 4) determine patient/surrogate understanding of the life-limiting event and expectant treatment goals; 5) convey physician understanding of the event, including prognosis, treatment options, and recommendation; 6) share decisions regarding withdrawing or withholding of resuscitative efforts, using available resources and considering options for organ donation; and 7) revise treatment goals as needed. Emergency physicians should break bad news compassionately, yet sufficiently, so that surrogate and family understand both the gravity of the situation and the lack of long-term benefit of continued life-sustaining interventions. EPs should also help the surrogate and family understand that palliative care addresses comfort needs of the patient including adequate treatment for pain, dyspnea, or anxiety. Part I of this communications model reviews determination of decision-making capacity, surrogacy laws, and advance directives, including legal definitions and application of these steps; Part II (which will appear in a future issue of AEM) covers communication moving from resuscitative to end-of-life and palliative treatment. EPs should recognize acute devastating illness or injuries, when appropriate, as opportunities to initiate end-of-life discussions and to implement shared decisions. © 2012 by the Society for Academic Emergency Medicine.

  5. Incidence of changes in respiration-induced tumor motion and its relationship with respiratory surrogates during individual treatment fractions.

    PubMed

    Malinowski, Kathleen; McAvoy, Thomas J; George, Rohini; Dietrich, Sonja; D'Souza, Warren D

    2012-04-01

    To determine how frequently (1) tumor motion and (2) the spatial relationship between tumor and respiratory surrogate markers change during a treatment fraction in lung and pancreas cancer patients. A Cyberknife Synchrony system radiographically localized the tumor and simultaneously tracked three respiratory surrogate markers fixed to a form-fitting vest. Data in 55 lung and 29 pancreas fractions were divided into successive 10-min blocks. Mean tumor positions and tumor position distributions were compared across 10-min blocks of data. Treatment margins were calculated from both 10 and 30 min of data. Partial least squares (PLS) regression models of tumor positions as a function of external surrogate marker positions were created from the first 10 min of data in each fraction; the incidence of significant PLS model degradation was used to assess changes in the spatial relationship between tumors and surrogate markers. The absolute change in mean tumor position from first to third 10-min blocks was >5 mm in 13% and 7% of lung and pancreas cases, respectively. Superior-inferior and medial-lateral differences in mean tumor position were significantly associated with the lobe of lung. In 61% and 54% of lung and pancreas fractions, respectively, margins calculated from 30 min of data were larger than margins calculated from 10 min of data. The change in treatment margin magnitude for superior-inferior motion was >1 mm in 42% of lung and 45% of pancreas fractions. Significantly increasing tumor position prediction model error (mean ± standard deviation rates of change of 1.6 ± 2.5 mm per 10 min) over 30 min indicated tumor-surrogate relationship changes in 63% of fractions. Both tumor motion and the relationship between tumor and respiratory surrogate displacements change in most treatment fractions for patient in-room time of 30 min. Copyright © 2012. Published by Elsevier Inc.

  6. Surrogate endpoints for overall survival in digestive oncology trials: which candidates? A questionnaires survey among clinicians and methodologists.

    PubMed

    Methy, Nicolas; Bedenne, Laurent; Bonnetain, Franck

    2010-06-10

    Overall survival (OS) is the gold standard for the demonstration of a clinical benefit in cancer trials. Replacement of OS by a surrogate endpoint allows to reduce trial duration. To date, few surrogate endpoints have been validated in digestive oncology. The aim of this study was to draw up an ordered list of potential surrogate endpoints for OS in digestive cancer trials, by way of a survey among clinicians and methodologists. Secondary objective was to obtain their opinion on surrogacy and quality of life (QoL). In 2007 and 2008, self administered sequential questionnaires were sent to a panel of French clinicians and methodologists involved in the conduct of cancer clinical trials. In the first questionnaire, panellists were asked to choose the most important characteristics defining a surrogate among six proposals, to give advantages and drawbacks of the surrogates, and to answer questions about their validation and use. Then they had to suggest potential surrogate endpoints for OS in each of the following tumour sites: oesophagus, stomach, liver, pancreas, biliary tract, lymphoma, colon, rectum, and anus. They finally gave their opinion on QoL as surrogate endpoint. In the second questionnaire, they had to classify the previously proposed candidate surrogates from the most (position #1) to the least relevant in their opinion.Frequency at which the endpoints were chosen as first, second or third most relevant surrogates was calculated and served as final ranking. Response rate was 30% (24/80) in the first round and 20% (16/80) in the second one. Participants highlighted key points concerning surrogacy. In particular, they reminded that a surrogate endpoint is expected to predict clinical benefit in a well-defined therapeutic situation. Half of them thought it was not relevant to study QoL as surrogate for OS.DFS, in the neoadjuvant settings or early stages, and PFS, in the non operable or metastatic settings, were ranked first, with a frequency of more than 69% in 20 out of 22 settings. PFS was proposed in association with QoL in metastatic primary liver and stomach cancers (both 81%). This composite endpoint was ranked second in metastatic oesophageal (69%), colorectal (56%) and anal (56%) cancers, whereas QoL alone was also suggested in most metastatic situations.Other endpoints frequently suggested were R0 resection in the neoadjuvant settings (oesophagus (69%), stomach (56%), pancreas (75%) and biliary tract (63%)) and response. An unexpected endpoint was metastatic PFS in non operable oesophageal (31%) and pancreatic (44%) cancers. Quality and results of surgical procedures like sphincter preservation were also cited as eligible surrogate endpoints in rectal (19%) and anal (50% in case of localized disease) cancers. Except for alpha-FP kinetic in hepatocellular carcinoma (13%) and CA19-9 decline (6%) in pancreas, few endpoints based on biological or tumour markers were proposed. The overall results should help prioritise the endpoints to be statistically evaluated as surrogate for OS, so that trialists and clinicians can rely on endpoints that ensure relevant clinical benefit to the patient.

  7. Surrogate endpoints for overall survival in digestive oncology trials: which candidates? A questionnaires survey among clinicians and methodologists

    PubMed Central

    2010-01-01

    Background Overall survival (OS) is the gold standard for the demonstration of a clinical benefit in cancer trials. Replacement of OS by a surrogate endpoint allows to reduce trial duration. To date, few surrogate endpoints have been validated in digestive oncology. The aim of this study was to draw up an ordered list of potential surrogate endpoints for OS in digestive cancer trials, by way of a survey among clinicians and methodologists. Secondary objective was to obtain their opinion on surrogacy and quality of life (QoL). Methods In 2007 and 2008, self administered sequential questionnaires were sent to a panel of French clinicians and methodologists involved in the conduct of cancer clinical trials. In the first questionnaire, panellists were asked to choose the most important characteristics defining a surrogate among six proposals, to give advantages and drawbacks of the surrogates, and to answer questions about their validation and use. Then they had to suggest potential surrogate endpoints for OS in each of the following tumour sites: oesophagus, stomach, liver, pancreas, biliary tract, lymphoma, colon, rectum, and anus. They finally gave their opinion on QoL as surrogate endpoint. In the second questionnaire, they had to classify the previously proposed candidate surrogates from the most (position #1) to the least relevant in their opinion. Frequency at which the endpoints were chosen as first, second or third most relevant surrogates was calculated and served as final ranking. Results Response rate was 30% (24/80) in the first round and 20% (16/80) in the second one. Participants highlighted key points concerning surrogacy. In particular, they reminded that a surrogate endpoint is expected to predict clinical benefit in a well-defined therapeutic situation. Half of them thought it was not relevant to study QoL as surrogate for OS. DFS, in the neoadjuvant settings or early stages, and PFS, in the non operable or metastatic settings, were ranked first, with a frequency of more than 69% in 20 out of 22 settings. PFS was proposed in association with QoL in metastatic primary liver and stomach cancers (both 81%). This composite endpoint was ranked second in metastatic oesophageal (69%), colorectal (56%) and anal (56%) cancers, whereas QoL alone was also suggested in most metastatic situations. Other endpoints frequently suggested were R0 resection in the neoadjuvant settings (oesophagus (69%), stomach (56%), pancreas (75%) and biliary tract (63%)) and response. An unexpected endpoint was metastatic PFS in non operable oesophageal (31%) and pancreatic (44%) cancers. Quality and results of surgical procedures like sphincter preservation were also cited as eligible surrogate endpoints in rectal (19%) and anal (50% in case of localized disease) cancers. Except for alpha-FP kinetic in hepatocellular carcinoma (13%) and CA19-9 decline (6%) in pancreas, few endpoints based on biological or tumour markers were proposed. Conclusion The overall results should help prioritise the endpoints to be statistically evaluated as surrogate for OS, so that trialists and clinicians can rely on endpoints that ensure relevant clinical benefit to the patient. PMID:20537166

  8. Air pollution exposure modeling of individuals

    EPA Science Inventory

    Air pollution epidemiology studies of ambient fine particulate matter (PM2.5) often use outdoor concentrations as exposure surrogates. These surrogates can induce exposure error since they do not account for (1) time spent indoors with ambient PM2.5 levels attenuated from outdoor...

  9. Chance-constrained multi-objective optimization of groundwater remediation design at DNAPLs-contaminated sites using a multi-algorithm genetically adaptive method

    NASA Astrophysics Data System (ADS)

    Ouyang, Qi; Lu, Wenxi; Hou, Zeyu; Zhang, Yu; Li, Shuai; Luo, Jiannan

    2017-05-01

    In this paper, a multi-algorithm genetically adaptive multi-objective (AMALGAM) method is proposed as a multi-objective optimization solver. It was implemented in the multi-objective optimization of a groundwater remediation design at sites contaminated by dense non-aqueous phase liquids. In this study, there were two objectives: minimization of the total remediation cost, and minimization of the remediation time. A non-dominated sorting genetic algorithm II (NSGA-II) was adopted to compare with the proposed method. For efficiency, the time-consuming surfactant-enhanced aquifer remediation simulation model was replaced by a surrogate model constructed by a multi-gene genetic programming (MGGP) technique. Similarly, two other surrogate modeling methods-support vector regression (SVR) and Kriging (KRG)-were employed to make comparisons with MGGP. In addition, the surrogate-modeling uncertainty was incorporated in the optimization model by chance-constrained programming (CCP). The results showed that, for the problem considered in this study, (1) the solutions obtained by AMALGAM incurred less remediation cost and required less time than those of NSGA-II, indicating that AMALGAM outperformed NSGA-II. It was additionally shown that (2) the MGGP surrogate model was more accurate than SVR and KRG; and (3) the remediation cost and time increased with the confidence level, which can enable decision makers to make a suitable choice by considering the given budget, remediation time, and reliability.

  10. A Multi-Fidelity Surrogate Model for the Equation of State for Mixtures of Real Gases

    NASA Astrophysics Data System (ADS)

    Ouellet, Frederick; Park, Chanyoung; Koneru, Rahul; Balachandar, S.; Rollin, Bertrand

    2017-11-01

    The explosive dispersal of particles is a complex multiphase and multi-species fluid flow problem. In these flows, the products of detonated explosives must be treated as real gases while the ideal gas equation of state is used for the ambient air. As the products expand outward, they mix with the air and create a region where both state equations must be satisfied. One of the most accurate, yet expensive, methods to handle this problem is an algorithm that iterates between both state equations until both pressure and thermal equilibrium are achieved inside of each computational cell. This work creates a multi-fidelity surrogate model to replace this process. This is achieved by using a Kriging model to produce a curve fit which interpolates selected data from the iterative algorithm. The surrogate is optimized for computing speed and model accuracy by varying the number of sampling points chosen to construct the model. The performance of the surrogate with respect to the iterative method is tested in simulations using a finite volume code. The model's computational speed and accuracy are analyzed to show the benefits of this novel approach. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA00023.

  11. Comparison between surrogate indexes of insulin sensitivity/resistance and hyperinsulinemic euglycemic clamp estimates in rats

    PubMed Central

    Muniyappa, Ranganath; Chen, Hui; Muzumdar, Radhika H.; Einstein, Francine H.; Yan, Xu; Yue, Lilly Q.; Barzilai, Nir

    2009-01-01

    Assessing insulin resistance in rodent models gives insight into mechanisms that cause type 2 diabetes and the metabolic syndrome. The hyperinsulinemic euglycemic glucose clamp, the reference standard for measuring insulin sensitivity in humans and animals, is labor intensive and technically demanding. A number of simple surrogate indexes of insulin sensitivity/resistance have been developed and validated primarily for use in large human studies. These same surrogates are also frequently used in rodent studies. However, in general, these indexes have not been rigorously evaluated in animals. In a recent validation study in mice, we demonstrated that surrogates have a weaker correlation with glucose clamp estimates of insulin sensitivity/resistance than in humans. This may be due to increased technical difficulties in mice and/or intrinsic differences between human and rodent physiology. To help distinguish among these possibilities, in the present study, using data from rats substantially larger than mice, we compared the clamp glucose infusion rate (GIR) with surrogate indexes, including QUICKI, HOMA, 1/HOMA, log (HOMA), and 1/fasting insulin. All surrogates were modestly correlated with GIR (r = 0.34–0.40). Calibration analyses of surrogates adjusted for body weight demonstrated similar predictive accuracy for GIR among all surrogates. We conclude that linear correlations of surrogate indexes with clamp estimates and predictive accuracy of surrogate indexes in rats are similar to those in mice (but not as substantial as in humans). This additional rat study (taken with the previous mouse study) suggests that application of surrogate insulin sensitivity indexes developed for humans may not be appropriate for determining primary outcomes in rodent studies due to intrinsic differences in metabolic physiology. However, use of surrogates may be appropriate in rodents, where feasibility of clamps is an obstacle and measurement of insulin sensitivity is a secondary outcome. PMID:19706785

  12. Effective use of integrated hydrological models in basin-scale water resources management: surrogate modeling approaches

    NASA Astrophysics Data System (ADS)

    Zheng, Y.; Wu, B.; Wu, X.

    2015-12-01

    Integrated hydrological models (IHMs) consider surface water and subsurface water as a unified system, and have been widely adopted in basin-scale water resources studies. However, due to IHMs' mathematical complexity and high computational cost, it is difficult to implement them in an iterative model evaluation process (e.g., Monte Carlo Simulation, simulation-optimization analysis, etc.), which diminishes their applicability for supporting decision-making in real-world situations. Our studies investigated how to effectively use complex IHMs to address real-world water issues via surrogate modeling. Three surrogate modeling approaches were considered, including 1) DYCORS (DYnamic COordinate search using Response Surface models), a well-established response surface-based optimization algorithm; 2) SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), a response surface-based optimization algorithm that we developed specifically for IHMs; and 3) Probabilistic Collocation Method (PCM), a stochastic response surface approach. Our investigation was based on a modeling case study in the Heihe River Basin (HRB), China's second largest endorheic river basin. The GSFLOW (Coupled Ground-Water and Surface-Water Flow Model) model was employed. Two decision problems were discussed. One is to optimize, both in time and in space, the conjunctive use of surface water and groundwater for agricultural irrigation in the middle HRB region; and the other is to cost-effectively collect hydrological data based on a data-worth evaluation. Overall, our study results highlight the value of incorporating an IHM in making decisions of water resources management and hydrological data collection. An IHM like GSFLOW can provide great flexibility to formulating proper objective functions and constraints for various optimization problems. On the other hand, it has been demonstrated that surrogate modeling approaches can pave the path for such incorporation in real-world situations, since they can dramatically reduce the computational cost of using IHMs in an iterative model evaluation process. In addition, our studies generated insights into the human-nature water conflicts in the specific study area and suggested potential solutions to address them.

  13. A Fast Surrogate-facilitated Data-driven Bayesian Approach to Uncertainty Quantification of a Regional Groundwater Flow Model with Structural Error

    NASA Astrophysics Data System (ADS)

    Xu, T.; Valocchi, A. J.; Ye, M.; Liang, F.

    2016-12-01

    Due to simplification and/or misrepresentation of the real aquifer system, numerical groundwater flow and solute transport models are usually subject to model structural error. During model calibration, the hydrogeological parameters may be overly adjusted to compensate for unknown structural error. This may result in biased predictions when models are used to forecast aquifer response to new forcing. In this study, we extend a fully Bayesian method [Xu and Valocchi, 2015] to calibrate a real-world, regional groundwater flow model. The method uses a data-driven error model to describe model structural error and jointly infers model parameters and structural error. In this study, Bayesian inference is facilitated using high performance computing and fast surrogate models. The surrogate models are constructed using machine learning techniques to emulate the response simulated by the computationally expensive groundwater model. We demonstrate in the real-world case study that explicitly accounting for model structural error yields parameter posterior distributions that are substantially different from those derived by the classical Bayesian calibration that does not account for model structural error. In addition, the Bayesian with error model method gives significantly more accurate prediction along with reasonable credible intervals.

  14. Developing the role of the social worker as coordinator of services at the surrogate parenting center.

    PubMed

    Gagin, Roni; Cohen, Miri; Greenblatt, Lee; Solomon, Hanah; Itskovitz-Eldor, Joseph

    2004-01-01

    A law permitting couples to conceive biological children through surrogacy was legislated in Israel in March 1996. The Rambam Medical Center has established the only nonprofit Surrogate Parenting Center at a public hospital in Israel. The multidisciplinary teamwork at the Center is case managed by a social worker. An important role of the social work intervention is consultation and support for the couple and the surrogate at all stages of the process. The case study presented in the article illustrates the need for sensitive and professional intervention due to the complexity of the surrogacy process and the crisis it involves for both the surrogate and the couple. In light of the growing parenting surrogacy cases in the United States, Europe, and Israel, a structured social work intervention model is described, which may be implemented at public or private surrogate parenting centers.

  15. Selecting surrogate endpoints for estimating pesticide effects on avian reproductive success.

    PubMed

    Bennett, Richard S; Etterson, Matthew A

    2013-10-01

    A Markov chain nest productivity model (MCnest) has been developed for projecting the effects of a specific pesticide-use scenario on the annual reproductive success of avian species of concern. A critical element in MCnest is the use of surrogate endpoints, defined as measured endpoints from avian toxicity tests that represent specific types of effects possible in field populations at specific phases of a nesting attempt. In this article, we discuss the attributes of surrogate endpoints and provide guidance for selecting surrogates from existing avian laboratory tests as well as other possible sources. We also discuss some of the assumptions and uncertainties related to using surrogate endpoints to represent field effects. The process of explicitly considering how toxicity test results can be used to assess effects in the field helps identify uncertainties and data gaps that could be targeted in higher-tier risk assessments. © 2013 SETAC.

  16. Does synchronization reflect a true interaction in the cardiorespiratory system?

    PubMed

    Toledo, E; Akselrod, S; Pinhas, I; Aravot, D

    2002-01-01

    Cardiorespiratory synchronization, studied within the framework of phase synchronization, has recently raised interest as one of the interactions in the cardiorespiratory system. In this work, we present a quantitative approach to the analysis of this nonlinear phenomenon. Our primary aim is to determine whether synchronization between HR and respiration rate is a real phenomenon or a random one. First, we developed an algorithm, which detects epochs of synchronization automatically and objectively. The algorithm was applied to recordings of respiration and HR obtained from 13 normal subjects and 13 heart transplant patients. Surrogate data sets were constructed from the original recordings, specifically lacking the coupling between HR and respiration. The statistical properties of synchronization in the two data sets and in their surrogates were compared. Synchronization was observed in all groups: in normal subjects, in the heart transplant patients and in the surrogates. Interestingly, synchronization was less abundant in normal subjects than in the transplant patients, indicating that the unique physiological condition of the latter promote cardiorespiratory synchronization. The duration of synchronization epochs was significantly shorter in the surrogate data of both data sets, suggesting that at least some of the synchronization epochs are real. In view of those results, cardiorespiratory synchronization, although not a major feature of cardiorespiratory interaction, seems to be a real phenomenon rather than an artifact.

  17. Multiple scaling behaviour and nonlinear traits in music scores

    PubMed Central

    Larralde, Hernán; Martínez-Mekler, Gustavo; Müller, Markus

    2017-01-01

    We present a statistical analysis of music scores from different composers using detrended fluctuation analysis (DFA). We find different fluctuation profiles that correspond to distinct autocorrelation structures of the musical pieces. Further, we reveal evidence for the presence of nonlinear autocorrelations by estimating the DFA of the magnitude series, a result validated by a corresponding study of appropriate surrogate data. The amount and the character of nonlinear correlations vary from one composer to another. Finally, we performed a simple experiment in order to evaluate the pleasantness of the musical surrogate pieces in comparison with the original music and find that nonlinear correlations could play an important role in the aesthetic perception of a musical piece. PMID:29308256

  18. Multiple scaling behaviour and nonlinear traits in music scores

    NASA Astrophysics Data System (ADS)

    González-Espinoza, Alfredo; Larralde, Hernán; Martínez-Mekler, Gustavo; Müller, Markus

    2017-12-01

    We present a statistical analysis of music scores from different composers using detrended fluctuation analysis (DFA). We find different fluctuation profiles that correspond to distinct autocorrelation structures of the musical pieces. Further, we reveal evidence for the presence of nonlinear autocorrelations by estimating the DFA of the magnitude series, a result validated by a corresponding study of appropriate surrogate data. The amount and the character of nonlinear correlations vary from one composer to another. Finally, we performed a simple experiment in order to evaluate the pleasantness of the musical surrogate pieces in comparison with the original music and find that nonlinear correlations could play an important role in the aesthetic perception of a musical piece.

  19. Predicting treatment effect from surrogate endpoints and historical trials: an extrapolation involving probabilities of a binary outcome or survival to a specific time

    PubMed Central

    Sargent, Daniel J.; Buyse, Marc; Burzykowski, Tomasz

    2011-01-01

    SUMMARY Using multiple historical trials with surrogate and true endpoints, we consider various models to predict the effect of treatment on a true endpoint in a target trial in which only a surrogate endpoint is observed. This predicted result is computed using (1) a prediction model (mixture, linear, or principal stratification) estimated from historical trials and the surrogate endpoint of the target trial and (2) a random extrapolation error estimated from successively leaving out each trial among the historical trials. The method applies to either binary outcomes or survival to a particular time that is computed from censored survival data. We compute a 95% confidence interval for the predicted result and validate its coverage using simulation. To summarize the additional uncertainty from using a predicted instead of true result for the estimated treatment effect, we compute its multiplier of standard error. Software is available for download. PMID:21838732

  20. Using abiotic variables to predict importance of sites for species representation.

    PubMed

    Albuquerque, Fabio; Beier, Paul

    2015-10-01

    In systematic conservation planning, species distribution data for all sites in a planning area are used to prioritize each site in terms of the site's importance toward meeting the goal of species representation. But comprehensive species data are not available in most planning areas and would be expensive to acquire. As a shortcut, ecologists use surrogates, such as occurrences of birds or another well-surveyed taxon, or land types defined from remotely sensed data, in the hope that sites that represent the surrogates also represent biodiversity. Unfortunately, surrogates have not performed reliably. We propose a new type of surrogate, predicted importance, that can be developed from species data for a q% subset of sites. With species data from this subset of sites, importance can be modeled as a function of abiotic variables available at no charge for all terrestrial areas on Earth. Predicted importance can then be used as a surrogate to prioritize all sites. We tested this surrogate with 8 sets of species data. For each data set, we used a q% subset of sites to model importance as a function of abiotic variables, used the resulting function to predict importance for all sites, and evaluated the number of species in the sites with highest predicted importance. Sites with the highest predicted importance represented species efficiently for all data sets when q = 25% and for 7 of 8 data sets when q = 20%. Predicted importance requires less survey effort than direct selection for species representation and meets representation goals well compared with other surrogates currently in use. This less expensive surrogate may be useful in those areas of the world that need it most, namely tropical regions with the highest biodiversity, greatest biodiversity loss, most severe lack of inventory data, and poorly developed protected area networks. © 2015 Society for Conservation Biology.

  1. Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic

    PubMed Central

    Guillas, S.; Georgiopoulou, A.; Dias, F.

    2017-01-01

    Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained. PMID:28484339

  2. Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic.

    PubMed

    Salmanidou, D M; Guillas, S; Georgiopoulou, A; Dias, F

    2017-04-01

    Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Oishik, E-mail: oishik-sen@uiowa.edu; Gaul, Nicholas J., E-mail: nicholas-gaul@ramdosolutions.com; Choi, K.K., E-mail: kyung-choi@uiowa.edu

    Macro-scale computations of shocked particulate flows require closure laws that model the exchange of momentum/energy between the fluid and particle phases. Closure laws are constructed in this work in the form of surrogate models derived from highly resolved mesoscale computations of shock-particle interactions. The mesoscale computations are performed to calculate the drag force on a cluster of particles for different values of Mach Number and particle volume fraction. Two Kriging-based methods, viz. the Dynamic Kriging Method (DKG) and the Modified Bayesian Kriging Method (MBKG) are evaluated for their ability to construct surrogate models with sparse data; i.e. using the leastmore » number of mesoscale simulations. It is shown that if the input data is noise-free, the DKG method converges monotonically; convergence is less robust in the presence of noise. The MBKG method converges monotonically even with noisy input data and is therefore more suitable for surrogate model construction from numerical experiments. This work is the first step towards a full multiscale modeling of interaction of shocked particle laden flows.« less

  4. Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille; Kolla, Hemanth

    This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- mericalmore » tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.« less

  5. Model based Inverse Methods for Sizing Cracks of Varying Shape and Location in Bolt hole Eddy Current (BHEC) Inspections (Postprint)

    DTIC Science & Technology

    2016-02-10

    using bolt hole eddy current (BHEC) techniques. Data was acquired for a wide range of crack sizes and shapes, including mid- bore , corner and through...to select the most appropriate VIC-3D surrogate model for subsequent crack sizing inversion step. Inversion results for select mid- bore , through and...the flaw. 15. SUBJECT TERMS Bolt hole eddy current (BHEC); mid- bore , corner and through-thickness crack types; VIC-3D generated surrogate models

  6. Limited predictive ability of surrogate indices of insulin sensitivity/resistance in Asian-Indian men.

    PubMed

    Muniyappa, Ranganath; Irving, Brian A; Unni, Uma S; Briggs, William M; Nair, K Sreekumaran; Quon, Michael J; Kurpad, Anura V

    2010-12-01

    Insulin resistance is highly prevalent in Asian Indians and contributes to worldwide public health problems, including diabetes and related disorders. Surrogate measurements of insulin sensitivity/resistance are used frequently to study Asian Indians, but these are not formally validated in this population. In this study, we compared the ability of simple surrogate indices to accurately predict insulin sensitivity as determined by the reference glucose clamp method. In this cross-sectional study of Asian-Indian men (n = 70), we used a calibration model to assess the ability of simple surrogate indices for insulin sensitivity [quantitative insulin sensitivity check index (QUICKI), homeostasis model assessment (HOMA2-IR), fasting insulin-to-glucose ratio (FIGR), and fasting insulin (FI)] to predict an insulin sensitivity index derived from the reference glucose clamp method (SI(Clamp)). Predictive accuracy was assessed by both root mean squared error (RMSE) of prediction as well as leave-one-out cross-validation-type RMSE of prediction (CVPE). QUICKI, FIGR, and FI, but not HOMA2-IR, had modest linear correlations with SI(Clamp) (QUICKI: r = 0.36; FIGR: r = -0.36; FI: r = -0.27; P < 0.05). No significant differences were noted among CVPE or RMSE from any of the surrogate indices when compared with QUICKI. Surrogate measurements of insulin sensitivity/resistance such as QUICKI, FIGR, and FI are easily obtainable in large clinical studies, but these may only be useful as secondary outcome measurements in assessing insulin sensitivity/resistance in clinical studies of Asian Indians.

  7. Phenotypic and genomic comparison of Mycobacterium aurum and surrogate model species to Mycobacterium tuberculosis: implications for drug discovery.

    PubMed

    Namouchi, Amine; Cimino, Mena; Favre-Rochex, Sandrine; Charles, Patricia; Gicquel, Brigitte

    2017-07-13

    Tuberculosis (TB) is caused by Mycobacterium tuberculosis and represents one of the major challenges facing drug discovery initiatives worldwide. The considerable rise in bacterial drug resistance in recent years has led to the need of new drugs and drug regimens. Model systems are regularly used to speed-up the drug discovery process and circumvent biosafety issues associated with manipulating M. tuberculosis. These include the use of strains such as Mycobacterium smegmatis and Mycobacterium marinum that can be handled in biosafety level 2 facilities, making high-throughput screening feasible. However, each of these model species have their own limitations. We report and describe the first complete genome sequence of Mycobacterium aurum ATCC23366, an environmental mycobacterium that can also grow in the gut of humans and animals as part of the microbiota. This species shows a comparable resistance profile to that of M. tuberculosis for several anti-TB drugs. The aims of this study were to (i) determine the drug resistance profile of a recently proposed model species, Mycobacterium aurum, strain ATCC23366, for anti-TB drug discovery as well as Mycobacterium smegmatis and Mycobacterium marinum (ii) sequence and annotate the complete genome sequence of this species obtained using Pacific Bioscience technology (iii) perform comparative genomics analyses of the various surrogate strains with M. tuberculosis (iv) discuss how the choice of the surrogate model used for drug screening can affect the drug discovery process. We describe the complete genome sequence of M. aurum, a surrogate model for anti-tuberculosis drug discovery. Most of the genes already reported to be associated with drug resistance are shared between all the surrogate strains and M. tuberculosis. We consider that M. aurum might be used in high-throughput screening for tuberculosis drug discovery. We also highly recommend the use of different model species during the drug discovery screening process.

  8. Nonlinear dynamic analysis of D α signals for type I edge localized modes characterization on JET with a carbon wall

    NASA Astrophysics Data System (ADS)

    Cannas, Barbara; Fanni, Alessandra; Murari, Andrea; Pisano, Fabio; Contributors, JET

    2018-02-01

    In this paper, the dynamic characteristics of type-I ELM time-series from the JET tokamak, the world’s largest magnetic confinement plasma physics experiment, have been investigated. The dynamic analysis has been focused on the detection of nonlinear structure in D α radiation time series. Firstly, the method of surrogate data has been applied to evaluate the statistical significance of the null hypothesis of static nonlinear distortion of an underlying Gaussian linear process. Several nonlinear statistics have been evaluated, such us the time delayed mutual information, the correlation dimension and the maximal Lyapunov exponent. The obtained results allow us to reject the null hypothesis, giving evidence of underlying nonlinear dynamics. Moreover, no evidence of low-dimensional chaos has been found; indeed, the analysed time series are better characterized by the power law sensitivity to initial conditions which can suggest a motion at the ‘edge of chaos’, at the border between chaotic and regular non-chaotic dynamics. This uncertainty makes it necessary to further investigate about the nature of the nonlinear dynamics. For this purpose, a second surrogate test to distinguish chaotic orbits from pseudo-periodic orbits has been applied. In this case, we cannot reject the null hypothesis which means that the ELM time series is possibly pseudo-periodic. In order to reproduce pseudo-periodic dynamical properties, a periodic state-of-the-art model, proposed to reproduce the ELM cycle, has been corrupted by a dynamical noise, obtaining time series qualitatively in agreement with experimental time series.

  9. An adaptive least-squares global sensitivity method and application to a plasma-coupled combustion prediction with parametric correlation

    NASA Astrophysics Data System (ADS)

    Tang, Kunkun; Massa, Luca; Wang, Jonathan; Freund, Jonathan B.

    2018-05-01

    We introduce an efficient non-intrusive surrogate-based methodology for global sensitivity analysis and uncertainty quantification. Modified covariance-based sensitivity indices (mCov-SI) are defined for outputs that reflect correlated effects. The overall approach is applied to simulations of a complex plasma-coupled combustion system with disparate uncertain parameters in sub-models for chemical kinetics and a laser-induced breakdown ignition seed. The surrogate is based on an Analysis of Variance (ANOVA) expansion, such as widely used in statistics, with orthogonal polynomials representing the ANOVA subspaces and a polynomial dimensional decomposition (PDD) representing its multi-dimensional components. The coefficients of the PDD expansion are obtained using a least-squares regression, which both avoids the direct computation of high-dimensional integrals and affords an attractive flexibility in choosing sampling points. This facilitates importance sampling using a Bayesian calibrated posterior distribution, which is fast and thus particularly advantageous in common practical cases, such as our large-scale demonstration, for which the asymptotic convergence properties of polynomial expansions cannot be realized due to computation expense. Effort, instead, is focused on efficient finite-resolution sampling. Standard covariance-based sensitivity indices (Cov-SI) are employed to account for correlation of the uncertain parameters. Magnitude of Cov-SI is unfortunately unbounded, which can produce extremely large indices that limit their utility. Alternatively, mCov-SI are then proposed in order to bound this magnitude ∈ [ 0 , 1 ]. The polynomial expansion is coupled with an adaptive ANOVA strategy to provide an accurate surrogate as the union of several low-dimensional spaces, avoiding the typical computational cost of a high-dimensional expansion. It is also adaptively simplified according to the relative contribution of the different polynomials to the total variance. The approach is demonstrated for a laser-induced turbulent combustion simulation model, which includes parameters with correlated effects.

  10. Assessing potential effects of highway runoff on receiving-water quality at selected sites in Oregon with the Stochastic Empirical Loading and Dilution Model (SELDM)

    USGS Publications Warehouse

    Risley, John C.; Granato, Gregory E.

    2014-01-01

    6. An analysis of the use of grab sampling and nonstochastic upstream modeling methods was done to evaluate the potential effects on modeling outcomes. Additional analyses using surrogate water-quality datasets for the upstream basin and highway catchment were provided for six Oregon study sites to illustrate the risk-based information that SELDM will produce. These analyses show that the potential effects of highway runoff on receiving-water quality downstream of the outfall depends on the ratio of drainage areas (dilution), the quality of the receiving water upstream of the highway, and the concentration of the criteria of the constituent of interest. These analyses also show that the probability of exceeding a water-quality criterion may depend on the input statistics used, thus careful selection of representative values is important.

  11. Organism-level models: When mechanisms and statistics fail us

    NASA Astrophysics Data System (ADS)

    Phillips, M. H.; Meyer, J.; Smith, W. P.; Rockhill, J. K.

    2014-03-01

    Purpose: To describe the unique characteristics of models that represent the entire course of radiation therapy at the organism level and to highlight the uses to which such models can be put. Methods: At the level of an organism, traditional model-building runs into severe difficulties. We do not have sufficient knowledge to devise a complete biochemistry-based model. Statistical model-building fails due to the vast number of variables and the inability to control many of them in any meaningful way. Finally, building surrogate models, such as animal-based models, can result in excluding some of the most critical variables. Bayesian probabilistic models (Bayesian networks) provide a useful alternative that have the advantages of being mathematically rigorous, incorporating the knowledge that we do have, and being practical. Results: Bayesian networks representing radiation therapy pathways for prostate cancer and head & neck cancer were used to highlight the important aspects of such models and some techniques of model-building. A more specific model representing the treatment of occult lymph nodes in head & neck cancer were provided as an example of how such a model can inform clinical decisions. A model of the possible role of PET imaging in brain cancer was used to illustrate the means by which clinical trials can be modelled in order to come up with a trial design that will have meaningful outcomes. Conclusions: Probabilistic models are currently the most useful approach to representing the entire therapy outcome process.

  12. Air Pollution Exposure Modeling for Epidemiology Studies and Public Health

    EPA Science Inventory

    Air pollution epidemiology studies of ambient fine particulate matter (PM2.5) often use outdoor concentrations as exposure surrogates. These surrogates can induce exposure error since they do not account for (1) time spent indoors with ambient PM2.5 levels attenuated from outdoor...

  13. Selecting surrogate endpoints for estimating pesticide effects on avian reproductive success

    EPA Science Inventory

    A Markov chain nest productivity model (MCnest) has been developed for projecting the effects of a specific pesticide-use scenario on the annual reproductive success of avian species of concern. A critical element in MCnest is the use of surrogate endpoints, defined as measured ...

  14. Use of predictive models and rapid methods to nowcast bacteria levels at coastal beaches

    USGS Publications Warehouse

    Francy, Donna S.

    2009-01-01

    The need for rapid assessments of recreational water quality to better protect public health is well accepted throughout the research and regulatory communities. Rapid analytical methods, such as quantitative polymerase chain reaction (qPCR) and immunomagnetic separation/adenosine triphosphate (ATP) analysis, are being tested but are not yet ready for widespread use.Another solution is the use of predictive models, wherein variable(s) that are easily and quickly measured are surrogates for concentrations of fecal-indicator bacteria. Rainfall-based alerts, the simplest type of model, have been used by several communities for a number of years. Deterministic models use mathematical representations of the processes that affect bacteria concentrations; this type of model is being used for beach-closure decisions at one location in the USA. Multivariable statistical models are being developed and tested in many areas of the USA; however, they are only used in three areas of the Great Lakes to aid in notifications of beach advisories or closings. These “operational” statistical models can result in more accurate assessments of recreational water quality than use of the previous day's Escherichia coli (E. coli)concentration as determined by traditional culture methods. The Ohio Nowcast, at Huntington Beach, Bay Village, Ohio, is described in this paper as an example of an operational statistical model. Because predictive modeling is a dynamic process, water-resource managers continue to collect additional data to improve the predictive ability of the nowcast and expand the nowcast to other Ohio beaches and a recreational river. Although predictive models have been shown to work well at some beaches and are becoming more widely accepted, implementation in many areas is limited by funding, lack of coordinated technical leadership, and lack of supporting epidemiological data.

  15. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  16. Using an external surrogate for predictor model training in real-time motion management of lung tumors.

    PubMed

    Rottmann, Joerg; Berbeco, Ross

    2014-12-01

    Precise prediction of respiratory motion is a prerequisite for real-time motion compensation techniques such as beam, dynamic couch, or dynamic multileaf collimator tracking. Collection of tumor motion data to train the prediction model is required for most algorithms. To avoid exposure of patients to additional dose from imaging during this procedure, the feasibility of training a linear respiratory motion prediction model with an external surrogate signal is investigated and its performance benchmarked against training the model with tumor positions directly. The authors implement a lung tumor motion prediction algorithm based on linear ridge regression that is suitable to overcome system latencies up to about 300 ms. Its performance is investigated on a data set of 91 patient breathing trajectories recorded from fiducial marker tracking during radiotherapy delivery to the lung of ten patients. The expected 3D geometric error is quantified as a function of predictor lookahead time, signal sampling frequency and history vector length. Additionally, adaptive model retraining is evaluated, i.e., repeatedly updating the prediction model after initial training. Training length for this is gradually increased with incoming (internal) data availability. To assess practical feasibility model calculation times as well as various minimum data lengths for retraining are evaluated. Relative performance of model training with external surrogate motion data versus tumor motion data is evaluated. However, an internal-external motion correlation model is not utilized, i.e., prediction is solely driven by internal motion in both cases. Similar prediction performance was achieved for training the model with external surrogate data versus internal (tumor motion) data. Adaptive model retraining can substantially boost performance in the case of external surrogate training while it has little impact for training with internal motion data. A minimum adaptive retraining data length of 8 s and history vector length of 3 s achieve maximal performance. Sampling frequency appears to have little impact on performance confirming previously published work. By using the linear predictor, a relative geometric 3D error reduction of about 50% was achieved (using adaptive retraining, a history vector length of 3 s and with results averaged over all investigated lookahead times and signal sampling frequencies). The absolute mean error could be reduced from (2.0 ± 1.6) mm when using no prediction at all to (0.9 ± 0.8) mm and (1.0 ± 0.9) mm when using the predictor trained with internal tumor motion training data and external surrogate motion training data, respectively (for a typical lookahead time of 250 ms and sampling frequency of 15 Hz). A linear prediction model can reduce latency induced tracking errors by an average of about 50% in real-time image guided radiotherapy systems with system latencies of up to 300 ms. Training a linear model for lung tumor motion prediction with an external surrogate signal alone is feasible and results in similar performance as training with (internal) tumor motion. Particularly for scenarios where motion data are extracted from fluoroscopic imaging with ionizing radiation, this may alleviate the need for additional imaging dose during the collection of model training data.

  17. Applications of New Surrogate Global Optimization Algorithms including Efficient Synchronous and Asynchronous Parallelism for Calibration of Expensive Nonlinear Geophysical Simulation Models.

    NASA Astrophysics Data System (ADS)

    Shoemaker, C. A.; Pang, M.; Akhtar, T.; Bindel, D.

    2016-12-01

    New parallel surrogate global optimization algorithms are developed and applied to objective functions that are expensive simulations (possibly with multiple local minima). The algorithms can be applied to most geophysical simulations, including those with nonlinear partial differential equations. The optimization does not require simulations be parallelized. Asynchronous (and synchronous) parallel execution is available in the optimization toolbox "pySOT". The parallel algorithms are modified from serial to eliminate fine grained parallelism. The optimization is computed with open source software pySOT, a Surrogate Global Optimization Toolbox that allows user to pick the type of surrogate (or ensembles), the search procedure on surrogate, and the type of parallelism (synchronous or asynchronous). pySOT also allows the user to develop new algorithms by modifying parts of the code. In the applications here, the objective function takes up to 30 minutes for one simulation, and serial optimization can take over 200 hours. Results from Yellowstone (NSF) and NCSS (Singapore) supercomputers are given for groundwater contaminant hydrology simulations with applications to model parameter estimation and decontamination management. All results are compared with alternatives. The first results are for optimization of pumping at many wells to reduce cost for decontamination of groundwater at a superfund site. The optimization runs with up to 128 processors. Superlinear speed up is obtained for up to 16 processors, and efficiency with 64 processors is over 80%. Each evaluation of the objective function requires the solution of nonlinear partial differential equations to describe the impact of spatially distributed pumping and model parameters on model predictions for the spatial and temporal distribution of groundwater contaminants. The second application uses an asynchronous parallel global optimization for groundwater quality model calibration. The time for a single objective function evaluation varies unpredictably, so efficiency is improved with asynchronous parallel calculations to improve load balancing. The third application (done at NCSS) incorporates new global surrogate multi-objective parallel search algorithms into pySOT and applies it to a large watershed calibration problem.

  18. SVAw - a web-based application tool for automated surrogate variable analysis of gene expression studies

    PubMed Central

    2013-01-01

    Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726

  19. Best Practicable Aggregation of Species: a step forward for species surrogacy in environmental assessment and monitoring

    PubMed Central

    Bevilacqua, Stanislao; Claudet, Joachim; Terlizzi, Antonio

    2013-01-01

    The available taxonomic expertise and knowledge of species is still inadequate to cope with the urgent need for cost-effective methods to quantifying community response to natural and anthropogenic drivers of change. So far, the mainstream approach to overcome these impediments has focused on using higher taxa as surrogates for species. However, the use of such taxonomic surrogates often limits inferences about the causality of community patterns, which in turn is essential for effective environmental management strategies. Here, we propose an alternative approach to species surrogacy, the “Best Practicable Aggregation of Species” (BestAgg), in which surrogates exulate from fixed taxonomic schemes. The approach uses null models from random aggregations of species to minimizing the number of surrogates without causing significant losses of information on community patterns. Surrogate types are then selected in order to maximize ecological information. We applied the approach to real case studies on natural and human-driven gradients from marine benthic communities. Outcomes from BestAgg were also compared with those obtained using classic taxonomic surrogates. Results showed that BestAgg surrogates are effective in detecting community changes. In contrast to classic taxonomic surrogates, BestAgg surrogates allow retaining significantly higher information on species-level community patterns than what is expected to occur by chance and a potential time saving during sample processing up to 25% higher. Our findings showed that BestAgg surrogates from a pilot study could be used successfully in similar environmental investigations in the same area, or for subsequent long-term monitoring programs. BestAgg is virtually applicable to any environmental context, allowing exploiting multiple surrogacy schemes beyond stagnant perspectives strictly relying on taxonomic relatedness among species. This prerogative is crucial to extend the concept of species surrogacy to ecological traits of species, thus leading to ecologically meaningful surrogates that, while cost effective in reflecting community patterns, may also contribute to unveil underlying processes. A specific R code for BestAgg is provided. PMID:24198939

  20. High Accuracy Transistor Compact Model Calibrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hembree, Charles E.; Mar, Alan; Robertson, Perry J.

    2015-09-01

    Typically, transistors are modeled by the application of calibrated nominal and range models. These models consists of differing parameter values that describe the location and the upper and lower limits of a distribution of some transistor characteristic such as current capacity. Correspond- ingly, when using this approach, high degrees of accuracy of the transistor models are not expected since the set of models is a surrogate for a statistical description of the devices. The use of these types of models describes expected performances considering the extremes of process or transistor deviations. In contrast, circuits that have very stringent accuracy requirementsmore » require modeling techniques with higher accuracy. Since these accurate models have low error in transistor descriptions, these models can be used to describe part to part variations as well as an accurate description of a single circuit instance. Thus, models that meet these stipulations also enable the calculation of quantifi- cation of margins with respect to a functional threshold and uncertainties in these margins. Given this need, new model high accuracy calibration techniques for bipolar junction transis- tors have been developed and are described in this report.« less

  1. Chance-constrained multi-objective optimization of groundwater remediation design at DNAPLs-contaminated sites using a multi-algorithm genetically adaptive method.

    PubMed

    Ouyang, Qi; Lu, Wenxi; Hou, Zeyu; Zhang, Yu; Li, Shuai; Luo, Jiannan

    2017-05-01

    In this paper, a multi-algorithm genetically adaptive multi-objective (AMALGAM) method is proposed as a multi-objective optimization solver. It was implemented in the multi-objective optimization of a groundwater remediation design at sites contaminated by dense non-aqueous phase liquids. In this study, there were two objectives: minimization of the total remediation cost, and minimization of the remediation time. A non-dominated sorting genetic algorithm II (NSGA-II) was adopted to compare with the proposed method. For efficiency, the time-consuming surfactant-enhanced aquifer remediation simulation model was replaced by a surrogate model constructed by a multi-gene genetic programming (MGGP) technique. Similarly, two other surrogate modeling methods-support vector regression (SVR) and Kriging (KRG)-were employed to make comparisons with MGGP. In addition, the surrogate-modeling uncertainty was incorporated in the optimization model by chance-constrained programming (CCP). The results showed that, for the problem considered in this study, (1) the solutions obtained by AMALGAM incurred less remediation cost and required less time than those of NSGA-II, indicating that AMALGAM outperformed NSGA-II. It was additionally shown that (2) the MGGP surrogate model was more accurate than SVR and KRG; and (3) the remediation cost and time increased with the confidence level, which can enable decision makers to make a suitable choice by considering the given budget, remediation time, and reliability. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  3. Signal decomposition for surrogate modeling of a constrained ultrasonic design space

    NASA Astrophysics Data System (ADS)

    Homa, Laura; Sparkman, Daniel; Wertz, John; Welter, John; Aldrin, John C.

    2018-04-01

    The U.S. Air Force seeks to improve the methods and measures by which the lifecycle of composite structures are managed. Nondestructive evaluation of damage - particularly internal damage resulting from impact - represents a significant input to that improvement. Conventional ultrasound can detect this damage; however, full 3D characterization has not been demonstrated. A proposed approach for robust characterization uses model-based inversion through fitting of simulated results to experimental data. One challenge with this approach is the high computational expense of the forward model to simulate the ultrasonic B-scans for each damage scenario. A potential solution is to construct a surrogate model using a subset of simulated ultrasonic scans built using a highly accurate, computationally expensive forward model. However, the dimensionality of these simulated B-scans makes interpolating between them a difficult and potentially infeasible problem. Thus, we propose using the chirplet decomposition to reduce the dimensionality of the data, and allow for interpolation in the chirplet parameter space. By applying the chirplet decomposition, we are able to extract the salient features in the data and construct a surrogate forward model.

  4. Narrative Interest Standard: A Novel Approach to Surrogate Decision-Making for People With Dementia.

    PubMed

    Wilkins, James M

    2017-06-17

    Dementia is a common neurodegenerative process that can significantly impair decision-making capacity as the disease progresses. When a person is found to lack capacity to make a decision, a surrogate decision-maker is generally sought to aid in decision-making. Typical bases for surrogate decision-making include the substituted judgment standard and the best interest standard. Given the heterogeneous and progressive course of dementia, however, these standards for surrogate decision-making are often insufficient in providing guidance for the decision-making for a person with dementia, escalating the likelihood of conflict in these decisions. In this article, the narrative interest standard is presented as a novel and more appropriate approach to surrogate decision-making for people with dementia. Through case presentation and ethical analysis, the standard mechanisms for surrogate decision-making for people with dementia are reviewed and critiqued. The narrative interest standard is then introduced and discussed as a dementia-specific model for surrogate decision-making. Through incorporation of elements of a best interest standard in focusing on the current benefit-burden ratio and elements of narrative to provide context, history, and flexibility for values and preferences that may change over time, the narrative interest standard allows for elaboration of an enriched context for surrogate decision-making for people with dementia. More importantly, however, a narrative approach encourages the direct contribution from people with dementia in authoring the story of what matters to them in their lives.

  5. The Use of Surrogate Data in Demographic Population Viability Analysis: A Case Study of California Sea Lions

    PubMed Central

    2015-01-01

    Reliable data necessary to parameterize population models are seldom available for imperiled species. As an alternative, data from populations of the same species or from ecologically similar species have been used to construct models. In this study, we evaluated the use of demographic data collected at one California sea lion colony (Los Islotes) to predict the population dynamics of the same species from two other colonies (San Jorge and Granito) in the Gulf of California, Mexico, for which demographic data are lacking. To do so, we developed a stochastic demographic age-structured matrix model and conducted a population viability analysis for each colony. For the Los Islotes colony we used site-specific pup, juvenile, and adult survival probabilities, as well as birth rates for older females. For the other colonies, we used site-specific pup and juvenile survival probabilities, but used surrogate data from Los Islotes for adult survival probabilities and birth rates. We assessed these models by comparing simulated retrospective population trajectories to observed population trends based on count data. The projected population trajectories approximated the observed trends when surrogate data were used for one colony but failed to match for a second colony. Our results indicate that species-specific and even region-specific surrogate data may lead to erroneous conservation decisions. These results highlight the importance of using population-specific demographic data in assessing extinction risk. When vital rates are not available and immediate management actions must be taken, in particular for imperiled species, we recommend the use of surrogate data only when the populations appear to have similar population trends. PMID:26413746

  6. Across-Platform Imputation of DNA Methylation Levels Incorporating Nonlocal Information Using Penalized Functional Regression.

    PubMed

    Zhang, Guosheng; Huang, Kuan-Chieh; Xu, Zheng; Tzeng, Jung-Ying; Conneely, Karen N; Guan, Weihua; Kang, Jian; Li, Yun

    2016-05-01

    DNA methylation is a key epigenetic mark involved in both normal development and disease progression. Recent advances in high-throughput technologies have enabled genome-wide profiling of DNA methylation. However, DNA methylation profiling often employs different designs and platforms with varying resolution, which hinders joint analysis of methylation data from multiple platforms. In this study, we propose a penalized functional regression model to impute missing methylation data. By incorporating functional predictors, our model utilizes information from nonlocal probes to improve imputation quality. Here, we compared the performance of our functional model to linear regression and the best single probe surrogate in real data and via simulations. Specifically, we applied different imputation approaches to an acute myeloid leukemia dataset consisting of 194 samples and our method showed higher imputation accuracy, manifested, for example, by a 94% relative increase in information content and up to 86% more CpG sites passing post-imputation filtering. Our simulated association study further demonstrated that our method substantially improves the statistical power to identify trait-associated methylation loci. These findings indicate that the penalized functional regression model is a convenient and valuable imputation tool for methylation data, and it can boost statistical power in downstream epigenome-wide association study (EWAS). © 2016 WILEY PERIODICALS, INC.

  7. Tumor response and progression-free survival as potential surrogate endpoints for overall survival in extensive stage small-cell lung cancer: findings on the basis of North Central Cancer Treatment Group trials.

    PubMed

    Foster, Nathan R; Qi, Yingwei; Shi, Qian; Krook, James E; Kugler, John W; Jett, James R; Molina, Julian R; Schild, Steven E; Adjei, Alex A; Mandrekar, Sumithra J

    2011-03-15

    The authors investigated the putative surrogate endpoints of best response, complete response (CR), confirmed response, and progression-free survival (PFS) for associations with overall survival (OS), and as possible surrogate endpoints for OS. Individual patient data from 870 untreated extensive stage small-cell lung cancer patients participating in 6 single-arm (274 patients) and 3 randomized trials (596 patients) were pooled. Patient-level associations between putative surrogate endpoints and OS were assessed by Cox models using landmark analyses. Trial-level surrogacy of putative surrogate endpoints were assessed by the association of treatment effects on OS and individual putative surrogate endpoints. Trial-level surrogacy measures included: R(2) from weighted least squares regression model, Spearman correlation coefficient, and R(2) from bivariate survival model (Copula R(2) ). Median OS and PFS were 9.6 (95% confidence interval [CI], 9.1-10.0) and 5.5 (95% CI, 5.2-5.9) months, respectively; best response, CR, and confirmed response rates were 44%, 22%, and 34%, respectively. Patient-level associations showed that PFS status at 4 months was a strong predictor of subsequent survival (hazard ratio [HR], 0.42; 95% CI, 0.35-0.51; concordance index 0.63; P < .01), with 6-month PFS being the strongest (HR, 0.41; 95% CI, 0.35-0.49; concordance index, 0.66, P < .01). At the trial level, PFS showed the highest level of surrogacy for OS (weighted least squares R(2) = 0.79; Copula R(2) = 0.80), explaining 79% of the variance in OS. Tumor response endpoints showed lower surrogacy levels (weighted least squares R(2) ≤0.48). PFS was strongly associated with OS at both the patient and trial levels. PFS also shows promise as a potential surrogate for OS, but further validation is needed using data from a larger number of randomized phase 3 trials. Copyright © 2010 American Cancer Society.

  8. Total N-nitrosamine Precursor Adsorption with Carbon Nanotubes: Elucidating Controlling Physiochemical Properties and Developing a Size-Resolved Precursor Surrogate

    NASA Astrophysics Data System (ADS)

    Needham, Erin Michelle

    As drinking water sources become increasingly impaired with nutrients and wastewater treatment plant (WWTP) effluent, formation of disinfection byproducts (DBPs)--such as trihalomethanes (THMs), dihaloacetonitriles (DHANs), and N-nitrosamines--during water treatment may also increase. N-nitrosamines may comprise the bulk of the chronic toxicity in treated drinking waters despite forming at low ng/L levels. This research seeks to elucidate physicochemical properties of carbon nanotubes (CNTs) for removal of DBP precursors, with an emphasis on total N-nitrosamines (TONO). Batch experiments with CNTs were completed to assess adsorption of THM, DHAN, and TONO precursors; physiochemical properties of CNTs were quantified through gas adsorption isotherms and x-ray photoelectron spectroscopy. Numerical modeling was used to elucidate characteristics of CNTs controlling DBP precursor adsorption. Multivariate models developed with unmodified CNTs revealed that surface carboxyl groups and, for TONO precursors, cumulative pore volume (CPV), controlled DBP precursor adsorption. Models developed with modified CNTs revealed that specific surface area controlled adsorption of THM and DHAN precursors while CPV and surface oxygen content were significant for adsorption of TONO precursors. While surrogates of THM and DHAN precursors leverage metrics from UV absorbance and fluorescence spectroscopy, a TONO precursor surrogate has proved elusive. This is important as measurements of TONO formation potential (TONOFP) require large sample volumes and long processing times, which impairs development of treatment processes. TONO precursor surrogates were developed using samples that had undergone oxidative or sorption treatments. Precursors were analyzed with asymmetric flow field-flow fractionation (AF4) with inline fluorescence detection (FLD) and whole water fluorescence excitation-emission matrices (EEMs). TONO precursor surrogates were discovered, capable of predicting changes in TONOFP in WWTP samples that have undergone oxidation (R2 = 0.996) and sorption (R2 = 0.576). Importantly, both surrogates only require just 2 mL of sample volume to measure and take only 1 hour. Application of the sorption precursor surrogate revealed that DBP precursor adsorption was feasible with freeform CNT microstructures with various dimensions and surface chemistries, establishing a framework for development of this novel CNT application for drinking water treatment.

  9. An Integrated Optimization Design Method Based on Surrogate Modeling Applied to Diverging Duct Design

    NASA Astrophysics Data System (ADS)

    Hanan, Lu; Qiushi, Li; Shaobin, Li

    2016-12-01

    This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.

  10. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure.

    PubMed

    Weir, Scott M; Suski, Jamie G; Salice, Christopher J

    2010-12-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  11. Use of surrogate technologies to estimate suspended sediment in the Clearwater River, Idaho, and Snake River, Washington, 2008-10

    USGS Publications Warehouse

    Wood, Molly S.; Teasdale, Gregg N.

    2013-01-01

    Elevated levels of fluvial sediment can reduce the biological productivity of aquatic systems, impair freshwater quality, decrease reservoir storage capacity, and decrease the capacity of hydraulic structures. The need to measure fluvial sediment has led to the development of sediment surrogate technologies, particularly in locations where streamflow alone is not a good estimator of sediment load because of regulated flow, load hysteresis, episodic sediment sources, and non-equilibrium sediment transport. An effective surrogate technology is low maintenance and sturdy over a range of hydrologic conditions, and measured variables can be modeled to estimate suspended-sediment concentration (SSC), load, and duration of elevated levels on a real-time basis. Among the most promising techniques is the measurement of acoustic backscatter strength using acoustic Doppler velocity meters (ADVMs) deployed in rivers. The U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers, Walla Walla District, evaluated the use of acoustic backscatter, turbidity, laser diffraction, and streamflow as surrogates for estimating real-time SSC and loads in the Clearwater and Snake Rivers, which adjoin in Lewiston, Idaho, and flow into Lower Granite Reservoir. The study was conducted from May 2008 to September 2010 and is part of the U.S. Army Corps of Engineers Lower Snake River Programmatic Sediment Management Plan to identify and manage sediment sources in basins draining into lower Snake River reservoirs. Commercially available acoustic instruments have shown great promise in sediment surrogate studies because they require little maintenance and measure profiles of the surrogate parameter across a sampling volume rather than at a single point. The strength of acoustic backscatter theoretically increases as more particles are suspended in the water to reflect the acoustic pulse emitted by the ADVM. ADVMs of different frequencies (0.5, 1.5, and 3 Megahertz) were tested to target various sediment grain sizes. Laser diffraction and turbidity also were tested as surrogate technologies. Models between SSC and surrogate variables were developed using ordinary least-squares regression. Acoustic backscatter using the high frequency ADVM at each site was the best predictor of sediment, explaining 93 and 92 percent of the variability in SSC and matching sediment sample data within +8.6 and +10 percent, on average, at the Clearwater River and Snake River study sites, respectively. Additional surrogate models were developed to estimate sand and fines fractions of suspended sediment based on acoustic backscatter. Acoustic backscatter generally appears to be a better estimator of suspended sediment concentration and load over short (storm event and monthly) and long (annual) time scales than transport curves derived solely from the regression of conventional sediment measurements and streamflow. Changing grain sizes, the presence of organic matter, and aggregation of sediments in the river likely introduce some variability in the model between acoustic backscatter and SSC.

  12. A computational framework for simultaneous estimation of muscle and joint contact forces and body motion using optimization and surrogate modeling.

    PubMed

    Eskinazi, Ilan; Fregly, Benjamin J

    2018-04-01

    Concurrent estimation of muscle activations, joint contact forces, and joint kinematics by means of gradient-based optimization of musculoskeletal models is hindered by computationally expensive and non-smooth joint contact and muscle wrapping algorithms. We present a framework that simultaneously speeds up computation and removes sources of non-smoothness from muscle force optimizations using a combination of parallelization and surrogate modeling, with special emphasis on a novel method for modeling joint contact as a surrogate model of a static analysis. The approach allows one to efficiently introduce elastic joint contact models within static and dynamic optimizations of human motion. We demonstrate the approach by performing two optimizations, one static and one dynamic, using a pelvis-leg musculoskeletal model undergoing a gait cycle. We observed convergence on the order of seconds for a static optimization time frame and on the order of minutes for an entire dynamic optimization. The presented framework may facilitate model-based efforts to predict how planned surgical or rehabilitation interventions will affect post-treatment joint and muscle function. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. Impact of short-term nutritional supplementation on surrogate markers of undernutrition in hemodialysis patients - prospective real-life interventional study.

    PubMed

    Ocepek, Andreja; Bevc, Sebastjan; Ekart, Robert

    Hemodialysis (HD) patients are at increased risk for undernutrition, especially protein wasting. We present the results of a prospective study in HD patients after 4 months of intervention with oral nutritional supplements (ONS). After a 3-month wash-out period, 92 HD patients were enrolled in the study. Patients were tested for undernutrition with composite parameters, laboratory tests, bioelectrical impedance analysis (BIA), and hand-grip strength test (HGS). All patients fulfilling criteria for, or at high risk of, undernutrition were given ONS in addition to their regular diet. The impact of short-term ONS on surrogate markers of undernutrition was statistically analyzed. Data for 84 patients, 45 (53.6%) male, average age 63.3 years, were available for analysis after 4 months. Patients were divided into three groups: group A (n = 28), patients with normal nutritional status (NUS) at baseline not necessitating ONS; group B (n = 43), patients entitled to receive ONS; group C (n = 13), patients entitled to receive but refused to take ONS. In group B patients, received on average 4.1 bottles of ONS (902 mL; 1,623.6 kcal; 73.06 g protein) per week. Baseline results showed statistically-significant differences between groups in serum albumin levels and phase angle (PhA) but not in HGS. After 4 months of ONS, we noticed stagnation of observed markers in group B. Interestingly, in group A, significant deterioration of serum albumin and PhA was observed, but HGS improved. There was a trend towards worsening of serum albumin levels and HGS in group C not reaching statistical significance. In undernourished HD patients after ONS we did not find statistically-significant improvement of NUS evaluating surrogate markers. Nevertheless, in undernourished patients not receiving ONS, serum albumin and HGS showed a trend towards worsening, and even in well-nourished patients, nutritional markers (serum albumin and PhA) declined. We speculate that a certain positive effect of ONS on nutritional status in undernourished HD patients could be observed already after short-term supplementation.
.

  14. Production of cloned NIBS (Nippon Institute for Biological Science) and α-1, 3-galactosyltransferase knockout MGH miniature pigs by somatic cell nuclear transfer using the NIBS breed as surrogates.

    PubMed

    Shimatsu, Yoshiki; Yamada, Kazuhiko; Horii, Wataru; Hirakata, Atsushi; Sakamoto, Yuji; Waki, Shiori; Sano, Junichi; Saitoh, Toshiki; Sahara, Hisashi; Shimizu, Akira; Yazawa, Hajime; Sachs, David H; Nunoya, Tetsuo

    2013-01-01

    Nuclear transfer (NT) technologies offer a means for producing the genetically modified pigs necessary to develop swine models for mechanistic studies of disease processes as well as to serve as organ donors for xenotransplantation. Most previous studies have used commercial pigs as surrogates. In this study, we established a cloning technique for miniature pigs by somatic cell nuclear transfer (SCNT) using Nippon Institute for Biological Science (NIBS) miniature pigs as surrogates. Moreover, utilizing this technique, we have successfully produced an α-1, 3-galactosyltransferase knockout (GalT-KO) miniature swine. Fibroblasts procured from a NIBS miniature pig fetus were injected into 1312 enucleated oocytes. The cloned embryos were transferred to 11 surrogates of which five successfully delivered 13 cloned offspring; the production efficiency was 1.0% (13/1312). In a second experiment, lung fibroblasts obtained from neonatal GalT-KO MGH miniature swine were used as donor cells and 1953 cloned embryos were transferred to 12 surrogates. Six cloned offspring were born from five surrogates, a production efficiency of 0.3% (6/1953). These results demonstrate successful establishment of a miniature pig cloning technique by SCNT using NIBS miniature pigs as surrogates. To our knowledge, this is the first demonstration of successful production of GalT-KO miniature swine using miniature swine surrogates. This technique could help to ensure a stable supply of the cloned pigs through the use of miniature pig surrogates and could expand production in countries with limited space or in facilities with special regulations such as specific pathogen-free or good laboratory practice. © 2013 John Wiley & Sons A/S.

  15. Production of cloned NIBS (Nippon Institute for Biological Science) and α-1, 3-galactosyltransferase knockout MGH miniature pigs by somatic cell nuclear transfer using the NIBS breed as surrogates

    PubMed Central

    Shimatsu, Yoshiki; Yamada, Kazuhiko; Horii, Wataru; Hirakata, Atsushi; Sakamoto, Yuji; Waki, Shiori; Sano, Junichi; Saitoh, Toshiki; Sahara, Hisashi; Shimizu, Akira; Yazawa, Hajime; Sachs, David H.; Nunoya, Tetsuo

    2013-01-01

    Background Nuclear transfer (NT) technologies offer a means for producing the genetically modified pigs necessary to develop swine models for mechanistic studies of disease processes as well as to serve as organ donors for xenotransplantation. Most previous studies have used commercial pigs as surrogates. Method and Results In this study, we established a cloning technique for miniature pigs by somatic cell nuclear transfer (SCNT) using Nippon Institute for Biological Science (NIBS) miniature pigs as surrogates. Moreover, utilizing this technique, we have successfully produced an α-1, 3-galactosyltransferase knockout (GalT-KO) miniature swine. Fibroblasts procured from a NIBS miniature pig fetus were injected into 1312 enucleated oocytes. The cloned embryos were transferred to 11 surrogates of which five successfully delivered 13 cloned offspring; the production efficiency was 1.0% (13/1312). In a second experiment, lung fibroblasts obtained from neonatal GalT-KO MGH miniature swine were used as donor cells and 1953 cloned embryos were transferred to 12 surrogates. Six cloned offspring were born from five surrogates, a production efficiency of 0.3% (6/1953). Conclusions These results demonstrate successful establishment of a miniature pig cloning technique by SCNT using NIBS miniature pigs as surrogates. To our knowledge, this is the first demonstration of successful production of GalT-KO miniature swine using miniature swine surrogates. This technique could help to ensure a stable supply of the cloned pigs through the use of miniature pig surrogates and could expand production in countries with limited space or in facilities with special regulations such as specific pathogen-free or good laboratory practice. PMID:23581451

  16. Perinatal outcomes after natural conception versus in vitro fertilization (IVF) in gestational surrogates: a model to evaluate IVF treatment versus maternal effects.

    PubMed

    Woo, Irene; Hindoyan, Rita; Landay, Melanie; Ho, Jacqueline; Ingles, Sue Ann; McGinnis, Lynda K; Paulson, Richard J; Chung, Karine

    2017-12-01

    To study the perinatal outcomes between singleton live births achieved with the use of commissioned versus spontaneously conceived embryos carried by the same gestational surrogate. Retrospective cohort study. Academic in vitro fertilization center. Gestational surrogate. None. Pregnancy outcome, gestational age at birth, birth weight, perinatal complications. We identified 124 gestational surrogates who achieved a total of 494 pregnancies. Pregnancy outcomes for surrogate and spontaneous pregnancies were significantly different (P<.001), with surrogate pregnancies more likely to result in twin pregnancies: 33% vs. 1%. Miscarriage and ectopic rates were similar. Of these pregnancies, there were 352 singleton live births: 103 achieved from commissioned embryos and 249 conceived spontaneously. Surrogate births had lower mean gestational age at delivery (38.8 ± 2.1 vs. 39.7 ± 1.4), higher rates of preterm birth (10.7% vs. 3.1%), and higher rates of low birth weight (7.8% vs. 2.4%). Neonates from surrogacy had birth weights that were, on average, 105 g lower. Surrogate births had significantly higher obstetrical complications, including gestational diabetes, hypertension, use of amniocentesis, placenta previa, antibiotic requirement during labor, and cesarean section. Neonates born from commissioned embryos and carried by gestational surrogates have increased adverse perinatal outcomes, including preterm birth, low birth weight, hypertension, maternal gestational diabetes, and placenta previa, compared with singletons conceived spontaneously and carried by the same woman. Our data suggest that assisted reproductive procedures may potentially affect embryo quality and that its negative impact can not be overcome even with a proven healthy uterine environment. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  17. Evidence of low dimensional chaos in renal blood flow control in genetic and experimental hypertension

    NASA Astrophysics Data System (ADS)

    Yip, K.-P.; Marsh, D. J.; Holstein-Rathlou, N.-H.

    1995-01-01

    We applied a surrogate data technique to test for nonlinear structure in spontaneous fluctuations of hydrostatic pressure in renal tubules of hypertensive rats. Tubular pressure oscillates at 0.03-0.05 Hz in animals with normal blood pressure, but the fluctuations become irregular with chronic hypertension. Using time series from rats with hypertension we produced surrogate data sets to test whether they represent linearly correlated noise or ‘static’ nonlinear transforms of a linear stochastic process. The correlation dimension and the forecasting error were used as discriminating statistics to compare surrogate with experimental data. The results show that the original experimental time series can be distinguished from both linearly and static nonlinearly correlated noise, indicating that the nonlinear behavior is due to the intrinsic dynamics of the system. Together with other evidence this strongly suggests that a low dimensional chaotic attractor governs renal hemodynamics in hypertension. This appears to be the first demonstration of a transition to chaotic dynamics in an integrated physiological control system occurring in association with a pathological condition.

  18. Presence of nonlinearity in intracranial EEG recordings: detected by Lyapunov exponents

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Shiau, Deng-Shan; Chaovalitwongse, W. Art; Pardalos, Panos M.; Sackellares, J. C.

    2007-11-01

    In this communication, we performed nonlinearity analysis in the EEG signals recorded from patients with temporal lobe epilepsy (TLE). The largest Lyapunov exponent (Lmax) and phase randomization surrogate data technique were employed to form the statistical test. EEG recordings were acquired invasively from three patients in six brain regions (left and right temporal depth, sub-temporal and orbitofrontal) with 28-32 depth electrodes placed in depth and subdural of the brain. All three patients in this study have unilateral epileptic focus region on the right hippocampus(RH). Nonlinearity was detected by comparing the Lmax profiles of the EEG recordings to its surrogates. The nonlinearity was seen in all different states of the patient with the highest found in post-ictal state. Further our results for all patients exhibited higher degree of differences, quantified by paired t-test, in Lmax values between original and its surrogate from EEG signals recorded from epileptic focus regions. The results of this study demonstrated the Lmax is capable to capture spatio-temporal dynamics that may not be able to detect by linear measurements in the intracranial EEG recordings.

  19. 76 FR 2337 - Certain Cased Pencils From the People's Republic of China: Preliminary Results and Partial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-13

    ... has autonomy from the government in making decisions regarding the selection of management; and (4... Financial Statistics. When relying on prices of imports into India as surrogate values, we have disregarded... the 2006-2007 financial statement of Triveni Pencils Ltd. (``Triveni''), an Indian producer of pencils...

  20. 76 FR 41207 - Tapered Roller Bearings and Parts Thereof, Finished or Unfinished, From the People's Republic of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ...) whether the respondent has autonomy from the government in making decisions regarding the selection of... respondent has autonomy from the government regarding the selection of management.\\28\\ \\28\\ See Sihe's SRA... published in the International Monetary Fund's International Financial Statistics.\\38\\ \\37\\ See Surrogate...

  1. An Analytical Evaluation of Two Common-Odds Ratios as Population Indicators of DIF.

    ERIC Educational Resources Information Center

    Pommerich, Mary; And Others

    The Mantel-Haenszel (MH) statistic for identifying differential item functioning (DIF) commonly conditions on the observed test score as a surrogate for conditioning on latent ability. When the comparison group distributions are not completely overlapping (i.e., are incongruent), the observed score represents different levels of latent ability…

  2. Partnership for Edge Physics (EPSI), University of Texas Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moser, Robert; Carey, Varis; Michoski, Craig

    Simulations of tokamak plasmas require a number of inputs whose values are uncertain. The effects of these input uncertainties on the reliability of model predictions is of great importance when validating predictions by comparison to experimental observations, and when using the predictions for design and operation of devices. However, high fidelity simulation of tokamak plasmas, particular those aimed at characterization of the edge plasma physics, are computationally expensive, so lower cost surrogates are required to enable practical uncertainty estimates. Two surrogate modeling techniques have been explored in the context of tokamak plasma simulations using the XGC family of plasma simulationmore » codes. The first is a response surface surrogate, and the second is an augmented surrogate relying on scenario extrapolation. In addition, to reduce the costs of the XGC simulations, a particle resampling algorithm was developed, which allows marker particle distributions to be adjusted to maintain optimal importance sampling. This means that the total number of particles in and therefore the cost of a simulation can be reduced while maintaining the same accuracy.« less

  3. Inactivation modeling of human enteric virus surrogates, MS2, Qβ, and ΦX174, in water using UVC-LEDs, a novel disinfecting system.

    PubMed

    Kim, Do-Kyun; Kim, Soo-Ji; Kang, Dong-Hyun

    2017-01-01

    In order to assure the microbial safety of drinking water, UVC-LED treatment has emerged as a possible technology to replace the use of conventional low pressure (LP) mercury vapor UV lamps. In this investigation, inactivation of Human Enteric Virus (HuEV) surrogates with UVC-LEDs was investigated in a water disinfection system, and kinetic model equations were applied to depict the surviving infectivities of the viruses. MS2, Qβ, and ΦX 174 bacteriophages were inoculated into sterile distilled water (DW) and irradiated with UVC-LED printed circuit boards (PCBs) (266nm and 279nm) or conventional LP lamps. Infectivities of bacteriophages were effectively reduced by up to 7-log after 9mJ/cm 2 treatment for MS2 and Qβ, and 1mJ/cm 2 for ΦX 174. UVC-LEDs showed a superior viral inactivation effect compared to conventional LP lamps at the same dose (1mJ/cm 2 ). Non-log linear plot patterns were observed, so that Weibull, Biphasic, Log linear-tail, and Weibull-tail model equations were used to fit the virus survival curves. For MS2 and Qβ, Weibull and Biphasic models fit well with R 2 values approximately equal to 0.97-0.99, and the Weibull-tail equation accurately described survival of ΦX 174. The level of UV-susceptibility among coliphages measured by the inactivation rate constant, k, was statistically different (ΦX 174 (ssDNA)>MS2, Qβ (ssRNA)), and indicated that sensitivity to UV was attributed to viral genetic material. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less

  5. Numerical modeling anti-personnel blast mines coupled to a deformable leg structure

    NASA Astrophysics Data System (ADS)

    Cronin, Duane; Worswick, Mike; Williams, Kevin; Bourget, Daniel; Pageau, Gilles

    2001-06-01

    The development of improved landmine protective footwear requires an understanding of the physics and damage mechanisms associated with a close proximity blast event. Numerical models have been developed to model surrogate mines buried in soil using the Arbitrary Lagrangian Eulerian (ALE) technique to model the explosive and surrounding air, while the soil is modeled as a deformable Lagrangian solid. The advantage of the ALE model is the ability to model large deformations, such as the expanding gases of a high explosive. This model has been validated using the available experimental data [1]. The effect of varying depth of burial and soil conditions has been investigated with these numerical models and compares favorably to data in the literature. The surrogate landmine model has been coupled to a numerical model of a Simplified Lower Leg (SLL), which is designed to mimic the response and failure mechanisms of a human leg. The SLL consists of a bone and tissue simulant arranged as concentric cylinders. A new strain-rate dependant hyperelastic material model for the tissue simulant, ballistic gelatin, has been developed to model the tissue simulant response. The polymeric bone simulant material has been characterized and implemented as a strain-rate dependent material in the numerical model. The numerical model results agree with the measured response of the SLL during experimental blast tests [2]. The numerical model results are used to explain the experimental data. These models predict that, for a surface or sub-surface buried anti-personnel mine, the coupling between the mine and SLL is an important effect. In addition, the soil properties have a significant effect on the load transmitted to the leg. [1] Bergeron, D., Walker, R. and Coffey, C., 1998, “Detonation of 100-Gram Anti-Personnel Mine Surrogate Charges in Sand”, Report number SR 668, Defence Research Establishment Suffield, Canada. [2] Bourget, D., Williams, K., Pageau, G., and Cronin, D., “AP Mine Blast Effects on Surrogate Lower Leg”, Military Aspects of Ballistics and Shock, MABS 16, 2000.

  6. An efficient adaptive sampling strategy for global surrogate modeling with applications in multiphase flow simulation

    NASA Astrophysics Data System (ADS)

    Mo, S.; Lu, D.; Shi, X.; Zhang, G.; Ye, M.; Wu, J.

    2016-12-01

    Surrogate models have shown remarkable computational efficiency in hydrological simulations involving design space exploration, sensitivity analysis, uncertainty quantification, etc. The central task of constructing a global surrogate models is to achieve a prescribed approximation accuracy with as few original model executions as possible, which requires a good design strategy to optimize the distribution of data points in the parameter domains and an effective stopping criterion to automatically terminate the design process when desired approximation accuracy is achieved. This study proposes a novel adaptive sampling strategy, which starts from a small number of initial samples and adaptively selects additional samples by balancing the collection in unexplored regions and refinement in interesting areas. We define an efficient and effective evaluation metric basing on Taylor expansion to select the most promising potential samples from candidate points, and propose a robust stopping criterion basing on the approximation accuracy at new points to guarantee the achievement of desired accuracy. The numerical results of several benchmark analytical functions indicate that the proposed approach is more computationally efficient and robust than the widely used maximin distance design and two other well-known adaptive sampling strategies. The application to two complicated multiphase flow problems further demonstrates the efficiency and effectiveness of our method in constructing global surrogate models for high-dimensional and highly nonlinear problems. Acknowledgements: This work was financially supported by the National Nature Science Foundation of China grants No. 41030746 and 41172206.

  7. A Novel Fast Helical 4D-CT Acquisition Technique to Generate Low-Noise Sorting Artifact–Free Images at User-Selected Breathing Phases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, David, E-mail: dhthomas@mednet.ucla.edu; Lamb, James; White, Benjamin

    2014-05-01

    Purpose: To develop a novel 4-dimensional computed tomography (4D-CT) technique that exploits standard fast helical acquisition, a simultaneous breathing surrogate measurement, deformable image registration, and a breathing motion model to remove sorting artifacts. Methods and Materials: Ten patients were imaged under free-breathing conditions 25 successive times in alternating directions with a 64-slice CT scanner using a low-dose fast helical protocol. An abdominal bellows was used as a breathing surrogate. Deformable registration was used to register the first image (defined as the reference image) to the subsequent 24 segmented images. Voxel-specific motion model parameters were determined using a breathing motion model. Themore » tissue locations predicted by the motion model in the 25 images were compared against the deformably registered tissue locations, allowing a model prediction error to be evaluated. A low-noise image was created by averaging the 25 images deformed to the first image geometry, reducing statistical image noise by a factor of 5. The motion model was used to deform the low-noise reference image to any user-selected breathing phase. A voxel-specific correction was applied to correct the Hounsfield units for lung parenchyma density as a function of lung air filling. Results: Images produced using the model at user-selected breathing phases did not suffer from sorting artifacts common to conventional 4D-CT protocols. The mean prediction error across all patients between the breathing motion model predictions and the measured lung tissue positions was determined to be 1.19 ± 0.37 mm. Conclusions: The proposed technique can be used as a clinical 4D-CT technique. It is robust in the presence of irregular breathing and allows the entire imaging dose to contribute to the resulting image quality, providing sorting artifact–free images at a patient dose similar to or less than current 4D-CT techniques.« less

  8. A novel fast helical 4D-CT acquisition technique to generate low-noise sorting artifact-free images at user-selected breathing phases.

    PubMed

    Thomas, David; Lamb, James; White, Benjamin; Jani, Shyam; Gaudio, Sergio; Lee, Percy; Ruan, Dan; McNitt-Gray, Michael; Low, Daniel

    2014-05-01

    To develop a novel 4-dimensional computed tomography (4D-CT) technique that exploits standard fast helical acquisition, a simultaneous breathing surrogate measurement, deformable image registration, and a breathing motion model to remove sorting artifacts. Ten patients were imaged under free-breathing conditions 25 successive times in alternating directions with a 64-slice CT scanner using a low-dose fast helical protocol. An abdominal bellows was used as a breathing surrogate. Deformable registration was used to register the first image (defined as the reference image) to the subsequent 24 segmented images. Voxel-specific motion model parameters were determined using a breathing motion model. The tissue locations predicted by the motion model in the 25 images were compared against the deformably registered tissue locations, allowing a model prediction error to be evaluated. A low-noise image was created by averaging the 25 images deformed to the first image geometry, reducing statistical image noise by a factor of 5. The motion model was used to deform the low-noise reference image to any user-selected breathing phase. A voxel-specific correction was applied to correct the Hounsfield units for lung parenchyma density as a function of lung air filling. Images produced using the model at user-selected breathing phases did not suffer from sorting artifacts common to conventional 4D-CT protocols. The mean prediction error across all patients between the breathing motion model predictions and the measured lung tissue positions was determined to be 1.19 ± 0.37 mm. The proposed technique can be used as a clinical 4D-CT technique. It is robust in the presence of irregular breathing and allows the entire imaging dose to contribute to the resulting image quality, providing sorting artifact-free images at a patient dose similar to or less than current 4D-CT techniques. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Active Learning to Understand Infectious Disease Models and Improve Policy Making

    PubMed Central

    Vladislavleva, Ekaterina; Broeckhove, Jan; Beutels, Philippe; Hens, Niel

    2014-01-01

    Modeling plays a major role in policy making, especially for infectious disease interventions but such models can be complex and computationally intensive. A more systematic exploration is needed to gain a thorough systems understanding. We present an active learning approach based on machine learning techniques as iterative surrogate modeling and model-guided experimentation to systematically analyze both common and edge manifestations of complex model runs. Symbolic regression is used for nonlinear response surface modeling with automatic feature selection. First, we illustrate our approach using an individual-based model for influenza vaccination. After optimizing the parameter space, we observe an inverse relationship between vaccination coverage and cumulative attack rate reinforced by herd immunity. Second, we demonstrate the use of surrogate modeling techniques on input-response data from a deterministic dynamic model, which was designed to explore the cost-effectiveness of varicella-zoster virus vaccination. We use symbolic regression to handle high dimensionality and correlated inputs and to identify the most influential variables. Provided insight is used to focus research, reduce dimensionality and decrease decision uncertainty. We conclude that active learning is needed to fully understand complex systems behavior. Surrogate models can be readily explored at no computational expense, and can also be used as emulator to improve rapid policy making in various settings. PMID:24743387

  10. Active learning to understand infectious disease models and improve policy making.

    PubMed

    Willem, Lander; Stijven, Sean; Vladislavleva, Ekaterina; Broeckhove, Jan; Beutels, Philippe; Hens, Niel

    2014-04-01

    Modeling plays a major role in policy making, especially for infectious disease interventions but such models can be complex and computationally intensive. A more systematic exploration is needed to gain a thorough systems understanding. We present an active learning approach based on machine learning techniques as iterative surrogate modeling and model-guided experimentation to systematically analyze both common and edge manifestations of complex model runs. Symbolic regression is used for nonlinear response surface modeling with automatic feature selection. First, we illustrate our approach using an individual-based model for influenza vaccination. After optimizing the parameter space, we observe an inverse relationship between vaccination coverage and cumulative attack rate reinforced by herd immunity. Second, we demonstrate the use of surrogate modeling techniques on input-response data from a deterministic dynamic model, which was designed to explore the cost-effectiveness of varicella-zoster virus vaccination. We use symbolic regression to handle high dimensionality and correlated inputs and to identify the most influential variables. Provided insight is used to focus research, reduce dimensionality and decrease decision uncertainty. We conclude that active learning is needed to fully understand complex systems behavior. Surrogate models can be readily explored at no computational expense, and can also be used as emulator to improve rapid policy making in various settings.

  11. Shape Optimization by Bayesian-Validated Computer-Simulation Surrogates

    NASA Technical Reports Server (NTRS)

    Patera, Anthony T.

    1997-01-01

    A nonparametric-validated, surrogate approach to optimization has been applied to the computational optimization of eddy-promoter heat exchangers and to the experimental optimization of a multielement airfoil. In addition to the baseline surrogate framework, a surrogate-Pareto framework has been applied to the two-criteria, eddy-promoter design problem. The Pareto analysis improves the predictability of the surrogate results, preserves generality, and provides a means to rapidly determine design trade-offs. Significant contributions have been made in the geometric description used for the eddy-promoter inclusions as well as to the surrogate framework itself. A level-set based, geometric description has been developed to define the shape of the eddy-promoter inclusions. The level-set technique allows for topology changes (from single-body,eddy-promoter configurations to two-body configurations) without requiring any additional logic. The continuity of the output responses for input variations that cross the boundary between topologies has been demonstrated. Input-output continuity is required for the straightforward application of surrogate techniques in which simplified, interpolative models are fitted through a construction set of data. The surrogate framework developed previously has been extended in a number of ways. First, the formulation for a general, two-output, two-performance metric problem is presented. Surrogates are constructed and validated for the outputs. The performance metrics can be functions of both outputs, as well as explicitly of the inputs, and serve to characterize the design preferences. By segregating the outputs and the performance metrics, an additional level of flexibility is provided to the designer. The validated outputs can be used in future design studies and the error estimates provided by the output validation step still apply, and require no additional appeals to the expensive analysis. Second, a candidate-based a posteriori error analysis capability has been developed which provides probabilistic error estimates on the true performance for a design randomly selected near the surrogate-predicted optimal design.

  12. The Impact of Parametric Uncertainties on Biogeochemistry in the E3SM Land Model

    NASA Astrophysics Data System (ADS)

    Ricciuto, Daniel; Sargsyan, Khachik; Thornton, Peter

    2018-02-01

    We conduct a global sensitivity analysis (GSA) of the Energy Exascale Earth System Model (E3SM), land model (ELM) to calculate the sensitivity of five key carbon cycle outputs to 68 model parameters. This GSA is conducted by first constructing a Polynomial Chaos (PC) surrogate via new Weighted Iterative Bayesian Compressive Sensing (WIBCS) algorithm for adaptive basis growth leading to a sparse, high-dimensional PC surrogate with 3,000 model evaluations. The PC surrogate allows efficient extraction of GSA information leading to further dimensionality reduction. The GSA is performed at 96 FLUXNET sites covering multiple plant functional types (PFTs) and climate conditions. About 20 of the model parameters are identified as sensitive with the rest being relatively insensitive across all outputs and PFTs. These sensitivities are dependent on PFT, and are relatively consistent among sites within the same PFT. The five model outputs have a majority of their highly sensitive parameters in common. A common subset of sensitive parameters is also shared among PFTs, but some parameters are specific to certain types (e.g., deciduous phenology). The relative importance of these parameters shifts significantly among PFTs and with climatic variables such as mean annual temperature.

  13. Real-time tumor motion estimation using respiratory surrogate via memory-based learning

    NASA Astrophysics Data System (ADS)

    Li, Ruijiang; Lewis, John H.; Berbeco, Ross I.; Xing, Lei

    2012-08-01

    Respiratory tumor motion is a major challenge in radiation therapy for thoracic and abdominal cancers. Effective motion management requires an accurate knowledge of the real-time tumor motion. External respiration monitoring devices (optical, etc) provide a noninvasive, non-ionizing, low-cost and practical approach to obtain the respiratory signal. Due to the highly complex and nonlinear relations between tumor and surrogate motion, its ultimate success hinges on the ability to accurately infer the tumor motion from respiratory surrogates. Given their widespread use in the clinic, such a method is critically needed. We propose to use a powerful memory-based learning method to find the complex relations between tumor motion and respiratory surrogates. The method first stores the training data in memory and then finds relevant data to answer a particular query. Nearby data points are assigned high relevance (or weights) and conversely distant data are assigned low relevance. By fitting relatively simple models to local patches instead of fitting one single global model, it is able to capture highly nonlinear and complex relations between the internal tumor motion and external surrogates accurately. Due to the local nature of weighting functions, the method is inherently robust to outliers in the training data. Moreover, both training and adapting to new data are performed almost instantaneously with memory-based learning, making it suitable for dynamically following variable internal/external relations. We evaluated the method using respiratory motion data from 11 patients. The data set consists of simultaneous measurement of 3D tumor motion and 1D abdominal surface (used as the surrogate signal in this study). There are a total of 171 respiratory traces, with an average peak-to-peak amplitude of ∼15 mm and average duration of ∼115 s per trace. Given only 5 s (roughly one breath) pretreatment training data, the method achieved an average 3D error of 1.5 mm and 95th percentile error of 3.4 mm on unseen test data. The average 3D error was further reduced to 1.4 mm when the model was tuned to its optimal setting for each respiratory trace. In one trace where a few outliers are present in the training data, the proposed method achieved an error reduction of as much as ∼50% compared with the best linear model (1.0 mm versus 2.1 mm). The memory-based learning technique is able to accurately capture the highly complex and nonlinear relations between tumor and surrogate motion in an efficient manner (a few milliseconds per estimate). Furthermore, the algorithm is particularly suitable to handle situations where the training data are contaminated by large errors or outliers. These desirable properties make it an ideal candidate for accurate and robust tumor gating/tracking using respiratory surrogates.

  14. Optimizing water resources management in large river basins with integrated surface water-groundwater modeling: A surrogate-based approach

    NASA Astrophysics Data System (ADS)

    Wu, Bin; Zheng, Yi; Wu, Xin; Tian, Yong; Han, Feng; Liu, Jie; Zheng, Chunmiao

    2015-04-01

    Integrated surface water-groundwater modeling can provide a comprehensive and coherent understanding on basin-scale water cycle, but its high computational cost has impeded its application in real-world management. This study developed a new surrogate-based approach, SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), to incorporate the integrated modeling into water management optimization. Its applicability and advantages were evaluated and validated through an optimization research on the conjunctive use of surface water (SW) and groundwater (GW) for irrigation in a semiarid region in northwest China. GSFLOW, an integrated SW-GW model developed by USGS, was employed. The study results show that, due to the strong and complicated SW-GW interactions, basin-scale water saving could be achieved by spatially optimizing the ratios of groundwater use in different irrigation districts. The water-saving potential essentially stems from the reduction of nonbeneficial evapotranspiration from the aqueduct system and shallow groundwater, and its magnitude largely depends on both water management schemes and hydrological conditions. Important implications for water resources management in general include: first, environmental flow regulation needs to take into account interannual variation of hydrological conditions, as well as spatial complexity of SW-GW interactions; and second, to resolve water use conflicts between upper stream and lower stream, a system approach is highly desired to reflect ecological, economic, and social concerns in water management decisions. Overall, this study highlights that surrogate-based approaches like SOIM represent a promising solution to filling the gap between complex environmental modeling and real-world management decision-making.

  15. An evaluation of single question delirium screening tools in older emergency department patients.

    PubMed

    Han, Jin H; Wilson, Amanda; Schnelle, John F; Dittus, Robert S; Wesley Ely, E

    2018-07-01

    To determine the diagnostic performances of several single question delirium screens. To the patient we asked: "Have you had any difficulty thinking clearly lately?" To the patient's surrogate, we asked: "Is the patient at his or her baseline mental status?" and "Have you noticed the patient's mental status fluctuate throughout the course of the day?" This was a prospective observational study that enrolled English speaking patients 65 years or older. A research assistant (RA) and emergency physician (EP) independently asked the patient and surrogate the single question delirium screens. The reference standard for delirium was a consultation-liaison psychiatrist's assessment using Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM-IV-TR) criteria. All assessments were performed within 3 h and were all blinded to each other. Of the 406 patients enrolled, 50 (12%) were delirious. A patient who was unable to answer the question "Have you had any difficulty thinking clearly lately?" was 99.7% (95% CI: 98.0%-99.9%) specific, but only 24.0% (95% CI: 14.3%-37.4%) sensitive for delirium when asked by the RA. The baseline mental status surrogate question was 77.1% (95% CI: 61.0%-87.9%) sensitive and 87.5% (95% CI: 82.8%-91.1%) specific for delirium when asked by the RA. The fluctuating course surrogate question was 77.1% (95% CI: 61.0%-87.9%) sensitive and 80.2% (95% CI: 74.8%-84.7%) specific. When asked by the EP, the single question delirium screens' diagnostic performances were similar. The patient and surrogate single question delirium assessments may be useful for delirium screening in the ED. Copyright © 2018. Published by Elsevier Inc.

  16. Methodology for Formulating Diesel Surrogate Fuels with Accurate Compositional, Ignition-Quality, and Volatility Characteristics

    DOE PAGES

    Mueller, Charles J.; Cannella, William J.; Bruno, Thomas J.; ...

    2012-05-22

    In this study, a novel approach was developed to formulate surrogate fuels having characteristics that are representative of diesel fuels produced from real-world refinery streams. Because diesel fuels typically consist of hundreds of compounds, it is difficult to conclusively determine the effects of fuel composition on combustion properties. Surrogate fuels, being simpler representations of these practical fuels, are of interest because they can provide a better understanding of fundamental fuel-composition and property effects on combustion and emissions-formation processes in internal-combustion engines. In addition, the application of surrogate fuels in numerical simulations with accurate vaporization, mixing, and combustion models could revolutionizemore » future engine designs by enabling computational optimization for evolving real fuels. Dependable computational design would not only improve engine function, it would do so at significant cost savings relative to current optimization strategies that rely on physical testing of hardware prototypes. The approach in this study utilized the state-of-the-art techniques of 13C and 1H nuclear magnetic resonance spectroscopy and the advanced distillation curve to characterize fuel composition and volatility, respectively. The ignition quality was quantified by the derived cetane number. Two well-characterized, ultra-low-sulfur #2 diesel reference fuels produced from refinery streams were used as target fuels: a 2007 emissions certification fuel and a Coordinating Research Council (CRC) Fuels for Advanced Combustion Engines (FACE) diesel fuel. A surrogate was created for each target fuel by blending eight pure compounds. The known carbon bond types within the pure compounds, as well as models for the ignition qualities and volatilities of their mixtures, were used in a multiproperty regression algorithm to determine optimal surrogate formulations. The predicted and measured surrogate-fuel properties were quantitatively compared to the measured target-fuel properties, and good agreement was found.« less

  17. Time-variant random interval natural frequency analysis of structures

    NASA Astrophysics Data System (ADS)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  18. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission—With an application to the 2014-2015 West Africa Ebola outbreak

    PubMed Central

    McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.

    2017-01-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216

  19. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission-With an application to the 2014-2015 West Africa Ebola outbreak.

    PubMed

    Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T

    2017-10-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.

  20. Evaluation of Mass Filtered, Time Dilated, Time-of-Flight Mass Spectrometry

    DTIC Science & Technology

    2010-01-01

    Figure 4.4: Mass resolution dependence on field for selected actinides and surrogates...45 Figure 4.7: Mass resolution dependence on field for selected actinides and actinide surrogates, modeled with no initial...system. A somewhat better mass resolution would need to be achieved in order to separate hydride molecules in the actinide region. However, the

  1. School-based clinics: their role in helping students meet the 1990 objectives.

    PubMed

    Dryfoos, J G; Klerman, L V

    1988-01-01

    Service statistics and observations from site visits across the country indicate that school-based clinics (SBCs) may be having an impact on several of the problems targeted in the 1990 health objectives, including unplanned pregnancy and substance abuse. At least 120 junior and senior high schools in 61 communities are currently operating or developing clinics. Growth is attributed to increasing concern about high-risk youth, especially among educators in their roles of "surrogate parents"; to disillusion with categorical interventions and a movement toward more comprehensive services; and to student, parent, school, and community approval of the new programs. This article describes the comprehensive school-based clinic model, including its history, organizational strategies, school/community partnerships, and services.

  2. Comparing Thermal Process Validation Methods for Salmonella Inactivation on Almond Kernels.

    PubMed

    Jeong, Sanghyup; Marks, Bradley P; James, Michael K

    2017-01-01

    Ongoing regulatory changes are increasing the need for reliable process validation methods for pathogen reduction processes involving low-moisture products; however, the reliability of various validation methods has not been evaluated. Therefore, the objective was to quantify accuracy and repeatability of four validation methods (two biologically based and two based on time-temperature models) for thermal pasteurization of almonds. Almond kernels were inoculated with Salmonella Enteritidis phage type 30 or Enterococcus faecium (NRRL B-2354) at ~10 8 CFU/g, equilibrated to 0.24, 0.45, 0.58, or 0.78 water activity (a w ), and then heated in a pilot-scale, moist-air impingement oven (dry bulb 121, 149, or 177°C; dew point <33.0, 69.4, 81.6, or 90.6°C; v air = 2.7 m/s) to a target lethality of ~4 log. Almond surface temperatures were measured in two ways, and those temperatures were used to calculate Salmonella inactivation using a traditional (D, z) model and a modified model accounting for process humidity. Among the process validation methods, both methods based on time-temperature models had better repeatability, with replication errors approximately half those of the surrogate ( E. faecium ). Additionally, the modified model yielded the lowest root mean squared error in predicting Salmonella inactivation (1.1 to 1.5 log CFU/g); in contrast, E. faecium yielded a root mean squared error of 1.2 to 1.6 log CFU/g, and the traditional model yielded an unacceptably high error (3.4 to 4.4 log CFU/g). Importantly, the surrogate and modified model both yielded lethality predictions that were statistically equivalent (α = 0.05) to actual Salmonella lethality. The results demonstrate the importance of methodology, a w , and process humidity when validating thermal pasteurization processes for low-moisture foods, which should help processors select and interpret validation methods to ensure product safety.

  3. Assessment of resampling methods for causality testing: A note on the US inflation behavior

    PubMed Central

    Kyrtsou, Catherine; Kugiumtzis, Dimitris; Diks, Cees

    2017-01-01

    Different resampling methods for the null hypothesis of no Granger causality are assessed in the setting of multivariate time series, taking into account that the driving-response coupling is conditioned on the other observed variables. As appropriate test statistic for this setting, the partial transfer entropy (PTE), an information and model-free measure, is used. Two resampling techniques, time-shifted surrogates and the stationary bootstrap, are combined with three independence settings (giving a total of six resampling methods), all approximating the null hypothesis of no Granger causality. In these three settings, the level of dependence is changed, while the conditioning variables remain intact. The empirical null distribution of the PTE, as the surrogate and bootstrapped time series become more independent, is examined along with the size and power of the respective tests. Additionally, we consider a seventh resampling method by contemporaneously resampling the driving and the response time series using the stationary bootstrap. Although this case does not comply with the no causality hypothesis, one can obtain an accurate sampling distribution for the mean of the test statistic since its value is zero under H0. Results indicate that as the resampling setting gets more independent, the test becomes more conservative. Finally, we conclude with a real application. More specifically, we investigate the causal links among the growth rates for the US CPI, money supply and crude oil. Based on the PTE and the seven resampling methods, we consistently find that changes in crude oil cause inflation conditioning on money supply in the post-1986 period. However this relationship cannot be explained on the basis of traditional cost-push mechanisms. PMID:28708870

  4. Assessment of resampling methods for causality testing: A note on the US inflation behavior.

    PubMed

    Papana, Angeliki; Kyrtsou, Catherine; Kugiumtzis, Dimitris; Diks, Cees

    2017-01-01

    Different resampling methods for the null hypothesis of no Granger causality are assessed in the setting of multivariate time series, taking into account that the driving-response coupling is conditioned on the other observed variables. As appropriate test statistic for this setting, the partial transfer entropy (PTE), an information and model-free measure, is used. Two resampling techniques, time-shifted surrogates and the stationary bootstrap, are combined with three independence settings (giving a total of six resampling methods), all approximating the null hypothesis of no Granger causality. In these three settings, the level of dependence is changed, while the conditioning variables remain intact. The empirical null distribution of the PTE, as the surrogate and bootstrapped time series become more independent, is examined along with the size and power of the respective tests. Additionally, we consider a seventh resampling method by contemporaneously resampling the driving and the response time series using the stationary bootstrap. Although this case does not comply with the no causality hypothesis, one can obtain an accurate sampling distribution for the mean of the test statistic since its value is zero under H0. Results indicate that as the resampling setting gets more independent, the test becomes more conservative. Finally, we conclude with a real application. More specifically, we investigate the causal links among the growth rates for the US CPI, money supply and crude oil. Based on the PTE and the seven resampling methods, we consistently find that changes in crude oil cause inflation conditioning on money supply in the post-1986 period. However this relationship cannot be explained on the basis of traditional cost-push mechanisms.

  5. Skin models for the testing of transdermal drugs

    PubMed Central

    Abd, Eman; Yousef, Shereen A; Pastore, Michael N; Telaprolu, Krishna; Mohammed, Yousuf H; Namjoshi, Sarika; Grice, Jeffrey E; Roberts, Michael S

    2016-01-01

    The assessment of percutaneous permeation of molecules is a key step in the evaluation of dermal or transdermal delivery systems. If the drugs are intended for delivery to humans, the most appropriate setting in which to do the assessment is the in vivo human. However, this may not be possible for ethical, practical, or economic reasons, particularly in the early phases of development. It is thus necessary to find alternative methods using accessible and reproducible surrogates for in vivo human skin. A range of models has been developed, including ex vivo human skin, usually obtained from cadavers or plastic surgery patients, ex vivo animal skin, and artificial or reconstructed skin models. Increasingly, largely driven by regulatory authorities and industry, there is a focus on developing standardized techniques and protocols. With this comes the need to demonstrate that the surrogate models produce results that correlate with those from in vivo human studies and that they can be used to show bioequivalence of different topical products. This review discusses the alternative skin models that have been developed as surrogates for normal and diseased skin and examines the concepts of using model systems for in vitro–in vivo correlation and the demonstration of bioequivalence. PMID:27799831

  6. Gaussian process regression of chirplet decomposed ultrasonic B-scans of a simulated design case

    NASA Astrophysics Data System (ADS)

    Wertz, John; Homa, Laura; Welter, John; Sparkman, Daniel; Aldrin, John

    2018-04-01

    The US Air Force seeks to implement damage tolerant lifecycle management of composite structures. Nondestructive characterization of damage is a key input to this framework. One approach to characterization is model-based inversion of the ultrasonic response from damage features; however, the computational expense of modeling the ultrasonic waves within composites is a major hurdle to implementation. A surrogate forward model with sufficient accuracy and greater computational efficiency is therefore critical to enabling model-based inversion and damage characterization. In this work, a surrogate model is developed on the simulated ultrasonic response from delamination-like structures placed at different locations within a representative composite layup. The resulting B-scans are decomposed via the chirplet transform, and a Gaussian process model is trained on the chirplet parameters. The quality of the surrogate is tested by comparing the B-scan for a delamination configuration not represented within the training data set. The estimated B-scan has a maximum error of ˜15% for an estimated reduction in computational runtime of ˜95% for 200 function calls. This considerable reduction in computational expense makes full 3D characterization of impact damage tractable.

  7. Improved design of prodromal Alzheimer's disease trials through cohort enrichment and surrogate endpoints.

    PubMed

    Macklin, Eric A; Blacker, Deborah; Hyman, Bradley T; Betensky, Rebecca A

    2013-01-01

    Alzheimer's disease (AD) trials initiated during or before the prodrome are costly and lengthy because patients are enrolled long before clinical symptoms are apparent, when disease progression is slow. We hypothesized that design of such trials could be improved by: 1) selecting individuals at moderate near-term risk of progression to AD dementia (the current clinical standard) and 2) by using short-term surrogate endpoints that predict progression to AD dementia. We used a longitudinal cohort of older, initially non-demented, community-dwelling participants (n = 358) to derive selection criteria and surrogate endpoints and tested them in an independent national data set (n = 6,243). To identify a "mid-risk" subgroup, we applied conditional tree-based survival models to Clinical Dementia Rating (CDR) scale scores and common neuropsychological tests. In the validation cohort, a time-to-AD dementia trial applying these mid-risk selection criteria to a pool of all non-demented individuals could achieve equivalent power with 47% fewer participants than enrolling at random from that pool. We evaluated surrogate endpoints measureable over two years of follow-up based on cross-validated concordance between predictions from Cox models and observed time to AD dementia. The best performing surrogate, rate of change in CDR sum-of-boxes, did not reduce the trial duration required for equivalent power using estimates from the validation cohort, but alternative surrogates with better ability to predict time to AD dementia should be able to do so. The approach tested here might improve efficiency of prodromal AD trials using other potential measures and could be generalized to other diseases with long prodromal phases.

  8. Improved design of prodromal Alzheimer’s disease trials through cohort enrichment and surrogate endpoints

    PubMed Central

    Macklin, Eric A.; Blacker, Deborah; Hyman, Bradley T.; Betensky, Rebecca A.

    2013-01-01

    Summary Alzheimer’s disease (AD) trials initiated during or before the prodrome are costly and lengthy because patients are enrolled long before clinical symptoms are apparent, when disease progression is slow. We hypothesized that design of such trials could be improved by: (1) selecting individuals at moderate near-term risk of progression to AD dementia (the current clinical standard) and (2) by using short-term surrogate endpoints that predict progression to AD dementia. We used a longitudinal cohort of older, initially non-demented, community-dwelling participants (n=358) to derive selection criteria and surrogate endpoints and tested them in an independent national data set (n=6,243). To identify a “mid-risk” subgroup, we applied conditional tree-based survival models to Clinical Dementia Rating (CDR) scale scores and common neuropsychological tests. In the validation cohort, a time-to-AD dementia trial applying these mid-risk selection criteria to a pool of all non-demented individuals could achieve equivalent power with 47% fewer participants than enrolling at random from that pool. We evaluated surrogate endpoints measureable over two years of follow-up based on cross-validated concordance between predictions from Cox models and observed time to AD dementia. The best performing surrogate, rate of change in CDR sum-of-boxes, did not reduce the trial duration required for equivalent power using estimates from the validation cohort, but alternative surrogates with better ability to predict time to AD dementia should be able to do so. The approach tested here might improve efficiency of prodromal AD trials using other potential measures and could be generalized to other diseases with long prodromal phases. PMID:23629586

  9. Identifying family members who may struggle in the role of surrogate decision maker.

    PubMed

    Majesko, Alyssa; Hong, Seo Yeon; Weissfeld, Lisa; White, Douglas B

    2012-08-01

    Although acting as a surrogate decision maker can be highly distressing for some family members of intensive care unit patients, little is known about whether there are modifiable risk factors for the occurrence of such difficulties. To identify: 1) factors associated with lower levels of confidence among family members to function as surrogates and 2) whether the quality of clinician-family communication is associated with the timing of decisions to forego life support. We conducted a prospective study of 230 surrogate decision makers for incapacitated, mechanically ventilated patients at high risk of death in four intensive care units at University of California San Francisco Medical Center from 2006 to 2007. Surrogates completed a questionnaire addressing their perceived ability to act as a surrogate and the quality of their communication with physicians. We used clustered multivariate logistic regression to identify predictors of low levels of perceived ability to act as a surrogate and a Cox proportional hazard model to determine whether quality of communication was associated with the timing of decisions to withdraw life support. There was substantial variability in family members' confidence to act as surrogate decision makers, with 27% rating their perceived ability as 7 or lower on a 10-point scale. Independent predictors of lower role confidence were the lack of prior experience as a surrogate (odds ratio 2.2, 95% confidence interval [1.04-4.46], p=.04), no prior discussions with the patient about treatment preferences (odds ratio 3.7, 95% confidence interval [1.79-7.76], p<.001), and poor quality of communication with the ICU physician (odds ratio 1.2, 95% confidence interval [1.09-1.35] p<.001). Higher quality physician-family communication was associated with a significantly shorter duration of life-sustaining treatment among patients who died (β=0.11, p=.001). Family members without prior experience as a surrogate and those who had not engaged in advanced discussions with the patient about treatment preferences were at higher risk to report less confidence in carrying out the surrogate role. Better-quality clinician-family communication was associated with both more confidence among family members to act as surrogates and a shorter duration of use of life support among patients who died.

  10. SU-E-J-234: Application of a Breathing Motion Model to ViewRay Cine MR Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Connell, D. P.; Thomas, D. H.; Dou, T. H.

    2015-06-15

    Purpose: A respiratory motion model previously used to generate breathing-gated CT images was used with cine MR images. Accuracy and predictive ability of the in-plane models were evaluated. Methods: Sagittalplane cine MR images of a patient undergoing treatment on a ViewRay MRI/radiotherapy system were acquired before and during treatment. Images were acquired at 4 frames/second with 3.5 × 3.5 mm resolution and a slice thickness of 5 mm. The first cine frame was deformably registered to following frames. Superior/inferior component of the tumor centroid position was used as a breathing surrogate. Deformation vectors and surrogate measurements were used to determinemore » motion model parameters. Model error was evaluated and subsequent treatment cines were predicted from breathing surrogate data. A simulated CT cine was created by generating breathing-gated volumetric images at 0.25 second intervals along the measured breathing trace, selecting a sagittal slice and downsampling to the resolution of the MR cines. A motion model was built using the first half of the simulated cine data. Model accuracy and error in predicting the remaining frames of the cine were evaluated. Results: Mean difference between model predicted and deformably registered lung tissue positions for the 28 second preview MR cine acquired before treatment was 0.81 +/− 0.30 mm. The model was used to predict two minutes of the subsequent treatment cine with a mean accuracy of 1.59 +/− 0.63 mm. Conclusion: Inplane motion models were built using MR cine images and evaluated for accuracy and ability to predict future respiratory motion from breathing surrogate measurements. Examination of long term predictive ability is ongoing. The technique was applied to simulated CT cines for further validation, and the authors are currently investigating use of in-plane models to update pre-existing volumetric motion models used for generation of breathing-gated CT planning images.« less

  11. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  12. THE EXPOSURE PARADOX IN PARTICULATE MATTER COMMUNITY TIME-SERIES EPIDEMIOLOGY: CAN AMBIENT CONCENTRATIONS OF PM BE USED AS A SURROGATE FOR PERSONAL EXPOSURE TO PM ?

    EPA Science Inventory

    Objective: Explain why epidemiologic studies find a statistically significant relationship between ambient concentrations of PM and health effects even though only a near-zero correlation is found between ambient concentrations of PM and personal exposures to PM. Method: Consider...

  13. Multi-fidelity Gaussian process regression for prediction of random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parussini, L.; Venturi, D., E-mail: venturi@ucsc.edu; Perdikaris, P.

    We propose a new multi-fidelity Gaussian process regression (GPR) approach for prediction of random fields based on observations of surrogate models or hierarchies of surrogate models. Our method builds upon recent work on recursive Bayesian techniques, in particular recursive co-kriging, and extends it to vector-valued fields and various types of covariances, including separable and non-separable ones. The framework we propose is general and can be used to perform uncertainty propagation and quantification in model-based simulations, multi-fidelity data fusion, and surrogate-based optimization. We demonstrate the effectiveness of the proposed recursive GPR techniques through various examples. Specifically, we study the stochastic Burgersmore » equation and the stochastic Oberbeck–Boussinesq equations describing natural convection within a square enclosure. In both cases we find that the standard deviation of the Gaussian predictors as well as the absolute errors relative to benchmark stochastic solutions are very small, suggesting that the proposed multi-fidelity GPR approaches can yield highly accurate results.« less

  14. To address surface reaction network complexity using scaling relations machine learning and DFT calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ulissi, Zachary W.; Medford, Andrew J.; Bligaard, Thomas

    Surface reaction networks involving hydrocarbons exhibit enormous complexity with thousands of species and reactions for all but the very simplest of chemistries. We present a framework for optimization under uncertainty for heterogeneous catalysis reaction networks using surrogate models that are trained on the fly. The surrogate model is constructed by teaching a Gaussian process adsorption energies based on group additivity fingerprints, combined with transition-state scaling relations and a simple classifier for determining the rate-limiting step. The surrogate model is iteratively used to predict the most important reaction step to be calculated explicitly with computationally demanding electronic structure theory. Applying thesemore » methods to the reaction of syngas on rhodium(111), we identify the most likely reaction mechanism. Lastly, propagating uncertainty throughout this process yields the likelihood that the final mechanism is complete given measurements on only a subset of the entire network and uncertainty in the underlying density functional theory calculations.« less

  15. To address surface reaction network complexity using scaling relations machine learning and DFT calculations

    DOE PAGES

    Ulissi, Zachary W.; Medford, Andrew J.; Bligaard, Thomas; ...

    2017-03-06

    Surface reaction networks involving hydrocarbons exhibit enormous complexity with thousands of species and reactions for all but the very simplest of chemistries. We present a framework for optimization under uncertainty for heterogeneous catalysis reaction networks using surrogate models that are trained on the fly. The surrogate model is constructed by teaching a Gaussian process adsorption energies based on group additivity fingerprints, combined with transition-state scaling relations and a simple classifier for determining the rate-limiting step. The surrogate model is iteratively used to predict the most important reaction step to be calculated explicitly with computationally demanding electronic structure theory. Applying thesemore » methods to the reaction of syngas on rhodium(111), we identify the most likely reaction mechanism. Lastly, propagating uncertainty throughout this process yields the likelihood that the final mechanism is complete given measurements on only a subset of the entire network and uncertainty in the underlying density functional theory calculations.« less

  16. Putting engineering back into protein engineering: bioinformatic approaches to catalyst design.

    PubMed

    Gustafsson, Claes; Govindarajan, Sridhar; Minshull, Jeremy

    2003-08-01

    Complex multivariate engineering problems are commonplace and not unique to protein engineering. Mathematical and data-mining tools developed in other fields of engineering have now been applied to analyze sequence-activity relationships of peptides and proteins and to assist in the design of proteins and peptides with specified properties. Decreasing costs of DNA sequencing in conjunction with methods to quickly synthesize statistically representative sets of proteins allow modern heuristic statistics to be applied to protein engineering. This provides an alternative approach to expensive assays or unreliable high-throughput surrogate screens.

  17. Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling

    PubMed Central

    Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren

    2014-01-01

    Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136

  18. An Exploration of Latent Structure in Observational Huntington’s Disease Studies

    PubMed Central

    Ghosh, Soumya; Sun, Zhaonan; Li, Ying; Cheng, Yu; Mohan, Amrita; Sampaio, Cristina; Hu, Jianying

    2017-01-01

    Huntington’s disease (HD) is a monogenic neurodegenerative disorder characterized by the progressive decay of motor and cognitive abilities accompanied by psychiatric episodes. Tracking and modeling the progression of the multi-faceted clinical symptoms of HD is a challenging problem that has important implications for staging of HD patients and the development of improved enrollment criteria for future HD studies and trials. In this paper, we describe the first steps towards this goal. We begin by curating data from four recent observational HD studies, each containing a diverse collection of clinical assessments. The resulting dataset is unprecedented in size and contains data from 19,269 study participants. By analyzing this large dataset, we are able to discover hidden low dimensional structure in the data that correlates well with surrogate measures of HD progression. The discovered structures are promising candidates for future consumption by downstream statistical HD progression models. PMID:28815114

  19. Towards the feasibility of using ultrasound to determine mechanical properties of tissues in a bioreactor.

    PubMed

    Mansour, Joseph M; Gu, Di-Win Marine; Chung, Chen-Yuan; Heebner, Joseph; Althans, Jake; Abdalian, Sarah; Schluchter, Mark D; Liu, Yiying; Welter, Jean F

    2014-10-01

    Our ultimate goal is to non-destructively evaluate mechanical properties of tissue-engineered (TE) cartilage using ultrasound (US). We used agarose gels as surrogates for TE cartilage. Previously, we showed that mechanical properties measured using conventional methods were related to those measured using US, which suggested a way to non-destructively predict mechanical properties of samples with known volume fractions. In this study, we sought to determine whether the mechanical properties of samples, with unknown volume fractions could be predicted by US. Aggregate moduli were calculated for hydrogels as a function of SOS, based on concentration and density using a poroelastic model. The data were used to train a statistical model, which we then used to predict volume fractions and mechanical properties of unknown samples. Young's and storage moduli were measured mechanically. The statistical model generally predicted the Young's moduli in compression to within <10% of their mechanically measured value. We defined positive linear correlations between the aggregate modulus predicted from US and both the storage and Young's moduli determined from mechanical tests. Mechanical properties of hydrogels with unknown volume fractions can be predicted successfully from US measurements. This method has the potential to predict mechanical properties of TE cartilage non-destructively in a bioreactor.

  20. Towards the feasibility of using ultrasound to determine mechanical properties of tissues in a bioreactor

    PubMed Central

    Mansour, Joseph M.; Gu, Di-Win Marine; Chung, Chen-Yuan; Heebner, Joseph; Althans, Jake; Abdalian, Sarah; Schluchter, Mark D.; Liu, Yiying; Welter, Jean F.

    2016-01-01

    Introduction Our ultimate goal is to non-destructively evaluate mechanical properties of tissue-engineered (TE) cartilage using ultrasound (US). We used agarose gels as surrogates for TE cartilage. Previously, we showed that mechanical properties measured using conventional methods were related to those measured using US, which suggested a way to non-destructively predict mechanical properties of samples with known volume fractions. In this study, we sought to determine whether the mechanical properties of samples, with unknown volume fractions could be predicted by US. Methods Aggregate moduli were calculated for hydrogels as a function of SOS, based on concentration and density using a poroelastic model. The data were used to train a statistical model, which we then used to predict volume fractions and mechanical properties of unknown samples. Young's and storage moduli were measured mechanically. Results The statistical model generally predicted the Young's moduli in compression to within < 10% of their mechanically measured value. We defined positive linear correlations between the aggregate modulus predicted from US and both the storage and Young's moduli determined from mechanical tests. Conclusions Mechanical properties of hydrogels with unknown volume fractions can be predicted successfully from US measurements. This method has the potential to predict mechanical properties of TE cartilage non-destructively in a bioreactor. PMID:25092421

  1. Do-not-resuscitate orders in an extended-care study group.

    PubMed

    Meyers, R M; Lurie, N; Breitenbucher, R B; Waring, C J

    1990-09-01

    We examined the charts of 911 nursing home patients in Hennepin County, Minnesota, to determine the prevalence of written do-not-resuscitate (DNR) orders. Information regarding demographic characteristics, and whether a surrogate decisionmaker was available and participated in the decision, was also collected. Twenty-seven percent of patients had DNR orders. Ninety percent of all patients had potentially available surrogate decisionmakers. However, for 31% of patients with DNR orders, there was no documentation of patient or surrogate participation in the DNR decision. Univariate analysis identified female sex; increased age, level of care (skilled versus intermediate), presence of a potential surrogate decisionmaker, and increasing length of time since nursing home admission as factors associated with presence of DNR orders. When a logistic regression model was used, increased age, increased length of time since nursing home admission, skilled versus intermediate level of care, and presence of a surrogate decisionmaker were independently associated with presence of DNR status. Several variables are independently associated with written DNR orders; their relationship to the factors physicians use in decision making requires further study.

  2. MRI-guided tumor tracking in lung cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Cerviño, Laura I.; Du, Jiang; Jiang, Steve B.

    2011-07-01

    Precise tracking of lung tumor motion during treatment delivery still represents a challenge in radiation therapy. Prototypes of MRI-linac hybrid systems are being created which have the potential of ionization-free real-time imaging of the tumor. This study evaluates the performance of lung tumor tracking algorithms in cine-MRI sagittal images from five healthy volunteers. Visible vascular structures were used as targets. Volunteers performed several series of regular and irregular breathing. Two tracking algorithms were implemented and evaluated: a template matching (TM) algorithm in combination with surrogate tracking using the diaphragm (surrogate was used when the maximum correlation between the template and the image in the search window was less than specified), and an artificial neural network (ANN) model based on the principal components of a region of interest that encompasses the target motion. The mean tracking error ē and the error at 95% confidence level e95 were evaluated for each model. The ANN model led to ē = 1.5 mm and e95 = 4.2 mm, while TM led to ē = 0.6 mm and e95 = 1.0 mm. An extra series was considered separately to evaluate the benefit of using surrogate tracking in combination with TM when target out-of-plane motion occurs. For this series, the mean error was 7.2 mm using only TM and 1.7 mm when the surrogate was used in combination with TM. Results show that, as opposed to tracking with other imaging modalities, ANN does not perform well in MR-guided tracking. TM, however, leads to highly accurate tracking. Out-of-plane motion could be addressed by surrogate tracking using the diaphragm, which can be easily identified in the images.

  3. Primary Sclerosing Cholangitis Risk Estimate Tool (PREsTo) Predicts Outcomes in PSC: A Derivation & Validation Study Using Machine Learning.

    PubMed

    Eaton, John E; Vesterhus, Mette; McCauley, Bryan M; Atkinson, Elizabeth J; Schlicht, Erik M; Juran, Brian D; Gossard, Andrea A; LaRusso, Nicholas F; Gores, Gregory J; Karlsen, Tom H; Lazaridis, Konstantinos N

    2018-05-09

    Improved methods are needed to risk stratify and predict outcomes in patients with primary sclerosing cholangitis (PSC). Therefore, we sought to derive and validate a new prediction model and compare its performance to existing surrogate markers. The model was derived using 509 subjects from a multicenter North American cohort and validated in an international multicenter cohort (n=278). Gradient boosting, a machine based learning technique, was used to create the model. The endpoint was hepatic decompensation (ascites, variceal hemorrhage or encephalopathy). Subjects with advanced PSC or cholangiocarcinoma at baseline were excluded. The PSC risk estimate tool (PREsTo) consists of 9 variables: bilirubin, albumin, serum alkaline phosphatase (SAP) times the upper limit of normal (ULN), platelets, AST, hemoglobin, sodium, patient age and the number of years since PSC was diagnosed. Validation in an independent cohort confirms PREsTo accurately predicts decompensation (C statistic 0.90, 95% confidence interval (CI) 0.84-0.95) and performed well compared to MELD score (C statistic 0.72, 95% CI 0.57-0.84), Mayo PSC risk score (C statistic 0.85, 95% CI 0.77-0.92) and SAP < 1.5x ULN (C statistic 0.65, 95% CI 0.55-0.73). PREsTo continued to be accurate among individuals with a bilirubin < 2.0 mg/dL (C statistic 0.90, 95% CI 0.82-0.96) and when the score was re-applied at a later course in the disease (C statistic 0.82, 95% CI 0.64-0.95). PREsTo accurately predicts hepatic decompensation in PSC and exceeds the performance among other widely available, noninvasive prognostic scoring systems. This article is protected by copyright. All rights reserved. © 2018 by the American Association for the Study of Liver Diseases.

  4. Evaluation of lung tumor motion management in radiation therapy with dynamic MRI

    NASA Astrophysics Data System (ADS)

    Park, Seyoun; Farah, Rana; Shea, Steven M.; Tryggestad, Erik; Hales, Russell; Lee, Junghoon

    2017-03-01

    Surrogate-based tumor motion estimation and tracing methods are commonly used in radiotherapy despite the lack of continuous real time 3D tumor and surrogate data. In this study, we propose a method to simultaneously track the tumor and external surrogates with dynamic MRI, which allows us to evaluate their reproducible correlation. Four MRIcompatible fiducials are placed on the patient's chest and upper abdomen, and multi-slice 2D cine MRIs are acquired to capture the lung and whole tumor, followed by two-slice 2D cine MRIs to simultaneously track the tumor and fiducials, all in sagittal orientation. A phase-binned 4D-MRI is first reconstructed from multi-slice MR images using body area as a respiratory surrogate and group-wise registration. The 4D-MRI provides 3D template volumes for different breathing phases. 3D tumor position is calculated by 3D-2D template matching in which 3D tumor templates in 4D-MRI reconstruction and the 2D cine MRIs from the two-slice tracking dataset are registered. 3D trajectories of the external surrogates are derived via matching a 3D geometrical model to the fiducial segmentations on the 2D cine MRIs. We tested our method on five lung cancer patients. Internal target volume from 4D-CT showed average sensitivity of 86.5% compared to the actual tumor motion for 5 min. 3D tumor motion correlated with the external surrogate signal, but showed a noticeable phase mismatch. The 3D tumor trajectory showed significant cycle-to-cycle variation, while the external surrogate was not sensitive enough to capture such variations. Additionally, there was significant phase mismatch between surrogate signals obtained from fiducials at different locations.

  5. Large Eddy Simulation of Turbulent Combustion

    DTIC Science & Technology

    2005-10-01

    a new method to automatically generate skeletal kinetic mechanisms for surrogate fuels, using the directed relation graph method with error...propagation, was developed. These mechanisms are guaranteed to match results obtained using detailed chemistry within a user- defined accuracy for any...specified target. They can be combined together to produce adequate chemical models for surrogate fuels. A library containing skeletal mechanisms of various

  6. Impact of competitor species composition on predicting diameter growth and survival rates of Douglas-fir trees in southwestern Oregon

    USGS Publications Warehouse

    Bravo, Felipe; Hann, D.W.; Maguire, Douglas A.

    2001-01-01

    Mixed conifer and hardwood stands in southwestern Oregon were studied to explore the hypothesis that competition effects on individual-tree growth and survival will differ according to the species comprising the competition measure. Likewise, it was hypothesized that competition measures should extrapolate best if crown-based surrogates are given preference over diameter-based (basal area based) surrogates. Diameter growth and probability of survival were modeled for individual Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco) trees growing in pure stands. Alternative models expressing one-sided and two-sided competition as a function of either basal area or crown structure were then applied to other plots in which Douglas-fir was mixed with other conifers and (or) hardwood species. Crown-based variables outperformed basal area based variables as surrogates for one-sided competition in both diameter growth and survival probability, regardless of species composition. In contrast, two-sided competition was best represented by total basal area of competing trees. Surrogates reflecting differences in crown morphology among species relate more closely to the mechanics of competition for light and, hence, facilitate extrapolation to species combinations for which no observations are available.

  7. Dry deposition of reduced and reactive nitrogen: A surrogate surfaces approach

    NASA Astrophysics Data System (ADS)

    Shahin, Usama Mohammed

    Nitrogen deposition constitutes an important component of acidic deposition to terrestrial surfaces. However, deposition flux and ambient concentration measurement methods and are still under development. A new sampler using water as a surrogate surface was developed in the Department of Environmental Engineering at Illinois Institute of Technology. This study investigated nitrate and ammonia dry deposition to the water surface sampler, a Nylasorb filter, a citric acid impregnated filter, and a greased strip on the dry deposition plate. The nitrogen containing species that may be responsible for nitrate dry deposition to the WSS include nitrogen monoxide (NO), nitrogen dioxide (NO2), peroxyacetyl nitrate (PAN), nitrous acid (HNO2), nitric acid (HNO3), and particulate nitrate. The experimental measurements showed that HNO3 and particulate nitrate are the major nitrate contributors to the WSS. Ammonia sources to the water surface are ammonia gas (NH3) and ammonium (NH4+). The experimental results showed that these two species are the sole sources to ammonium deposition. Comparison between the measured deposition velocity of SO2, and HNO3, shows that their dry deposition velocities are statistically the same at the 95% confidence level and NH3 deposition velocity and the water evaporation rate are also the same. It was also shown that the air side MTC of two different compounds were correlated to the square root of the inverse of the molecular weight for compounds. The measured MTC was tested by the application of two models, the resistance model and the water evaporation model. The resistance model prediction of the MTC was very close to the measured value but the evaporation model prediction was not. This result is compatible with the finding of Yi, (1997) who used the same WSS for measurements of SO2. The experimental data collected in this research project was used to develop an empirical model to measure the MTC that is [kl/over D] = 0.0426 ([lv/rho/over /mu])0.8([/mu/over /rho [ D

  8. A Statistical Method of Identifying Interactions in Neuron–Glia Systems Based on Functional Multicell Ca2+ Imaging

    PubMed Central

    Nakae, Ken; Ikegaya, Yuji; Ishikawa, Tomoe; Oba, Shigeyuki; Urakubo, Hidetoshi; Koyama, Masanori; Ishii, Shin

    2014-01-01

    Crosstalk between neurons and glia may constitute a significant part of information processing in the brain. We present a novel method of statistically identifying interactions in a neuron–glia network. We attempted to identify neuron–glia interactions from neuronal and glial activities via maximum-a-posteriori (MAP)-based parameter estimation by developing a generalized linear model (GLM) of a neuron–glia network. The interactions in our interest included functional connectivity and response functions. We evaluated the cross-validated likelihood of GLMs that resulted from the addition or removal of connections to confirm the existence of specific neuron-to-glia or glia-to-neuron connections. We only accepted addition or removal when the modification improved the cross-validated likelihood. We applied the method to a high-throughput, multicellular in vitro Ca2+ imaging dataset obtained from the CA3 region of a rat hippocampus, and then evaluated the reliability of connectivity estimates using a statistical test based on a surrogate method. Our findings based on the estimated connectivity were in good agreement with currently available physiological knowledge, suggesting our method can elucidate undiscovered functions of neuron–glia systems. PMID:25393874

  9. The Secondary Organic Aerosol Processor (SOAP v1.0) model: a unified model with different ranges of complexity based on the molecular surrogate approach

    NASA Astrophysics Data System (ADS)

    Couvidat, F.; Sartelet, K.

    2014-01-01

    The Secondary Organic Aerosol Processor (SOAP v1.0) model is presented. This model is designed to be modular with different user options depending on the computing time and the complexity required by the user. This model is based on the molecular surrogate approach, in which each surrogate compound is associated with a molecular structure to estimate some properties and parameters (hygroscopicity, absorption on the aqueous phase of particles, activity coefficients, phase separation). Each surrogate can be hydrophilic (condenses only on the aqueous phase of particles), hydrophobic (condenses only on the organic phase of particles) or both (condenses on both the aqueous and the organic phases of particles). Activity coefficients are computed with the UNIFAC thermodynamic model for short-range interactions and with the AIOMFAC parameterization for medium and long-range interactions between electrolytes and organic compounds. Phase separation is determined by Gibbs energy minimization. The user can choose between an equilibrium and a dynamic representation of the organic aerosol. In the equilibrium representation, compounds in the particle phase are assumed to be at equilibrium with the gas phase. However, recent studies show that the organic aerosol (OA) is not at equilibrium with the gas phase because the organic phase could be semi-solid (very viscous liquid phase). The condensation or evaporation of organic compounds could then be limited by the diffusion in the organic phase due to the high viscosity. A dynamic representation of secondary organic aerosols (SOA) is used with OA divided into layers, the first layer at the center of the particle (slowly reaches equilibrium) and the final layer near the interface with the gas phase (quickly reaches equilibrium).

  10. Evaluation of surrogate markers for human immunodeficiency virus infection among blood donors at the blood bank of "Hospital Universitário Regional Norte do Paraná", Londrina, PR, Brazil.

    PubMed

    Reiche, Edna Maria Vissoci; Vogler, Ingridt Hildegard; Morimoto, Helena Kaminami; Bortoliero, André Luis; Matsuo, Tiemi; Yuahasi, Kátia Kioko; Cancian, Sanderson Júnior; Koguichi, Roberto Setsuo

    2003-01-01

    This study evaluated the usefulness of the anti-HBc, hepatitis C virus antibodies (anti-HCV), human T cell lymphotropic virus I and II antibodies (anti-HTLV I/II), serologic tests for syphilis, and surface antigen of hepatitis B virus (HBsAg) as surrogate markers for the risk for HIV infection in 80,284 serum samples from blood donors from the Blood Bank of "Hospital Universitário Regional Norte do Paraná", Londrina, Paraná State, Brazil, analyzed from July 1994 to April 2001. Among 39 blood donors with positive serology for HIV, 12 (30.8%) were anti-HBc positive, 10 (25.6%) for anti-HCV, 1 (2.6%) for anti-HTLV I/I, 1 (2.6%) was positive for syphilis, and 1 (2.6%) for HBsAg. Among the donors with negative serology for HIV, these markers were detected in 8,407 (10.5%), 441 (0.5%), 189 (0.2%), 464 (0.6%), and 473 (0.6%) samples, respectively. The difference was statistically significant (p < 0.001) for anti-HBc and anti-HCV. Although the predictive positive values for these surrogate markers were low for HIV infection, the results confirmed the anti-HBc and anti-HCV as useful surrogate markers for HIV infection thus reinforcing the maintenance of them in the screening for blood donors contributing to the prevention of the small number of cases in which HIV is still transmitted by transfusion.

  11. Relational autonomy: moving beyond the limits of isolated individualism.

    PubMed

    Walter, Jennifer K; Ross, Lainie Friedman

    2014-02-01

    Although clinicians may value respecting a patient's or surrogate's autonomy in decision-making, it is not always clear how to proceed in clinical practice. The confusion results, in part, from which conception of autonomy is used to guide ethical practice. Reliance on an individualistic conception such as the "in-control agent" model prioritizes self-sufficiency in decision-making and highlights a decision-maker's capacity to have reason transcend one's emotional experience. An alternative model of autonomy, relational autonomy, highlights the social context within which all individuals exist and acknowledges the emotional and embodied aspects of decision-makers. These 2 conceptions of autonomy lead to different interpretations of several aspects of ethical decision-making. The in-control agent model believes patients or surrogates should avoid both the influence of others and emotional persuasion in decision-making. As a result, providers have a limited role to play and are expected to provide medical expertise but not interfere with the individual's decision-making process. In contrast, a relational autonomy approach acknowledges the central role of others in decision-making, including clinicians, who have a responsibility to engage patients' and surrogates' emotional experiences and offer clear guidance when patients are confronting serious illness. In the pediatric setting, in which decision-making is complicated by having a surrogate decision-maker in addition to a patient, these conceptions of autonomy also may influence expectations about the role that adolescents can play in decision-making.

  12. The Impact of Parametric Uncertainties on Biogeochemistry in the E3SM Land Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricciuto, Daniel; Sargsyan, Khachik; Thornton, Peter

    We conduct a global sensitivity analysis (GSA) of the Energy Exascale Earth System Model (E3SM), land model (ELM) to calculate the sensitivity of five key carbon cycle outputs to 68 model parameters. This GSA is conducted by first constructing a Polynomial Chaos (PC) surrogate via new Weighted Iterative Bayesian Compressive Sensing (WIBCS) algorithm for adaptive basis growth leading to a sparse, high-dimensional PC surrogate with 3,000 model evaluations. The PC surrogate allows efficient extraction of GSA information leading to further dimensionality reduction. The GSA is performed at 96 FLUXNET sites covering multiple plant functional types (PFTs) and climate conditions. Aboutmore » 20 of the model parameters are identified as sensitive with the rest being relatively insensitive across all outputs and PFTs. These sensitivities are dependent on PFT, and are relatively consistent among sites within the same PFT. The five model outputs have a majority of their highly sensitive parameters in common. A common subset of sensitive parameters is also shared among PFTs, but some parameters are specific to certain types (e.g., deciduous phenology). In conclusion, the relative importance of these parameters shifts significantly among PFTs and with climatic variables such as mean annual temperature.« less

  13. The Impact of Parametric Uncertainties on Biogeochemistry in the E3SM Land Model

    DOE PAGES

    Ricciuto, Daniel; Sargsyan, Khachik; Thornton, Peter

    2018-02-27

    We conduct a global sensitivity analysis (GSA) of the Energy Exascale Earth System Model (E3SM), land model (ELM) to calculate the sensitivity of five key carbon cycle outputs to 68 model parameters. This GSA is conducted by first constructing a Polynomial Chaos (PC) surrogate via new Weighted Iterative Bayesian Compressive Sensing (WIBCS) algorithm for adaptive basis growth leading to a sparse, high-dimensional PC surrogate with 3,000 model evaluations. The PC surrogate allows efficient extraction of GSA information leading to further dimensionality reduction. The GSA is performed at 96 FLUXNET sites covering multiple plant functional types (PFTs) and climate conditions. Aboutmore » 20 of the model parameters are identified as sensitive with the rest being relatively insensitive across all outputs and PFTs. These sensitivities are dependent on PFT, and are relatively consistent among sites within the same PFT. The five model outputs have a majority of their highly sensitive parameters in common. A common subset of sensitive parameters is also shared among PFTs, but some parameters are specific to certain types (e.g., deciduous phenology). In conclusion, the relative importance of these parameters shifts significantly among PFTs and with climatic variables such as mean annual temperature.« less

  14. Accelerated iterative beam angle selection in IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bangert, Mark, E-mail: m.bangert@dkfz.de; Unkelbach, Jan

    2016-03-15

    Purpose: Iterative methods for beam angle selection (BAS) for intensity-modulated radiation therapy (IMRT) planning sequentially construct a beneficial ensemble of beam directions. In a naïve implementation, the nth beam is selected by adding beam orientations one-by-one from a discrete set of candidates to an existing ensemble of (n − 1) beams. The best beam orientation is identified in a time consuming process by solving the fluence map optimization (FMO) problem for every candidate beam and selecting the beam that yields the largest improvement to the objective function value. This paper evaluates two alternative methods to accelerate iterative BAS based onmore » surrogates for the FMO objective function value. Methods: We suggest to select candidate beams not based on the FMO objective function value after convergence but (1) based on the objective function value after five FMO iterations of a gradient based algorithm and (2) based on a projected gradient of the FMO problem in the first iteration. The performance of the objective function surrogates is evaluated based on the resulting objective function values and dose statistics in a treatment planning study comprising three intracranial, three pancreas, and three prostate cases. Furthermore, iterative BAS is evaluated for an application in which a small number of noncoplanar beams complement a set of coplanar beam orientations. This scenario is of practical interest as noncoplanar setups may require additional attention of the treatment personnel for every couch rotation. Results: Iterative BAS relying on objective function surrogates yields similar results compared to naïve BAS with regard to the objective function values and dose statistics. At the same time, early stopping of the FMO and using the projected gradient during the first iteration enable reductions in computation time by approximately one to two orders of magnitude. With regard to the clinical delivery of noncoplanar IMRT treatments, we could show that optimized beam ensembles using only a few noncoplanar beam orientations often approach the plan quality of fully noncoplanar ensembles. Conclusions: We conclude that iterative BAS in combination with objective function surrogates can be a viable option to implement automated BAS at clinically acceptable computation times.« less

  15. Accelerated iterative beam angle selection in IMRT.

    PubMed

    Bangert, Mark; Unkelbach, Jan

    2016-03-01

    Iterative methods for beam angle selection (BAS) for intensity-modulated radiation therapy (IMRT) planning sequentially construct a beneficial ensemble of beam directions. In a naïve implementation, the nth beam is selected by adding beam orientations one-by-one from a discrete set of candidates to an existing ensemble of (n - 1) beams. The best beam orientation is identified in a time consuming process by solving the fluence map optimization (FMO) problem for every candidate beam and selecting the beam that yields the largest improvement to the objective function value. This paper evaluates two alternative methods to accelerate iterative BAS based on surrogates for the FMO objective function value. We suggest to select candidate beams not based on the FMO objective function value after convergence but (1) based on the objective function value after five FMO iterations of a gradient based algorithm and (2) based on a projected gradient of the FMO problem in the first iteration. The performance of the objective function surrogates is evaluated based on the resulting objective function values and dose statistics in a treatment planning study comprising three intracranial, three pancreas, and three prostate cases. Furthermore, iterative BAS is evaluated for an application in which a small number of noncoplanar beams complement a set of coplanar beam orientations. This scenario is of practical interest as noncoplanar setups may require additional attention of the treatment personnel for every couch rotation. Iterative BAS relying on objective function surrogates yields similar results compared to naïve BAS with regard to the objective function values and dose statistics. At the same time, early stopping of the FMO and using the projected gradient during the first iteration enable reductions in computation time by approximately one to two orders of magnitude. With regard to the clinical delivery of noncoplanar IMRT treatments, we could show that optimized beam ensembles using only a few noncoplanar beam orientations often approach the plan quality of fully noncoplanar ensembles. We conclude that iterative BAS in combination with objective function surrogates can be a viable option to implement automated BAS at clinically acceptable computation times.

  16. On a sparse pressure-flow rate condensation of rigid circulation models

    PubMed Central

    Schiavazzi, D. E.; Hsia, T. Y.; Marsden, A. L.

    2015-01-01

    Cardiovascular simulation has shown potential value in clinical decision-making, providing a framework to assess changes in hemodynamics produced by physiological and surgical alterations. State-of-the-art predictions are provided by deterministic multiscale numerical approaches coupling 3D finite element Navier Stokes simulations to lumped parameter circulation models governed by ODEs. Development of next-generation stochastic multiscale models whose parameters can be learned from available clinical data under uncertainty constitutes a research challenge made more difficult by the high computational cost typically associated with the solution of these models. We present a methodology for constructing reduced representations that condense the behavior of 3D anatomical models using outlet pressure-flow polynomial surrogates, based on multiscale model solutions spanning several heart cycles. Relevance vector machine regression is compared with maximum likelihood estimation, showing that sparse pressure/flow rate approximations offer superior performance in producing working surrogate models to be included in lumped circulation networks. Sensitivities of outlets flow rates are also quantified through a Sobol’ decomposition of their total variance encoded in the orthogonal polynomial expansion. Finally, we show that augmented lumped parameter models including the proposed surrogates accurately reproduce the response of multiscale models they were derived from. In particular, results are presented for models of the coronary circulation with closed loop boundary conditions and the abdominal aorta with open loop boundary conditions. PMID:26671219

  17. Head circumference as a useful surrogate for intracranial volume in older adults.

    PubMed

    Hshieh, Tammy T; Fox, Meaghan L; Kosar, Cyrus M; Cavallari, Michele; Guttmann, Charles R G; Alsop, David; Marcantonio, Edward R; Schmitt, Eva M; Jones, Richard N; Inouye, Sharon K

    2016-01-01

    Intracranial volume (ICV) has been proposed as a measure of maximum lifetime brain size. Accurate ICV measures require neuroimaging which is not always feasible for epidemiologic investigations. We examined head circumference as a useful surrogate for ICV in older adults. 99 older adults underwent Magnetic Resonance Imaging (MRI). ICV was measured by Statistical Parametric Mapping 8 (SPM8) software or Functional MRI of the Brain Software Library (FSL) extraction with manual editing, typically considered the gold standard. Head circumferences were determined using standardized tape measurement. We examined estimated correlation coefficients between head circumference and the two MRI-based ICV measurements. Head circumference and ICV by SPM8 were moderately correlated (overall r = 0.73, men r = 0.67, women r = 0.63). Head circumference and ICV by FSL were also moderately correlated (overall r = 0.69, men r = 0.63, women r = 0.49). Head circumference measurement was strongly correlated with MRI-derived ICV. Our study presents a simple method to approximate ICV among older patients, which may prove useful as a surrogate for cognitive reserve in large scale epidemiologic studies of cognitive outcomes. This study also suggests the stability of head circumference correlation with ICV throughout the lifespan.

  18. A Statistical Approach for Judging Stability of Whole Mixture Chemical Composition over Time for Highly Complex Disinfection By-Product Mixtures from EPA's Four Lab Study

    EPA Science Inventory

    Chemical characterization of complex mixtures and assessment of stability over time of the characterized chemicals is crucial both to characterize exposure and to use data from one mixture as a surrogate for other similar mixtures. The chemical composition of test mixtures can va...

  19. Recurrence network measures for hypothesis testing using surrogate data: Application to black hole light curves

    NASA Astrophysics Data System (ADS)

    Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.

    2018-01-01

    Recurrence networks and the associated statistical measures have become important tools in the analysis of time series data. In this work, we test how effective the recurrence network measures are in analyzing real world data involving two main types of noise, white noise and colored noise. We use two prominent network measures as discriminating statistic for hypothesis testing using surrogate data for a specific null hypothesis that the data is derived from a linear stochastic process. We show that the characteristic path length is especially efficient as a discriminating measure with the conclusions reasonably accurate even with limited number of data points in the time series. We also highlight an additional advantage of the network approach in identifying the dimensionality of the system underlying the time series through a convergence measure derived from the probability distribution of the local clustering coefficients. As examples of real world data, we use the light curves from a prominent black hole system and show that a combined analysis using three primary network measures can provide vital information regarding the nature of temporal variability of light curves from different spectroscopic classes.

  20. On Design Mining: Coevolution and Surrogate Models.

    PubMed

    Preen, Richard J; Bull, Larry

    2017-01-01

    Design mining is the use of computational intelligence techniques to iteratively search and model the attribute space of physical objects evaluated directly through rapid prototyping to meet given objectives. It enables the exploitation of novel materials and processes without formal models or complex simulation. In this article, we focus upon the coevolutionary nature of the design process when it is decomposed into concurrent sub-design-threads due to the overall complexity of the task. Using an abstract, tunable model of coevolution, we consider strategies to sample subthread designs for whole-system testing and how best to construct and use surrogate models within the coevolutionary scenario. Drawing on our findings, we then describe the effective design of an array of six heterogeneous vertical-axis wind turbines.

  1. Munitions and Explosives of Concern Survey Methodology and In-field Testing for Wind Energy Areas on the Atlantic Outer Continental Shelf

    NASA Astrophysics Data System (ADS)

    DuVal, C.; Carton, G.; Trembanis, A. C.; Edwards, M.; Miller, J. K.

    2017-12-01

    Munitions and explosives of concern (MEC) are present in U.S. waters as a result of past and ongoing live-fire testing and training, combat operations, and sea disposal. To identify MEC that may pose a risk to human safety during development of offshore wind facilities on the Atlantic Outer Continental Shelf (OCS), the Bureau of Ocean Energy Management (BOEM) is preparing to develop guidance on risk analysis and selection processes for methods and technologies to identify MEC in Wind Energy Areas (WEA). This study developed a process for selecting appropriate technologies and methodologies for MEC detection using a synthesis of historical research, physical site characterization, remote sensing technology review, and in-field trials. Personnel were tasked with seeding a portion of the Delaware WEA with munitions surrogates, while a second group of researchers not privy to the surrogate locations tested and optimized the selected methodology to find and identify the placed targets. This in-field trial, conducted in July 2016, emphasized the use of multiple sensors for MEC detection, and led to further guidance for future MEC detection efforts on the Atlantic OCS. An April 2017 follow on study determined the fate of the munitions surrogates after the Atlantic storm season had passed. Using regional hydrodynamic models and incorporating the recommendations from the 2016 field trial, the follow on study examined the fate of the MEC and compared the findings to existing research on munitions mobility, as well as models developed as part of the Office of Naval Research Mine-Burial Program. Focus was given to characterizing the influence of sediment type on surrogate munitions behavior and the influence of mophodynamics and object burial on MEC detection. Supporting Mine-Burial models, ripple bedforms were observed to impede surrogate scour and burial in coarse sediments, while surrogate burial was both predicted and observed in finer sediments. Further, incorporation of recommendations from the previous trial in the 2017 study led to fourfold improvement of MEC detection rates over the 2016 approach. The use of modeling to characterize local morphodynamics, MEC burial or mobility, and the impact of seasonal or episodic storm events are discussed in light of technology selection and timing for future MEC detection surveys.

  2. Evaluating and interpreting cross-taxon congruence: Potential pitfalls and solutions

    NASA Astrophysics Data System (ADS)

    Gioria, Margherita; Bacaro, Giovanni; Feehan, John

    2011-05-01

    Characterizing the relationship between different taxonomic groups is critical to identify potential surrogates for biodiversity. Previous studies have shown that cross-taxa relationships are generally weak and/or inconsistent. The difficulties in finding predictive patterns have often been attributed to the spatial and temporal scales of these studies and on the differences in the measure used to evaluate such relationships (species richness versus composition). However, the choice of the analytical approach used to evaluate cross-taxon congruence inevitably represents a major source of variation. Here, we described the use of a range of methods that can be used to comprehensively assess cross-taxa relationships. To do so, we used data for two taxonomic groups, wetland plants and water beetles, collected from 54 farmland ponds in Ireland. Specifically, we used the Pearson correlation and rarefaction curves to analyse patterns in species richness, while Mantel tests, Procrustes analysis, and co-correspondence analysis were used to evaluate congruence in species composition. We compared the results of these analyses and we described some of the potential pitfalls associated with the use of each of these statistical approaches. Cross-taxon congruence was moderate to strong, depending on the choice of the analytical approach, on the nature of the response variable, and on local and environmental conditions. Our findings indicate that multiple approaches and measures of community structure are required for a comprehensive assessment of cross-taxa relationships. In particular, we showed that selection of surrogate taxa in conservation planning should not be based on a single statistic expressing the degree of correlation in species richness or composition. Potential solutions to the analytical issues associated with the assessment of cross-taxon congruence are provided and the implications of our findings in the selection of surrogates for biodiversity are discussed.

  3. Dry deposition of gaseous oxidized mercury in Western Maryland.

    PubMed

    Castro, Mark S; Moore, Chris; Sherwell, John; Brooks, Steve B

    2012-02-15

    The purpose of this study was to directly measure the dry deposition of gaseous oxidized mercury (GOM) in western Maryland. Annual estimates were made using passive ion-exchange surrogate surfaces and a resistance model. Surrogate surfaces were deployed for seventeen weekly sampling periods between September 2009 and October 2010. Dry deposition rates from surrogate surfaces ranged from 80 to 1512 pgm(-2)h(-1). GOM dry deposition rates were strongly correlated (r(2)=0.75) with the weekly average atmospheric GOM concentrations, which ranged from 2.3 to 34.1 pgm(-3). Dry deposition of GOM could be predicted from the ambient air concentrations of GOM using this equation: GOM dry deposition (pgm(-2)h(-1))=43.2 × GOM concentration-80.3. Dry deposition velocities computed using GOM concentrations and surrogate surface GOM dry deposition rates, ranged from 0.2 to 1.7 cms(-1). Modeled dry deposition rates were highly correlated (r(2)=0.80) with surrogate surface dry deposition rates. Using the overall weekly average surrogate surface dry deposition rate (369 ± 340 pg m(-2)h(-1)), we estimated an annual GOM dry deposition rate of 3.2 μg m(-2)year(-1). Using the resistance model, we estimated an annual GOM dry deposition rate of 3.5 μg m(-2)year(-1). Our annual GOM dry deposition rates were similar to the dry deposition (3.3 μg m(-2)h(-1)) of gaseous elemental mercury (GEM) at our site. In addition, annual GOM dry deposition was approximately 1/2 of the average annual wet deposition of total mercury (7.7 ± 1.9 μg m(-2)year(-1)) at our site. Total annual mercury deposition from dry deposition of GOM and GEM and wet deposition was approximately 14.4 μg m(-2)year(-1), which was similar to the average annual litterfall deposition (15 ± 2.1 μg m(-2)year(-1)) of mercury, which was also measured at our site. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. A Phase 1 Randomized, Blinded Comparison of the Pharmacokinetics and Colonic Distribution of Three Candidate Rectal Microbicide Formulations of Tenofovir 1% Gel with Simulated Unprotected Sex (CHARM-02)

    PubMed Central

    Hiruy, Hiwot; Fuchs, Edward J.; Marzinke, Mark A.; Bakshi, Rahul P.; Breakey, Jennifer C.; Aung, Wutyi S.; Manohar, Madhuri; Yue, Chen; Caffo, Brian S.; Du, Yong; Abebe, Kaleab Z.; Spiegel, Hans M.L.; Rohan, Lisa C.; McGowan, Ian

    2015-01-01

    Abstract CHARM-02 is a crossover, double-blind, randomized trial to compare the safety and pharmacokinetics of three rectally applied tenofovir 1% gel candidate rectal microbicides of varying osmolalities: vaginal formulation (VF) (3111 mOsmol/kg), the reduced glycerin vaginal formulation (RGVF) (836 mOsmol/kg), and an isoosmolal rectal-specific formulation (RF) (479 mOsmol/kg). Participants (n = 9) received a single, 4 ml, radiolabeled dose of each gel twice, once with and once without simulated unprotected receptive anal intercourse (RAI). The safety, plasma tenofovir pharmacokinetics, colonic small molecule permeability, and SPECT/CT imaging of lower gastrointestinal distribution of drug and virus surrogate were assessed. There were no Grade 3 or 4 adverse events reported for any of the products. Overall, there were more Grade 2 adverse events in the VF group compared to RF (p = 0.006) and RGVF (p = 0.048). In the absence of simulated unprotected RAI, VF had up to 3.8-fold greater systemic tenofovir exposure, 26- to 234-fold higher colonic permeability of the drug surrogate, and 1.5- to 2-fold greater proximal migration in the colonic lumen, when compared to RF and RGVF. Similar trends were observed with simulated unprotected RAI, but most did not reach statistical significance. SPECT analysis showed 86% (standard deviation 19%) of the drug surrogate colocalized with the virus surrogate in the colonic lumen. There were no significant differences between the RGVF and RF formulation, with the exception of a higher plasma tenofovir concentration of RGVF in the absence of simulated unprotected RAI. VF had the most adverse events, highest plasma tenofovir concentrations, greater mucosal permeability of the drug surrogate, and most proximal colonic luminal migration compared to RF and RGVF formulations. There were no major differences between RF and RGVF formulations. Simultaneous assessment of toxicity, systemic and luminal pharmacokinetics, and colocalization of drug and viral surrogates substantially informs rectal microbicide product development. PMID:26227279

  5. Statistical Metamodeling and Sequential Design of Computer Experiments to Model Glyco-Altered Gating of Sodium Channels in Cardiac Myocytes.

    PubMed

    Du, Dongping; Yang, Hui; Ednie, Andrew R; Bennett, Eric S

    2016-09-01

    Glycan structures account for up to 35% of the mass of cardiac sodium ( Nav ) channels. To question whether and how reduced sialylation affects Nav activity and cardiac electrical signaling, we conducted a series of in vitro experiments on ventricular apex myocytes under two different glycosylation conditions, reduced protein sialylation (ST3Gal4(-/-)) and full glycosylation (control). Although aberrant electrical signaling is observed in reduced sialylation, realizing a better understanding of mechanistic details of pathological variations in INa and AP is difficult without performing in silico studies. However, computer model of Nav channels and cardiac myocytes involves greater levels of complexity, e.g., high-dimensional parameter space, nonlinear and nonconvex equations. Traditional linear and nonlinear optimization methods have encountered many difficulties for model calibration. This paper presents a new statistical metamodeling approach for efficient computer experiments and optimization of Nav models. First, we utilize a fractional factorial design to identify control variables from the large set of model parameters, thereby reducing the dimensionality of parametric space. Further, we develop the Gaussian process model as a surrogate of expensive and time-consuming computer models and then identify the next best design point that yields the maximal probability of improvement. This process iterates until convergence, and the performance is evaluated and validated with real-world experimental data. Experimental results show the proposed algorithm achieves superior performance in modeling the kinetics of Nav channels under a variety of glycosylation conditions. As a result, in silico models provide a better understanding of glyco-altered mechanistic details in state transitions and distributions of Nav channels. Notably, ST3Gal4(-/-) myocytes are shown to have higher probabilities accumulated in intermediate inactivation during the repolarization and yield a shorter refractory period than WTs. The proposed statistical design of computer experiments is generally extensible to many other disciplines that involve large-scale and computationally expensive models.

  6. Airfoil Shape Optimization based on Surrogate Model

    NASA Astrophysics Data System (ADS)

    Mukesh, R.; Lingadurai, K.; Selvakumar, U.

    2018-02-01

    Engineering design problems always require enormous amount of real-time experiments and computational simulations in order to assess and ensure the design objectives of the problems subject to various constraints. In most of the cases, the computational resources and time required per simulation are large. In certain cases like sensitivity analysis, design optimisation etc where thousands and millions of simulations have to be carried out, it leads to have a life time of difficulty for designers. Nowadays approximation models, otherwise called as surrogate models (SM), are more widely employed in order to reduce the requirement of computational resources and time in analysing various engineering systems. Various approaches such as Kriging, neural networks, polynomials, Gaussian processes etc are used to construct the approximation models. The primary intention of this work is to employ the k-fold cross validation approach to study and evaluate the influence of various theoretical variogram models on the accuracy of the surrogate model construction. Ordinary Kriging and design of experiments (DOE) approaches are used to construct the SMs by approximating panel and viscous solution algorithms which are primarily used to solve the flow around airfoils and aircraft wings. The method of coupling the SMs with a suitable optimisation scheme to carryout an aerodynamic design optimisation process for airfoil shapes is also discussed.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rottmann, Joerg; Berbeco, Ross

    Purpose: Precise prediction of respiratory motion is a prerequisite for real-time motion compensation techniques such as beam, dynamic couch, or dynamic multileaf collimator tracking. Collection of tumor motion data to train the prediction model is required for most algorithms. To avoid exposure of patients to additional dose from imaging during this procedure, the feasibility of training a linear respiratory motion prediction model with an external surrogate signal is investigated and its performance benchmarked against training the model with tumor positions directly. Methods: The authors implement a lung tumor motion prediction algorithm based on linear ridge regression that is suitable tomore » overcome system latencies up to about 300 ms. Its performance is investigated on a data set of 91 patient breathing trajectories recorded from fiducial marker tracking during radiotherapy delivery to the lung of ten patients. The expected 3D geometric error is quantified as a function of predictor lookahead time, signal sampling frequency and history vector length. Additionally, adaptive model retraining is evaluated, i.e., repeatedly updating the prediction model after initial training. Training length for this is gradually increased with incoming (internal) data availability. To assess practical feasibility model calculation times as well as various minimum data lengths for retraining are evaluated. Relative performance of model training with external surrogate motion data versus tumor motion data is evaluated. However, an internal–external motion correlation model is not utilized, i.e., prediction is solely driven by internal motion in both cases. Results: Similar prediction performance was achieved for training the model with external surrogate data versus internal (tumor motion) data. Adaptive model retraining can substantially boost performance in the case of external surrogate training while it has little impact for training with internal motion data. A minimum adaptive retraining data length of 8 s and history vector length of 3 s achieve maximal performance. Sampling frequency appears to have little impact on performance confirming previously published work. By using the linear predictor, a relative geometric 3D error reduction of about 50% was achieved (using adaptive retraining, a history vector length of 3 s and with results averaged over all investigated lookahead times and signal sampling frequencies). The absolute mean error could be reduced from (2.0 ± 1.6) mm when using no prediction at all to (0.9 ± 0.8) mm and (1.0 ± 0.9) mm when using the predictor trained with internal tumor motion training data and external surrogate motion training data, respectively (for a typical lookahead time of 250 ms and sampling frequency of 15 Hz). Conclusions: A linear prediction model can reduce latency induced tracking errors by an average of about 50% in real-time image guided radiotherapy systems with system latencies of up to 300 ms. Training a linear model for lung tumor motion prediction with an external surrogate signal alone is feasible and results in similar performance as training with (internal) tumor motion. Particularly for scenarios where motion data are extracted from fluoroscopic imaging with ionizing radiation, this may alleviate the need for additional imaging dose during the collection of model training data.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nasehi Tehrani, J; Wang, J; McEwan, A

    Purpose: In this study, we developed and evaluated a method for predicting lung surface deformation vector fields (SDVFs) based on surrogate signals such as chest and abdomen motion at selected locations and spirometry measurements. Methods: A Patient-specific 3D triangular surface mesh of the lung region at end-expiration (EE) phase was obtained by threshold-based segmentation method. For each patient, a spirometer recorded the flow volume changes of the lungs; and 192 selected points at a regular spacing of 2cm X 2cm matrix points over a total area of 34cm X 24cm on the surface of chest and abdomen was used tomore » detect chest wall motions. Preprocessing techniques such as QR factorization with column pivoting (QRCP) were employed to remove redundant observations of the chest and abdominal area. To create a statistical model between the lung surface and the corresponding surrogate signals, we developed a predictive model based on canonical ridge regression (CRR). Two unique weighting vectors were selected for each vertex on the surface of the lung, and they were optimized during the training process using the all other phases of 4D-CT except the end-inspiration (EI) phase. These parameters were employed to predict the vertices locations of a testing data set, which was the EI phase of 4D-CT. Results: For ten lung cancer patients, the deformation vector field of each vertex of lung surface mesh was estimated from the external motion at selected positions on the chest wall surface plus spirometry measurements. The average estimation of 98th percentile of error was less than 1 mm (AP= 0.85, RL= 0.61, and SI= 0.82). Conclusion: The developed predictive model provides a non-invasive approach to derive lung boundary condition. Together with personalized biomechanical respiration modelling, the proposed model can be used to derive the lung tumor motion during radiation therapy accurately from non-invasive measurements.« less

  9. Impact of copula directional specification on multi-trial evaluation of surrogate endpoints

    PubMed Central

    Renfro, Lindsay A.; Shang, Hongwei; Sargent, Daniel J.

    2014-01-01

    Evaluation of surrogate endpoints using patient-level data from multiple trials is the gold standard, where multi-trial copula models are used to quantify both patient-level and trial-level surrogacy. While limited consideration has been given in the literature to copula choice (e.g., Clayton), no prior consideration has been given to direction of implementation (via survival versus distribution functions). We demonstrate that evenwith the “correct” copula family, directional misspecification leads to biased estimates of patient-level and trial-level surrogacy. We illustrate with a simulation study and a re-analysis of disease-free survival as a surrogate for overall survival in early stage colon cancer. PMID:24905465

  10. An integrated simulation and optimization approach for managing human health risks of atmospheric pollutants by coal-fired power plants.

    PubMed

    Dai, C; Cai, X H; Cai, Y P; Guo, H C; Sun, W; Tan, Q; Huang, G H

    2014-06-01

    This research developed a simulation-aided nonlinear programming model (SNPM). This model incorporated the consideration of pollutant dispersion modeling, and the management of coal blending and the related human health risks within a general modeling framework In SNPM, the simulation effort (i.e., California puff [CALPUFF]) was used to forecast the fate of air pollutants for quantifying the health risk under various conditions, while the optimization studies were to identify the optimal coal blending strategies from a number of alternatives. To solve the model, a surrogate-based indirect search approach was proposed, where the support vector regression (SVR) was used to create a set of easy-to-use and rapid-response surrogates for identifying the function relationships between coal-blending operating conditions and health risks. Through replacing the CALPUFF and the corresponding hazard quotient equation with the surrogates, the computation efficiency could be improved. The developed SNPM was applied to minimize the human health risk associated with air pollutants discharged from Gaojing and Shijingshan power plants in the west of Beijing. Solution results indicated that it could be used for reducing the health risk of the public in the vicinity of the two power plants, identifying desired coal blending strategies for decision makers, and considering a proper balance between coal purchase cost and human health risk. A simulation-aided nonlinear programming model (SNPM) is developed. It integrates the advantages of CALPUFF and nonlinear programming model. To solve the model, a surrogate-based indirect search approach based on the combination of support vector regression and genetic algorithm is proposed. SNPM is applied to reduce the health risk caused by air pollutants discharged from Gaojing and Shijingshan power plants in the west of Beijing. Solution results indicate that it is useful for generating coal blending schemes, reducing the health risk of the public, reflecting the trade-offbetween coal purchase cost and health risk.

  11. Surrogate measures and consistent surrogates

    PubMed Central

    VanderWeele, Tyler J.

    2014-01-01

    Summary Surrogates which allow one to predict the effect of the treatment on the outcome of interest from the effect of the treatment on the surrogate are of importance when it is difficult or expensive to measure the primary outcome. Unfortunately, the use of such surrogates can give rise to paradoxical situations in which the effect of the treatment on the surrogate is positive, the surrogate and outcome are strongly positively correlated, but the effect of the treatment on the outcome is negative, a phenomenon sometimes referred to as the "surrogate paradox." New results are given for consistent surrogates that extend the existing literature on sufficient conditions that ensure the surrogate paradox is not manifest. Specifically, it is shown that for the surrogate paradox to beman.est it must be the case that either there is (i) a direct effect of treatment on the outcome not through the surrogate and in the opposite direction as that through the surrogate or (ii) confounding for the effect of the surrogate on the outcome, or (iii) a lack of transitivity so that treatment does not positively affect the surrogate for all the same individuals for which the surrogate positively affects the outcome. The conditions for consistent surrogates and the results of the paper are important because they allow investigators to predict the direction of the effect of the treatment on the outcome simply from the direction of the effect of the treatment on the surrogate. These results on consistent surrogates are then related to the four approaches to surrogate outcomes described by Joffe and Greene (2009, Biometrics 65, 530–538) to assess whether the standard criterion used by these approaches to assess whether a surrogate is "good" suffices to avoid the surrogate paradox. PMID:24073861

  12. Imputation of a true endpoint from a surrogate: application to a cluster randomized controlled trial with partial information on the true endpoint.

    PubMed

    Nixon, Richard M; Duffy, Stephen W; Fender, Guy R K

    2003-09-24

    The Anglia Menorrhagia Education Study (AMES) is a randomized controlled trial testing the effectiveness of an education package applied to general practices. Binary data are available from two sources; general practitioner reported referrals to hospital, and referrals to hospital determined by independent audit of the general practices. The former may be regarded as a surrogate for the latter, which is regarded as the true endpoint. Data are only available for the true end point on a sub set of the practices, but there are surrogate data for almost all of the audited practices and for most of the remaining practices. The aim of this paper was to estimate the treatment effect using data from every practice in the study. Where the true endpoint was not available, it was estimated by three approaches, a regression method, multiple imputation and a full likelihood model. Including the surrogate data in the analysis yielded an estimate of the treatment effect which was more precise than an estimate gained from using the true end point data alone. The full likelihood method provides a new imputation tool at the disposal of trials with surrogate data.

  13. Atomic Radius and Charge Parameter Uncertainty in Biomolecular Solvation Energy Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiu; Lei, Huan; Gao, Peiyuan

    Atomic radii and charges are two major parameters used in implicit solvent electrostatics and energy calculations. The optimization problem for charges and radii is under-determined, leading to uncertainty in the values of these parameters and in the results of solvation energy calculations using these parameters. This paper presents a method for quantifying this uncertainty in solvation energies using surrogate models based on generalized polynomial chaos (gPC) expansions. There are relatively few atom types used to specify radii parameters in implicit solvation calculations; therefore, surrogate models for these low-dimensional spaces could be constructed using least-squares fitting. However, there are many moremore » types of atomic charges; therefore, construction of surrogate models for the charge parameter space required compressed sensing combined with an iterative rotation method to enhance problem sparsity. We present results for the uncertainty in small molecule solvation energies based on these approaches. Additionally, we explore the correlation between uncertainties due to radii and charges which motivates the need for future work in uncertainty quantification methods for high-dimensional parameter spaces.« less

  14. Scalability of surrogate-assisted multi-objective optimization of antenna structures exploiting variable-fidelity electromagnetic simulation models

    NASA Astrophysics Data System (ADS)

    Koziel, Slawomir; Bekasiewicz, Adrian

    2016-10-01

    Multi-objective optimization of antenna structures is a challenging task owing to the high computational cost of evaluating the design objectives as well as the large number of adjustable parameters. Design speed-up can be achieved by means of surrogate-based optimization techniques. In particular, a combination of variable-fidelity electromagnetic (EM) simulations, design space reduction techniques, response surface approximation models and design refinement methods permits identification of the Pareto-optimal set of designs within a reasonable timeframe. Here, a study concerning the scalability of surrogate-assisted multi-objective antenna design is carried out based on a set of benchmark problems, with the dimensionality of the design space ranging from six to 24 and a CPU cost of the EM antenna model from 10 to 20 min per simulation. Numerical results indicate that the computational overhead of the design process increases more or less quadratically with the number of adjustable geometric parameters of the antenna structure at hand, which is a promising result from the point of view of handling even more complex problems.

  15. Hamiltonian Monte Carlo acceleration using surrogate functions with random bases.

    PubMed

    Zhang, Cheng; Shahbaba, Babak; Zhao, Hongkai

    2017-11-01

    For big data analysis, high computational cost for Bayesian methods often limits their applications in practice. In recent years, there have been many attempts to improve computational efficiency of Bayesian inference. Here we propose an efficient and scalable computational technique for a state-of-the-art Markov chain Monte Carlo methods, namely, Hamiltonian Monte Carlo. The key idea is to explore and exploit the structure and regularity in parameter space for the underlying probabilistic model to construct an effective approximation of its geometric properties. To this end, we build a surrogate function to approximate the target distribution using properly chosen random bases and an efficient optimization process. The resulting method provides a flexible, scalable, and efficient sampling algorithm, which converges to the correct target distribution. We show that by choosing the basis functions and optimization process differently, our method can be related to other approaches for the construction of surrogate functions such as generalized additive models or Gaussian process models. Experiments based on simulated and real data show that our approach leads to substantially more efficient sampling algorithms compared to existing state-of-the-art methods.

  16. Role of pseudo-turbulent stresses in shocked particle clouds and construction of surrogate models for closure

    NASA Astrophysics Data System (ADS)

    Sen, O.; Gaul, N. J.; Davis, S.; Choi, K. K.; Jacobs, G.; Udaykumar, H. S.

    2018-05-01

    Macroscale models of shock-particle interactions require closure terms for unresolved solid-fluid momentum and energy transfer. These comprise the effects of mean as well as fluctuating fluid-phase velocity fields in the particle cloud. Mean drag and Reynolds stress equivalent terms (also known as pseudo-turbulent terms) appear in the macroscale equations. Closure laws for the pseudo-turbulent terms are constructed in this work from ensembles of high-fidelity mesoscale simulations. The computations are performed over a wide range of Mach numbers ( M) and particle volume fractions (φ ) and are used to explicitly compute the pseudo-turbulent stresses from the Favre average of the velocity fluctuations in the flow field. The computed stresses are then used as inputs to a Modified Bayesian Kriging method to generate surrogate models. The surrogates can be used as closure models for the pseudo-turbulent terms in macroscale computations of shock-particle interactions. It is found that the kinetic energy associated with the velocity fluctuations is comparable to that of the mean flow—especially for increasing M and φ . This work is a first attempt to quantify and evaluate the effect of velocity fluctuations for problems of shock-particle interactions.

  17. Role of pseudo-turbulent stresses in shocked particle clouds and construction of surrogate models for closure

    NASA Astrophysics Data System (ADS)

    Sen, O.; Gaul, N. J.; Davis, S.; Choi, K. K.; Jacobs, G.; Udaykumar, H. S.

    2018-02-01

    Macroscale models of shock-particle interactions require closure terms for unresolved solid-fluid momentum and energy transfer. These comprise the effects of mean as well as fluctuating fluid-phase velocity fields in the particle cloud. Mean drag and Reynolds stress equivalent terms (also known as pseudo-turbulent terms) appear in the macroscale equations. Closure laws for the pseudo-turbulent terms are constructed in this work from ensembles of high-fidelity mesoscale simulations. The computations are performed over a wide range of Mach numbers (M) and particle volume fractions (φ ) and are used to explicitly compute the pseudo-turbulent stresses from the Favre average of the velocity fluctuations in the flow field. The computed stresses are then used as inputs to a Modified Bayesian Kriging method to generate surrogate models. The surrogates can be used as closure models for the pseudo-turbulent terms in macroscale computations of shock-particle interactions. It is found that the kinetic energy associated with the velocity fluctuations is comparable to that of the mean flow—especially for increasing M and φ . This work is a first attempt to quantify and evaluate the effect of velocity fluctuations for problems of shock-particle interactions.

  18. Local-metrics error-based Shepard interpolation as surrogate for highly non-linear material models in high dimensions

    NASA Astrophysics Data System (ADS)

    Lorenzi, Juan M.; Stecher, Thomas; Reuter, Karsten; Matera, Sebastian

    2017-10-01

    Many problems in computational materials science and chemistry require the evaluation of expensive functions with locally rapid changes, such as the turn-over frequency of first principles kinetic Monte Carlo models for heterogeneous catalysis. Because of the high computational cost, it is often desirable to replace the original with a surrogate model, e.g., for use in coupled multiscale simulations. The construction of surrogates becomes particularly challenging in high-dimensions. Here, we present a novel version of the modified Shepard interpolation method which can overcome the curse of dimensionality for such functions to give faithful reconstructions even from very modest numbers of function evaluations. The introduction of local metrics allows us to take advantage of the fact that, on a local scale, rapid variation often occurs only across a small number of directions. Furthermore, we use local error estimates to weigh different local approximations, which helps avoid artificial oscillations. Finally, we test our approach on a number of challenging analytic functions as well as a realistic kinetic Monte Carlo model. Our method not only outperforms existing isotropic metric Shepard methods but also state-of-the-art Gaussian process regression.

  19. Local-metrics error-based Shepard interpolation as surrogate for highly non-linear material models in high dimensions.

    PubMed

    Lorenzi, Juan M; Stecher, Thomas; Reuter, Karsten; Matera, Sebastian

    2017-10-28

    Many problems in computational materials science and chemistry require the evaluation of expensive functions with locally rapid changes, such as the turn-over frequency of first principles kinetic Monte Carlo models for heterogeneous catalysis. Because of the high computational cost, it is often desirable to replace the original with a surrogate model, e.g., for use in coupled multiscale simulations. The construction of surrogates becomes particularly challenging in high-dimensions. Here, we present a novel version of the modified Shepard interpolation method which can overcome the curse of dimensionality for such functions to give faithful reconstructions even from very modest numbers of function evaluations. The introduction of local metrics allows us to take advantage of the fact that, on a local scale, rapid variation often occurs only across a small number of directions. Furthermore, we use local error estimates to weigh different local approximations, which helps avoid artificial oscillations. Finally, we test our approach on a number of challenging analytic functions as well as a realistic kinetic Monte Carlo model. Our method not only outperforms existing isotropic metric Shepard methods but also state-of-the-art Gaussian process regression.

  20. A Comparison of Turbidity-Based and Streamflow-Based Estimates of Suspended-Sediment Concentrations in Three Chesapeake Bay Tributaries

    USGS Publications Warehouse

    Jastram, John D.; Moyer, Douglas; Hyer, Kenneth

    2009-01-01

    Fluvial transport of sediment into the Chesapeake Bay estuary is a persistent water-quality issue with major implications for the overall health of the bay ecosystem. Accurately and precisely estimating the suspended-sediment concentrations (SSC) and loads that are delivered to the bay, however, remains challenging. Although manual sampling of SSC produces an accurate series of point-in-time measurements, robust extrapolation to unmeasured periods (especially highflow periods) has proven to be difficult. Sediment concentrations typically have been estimated using regression relations between individual SSC values and associated streamflow values; however, suspended-sediment transport during storm events is extremely variable, and it is often difficult to relate a unique SSC to a given streamflow. With this limitation for estimating SSC, innovative approaches for generating detailed records of suspended-sediment transport are needed. One effective method for improved suspended-sediment determination involves the continuous monitoring of turbidity as a surrogate for SSC. Turbidity measurements are theoretically well correlated to SSC because turbidity represents a measure of water clarity that is directly influenced by suspended sediments; thus, turbidity-based estimation models typically are effective tools for generating SSC data. The U.S. Geological Survey, in cooperation with the U.S. Environmental Protection Agency Chesapeake Bay Program and Virginia Department of Environmental Quality, initiated continuous turbidity monitoring on three major tributaries of the bay - the James, Rappahannock, and North Fork Shenandoah Rivers - to evaluate the use of turbidity as a sediment surrogate in rivers that deliver sediment to the bay. Results of this surrogate approach were compared to the traditionally applied streamflow-based approach for estimating SSC. Additionally, evaluation and comparison of these two approaches were conducted for nutrient estimations. Results demonstrate that the application of turbidity-based estimation models provides an improved method for generating a continuous record of SSC, relative to the classical approach that uses streamflow as a surrogate for SSC. Turbidity-based estimates of SSC were found to be more accurate and precise than SSC estimates from streamflow-based approaches. The turbidity-based SSC estimation models explained 92 to 98 percent of the variability in SSC, while streamflow-based models explained 74 to 88 percent of the variability in SSC. Furthermore, the mean absolute error of turbidity-based SSC estimates was 50 to 87 percent less than the corresponding values from the streamflow-based models. Statistically significant differences were detected between the distributions of residual errors and estimates from the two approaches, indicating that the turbidity-based approach yields estimates of SSC with greater precision than the streamflow-based approach. Similar improvements were identified for turbidity-based estimates of total phosphorus, which is strongly related to turbidity because total phosphorus occurs predominantly in particulate form. Total nitrogen estimation models based on turbidity and streamflow generated estimates of similar quality, with the turbidity-based models providing slight improvements in the quality of estimations. This result is attributed to the understanding that nitrogen transport is dominated by dissolved forms that relate less directly to streamflow and turbidity. Improvements in concentration estimation resulted in improved estimates of load. Turbidity-based suspended-sediment loads estimated for the James River at Cartersville, VA, monitoring station exhibited tighter confidence interval bounds and a coefficient of variation of 12 percent, compared with a coefficient of variation of 38 percent for the streamflow-based load.

  1. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Schöbi, Roland; Sudret, Bruno

    2017-06-01

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions to surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.

  2. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schöbi, Roland, E-mail: schoebi@ibk.baug.ethz.ch; Sudret, Bruno, E-mail: sudret@ibk.baug.ethz.ch

    2017-06-15

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions tomore » surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.« less

  3. Processes in scientific workflows for information seeking related to physical sample materials

    NASA Astrophysics Data System (ADS)

    Ramdeen, S.

    2014-12-01

    The majority of State Geological Surveys have repositories containing cores, cuttings, fossils or other physical sample material. State surveys maintain these collections to support their own research as well as the research conducted by external users from other organizations. This includes organizations such as government agencies (state and federal), academia, industry and the public. The preliminary results presented in this paper will look at the research processes of these external users. In particular: how they discover, access and use digital surrogates, which they use to evaluate and access physical items in these collections. Data such as physical samples are materials that cannot be completely replaced with digital surrogates. Digital surrogates may be represented as metadata, which enable discovery and ultimately access to these samples. These surrogates may be found in records, databases, publications, etc. But surrogates do not completely prevent the need for access to the physical item as they cannot be subjected to chemical testing and/or other similar analysis. The goal of this research is to document the various processes external users perform in order to access physical materials. Data for this study will be collected by conducting interviews with these external users. During the interviews, participants will be asked to describe the workflow that lead them to interact with state survey repositories, and what steps they took afterward. High level processes/categories of behavior will be identified. These processes will be used in the development of an information seeking behavior model. This model may be used to facilitate the development of management tools and other aspects of cyberinfrastructure related to physical samples.

  4. Quantitative analysis of the growth of Salmonella stanley during alfalfa sprouting and evaluation of Enterobacter aerogenes as its surrogate.

    PubMed

    Liu, Bin; Schaffner, Donald W

    2007-02-01

    Raw seed sprouts have been implicated in several food poisoning outbreaks in the last 10 years. Few studies have included investigations of factors influencing the effectiveness of testing spent irrigation water, and in no studies to date has a nonpathogenic surrogate been identified as suitable for large-scale irrigation water testing trials. Alfalfa seeds were inoculated with Salmonella Stanley or its presumptive surrogate (nalidixic acid-resistant Enterobacter aerogenes) at three concentrations (-3, -30, and -300 CFU/g) and were then transferred into either flasks or a bench top-scale sprouting chamber. Microbial concentrations were determined in seeds, sprouts, and irrigation water at various times during a 4-day sprouting process. Data were fit to logistic regression models, and growth rates and maximum concentrations were compared using the generalized linear model procedure of SAS. No significant differences in growth rates were observed among samples taken from flasks or the chamber. Microbial concentrations in irrigation water were not significantly different from concentrations in sprout samples obtaihed at the same time. E. aerogenes concentrations were similar to those of Salmonella Stanley at corresponding time points for all three inoculum concentrations. Growth rates were also constant regardless of inoculum concentration or strain, except that lower inoculum concentrations resulted in lower final concentrations proportional to their initial concentrations. This research demonstrated that a nonpathogenic easy-to-isolate surrogate (nalidixic acid-resistant E. aerogenes) provides results similar to those obtained with Salmonella Stanley, supporting the use of this surrogate in future large-scale experiments.

  5. Calibration of Complex Subsurface Reaction Models Using a Surrogate-Model Approach

    EPA Science Inventory

    Application of model assessment techniques to complex subsurface reaction models involves numerous difficulties, including non-trivial model selection, parameter non-uniqueness, and excessive computational burden. To overcome these difficulties, this study introduces SAMM (Simult...

  6. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  7. Cutthroat trout virus as a surrogate in vitro infection model for testing inhibitors of hepatitis E virus replication

    USGS Publications Warehouse

    Debing, Yannick; Winton, James; Neyts, Johan; Dallmeier, Kai

    2013-01-01

    Hepatitis E virus (HEV) is one of the most important causes of acute hepatitis worldwide. Although most infections are self-limiting, mortality is particularly high in pregnant women. Chronic infections can occur in transplant and other immune-compromised patients. Successful treatment of chronic hepatitis E has been reported with ribavirin and pegylated interferon-alpha, however severe side effects were observed. We employed the cutthroat trout virus (CTV), a non-pathogenic fish virus with remarkable similarities to HEV, as a potential surrogate for HEV and established an antiviral assay against this virus using the Chinook salmon embryo (CHSE-214) cell line. Ribavirin and the respective trout interferon were found to efficiently inhibit CTV replication. Other known broad-spectrum inhibitors of RNA virus replication such as the nucleoside analog 2′-C-methylcytidine resulted only in a moderate antiviral activity. In its natural fish host, CTV levels largely fluctuate during the reproductive cycle with the virus detected mainly during spawning. We wondered whether this aspect of CTV infection may serve as a surrogate model for the peculiar pathogenesis of HEV in pregnant women. To that end the effect of three sex steroids on in vitro CTV replication was evaluated. Whereas progesterone resulted in marked inhibition of virus replication, testosterone and 17β-estradiol stimulated viral growth. Our data thus indicate that CTV may serve as a surrogate model for HEV, both for antiviral experiments and studies on the replication biology of the Hepeviridae.

  8. Estimation of k-ε parameters using surrogate models and jet-in-crossflow data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lefantzi, Sophia; Ray, Jaideep; Arunajatesan, Srinivasan

    2014-11-01

    We demonstrate a Bayesian method that can be used to calibrate computationally expensive 3D RANS (Reynolds Av- eraged Navier Stokes) models with complex response surfaces. Such calibrations, conditioned on experimental data, can yield turbulence model parameters as probability density functions (PDF), concisely capturing the uncertainty in the parameter estimates. Methods such as Markov chain Monte Carlo (MCMC) estimate the PDF by sampling, with each sample requiring a run of the RANS model. Consequently a quick-running surrogate is used instead to the RANS simulator. The surrogate can be very difficult to design if the model's response i.e., the dependence of themore » calibration variable (the observable) on the parameter being estimated is complex. We show how the training data used to construct the surrogate can be employed to isolate a promising and physically realistic part of the parameter space, within which the response is well-behaved and easily modeled. We design a classifier, based on treed linear models, to model the "well-behaved region". This classifier serves as a prior in a Bayesian calibration study aimed at estimating 3 k - ε parameters ( C μ, C ε2 , C ε1 ) from experimental data of a transonic jet-in-crossflow interaction. The robustness of the calibration is investigated by checking its predictions of variables not included in the cal- ibration data. We also check the limit of applicability of the calibration by testing at off-calibration flow regimes. We find that calibration yield turbulence model parameters which predict the flowfield far better than when the nomi- nal values of the parameters are used. Substantial improvements are still obtained when we use the calibrated RANS model to predict jet-in-crossflow at Mach numbers and jet strengths quite different from those used to generate the ex- perimental (calibration) data. Thus the primary reason for poor predictive skill of RANS, when using nominal values of the turbulence model parameters, was parametric uncertainty, which was rectified by calibration. Post-calibration, the dominant contribution to model inaccuraries are due to the structural errors in RANS.« less

  9. Generation of fluoroscopic 3D images with a respiratory motion model based on an external surrogate signal

    NASA Astrophysics Data System (ADS)

    Hurwitz, Martina; Williams, Christopher L.; Mishra, Pankaj; Rottmann, Joerg; Dhou, Salam; Wagar, Matthew; Mannarino, Edward G.; Mak, Raymond H.; Lewis, John H.

    2015-01-01

    Respiratory motion during radiotherapy can cause uncertainties in definition of the target volume and in estimation of the dose delivered to the target and healthy tissue. In this paper, we generate volumetric images of the internal patient anatomy during treatment using only the motion of a surrogate signal. Pre-treatment four-dimensional CT imaging is used to create a patient-specific model correlating internal respiratory motion with the trajectory of an external surrogate placed on the chest. The performance of this model is assessed with digital and physical phantoms reproducing measured irregular patient breathing patterns. Ten patient breathing patterns are incorporated in a digital phantom. For each patient breathing pattern, the model is used to generate images over the course of thirty seconds. The tumor position predicted by the model is compared to ground truth information from the digital phantom. Over the ten patient breathing patterns, the average absolute error in the tumor centroid position predicted by the motion model is 1.4 mm. The corresponding error for one patient breathing pattern implemented in an anthropomorphic physical phantom was 0.6 mm. The global voxel intensity error was used to compare the full image to the ground truth and demonstrates good agreement between predicted and true images. The model also generates accurate predictions for breathing patterns with irregular phases or amplitudes.

  10. Correlations of turbidity to suspended-sediment concentration in the Toutle River Basin, near Mount St. Helens, Washington, 2010-11

    USGS Publications Warehouse

    Uhrich, Mark A.; Kolasinac, Jasna; Booth, Pamela L.; Fountain, Robert L.; Spicer, Kurt R.; Mosbrucker, Adam R.

    2014-01-01

    Researchers at the U.S. Geological Survey, Cascades Volcano Observatory, investigated alternative methods for the traditional sample-based sediment record procedure in determining suspended-sediment concentration (SSC) and discharge. One such sediment-surrogate technique was developed using turbidity and discharge to estimate SSC for two gaging stations in the Toutle River Basin near Mount St. Helens, Washington. To provide context for the study, methods for collecting sediment data and monitoring turbidity are discussed. Statistical methods used include the development of ordinary least squares regression models for each gaging station. Issues of time-related autocorrelation also are evaluated. Addition of lagged explanatory variables was used to account for autocorrelation in the turbidity, discharge, and SSC data. Final regression model equations and plots are presented for the two gaging stations. The regression models support near-real-time estimates of SSC and improved suspended-sediment discharge records by incorporating continuous instream turbidity. Future use of such models may potentially lower the costs of sediment monitoring by reducing time it takes to collect and process samples and to derive a sediment-discharge record.

  11. Machine Learning Techniques for Global Sensitivity Analysis in Climate Models

    NASA Astrophysics Data System (ADS)

    Safta, C.; Sargsyan, K.; Ricciuto, D. M.

    2017-12-01

    Climate models studies are not only challenged by the compute intensive nature of these models but also by the high-dimensionality of the input parameter space. In our previous work with the land model components (Sargsyan et al., 2014) we identified subsets of 10 to 20 parameters relevant for each QoI via Bayesian compressive sensing and variance-based decomposition. Nevertheless the algorithms were challenged by the nonlinear input-output dependencies for some of the relevant QoIs. In this work we will explore a combination of techniques to extract relevant parameters for each QoI and subsequently construct surrogate models with quantified uncertainty necessary to future developments, e.g. model calibration and prediction studies. In the first step, we will compare the skill of machine-learning models (e.g. neural networks, support vector machine) to identify the optimal number of classes in selected QoIs and construct robust multi-class classifiers that will partition the parameter space in regions with smooth input-output dependencies. These classifiers will be coupled with techniques aimed at building sparse and/or low-rank surrogate models tailored to each class. Specifically we will explore and compare sparse learning techniques with low-rank tensor decompositions. These models will be used to identify parameters that are important for each QoI. Surrogate accuracy requirements are higher for subsequent model calibration studies and we will ascertain the performance of this workflow for multi-site ALM simulation ensembles.

  12. Low-rank separated representation surrogates of high-dimensional stochastic functions: Application in Bayesian inference

    NASA Astrophysics Data System (ADS)

    Validi, AbdoulAhad

    2014-03-01

    This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector-valued separated representation-based model, in comparison to the available scalar-valued case, leads to a significant reduction in the cost of approximation by an order of magnitude equal to the vector size. The performance of the method is studied through its application to three numerical examples including a 41-dimensional elliptic PDE and a 21-dimensional cavity flow.

  13. Surrogacy Assessment Using Principal Stratification and a Gaussian Copula Model

    PubMed Central

    Taylor, J.M.G.; Elliott, M.R.

    2014-01-01

    In clinical trials, a surrogate outcome (S) can be measured before the outcome of interest (T) and may provide early information regarding the treatment (Z) effect on T. Many methods of surrogacy validation rely on models for the conditional distribution of T given Z and S. However, S is a post-randomization variable, and unobserved, simultaneous predictors of S and T may exist, resulting in a non-causal interpretation. Frangakis and Rubin1 developed the concept of principal surrogacy, stratifying on the joint distribution of the surrogate marker under treatment and control to assess the association between the causal effects of treatment on the marker and the causal effects of treatment on the clinical outcome. Working within the principal surrogacy framework, we address the scenario of an ordinal categorical variable as a surrogate for a censored failure time true endpoint. A Gaussian copula model is used to model the joint distribution of the potential outcomes of T, given the potential outcomes of S. Because the proposed model cannot be fully identified from the data, we use a Bayesian estimation approach with prior distributions consistent with reasonable assumptions in the surrogacy assessment setting. The method is applied to data from a colorectal cancer clinical trial, previously analyzed by Burzykowski et al..2 PMID:24947559

  14. Surrogacy assessment using principal stratification and a Gaussian copula model.

    PubMed

    Conlon, Asc; Taylor, Jmg; Elliott, M R

    2017-02-01

    In clinical trials, a surrogate outcome ( S) can be measured before the outcome of interest ( T) and may provide early information regarding the treatment ( Z) effect on T. Many methods of surrogacy validation rely on models for the conditional distribution of T given Z and S. However, S is a post-randomization variable, and unobserved, simultaneous predictors of S and T may exist, resulting in a non-causal interpretation. Frangakis and Rubin developed the concept of principal surrogacy, stratifying on the joint distribution of the surrogate marker under treatment and control to assess the association between the causal effects of treatment on the marker and the causal effects of treatment on the clinical outcome. Working within the principal surrogacy framework, we address the scenario of an ordinal categorical variable as a surrogate for a censored failure time true endpoint. A Gaussian copula model is used to model the joint distribution of the potential outcomes of T, given the potential outcomes of S. Because the proposed model cannot be fully identified from the data, we use a Bayesian estimation approach with prior distributions consistent with reasonable assumptions in the surrogacy assessment setting. The method is applied to data from a colorectal cancer clinical trial, previously analyzed by Burzykowski et al.

  15. Advance directives lessen the decisional burden of surrogate decision-making for the chronically critically ill.

    PubMed

    Hickman, Ronald L; Pinto, Melissa D

    2014-03-01

    To identify the relationships between advance directive status, demographic characteristics and decisional burden (role stress and depressive symptoms) of surrogate decision-makers (SDMs) of patients with chronic critical illness. Although the prevalence of advance directives among Americans has increased, SDMs are ultimately responsible for complex medical decisions of the chronically critically ill patient. Decisional burden has lasting psychological effects on SDMs. There is insufficient evidence on the influence of advance directives on the decisional burden of surrogate decision-makers of patients with chronic critical illness. The study was a secondary data analysis of cross-sectional data. Data were obtained from 489 surrogate decision-makers of chronically critically ill patients at two academic medical centres in Northeast Ohio, United States, between September 2005-May 2008. Data were collected using demographic forms and questionnaires. A single-item measure of role stress and the Center for Epidemiological Studies Depression (CESD) scale were used to capture the SDM's decisional burden. Descriptive statistics, t-tests, chi-square and path analyses were performed. Surrogate decision-makers who were nonwhite, with low socioeconomic status and low education level were less likely to have advance directive documentation for their chronically critically ill patient. The presence of an advance directive mitigates the decisional burden by directly reducing the SDM's role stress and indirectly lessening the severity of depressive symptoms. Most SDMs of chronically critically ill patients will not have the benefit of knowing the patient's preferences for life-sustaining therapies and consequently be at risk of increased decisional burden. Study results are clinically useful for patient education on the influence of advance directives. Patients may be informed that SDMs without advance directives are at risk of increased decisional burden and will require decisional support to facilitate patient-centred decision-making. © 2013 John Wiley & Sons Ltd.

  16. Surrogate and clinical endpoints in interventional cardiology: are statistics the brakes?

    PubMed

    Waliszewski, Matthias; Rittger, Harald

    2016-10-01

    Randomized controlled trials are the gold standard for demonstrating safety and efficacy of coronary devices with or without accompanying drug treatments in interventional cardiology. With the advent of last-generation drug-eluting stents having enhanced technical attributes and long-term clinical benefits, the proof of incremental angiographic or long-term clinical efficacy becomes more challenging. The purpose of this review is to provide an overview of the most common and alternative study endpoints in interventional cardiology and their potential reimbursement value. Moreover, we intend to describe the statistical limitations in order to demonstrate differences between potential treatment groups. Furthermore, careful endpoint recommendations for a given patient number are offered for future study designs. The number of patients per treatment group was estimated for various study designs such as noninferiority test hypotheses with hard clinical endpoints and various surrogate endpoints. To test for differences in various surrogate endpoint scenarios, the corresponding patient group sizes were explored. To evaluate these endpoints in terms of their reimbursement impact, preferred endpoints for technical appraisals in interventional cardiology at the National Institute of Health and Care Excellence (NICE) were used. Even with the most stringent experimental control to reduce bias-introducing factors, studies with hard primary clinical endpoints such as the occurrence of major adverse cardiac events (MACE) or target-lesion revascularization (TLR) rates remain the gold standard, with numbers reaching into the 300-700 patient range per group. Study designs using loss in fractional-flow reserve (FFR) or stent-strut-coverage rates can be statistically formulated; however, the clinical ramifications for the patient remain to be discussed. Nonrandomized study designs with intrapatient angiographic controls in nontarget vessels may merit further thoughts and explorations. From a reimbursement impact, the primary endpoints MACE and TLR are the best choices for a moderately sized study population of 500 patients per group. Angiographic endpoints, in particular minimal lumen diameter (MLD), are not useful in this context. The emerging endpoints such as loss in FFR or stent coverage require smaller patient populations. However, their impact on reimbursement-related decisions is limited. © The Author(s), 2016.

  17. Surrogate and clinical endpoints in interventional cardiology: are statistics the brakes?

    PubMed Central

    Waliszewski, Matthias; Rittger, Harald

    2016-01-01

    Background: Randomized controlled trials are the gold standard for demonstrating safety and efficacy of coronary devices with or without accompanying drug treatments in interventional cardiology. With the advent of last-generation drug-eluting stents having enhanced technical attributes and long-term clinical benefits, the proof of incremental angiographic or long-term clinical efficacy becomes more challenging. The purpose of this review is to provide an overview of the most common and alternative study endpoints in interventional cardiology and their potential reimbursement value. Moreover, we intend to describe the statistical limitations in order to demonstrate differences between potential treatment groups. Furthermore, careful endpoint recommendations for a given patient number are offered for future study designs. Methods: The number of patients per treatment group was estimated for various study designs such as noninferiority test hypotheses with hard clinical endpoints and various surrogate endpoints. To test for differences in various surrogate endpoint scenarios, the corresponding patient group sizes were explored. To evaluate these endpoints in terms of their reimbursement impact, preferred endpoints for technical appraisals in interventional cardiology at the National Institute of Health and Care Excellence (NICE) were used. Results: Even with the most stringent experimental control to reduce bias-introducing factors, studies with hard primary clinical endpoints such as the occurrence of major adverse cardiac events (MACE) or target-lesion revascularization (TLR) rates remain the gold standard, with numbers reaching into the 300–700 patient range per group. Study designs using loss in fractional-flow reserve (FFR) or stent-strut-coverage rates can be statistically formulated; however, the clinical ramifications for the patient remain to be discussed. Nonrandomized study designs with intrapatient angiographic controls in nontarget vessels may merit further thoughts and explorations. Conclusions: From a reimbursement impact, the primary endpoints MACE and TLR are the best choices for a moderately sized study population of 500 patients per group. Angiographic endpoints, in particular minimal lumen diameter (MLD), are not useful in this context. The emerging endpoints such as loss in FFR or stent coverage require smaller patient populations. However, their impact on reimbursement-related decisions is limited. PMID:27378486

  18. Estimating suspended sediment using acoustics in a fine-grained riverine system, Kickapoo Creek at Bloomington, Illinois

    USGS Publications Warehouse

    Manaster, Amanda D.; Domanski, Marian M.; Straub, Timothy D.; Boldt, Justin A.

    2016-08-18

    Acoustic technologies have the potential to be used as a surrogate for measuring suspended-sediment concentration (SSC). This potential was examined in a fine-grained (97-100 percent fines) riverine system in central Illinois by way of installation of an acoustic instrument. Acoustic data were collected continuously over the span of 5.5 years. Acoustic parameters were regressed against SSC data to determine the accuracy of using acoustic technology as a surrogate for measuring SSC in a fine-grained riverine system. The resulting regressions for SSC and sediment acoustic parameters had coefficients of determination ranging from 0.75 to 0.97 for various events and configurations. The overall Nash-Sutcliffe model-fit efficiency was 0.95 for the 132 observed and predicted SSC values determined using the sediment acoustic parameter regressions. The study of using acoustic technologies as a surrogate for measuring SSC in fine-grained riverine systems is ongoing. The results at this site are promising in the realm of surrogate technology.

  19. The potential investment impact of improved access to accelerated approval on the development of treatments for low prevalence rare diseases

    PubMed Central

    2011-01-01

    Background Over 95% of rare diseases lack treatments despite many successful treatment studies in animal models. To improve access to treatments, the Accelerated Approval (AA) regulations were implemented allowing the use of surrogate endpoints to achieve drug approval and accelerate development of life-saving therapies. Many rare diseases have not utilized AA due to the difficulty in gaining acceptance of novel surrogate endpoints in untreated rare diseases. Methods To assess the potential impact of improved AA accessibility, we devised clinical development programs using proposed clinical or surrogate endpoints for fifteen rare disease treatments. Results We demonstrate that better AA access could reduce development costs by approximately 60%, increase investment value, and foster development of three times as many rare disease drugs for the same investment. Conclusion Our research brings attention to the need for well-defined and practical qualification criteria for the use of surrogate endpoints to allow more access to the AA approval pathway in clinical trials for rare diseases. PMID:21733145

  20. An information-theoretic approach for the evaluation of surrogate endpoints based on causal inference.

    PubMed

    Alonso, Ariel; Van der Elst, Wim; Molenberghs, Geert; Buyse, Marc; Burzykowski, Tomasz

    2016-09-01

    In this work a new metric of surrogacy, the so-called individual causal association (ICA), is introduced using information-theoretic concepts and a causal inference model for a binary surrogate and true endpoint. The ICA has a simple and appealing interpretation in terms of uncertainty reduction and, in some scenarios, it seems to provide a more coherent assessment of the validity of a surrogate than existing measures. The identifiability issues are tackled using a two-step procedure. In the first step, the region of the parametric space of the distribution of the potential outcomes, compatible with the data at hand, is geometrically characterized. Further, in a second step, a Monte Carlo approach is proposed to study the behavior of the ICA on the previous region. The method is illustrated using data from the Collaborative Initial Glaucoma Treatment Study. A newly developed and user-friendly R package Surrogate is provided to carry out the evaluation exercise. © 2016, The International Biometric Society.

  1. Capacity for Preferences: Respecting Patients with Compromised Decision-Making.

    PubMed

    Wasserman, Jason Adam; Navin, Mark Christopher

    2018-05-01

    When a patient lacks decision-making capacity, then according to standard clinical ethics practice in the United States, the health care team should seek guidance from a surrogate decision-maker, either previously selected by the patient or appointed by the courts. If there are no surrogates willing or able to exercise substituted judgment, then the team is to choose interventions that promote a patient's best interests. We argue that, even when there is input from a surrogate, patient preferences should be an additional source of guidance for decisions about patients who lack decision-making capacity. Our proposal builds on other efforts to help patients who lack decision-making capacity provide input into decisions about their care. For example, "supported," "assisted," or "guided" decision-making models reflect a commitment to humanistic patient engagement and create a more supportive process for patients, families, and health care teams. But often, they are supportive processes for guiding a patient toward a decision that the surrogate or team believes to be in the patient's medical best interests. Another approach holds that taking seriously the preferences of such a patient can help surrogates develop a better account of what the patient's treatment choices would have been if the patient had retained decision-making capacity; the surrogate then must try to integrate features of the patient's formerly rational self with the preferences of the patient's currently compromised self. Patients who lack decision-making capacity are well served by these efforts to solicit and use their preferences to promote best interests or to craft would-be autonomous patient images for use by surrogates. However, we go further: the moral reasons for valuing the preferences of patients without decision-making capacity are not reducible to either best-interests or (surrogate) autonomy considerations but can be grounded in the values of liberty and respect for persons. This has important consequences for treatment decisions involving these vulnerable patients. © 2018 The Hastings Center.

  2. UV controlling factors and trends derived from the ground-based measurements taken at Belsk, Poland, 1976-1994

    NASA Astrophysics Data System (ADS)

    KrzyśCin, Janusz W.

    1996-07-01

    Monthly means of UV erythemal dose at ground level from the Robertson-Berger (RB) sunburn meter (1976-1992) and the UV-Biometer model 501 MED meter (1993-1994) located at Belsk (21°E, 52°N), Poland, are examined. The monthly means are calculated from all-sky daily means of UV erythemal dose. Ancillary measurements of column ozone (by Dobson spectrophotometer), sunshine duration (by Campbell-Stokes heliograph), and total (sun and sky) radiation (by a pyranometer) are considered to explain variations in the UV data. A multiple regression model is proposed to study trends in the UV data. The model accounts for the UV erythemal dose changes induced by total ozone, sunshine duration (surrogate for cloud cover variations), or total solar radiation (surrogate for combined cloud cover and atmospheric turbidity impact on the UV radiation), trends due to instrument drift, step changes in the data, and serial correlations. A strong relationship between monthly all-sky UV erythemal dose changes and total ozone (and total solar radiation) is found. Calculations show that an erythemal radiative amplification factor (RAF) due to ozone under all skies is close to its clear-sky value (about 1). However, the model gives evidence that the RAF due to ozone is smaller for cloudier (and/or more turbid) atmospheres than long-term reference. Total solar radiation change of 1% is associated with a change of 0.7% in the UV erythemal dose. Modeled trends in the Belsk's UV data, inferred from the model using ozone and total solar radiation as the UV forcing factors, are 2.3% ± 0.4% (1σ) per decade in the period 1976-1994. The large increase in the UV erythemal dose, of the order of 4% per decade due to ozone depletion (-3.2% per decade), is partially compensated by a decreasing tendency (-2.8% per decade) in total solar radiation. The model estimates the trend in the UV data of the order of 0.1% per decade (not statistically significant) due to superposition of the instrument drift and long-term effects related to other UV influencing factors (not parameterized by the model).

  3. Shared Decision Making in Intensive Care Units: An American College of Critical Care Medicine and American Thoracic Society Policy Statement

    PubMed Central

    Kon, Alexander A.; Davidson, Judy E.; Morrison, Wynne; Danis, Marion; White, Douglas B.

    2015-01-01

    Objectives Shared decision-making (SDM) is endorsed by critical care organizations, however there remains confusion about what SDM is, when it should be used, and approaches to promote partnerships in treatment decisions. The purpose of this statement is to define SDM, recommend when SDM should be used, identify the range of ethically acceptable decision-making models, and present important communication skills. Methods The American College of Critical Care Medicine (ACCM) and American Thoracic Society (ATS) Ethics Committees reviewed empirical research and normative analyses published in peer-reviewed journals to generate recommendations. Recommendations approved by consensus of the full Ethics Committees of ACCM and ATS were included in the statement. Main Results Six recommendations were endorsed: 1) Definition: Shared decision-making is a collaborative process that allows patients, or their surrogates, and clinicians to make health care decisions together, taking into account the best scientific evidence available, as well as the patient’s values, goals, and preferences. 2) Clinicians should engage in a SDM process to define overall goals of care (including decisions regarding limiting or withdrawing life-prolonging interventions) and when making major treatment decisions that may be affected by personal values, goals, and preferences. 3) Clinicians should use as their “default” approach a SDM process that includes three main elements: information exchange, deliberation, and making a treatment decision. 4) A wide range of decision-making approaches are ethically supportable including patient- or surrogate-directed and clinician-directed models. Clinicians should tailor the decision-making process based on the preferences of the patient or surrogate. 5) Clinicians should be trained in communication skills. 6) Research is needed to evaluate decision-making strategies. Conclusions Patient and surrogate preferences for decision-making roles regarding value-laden choices range from preferring to exercise significant authority to ceding such authority to providers. Clinicians should adapt the decision-making model to the needs and preferences of the patient or surrogate. PMID:26509317

  4. Shared Decision Making in ICUs: An American College of Critical Care Medicine and American Thoracic Society Policy Statement.

    PubMed

    Kon, Alexander A; Davidson, Judy E; Morrison, Wynne; Danis, Marion; White, Douglas B

    2016-01-01

    Shared decision making is endorsed by critical care organizations; however, there remains confusion about what shared decision making is, when it should be used, and approaches to promote partnerships in treatment decisions. The purpose of this statement is to define shared decision making, recommend when shared decision making should be used, identify the range of ethically acceptable decision-making models, and present important communication skills. The American College of Critical Care Medicine and American Thoracic Society Ethics Committees reviewed empirical research and normative analyses published in peer-reviewed journals to generate recommendations. Recommendations approved by consensus of the full Ethics Committees of American College of Critical Care Medicine and American Thoracic Society were included in the statement. Six recommendations were endorsed: 1) DEFINITION: Shared decision making is a collaborative process that allows patients, or their surrogates, and clinicians to make healthcare decisions together, taking into account the best scientific evidence available, as well as the patient's values, goals, and preferences. 2) Clinicians should engage in a shared decision making process to define overall goals of care (including decisions regarding limiting or withdrawing life-prolonging interventions) and when making major treatment decisions that may be affected by personal values, goals, and preferences. 3) Clinicians should use as their "default" approach a shared decision making process that includes three main elements: information exchange, deliberation, and making a treatment decision. 4) A wide range of decision-making approaches are ethically supportable, including patient- or surrogate-directed and clinician-directed models. Clinicians should tailor the decision-making process based on the preferences of the patient or surrogate. 5) Clinicians should be trained in communication skills. 6) Research is needed to evaluate decision-making strategies. Patient and surrogate preferences for decision-making roles regarding value-laden choices range from preferring to exercise significant authority to ceding such authority to providers. Clinicians should adapt the decision-making model to the needs and preferences of the patient or surrogate.

  5. Predicting trace organic compound breakthrough in granular activated carbon using fluorescence and UV absorbance as surrogates.

    PubMed

    Anumol, Tarun; Sgroi, Massimiliano; Park, Minkyu; Roccaro, Paolo; Snyder, Shane A

    2015-06-01

    This study investigated the applicability of bulk organic parameters like dissolved organic carbon (DOC), UV absorbance at 254 nm (UV254), and total fluorescence (TF) to act as surrogates in predicting trace organic compound (TOrC) removal by granular activated carbon in water reuse applications. Using rapid small-scale column testing, empirical linear correlations for thirteen TOrCs were determined with DOC, UV254, and TF in four wastewater effluents. Linear correlations (R(2) > 0.7) were obtained for eight TOrCs in each water quality in the UV254 model, while ten TOrCs had R(2) > 0.7 in the TF model. Conversely, DOC was shown to be a poor surrogate for TOrC breakthrough prediction. When the data from all four water qualities was combined, good linear correlations were still obtained with TF having higher R(2) than UV254 especially for TOrCs with log Dow>1. Excellent linear relationship (R(2) > 0.9) between log Dow and the removal of TOrC at 0% surrogate removal (y-intercept) were obtained for the five neutral TOrCs tested in this study. Positively charged TOrCs had enhanced removals due to electrostatic interactions with negatively charged GAC that caused them to deviate from removals that would be expected with their log Dow. Application of the empirical linear correlation models to full-scale samples provided good results for six of seven TOrCs (except meprobamate) tested when comparing predicted TOrC removal by UV254 and TF with actual removals for GAC in all the five samples tested. Surrogate predictions using UV254 and TF provide valuable tools for rapid or on-line monitoring of GAC performance and can result in cost savings by extended GAC run times as compared to using DOC breakthrough to trigger regeneration or replacement. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Stable isotope signatures and trophic-step fractionation factors of fish tissues collected as non-lethal surrogates of dorsal muscle.

    PubMed

    Busst, Georgina M A; Bašić, Tea; Britton, J Robert

    2015-08-30

    Dorsal white muscle is the standard tissue analysed in fish trophic studies using stable isotope analyses. As muscle is usually collected destructively, fin tissues and scales are often used as non-lethal surrogates; we examined the utility of scales and fin tissue as muscle surrogates. The muscle, fin and scale δ(15) N and δ(13) C values from 10 cyprinid fish species determined with an elemental analyser coupled with an isotope ratio mass spectrometer were compared. The fish comprised (1) samples from the wild, and (2) samples from tank aquaria, using six species held for 120 days and fed a single food resource. Relationships between muscle, fin and scale isotope ratios were examined for each species and for the entire dataset, with the efficacy of four methods of predicting muscle isotope ratios from fin and scale values being tested. The fractionation factors between the three tissues of the laboratory fishes and their food resource were then calculated and applied to Bayesian mixing models to assess their effect on fish diet predictions. The isotopic data of the three tissues per species were distinct, but were significantly related, enabling estimations of muscle values from the two surrogates. Species-specific equations provided the least erroneous corrections of scale and fin isotope ratios (errors < 0.6‰). The fractionation factors for δ(15) N values were in the range obtained for other species, but were often higher for δ(13) C values. Their application to data from two fish populations in the mixing models resulted in significant alterations in diet predictions. Scales and fin tissue are strong surrogates of dorsal muscle in food web studies as they can provide estimates of muscle values within an acceptable level of error when species-specific methods are used. Their derived fractionation factors can also be applied to models predicting fish diet composition from δ(15) N and δ(13) C values. Copyright © 2015 John Wiley & Sons, Ltd.

  7. Fusion set selection with surrogate metric in multi-atlas based image segmentation

    NASA Astrophysics Data System (ADS)

    Zhao, Tingting; Ruan, Dan

    2016-02-01

    Multi-atlas based image segmentation sees unprecedented opportunities but also demanding challenges in the big data era. Relevant atlas selection before label fusion plays a crucial role in reducing potential performance loss from heterogeneous data quality and high computation cost from extensive data. This paper starts with investigating the image similarity metric (termed ‘surrogate’), an alternative to the inaccessible geometric agreement metric (termed ‘oracle’) in atlas relevance assessment, and probes into the problem of how to select the ‘most-relevant’ atlases and how many such atlases to incorporate. We propose an inference model to relate the surrogates and the oracle geometric agreement metrics. Based on this model, we quantify the behavior of the surrogates in mimicking oracle metrics for atlas relevance ordering. Finally, analytical insights on the choice of fusion set size are presented from a probabilistic perspective, with the integrated goal of including the most relevant atlases and excluding the irrelevant ones. Empirical evidence and performance assessment are provided based on prostate and corpus callosum segmentation.

  8. The effectiveness of surrogate taxa to conserve freshwater biodiversity

    USGS Publications Warehouse

    Stewart, David R.; Underwood, Zachary E.; Rahel, Frank J.; Walters, Annika W.

    2018-01-01

    Establishing protected areas has long been an effective conservation strategy, and is often based on more readily surveyed species. The potential of any freshwater taxa to be a surrogate of other aquatic groups has not been fully explored. We compiled occurrence data on 72 species of freshwater fish, amphibians, mussels, and aquatic reptiles for the Great Plains, Wyoming. We used hierarchical Bayesian multi-species mixture models and MaxEnt models to describe species distributions, and program Zonation to identify conservation priority areas for each aquatic group. The landscape-scale factors that best characterized aquatic species distributions differed among groups. There was low agreement and congruence among taxa-specific conservation priorities (<20%), meaning that no surrogate priority areas would include or protect the best habitats of other aquatic taxa. We found that common, wide-ranging aquatic species were included in taxa-specific priority areas, but rare freshwater species were not included. Thus, the development of conservation priorities based on a single freshwater aquatic group would not protect all species in the other aquatic groups.

  9. Mimicking Retention and Transport of Rotavirus and Adenovirus in Sand Media Using DNA-labeled, Protein-coated Silica Nanoparticles

    NASA Astrophysics Data System (ADS)

    Pang, Liping; Farkas, Kata; Bennett, Grant; Varsani, Arvind; Easingwood, Richard; Tilley, Richard; Nowostawska, Urszula; Lin, Susan

    2014-05-01

    Rotavirus (RoV) and adenovirus (AdV) are important viral pathogens for the risk analysis of drinking water. Despite this, little is known about their retention and transport behaviors in porous media (e.g. sand filtered used for water treatment and groundwater aquifers due to a lack of representative surrogates. In this study, we developed RoV and AdV surrogates by covalently coating 70-nm sized silica nanoparticles with specific proteins and a DNA marker for sensitive detection. Filtration experiments using beach sand columns demonstrated the similarity of the surrogates' concentrations, attachment, and filtration efficiencies to the target viruses. The surrogates showed the same magnitude of concentration reduction as the viruses. Conversely, MS2 phage (a traditional virus model) over predicted concentrations of AdV and RoV by 1- and 2-orders of magnitude, respectively. The surrogates remained stable in size, surface charge and DNA concentration for at least one year. They can be easily and rapidly detected at concentrations down to one particle per PCR reaction and are readily detectable in natural waters and even in effluent. With up-scaling validation in pilot trials, the surrogates can be a useful cost-effective new tool for studying virus retention and transport in porous media, e.g. for assessing filter efficiency in water and wastewater treatment, tracking virus migration in groundwater after effluent land disposal, and establishing safe setback distances for groundwater protection.

  10. Uncertainty quantification for accident management using ACE surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varuttamaseni, A.; Lee, J. C.; Youngblood, R. W.

    The alternating conditional expectation (ACE) regression method is used to generate RELAP5 surrogates which are then used to determine the distribution of the peak clad temperature (PCT) during the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed (F and B) operation in the Zion-1 nuclear power plant. The construction of the surrogates assumes conditional independence relations among key reactor parameters. The choice of parameters to model is based on the macroscopic balance statements governing the behavior of the reactor. The peak clad temperature is calculated based on the independent variables that are known tomore » be important in determining the success of the F and B operation. The relationship between these independent variables and the plant parameters such as coolant pressure and temperature is represented by surrogates that are constructed based on 45 RELAP5 cases. The time-dependent PCT for different values of F and B parameters is calculated by sampling the independent variables from their probability distributions and propagating the information through two layers of surrogates. The results of our analysis show that the ACE surrogates are able to satisfactorily reproduce the behavior of the plant parameters even though a quasi-static assumption is primarily used in their construction. The PCT is found to be lower in cases where the F and B operation is initiated, compared to the case without F and B, regardless of the F and B parameters used. (authors)« less

  11. The Relationship Between Surface Curvature and Abdominal Aortic Aneurysm Wall Stress.

    PubMed

    de Galarreta, Sergio Ruiz; Cazón, Aitor; Antón, Raúl; Finol, Ender A

    2017-08-01

    The maximum diameter (MD) criterion is the most important factor when predicting risk of rupture of abdominal aortic aneurysms (AAAs). An elevated wall stress has also been linked to a high risk of aneurysm rupture, yet is an uncommon clinical practice to compute AAA wall stress. The purpose of this study is to assess whether other characteristics of the AAA geometry are statistically correlated with wall stress. Using in-house segmentation and meshing algorithms, 30 patient-specific AAA models were generated for finite element analysis (FEA). These models were subsequently used to estimate wall stress and maximum diameter and to evaluate the spatial distributions of wall thickness, cross-sectional diameter, mean curvature, and Gaussian curvature. Data analysis consisted of statistical correlations of the aforementioned geometry metrics with wall stress for the 30 AAA inner and outer wall surfaces. In addition, a linear regression analysis was performed with all the AAA wall surfaces to quantify the relationship of the geometric indices with wall stress. These analyses indicated that while all the geometry metrics have statistically significant correlations with wall stress, the local mean curvature (LMC) exhibits the highest average Pearson's correlation coefficient for both inner and outer wall surfaces. The linear regression analysis revealed coefficients of determination for the outer and inner wall surfaces of 0.712 and 0.516, respectively, with LMC having the largest effect on the linear regression equation with wall stress. This work underscores the importance of evaluating AAA mean wall curvature as a potential surrogate for wall stress.

  12. Numerical relativity waveform surrogate model for generically precessing binary black hole mergers

    NASA Astrophysics Data System (ADS)

    Blackman, Jonathan; Field, Scott E.; Scheel, Mark A.; Galley, Chad R.; Ott, Christian D.; Boyle, Michael; Kidder, Lawrence E.; Pfeiffer, Harald P.; Szilágyi, Béla

    2017-07-01

    A generic, noneccentric binary black hole (BBH) system emits gravitational waves (GWs) that are completely described by seven intrinsic parameters: the black hole spin vectors and the ratio of their masses. Simulating a BBH coalescence by solving Einstein's equations numerically is computationally expensive, requiring days to months of computing resources for a single set of parameter values. Since theoretical predictions of the GWs are often needed for many different source parameters, a fast and accurate model is essential. We present the first surrogate model for GWs from the coalescence of BBHs including all seven dimensions of the intrinsic noneccentric parameter space. The surrogate model, which we call NRSur7dq2, is built from the results of 744 numerical relativity simulations. NRSur7dq2 covers spin magnitudes up to 0.8 and mass ratios up to 2, includes all ℓ≤4 modes, begins about 20 orbits before merger, and can be evaluated in ˜50 ms . We find the largest NRSur7dq2 errors to be comparable to the largest errors in the numerical relativity simulations, and more than an order of magnitude smaller than the errors of other waveform models. Our model, and more broadly the methods developed here, will enable studies that were not previously possible when using highly accurate waveforms, such as parameter inference and tests of general relativity with GW observations.

  13. Fiber orientation interpolation for the multiscale analysis of short fiber reinforced composite parts

    NASA Astrophysics Data System (ADS)

    Köbler, Jonathan; Schneider, Matti; Ospald, Felix; Andrä, Heiko; Müller, Ralf

    2018-06-01

    For short fiber reinforced plastic parts the local fiber orientation has a strong influence on the mechanical properties. To enable multiscale computations using surrogate models we advocate a two-step identification strategy. Firstly, for a number of sample orientations an effective model is derived by numerical methods available in the literature. Secondly, to cover a general orientation state, these effective models are interpolated. In this article we develop a novel and effective strategy to carry out this interpolation. Firstly, taking into account symmetry arguments, we reduce the fiber orientation phase space to a triangle in R^2 . For an associated triangulation of this triangle we furnish each node with an surrogate model. Then, we use linear interpolation on the fiber orientation triangle to equip each fiber orientation state with an effective stress. The proposed approach is quite general, and works for any physically nonlinear constitutive law on the micro-scale, as long as surrogate models for single fiber orientation states can be extracted. To demonstrate the capabilities of our scheme we study the viscoelastic creep behavior of short glass fiber reinforced PA66, and use Schapery's collocation method together with FFT-based computational homogenization to derive single orientation state effective models. We discuss the efficient implementation of our method, and present results of a component scale computation on a benchmark component by using ABAQUS ®.

  14. Fiber orientation interpolation for the multiscale analysis of short fiber reinforced composite parts

    NASA Astrophysics Data System (ADS)

    Köbler, Jonathan; Schneider, Matti; Ospald, Felix; Andrä, Heiko; Müller, Ralf

    2018-04-01

    For short fiber reinforced plastic parts the local fiber orientation has a strong influence on the mechanical properties. To enable multiscale computations using surrogate models we advocate a two-step identification strategy. Firstly, for a number of sample orientations an effective model is derived by numerical methods available in the literature. Secondly, to cover a general orientation state, these effective models are interpolated. In this article we develop a novel and effective strategy to carry out this interpolation. Firstly, taking into account symmetry arguments, we reduce the fiber orientation phase space to a triangle in R^2 . For an associated triangulation of this triangle we furnish each node with an surrogate model. Then, we use linear interpolation on the fiber orientation triangle to equip each fiber orientation state with an effective stress. The proposed approach is quite general, and works for any physically nonlinear constitutive law on the micro-scale, as long as surrogate models for single fiber orientation states can be extracted. To demonstrate the capabilities of our scheme we study the viscoelastic creep behavior of short glass fiber reinforced PA66, and use Schapery's collocation method together with FFT-based computational homogenization to derive single orientation state effective models. We discuss the efficient implementation of our method, and present results of a component scale computation on a benchmark component by using ABAQUS ®.

  15. Evaluation of the validity of treatment decisions based on surrogate country models before introduction of the Polish FRAX and recommendations in comparison to current practice

    PubMed Central

    Narloch, Jerzy; Glinkowska, Bożena; Bandura, Małgorzata

    2016-01-01

    Introduction Patients diagnosed before the Polish FRAX was introduced may require re-evaluation and treatment changes if the diagnosis was established according to a surrogate country FRAX score. The aim of the study was to evaluate the validity of treatment decisions based on the surrogate country model before introduction of the Polish FRAX and to provide recommendations based on the current practice. Material and methods We evaluated a group of 142 postmenopausal women (70.7 ±8.9 years) who underwent bone mineral density measurements. We used 22 country-specific FRAX models and compared these to the Polish model. Results The mean risk values for hip and major osteoporotic fractures within 10 years were 4.575 (from 0.82 to 8.46) and 12.47% (from 2.18 to 21.65), respectively. In the case of a major fracture, 94.4% of women would receive lifestyle advice, and 5.6% would receive treatment according to the Polish FRAX using the guidelines of the National Osteoporosis Foundation (NOF). Polish treatment thresholds would implement pharmacotherapy in 32.4% of the study group. In the case of hip fractures, 45% of women according to the NOF would require pharmacotherapy but only 9.8% of women would qualify according to Polish guidelines. Nearly all surrogate FRAX calculator scores proved significantly different form Polish (p > 0.05). Conclusions More patients might have received antiresorptive medication before the Polish FRAX. This study recommends re-evaluation of patients who received medical therapy before the Polish FRAX was introduced and a review of the recommendations, considering the side effects of antiresorptive medication. PMID:29593808

  16. Evaluation of the validity of treatment decisions based on surrogate country models before introduction of the Polish FRAX and recommendations in comparison to current practice.

    PubMed

    Glinkowski, Wojciech M; Narloch, Jerzy; Glinkowska, Bożena; Bandura, Małgorzata

    2018-03-01

    Patients diagnosed before the Polish FRAX was introduced may require re-evaluation and treatment changes if the diagnosis was established according to a surrogate country FRAX score. The aim of the study was to evaluate the validity of treatment decisions based on the surrogate country model before introduction of the Polish FRAX and to provide recommendations based on the current practice. We evaluated a group of 142 postmenopausal women (70.7 ±8.9 years) who underwent bone mineral density measurements. We used 22 country-specific FRAX models and compared these to the Polish model. The mean risk values for hip and major osteoporotic fractures within 10 years were 4.575 (from 0.82 to 8.46) and 12.47% (from 2.18 to 21.65), respectively. In the case of a major fracture, 94.4% of women would receive lifestyle advice, and 5.6% would receive treatment according to the Polish FRAX using the guidelines of the National Osteoporosis Foundation (NOF). Polish treatment thresholds would implement pharmacotherapy in 32.4% of the study group. In the case of hip fractures, 45% of women according to the NOF would require pharmacotherapy but only 9.8% of women would qualify according to Polish guidelines. Nearly all surrogate FRAX calculator scores proved significantly different form Polish ( p > 0.05). More patients might have received antiresorptive medication before the Polish FRAX. This study recommends re-evaluation of patients who received medical therapy before the Polish FRAX was introduced and a review of the recommendations, considering the side effects of antiresorptive medication.

  17. Projecting climate change impacts on hydrology: the potential role of daily GCM output

    NASA Astrophysics Data System (ADS)

    Maurer, E. P.; Hidalgo, H. G.; Das, T.; Dettinger, M. D.; Cayan, D.

    2008-12-01

    A primary challenge facing resource managers in accommodating climate change is determining the range and uncertainty in regional and local climate projections. This is especially important for assessing changes in extreme events, which will drive many of the more severe impacts of a changed climate. Since global climate models (GCMs) produce output at a spatial scale incompatible with local impact assessment, different techniques have evolved to downscale GCM output so locally important climate features are expressed in the projections. We compared skill and hydrologic projections using two statistical downscaling methods and a distributed hydrology model. The downscaling methods are the constructed analogues (CA) and the bias correction and spatial downscaling (BCSD). CA uses daily GCM output, and can thus capture GCM projections for changing extreme event occurrence, while BCSD uses monthly output and statistically generates historical daily sequences. We evaluate the hydrologic impacts projected using downscaled climate (from the NCEP/NCAR reanalysis as a surrogate GCM) for the late 20th century with both methods, comparing skill in projecting soil moisture, snow pack, and streamflow at key locations in the Western United States. We include an assessment of a new method for correcting for GCM biases in a hybrid method combining the most important characteristics of both methods.

  18. Computing mammographic density from a multiple regression model constructed with image-acquisition parameters from a full-field digital mammographic unit

    PubMed Central

    Lu, Lee-Jane W.; Nishino, Thomas K.; Khamapirad, Tuenchit; Grady, James J; Leonard, Morton H.; Brunder, Donald G.

    2009-01-01

    Breast density (the percentage of fibroglandular tissue in the breast) has been suggested to be a useful surrogate marker for breast cancer risk. It is conventionally measured using screen-film mammographic images by a labor intensive histogram segmentation method (HSM). We have adapted and modified the HSM for measuring breast density from raw digital mammograms acquired by full-field digital mammography. Multiple regression model analyses showed that many of the instrument parameters for acquiring the screening mammograms (e.g. breast compression thickness, radiological thickness, radiation dose, compression force, etc) and image pixel intensity statistics of the imaged breasts were strong predictors of the observed threshold values (model R2=0.93) and %density (R2=0.84). The intra-class correlation coefficient of the %-density for duplicate images was estimated to be 0.80, using the regression model-derived threshold values, and 0.94 if estimated directly from the parameter estimates of the %-density prediction regression model. Therefore, with additional research, these mathematical models could be used to compute breast density objectively, automatically bypassing the HSM step, and could greatly facilitate breast cancer research studies. PMID:17671343

  19. Phospholipid and Respiratory Quinone Analyses From Extreme Environments

    NASA Astrophysics Data System (ADS)

    Pfiffner, S. M.

    2008-12-01

    Extreme environments on Earth have been chosen as surrogate sites to test methods and strategies for the deployment of space craft in the search for extraterrestrial life. Surrogate sites for many of the NASA astrobiology institutes include the South African gold mines, Canadian subpermafrost, Atacama Desert, and acid rock drainage. Soils, sediments, rock cores, fracture waters, biofilms, and service and drill waters represent the types of samples collected from these sites. These samples were analyzed by gas chromatography mass spectrometry for phospholipid fatty acid methyl esters and by high performance liquid chromatography atmospheric pressure chemical ionization tandem mass spectrometry for respiratory quinones. Phospholipid analyses provided estimates of biomass, community composition, and compositional changes related to nutritional limitations or exposure to toxic conditions. Similar to phospholipid analyses, respiratory quinone analyses afforded identification of certain types of microorganisms in the community based on respiration and offered clues to in situ redox conditions. Depending on the number of samples analyzed, selected multivariate statistical methods were applied to relate membrane lipid results with site biogeochemical parameters. Successful detection of life signatures and refinement of methodologies at surrogate sites on Earth will be critical for the recognition of extraterrestrial life. At this time, membrane lipid analyses provide useful information not easily obtained by other molecular techniques.

  20. Development of size-selective sampling of Bacillus anthracis surrogate spores from simulated building air intake mixtures for analysis via laser-induced breakdown spectroscopy.

    PubMed

    Gibb-Snyder, Emily; Gullett, Brian; Ryan, Shawn; Oudejans, Lukas; Touati, Abderrahmane

    2006-08-01

    Size-selective sampling of Bacillus anthracis surrogate spores from realistic, common aerosol mixtures was developed for analysis by laser-induced breakdown spectroscopy (LIBS). A two-stage impactor was found to be the preferential sampling technique for LIBS analysis because it was able to concentrate the spores in the mixtures while decreasing the collection of potentially interfering aerosols. Three common spore/aerosol scenarios were evaluated, diesel truck exhaust (to simulate a truck running outside of a building air intake), urban outdoor aerosol (to simulate common building air), and finally a protein aerosol (to simulate either an agent mixture (ricin/anthrax) or a contaminated anthrax sample). Two statistical methods, linear correlation and principal component analysis, were assessed for differentiation of surrogate spore spectra from other common aerosols. Criteria for determining percentages of false positives and false negatives via correlation analysis were evaluated. A single laser shot analysis of approximately 4 percent of the spores in a mixture of 0.75 m(3) urban outdoor air doped with approximately 1.1 x 10(5) spores resulted in a 0.04 proportion of false negatives. For that same sample volume of urban air without spores, the proportion of false positives was 0.08.

  1. Head Circumference as a Useful Surrogate for Intracranial Volume in Older Adults

    PubMed Central

    Hshieh, Tammy T.; Fox, Meaghan L.; Kosar, Cyrus M.; Cavallari, Michele; Guttmann, Charles R.G.; Alsop, David; Marcantonio, Edward R.; Schmitt, Eva M.; Jones, Richard N.; Inouye, Sharon K.

    2015-01-01

    Background Intracranial volume (ICV) has been proposed as a measure of maximum lifetime brain size. Accurate ICV measures require neuroimaging which is not always feasible for epidemiologic investigations. We examined head circumference as a useful surrogate for intracranial volume in older adults. Methods 99 older adults underwent Magnetic Resonance Imaging (MRI). ICV was measured by Statistical Parametric Mapping 8 (SPM8) software or Functional MRI of the Brain Software Library (FSL) extraction with manual editing, typically considered the gold standard. Head circumferences were determined using standardized tape measurement. We examined estimated correlation coefficients between head circumference and the two MRI-based ICV measurements. Results Head circumference and ICV by SPM8 were moderately correlated (overall r=0.73, men r=0.67, women r=0.63). Head circumference and ICV by FSL were also moderately correlated (overall r=0.69, men r=0.63, women r=0.49). Conclusions Head circumference measurement was strongly correlated with MRI-derived ICV. Our study presents a simple method to approximate ICV among older patients, which may prove useful as a surrogate for cognitive reserve in large scale epidemiologic studies of cognitive outcomes. This study also suggests the stability of head circumference correlation with ICV throughout the lifespan. PMID:26631180

  2. Quantitative bone scan lesion area as an early surrogate outcome measure indicative of overall survival in metastatic prostate cancer.

    PubMed

    Brown, Matthew S; Kim, Grace Hyun J; Chu, Gregory H; Ramakrishna, Bharath; Allen-Auerbach, Martin; Fischer, Cheryce P; Levine, Benjamin; Gupta, Pawan K; Schiepers, Christiaan W; Goldin, Jonathan G

    2018-01-01

    A clinical validation of the bone scan lesion area (BSLA) as a quantitative imaging biomarker was performed in metastatic castration-resistant prostate cancer (mCRPC). BSLA was computed from whole-body bone scintigraphy at baseline and week 12 posttreatment in a cohort of 198 mCRPC subjects (127 treated and 71 placebo) from a clinical trial involving a different drug from the initial biomarker development. BSLA computation involved automated image normalization, lesion segmentation, and summation of the total area of segmented lesions on bone scan AP and PA views as a measure of tumor burden. As a predictive biomarker, treated subjects with baseline BSLA [Formula: see text] had longer survival than those with higher BSLA ([Formula: see text] and [Formula: see text]). As a surrogate outcome biomarker, subjects were categorized as progressive disease (PD) if the BSLA increased by a prespecified 30% or more from baseline to week 12 and non-PD otherwise. Overall survival rates between PD and non-PD groups were statistically different ([Formula: see text] and [Formula: see text]). Subjects without PD at week 12 had longer survival than subjects with PD: median 398 days versus 280 days. BSLA has now been demonstrated to be an early surrogate outcome for overall survival in different prostate cancer drug treatments.

  3. Development of surrogate models for the prediction of the flow around an aircraft propeller

    NASA Astrophysics Data System (ADS)

    Salpigidou, Christina; Misirlis, Dimitris; Vlahostergios, Zinon; Yakinthos, Kyros

    2018-05-01

    In the present work, the derivation of two surrogate models (SMs) for modelling the flow around a propeller for small aircrafts is presented. Both methodologies use derived functions based on computations with the detailed propeller geometry. The computations were performed using k-ω shear stress transport for modelling turbulence. In the SMs, the modelling of the propeller was performed in a computational domain of disk-like geometry, where source terms were introduced in the momentum equations. In the first SM, the source terms were polynomial functions of swirl and thrust, mainly related to the propeller radius. In the second SM, regression analysis was used to correlate the source terms with the velocity distribution through the propeller. The proposed SMs achieved faster convergence, in relation to the detail model, by providing also results closer to the available operational data. The regression-based model was the most accurate and required less computational time for convergence.

  4. Replicability of time-varying connectivity patterns in large resting state fMRI samples.

    PubMed

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L; Stephen, Julia M; Claus, Eric D; Mayer, Andrew R; Calhoun, Vince D

    2017-12-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain's inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Replicability of time-varying connectivity patterns in large resting state fMRI samples

    PubMed Central

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L.; Stephen, Julia M.; Claus, Eric D.; Mayer, Andrew R.; Calhoun, Vince D.

    2018-01-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain’s inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. PMID:28916181

  6. Modeling and Deorphanization of Orphan GPCRs.

    PubMed

    Diaz, Constantino; Angelloz-Nicoud, Patricia; Pihan, Emilie

    2018-01-01

    Despite tremendous efforts, approximately 120 GPCRs remain orphan. Their physiological functions and their potential roles in diseases are poorly understood. Orphan GPCRs are extremely important because they may provide novel therapeutic targets for unmet medical needs. As a complement to experimental approaches, molecular modeling and virtual screening are efficient techniques to discover synthetic surrogate ligands which can help to elucidate the role of oGPCRs. Constitutively activated mutants and recently published active structures of GPCRs provide stimulating opportunities for building active molecular models for oGPCRs and identifying activators using virtual screening of compound libraries. We describe the molecular modeling and virtual screening process we have applied in the discovery of surrogate ligands, and provide examples for CCKA, a simulated oGPCR, and for two oGPCRs, GPR52 and GPR34.

  7. Methodology for Formulating Diesel Surrogate Fuels with Accurate Compositional, Ignition-Quality, and Volatility Characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Charles J.; Cannella, William J.; Bruno, Thomas J.

    In this study, a novel approach was developed to formulate surrogate fuels having characteristics that are representative of diesel fuels produced from real-world refinery streams. Because diesel fuels typically consist of hundreds of compounds, it is difficult to conclusively determine the effects of fuel composition on combustion properties. Surrogate fuels, being simpler representations of these practical fuels, are of interest because they can provide a better understanding of fundamental fuel-composition and property effects on combustion and emissions-formation processes in internal-combustion engines. In addition, the application of surrogate fuels in numerical simulations with accurate vaporization, mixing, and combustion models could revolutionizemore » future engine designs by enabling computational optimization for evolving real fuels. Dependable computational design would not only improve engine function, it would do so at significant cost savings relative to current optimization strategies that rely on physical testing of hardware prototypes. The approach in this study utilized the stateof- the-art techniques of 13C and 1H nuclear magnetic resonance spectroscopy and the advanced distillation curve to characterize fuel composition and volatility, respectively. The ignition quality was quantified by the derived cetane number. Two wellcharacterized, ultra-low-sulfur #2 diesel reference fuels produced from refinery streams were used as target fuels: a 2007 emissions certification fuel and a Coordinating Research Council (CRC) Fuels for Advanced Combustion Engines (FACE) diesel fuel. A surrogate was created for each target fuel by blending eight pure compounds. The known carbon bond types within the pure compounds, as well as models for the ignition qualities and volatilities of their mixtures, were used in a multiproperty regression algorithm to determine optimal surrogate formulations. The predicted and measured surrogate-fuel properties were quantitatively compared to the measured target-fuel properties, and good agreement was found. This paper is dedicated to the memory of our friend and colleague Jim Franz. Funding for this research was provided by the U.S. Department of Energy (U.S. DOE) Office of Vehicle Technologies, and by the Coordinating Research Council (CRC) and the companies that employ the CRC members. The study was conducted under the auspices of CRC. The authors thank U.S. DOE program manager Kevin Stork for supporting the participation of the U.S. national laboratories in this study.« less

  8. Prediction of NOx emissions from a simplified biodiesel surrogate by applying stochastic simulation algorithms (SSA)

    NASA Astrophysics Data System (ADS)

    Omidvarborna, Hamid; Kumar, Ashok; Kim, Dong-Shik

    2017-03-01

    A stochastic simulation algorithm (SSA) approach is implemented with the components of a simplified biodiesel surrogate to predict NOx (NO and NO2) emission concentrations from the combustion of biodiesel. The main reaction pathways were obtained by simplifying the previously derived skeletal mechanisms, including saturated methyl decenoate (MD), unsaturated methyl 5-decanoate (MD5D), and n-decane (ND). ND was added to match the energy content and the C/H/O ratio of actual biodiesel fuel. The MD/MD5D/ND surrogate model was also equipped with H2/CO/C1 formation mechanisms and a simplified NOx formation mechanism. The predicted model results are in good agreement with a limited number of experimental data at low-temperature combustion (LTC) conditions for three different biodiesel fuels consisting of various ratios of unsaturated and saturated methyl esters. The root mean square errors (RMSEs) of predicted values are 0.0020, 0.0018, and 0.0025 for soybean methyl ester (SME), waste cooking oil (WCO), and tallow oil (TO), respectively. The SSA model showed the potential to predict NOx emission concentrations, when the peak combustion temperature increased through the addition of ultra-low sulphur diesel (ULSD) to biodiesel. The SSA method used in this study demonstrates the possibility of reducing the computational complexity in biodiesel emissions modelling.

  9. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  10. Cutthroat trout virus as a surrogate in vitro infection model for testing inhibitors of hepatitis E virus replication.

    PubMed

    Debing, Yannick; Winton, James; Neyts, Johan; Dallmeier, Kai

    2013-10-01

    Hepatitis E virus (HEV) is one of the most important causes of acute hepatitis worldwide. Although most infections are self-limiting, mortality is particularly high in pregnant women. Chronic infections can occur in transplant and other immune-compromised patients. Successful treatment of chronic hepatitis E has been reported with ribavirin and pegylated interferon-alpha, however severe side effects were observed. We employed the cutthroat trout virus (CTV), a non-pathogenic fish virus with remarkable similarities to HEV, as a potential surrogate for HEV and established an antiviral assay against this virus using the Chinook salmon embryo (CHSE-214) cell line. Ribavirin and the respective trout interferon were found to efficiently inhibit CTV replication. Other known broad-spectrum inhibitors of RNA virus replication such as the nucleoside analog 2'-C-methylcytidine resulted only in a moderate antiviral activity. In its natural fish host, CTV levels largely fluctuate during the reproductive cycle with the virus detected mainly during spawning. We wondered whether this aspect of CTV infection may serve as a surrogate model for the peculiar pathogenesis of HEV in pregnant women. To that end the effect of three sex steroids on in vitro CTV replication was evaluated. Whereas progesterone resulted in marked inhibition of virus replication, testosterone and 17β-estradiol stimulated viral growth. Our data thus indicate that CTV may serve as a surrogate model for HEV, both for antiviral experiments and studies on the replication biology of the Hepeviridae. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Fuel Effects on Nozzle Flow and Spray Using Fully Coupled Eulerian Simulations

    DTIC Science & Technology

    2015-09-01

    Density of liquid fuel, kg/m 3 = Density of ambient gas , kg/m 3 VOF = Volume of Fluid model = Volume of Fluid Scalar ROI = Rate of...have been reported arising from individual refinery processes, crude oil source, and also varying with season, year and age of the fuel. This myriad...configurations. Under reacting conditions, Violi et al. (6) presented a surrogate mixture of six pure hydrocarbon ( Utah surrogate) and found that it

  12. Identification of hydrodynamic forces around 3D surrogates using particle image velocimetry in a microfluidic channel

    NASA Astrophysics Data System (ADS)

    Afshar, Sepideh; Nath, Shubhankar; Demirci, Utkan; Hasan, Tayyaba; Scarcelli, Giuliano; Rizvi, Imran; Franco, Walfre

    2018-02-01

    Previous studies have demonstrated that flow-induced shear stress induces a motile and aggressive tumor phenotype in a microfluidic model of 3D ovarian cancer. However, the magnitude and distribution of the hydrodynamic forces that influence this biological modulation on the 3D cancer nodules are not known. We have developed a series of numerical and experimental tools to identify these forces within a 3D microchannel. In this work, we used particle image velocimetry (PIV) to find the velocity profile using fluorescent micro-spheres as surrogates and nano-particles as tracers, from which hydrodynamic forces can be derived. The fluid velocity is obtained by imaging the trajectory of a range of florescence nano-particles (500-800 μm) via confocal microscopy. Imaging was done at different horizontal planes and with a 50 μm bead as the surrogate. For an inlet current rate of 2 μl/s, the maximum velocity at the center of the channel was 51 μm/s. The velocity profile around the sphere was symmetric which is expected since the flow is dominated by viscous forces as opposed to inertial forces. The confocal PIV was successfully employed in finding the velocity profile in a microchannel with a nodule surrogate; therefore, it seems feasible to use PIV to investigate the hydrodynamic forces around 3D biological models.

  13. Used fuel rail shock and vibration testing options analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Steven B.; Best, Ralph E.; Klymyshyn, Nicholas A.

    2014-09-25

    The objective of the rail shock and vibration tests is to complete the framework needed to quantify loads of fuel assembly components that are necessary to guide materials research and establish a technical basis for review organizations such as the U.S. Nuclear Regulatory Commission (NRC). A significant body of experimental and numerical modeling data exists to quantify loads and failure limits applicable to normal conditions of transport (NCT) rail transport, but the data are based on assumptions that can only be verified through experimental testing. The test options presented in this report represent possible paths for acquiring the data thatmore » are needed to confirm the assumptions of previous work, validate modeling methods that will be needed for evaluating transported fuel on a case-by-case basis, and inform material test campaigns on the anticipated range of fuel loading. The ultimate goal of this testing is to close all of the existing knowledge gaps related to the loading of used fuel under NCT conditions and inform the experiments and analysis program on specific endpoints for their research. The options include tests that would use an actual railcar, surrogate assemblies, and real or simulated rail transportation casks. The railcar carrying the cradle, cask, and surrogate fuel assembly payload would be moved in a train operating over rail track modified or selected to impart shock and vibration forces that occur during normal rail transportation. Computer modeling would be used to help design surrogates that may be needed for a rail cask, a cask’s internal basket, and a transport cradle. The objective of the design of surrogate components would be to provide a test platform that effectively simulates responses to rail shock and vibration loads that would be exhibited by state-of-the-art rail cask, basket, and/or cradle structures. The computer models would also be used to help determine the placement of instrumentation (accelerometers and strain gauges) on the surrogate fuel assemblies, cask and cradle structures, and the railcar so that forces and deflections that would result in the greatest potential for damage to high burnup and long-cooled UNF can be determined. For purposes of this report we consider testing on controlled track when we have control of the track and speed to facilitate modeling.« less

  14. Surrogate decision making: do we have to trade off accuracy and procedural satisfaction?

    PubMed

    Frey, Renato; Hertwig, Ralph; Herzog, Stefan M

    2014-02-01

    Making surrogate decisions on behalf of incapacitated patients can raise difficult questions for relatives, physicians, and society. Previous research has focused on the accuracy of surrogate decisions (i.e., the proportion of correctly inferred preferences). Less attention has been paid to the procedural satisfaction that patients' surrogates and patients attribute to specific approaches to making surrogate decisions. The objective was to investigate hypothetical patients' and surrogates' procedural satisfaction with specific approaches to making surrogate decisions and whether implementing these preferences would lead to tradeoffs between procedural satisfaction and accuracy. Study 1 investigated procedural satisfaction by assigning participants (618 in a mixed-age but relatively young online sample and 50 in an older offline sample) to the roles of hypothetical surrogates or patients. Study 2 (involving 64 real multigenerational families with a total of 253 participants) investigated accuracy using 24 medical scenarios. Hypothetical patients and surrogates had closely aligned preferences: Procedural satisfaction was highest with a patient-designated surrogate, followed by shared surrogate decision-making approaches and legally assigned surrogates. These approaches did not differ substantially in accuracy. Limitations are that participants' preferences regarding existing and novel approaches to making surrogate decisions can only be elicited under hypothetical conditions. Next to decision making by patient-designated surrogates, shared surrogate decision making is the preferred approach among patients and surrogates alike. This approach appears to impose no tradeoff between procedural satisfaction and accuracy. Therefore, shared decision making should be further studied in representative samples of the general population, and if people's preferences prove to be robust, they deserve to be weighted more strongly in legal frameworks in addition to patient-designated surrogates.

  15. Optimization and uncertainty assessment of strongly nonlinear groundwater models with high parameter dimensionality

    NASA Astrophysics Data System (ADS)

    Keating, Elizabeth H.; Doherty, John; Vrugt, Jasper A.; Kang, Qinjun

    2010-10-01

    Highly parameterized and CPU-intensive groundwater models are increasingly being used to understand and predict flow and transport through aquifers. Despite their frequent use, these models pose significant challenges for parameter estimation and predictive uncertainty analysis algorithms, particularly global methods which usually require very large numbers of forward runs. Here we present a general methodology for parameter estimation and uncertainty analysis that can be utilized in these situations. Our proposed method includes extraction of a surrogate model that mimics key characteristics of a full process model, followed by testing and implementation of a pragmatic uncertainty analysis technique, called null-space Monte Carlo (NSMC), that merges the strengths of gradient-based search and parameter dimensionality reduction. As part of the surrogate model analysis, the results of NSMC are compared with a formal Bayesian approach using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. Such a comparison has never been accomplished before, especially in the context of high parameter dimensionality. Despite the highly nonlinear nature of the inverse problem, the existence of multiple local minima, and the relatively large parameter dimensionality, both methods performed well and results compare favorably with each other. Experiences gained from the surrogate model analysis are then transferred to calibrate the full highly parameterized and CPU intensive groundwater model and to explore predictive uncertainty of predictions made by that model. The methodology presented here is generally applicable to any highly parameterized and CPU-intensive environmental model, where efficient methods such as NSMC provide the only practical means for conducting predictive uncertainty analysis.

  16. SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalski, D; Huq, M; Bednarz, G

    Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same ismore » for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.« less

  17. Is Decedent Race an Independent Predictor of Organ Donor Consent, or Merely a Surrogate Marker of Socioeconomic Status?

    PubMed Central

    DuBay, Derek A.; Redden, David; Haque, Akhlaque; Gray, Stephen; Fouad, Mona; Siminoff, Laura A.; Holt, Cheryl; Kohler, Connie; Eckhoff, Devin

    2013-01-01

    Background Studies have demonstrated that African American race is a strong predictor of non-donation. However, it is often and correctly argued that African American race is a crude explanatory variable that is a surrogate marker of socioeconomic status (SES), education and access to health care. We hypothesized that when controlling for these factors, African American race would cease to be a predictor of organ donation. Methods A retrospective review was performed of 1292 Alabama decedents approached for organ donation between 2006 and 2009. Multivariable logistic regression models were constructed to identify the most parsimonious model that could explain the variation in the log-odds of obtaining consent. Results Consent for donation was obtained from 49% of the decedent's families. Household income was a predictor of organ donor consent only in Caucasians. Surprisingly, household income was not statistically different between consented and non-consented African American decedents ($25,147 vs. $26,137; p=0.90). On multivariable analysis, education, urban residence and shorter distance between the decedent residence and donor hospital were significantly associated with obtaining consent for organ donation. On univariate analysis, the odds of donor consent in Caucasians compared to African Americans was 2.76 (95% CI 2.17 – 3.57). When controlling for SES and access to healthcare variables, the odds of donor consent increased to 4.36 (95% CI 2.88 – 6.61). Conclusions We interpret this result to indicate that there remains unknown but important factor(s) associated with both race and obtaining organ donor consent. Further studies are required to isolate and determine whether this factor(s) is modifiable. PMID:23018878

  18. A refined 2010-based VOC emission inventory and its improvement on modeling regional ozone in the Pearl River Delta Region, China.

    PubMed

    Yin, Shasha; Zheng, Junyu; Lu, Qing; Yuan, Zibing; Huang, Zhijiong; Zhong, Liuju; Lin, Hui

    2015-05-01

    Accurate and gridded VOC emission inventories are important for improving regional air quality model performance. In this study, a four-level VOC emission source categorization system was proposed. A 2010-based gridded Pearl River Delta (PRD) regional VOC emission inventory was developed with more comprehensive source coverage, latest emission factors, and updated activity data. The total anthropogenic VOC emission was estimated to be about 117.4 × 10(4)t, in which on-road mobile source shared the largest contribution, followed by industrial solvent use and industrial processes sources. Among the industrial solvent use source, furniture manufacturing and shoemaking were major VOC emission contributors. The spatial surrogates of VOC emission were updated for major VOC sources such as industrial sectors and gas stations. Subsector-based temporal characteristics were investigated and their temporal variations were characterized. The impacts of updated VOC emission estimates and spatial surrogates were evaluated by modeling O₃ concentration in the PRD region in the July and October of 2010, respectively. The results indicated that both updated emission estimates and spatial allocations can effectively reduce model bias on O₃ simulation. Further efforts should be made on the refinement of source classification, comprehensive collection of activity data, and spatial-temporal surrogates in order to reduce uncertainty in emission inventory and improve model performance. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Computational benefits using artificial intelligent methodologies for the solution of an environmental design problem: saltwater intrusion.

    PubMed

    Papadopoulou, Maria P; Nikolos, Ioannis K; Karatzas, George P

    2010-01-01

    Artificial Neural Networks (ANNs) comprise a powerful tool to approximate the complicated behavior and response of physical systems allowing considerable reduction in computation time during time-consuming optimization runs. In this work, a Radial Basis Function Artificial Neural Network (RBFN) is combined with a Differential Evolution (DE) algorithm to solve a water resources management problem, using an optimization procedure. The objective of the optimization scheme is to cover the daily water demand on the coastal aquifer east of the city of Heraklion, Crete, without reducing the subsurface water quality due to seawater intrusion. The RBFN is utilized as an on-line surrogate model to approximate the behavior of the aquifer and to replace some of the costly evaluations of an accurate numerical simulation model which solves the subsurface water flow differential equations. The RBFN is used as a local approximation model in such a way as to maintain the robustness of the DE algorithm. The results of this procedure are compared to the corresponding results obtained by using the Simplex method and by using the DE procedure without the surrogate model. As it is demonstrated, the use of the surrogate model accelerates the convergence of the DE optimization procedure and additionally provides a better solution at the same number of exact evaluations, compared to the original DE algorithm.

  20. A Deep Learning based Approach to Reduced Order Modeling of Fluids using LSTM Neural Networks

    NASA Astrophysics Data System (ADS)

    Mohan, Arvind; Gaitonde, Datta

    2017-11-01

    Reduced Order Modeling (ROM) can be used as surrogates to prohibitively expensive simulations to model flow behavior for long time periods. ROM is predicated on extracting dominant spatio-temporal features of the flow from CFD or experimental datasets. We explore ROM development with a deep learning approach, which comprises of learning functional relationships between different variables in large datasets for predictive modeling. Although deep learning and related artificial intelligence based predictive modeling techniques have shown varied success in other fields, such approaches are in their initial stages of application to fluid dynamics. Here, we explore the application of the Long Short Term Memory (LSTM) neural network to sequential data, specifically to predict the time coefficients of Proper Orthogonal Decomposition (POD) modes of the flow for future timesteps, by training it on data at previous timesteps. The approach is demonstrated by constructing ROMs of several canonical flows. Additionally, we show that statistical estimates of stationarity in the training data can indicate a priori how amenable a given flow-field is to this approach. Finally, the potential and limitations of deep learning based ROM approaches will be elucidated and further developments discussed.

  1. Radiation Transport in Random Media With Large Fluctuations

    NASA Astrophysics Data System (ADS)

    Olson, Aaron; Prinja, Anil; Franke, Brian

    2017-09-01

    Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.

  2. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases. Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission. Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an advanced design methods (ADM) based approach. This approach applies the concepts of design of experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development effort. In order to fit a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful application of an ADM approach to an ACO design study. Following the overview of the tool set automation, an example problem will be given to illustrate the implementation of the ADM approach. The example problem will first cover the inclusion of ground rules and assumptions (GR&A) for a study. The GR&A are very important to the study as they determine the constraints within which a trade study can be conducted. These trades must ultimately reconcile with the customer's desired output and any anticipated "what if" questions. The example problem will then illustrate the setup and execution of a DOE through the automated ACO tools. This process is accomplished more efficiently in this work by splitting the tools into two separate environments. The first environment encompasses the structural optimization and mass estimation tools, while the second is focused on trajectory optimization. Surrogate models are fit to the outputs of each environment and are "integrated" via connection of the surrogate equations. Throughout this process, checks are implemented to compare the output of the surrogates to the output of manually run cases to ensure that the error of the final surrogates is at an acceptable level. The conclusion of the example problem demonstrates the utility of the ADM based approach. Using surrogate models gives the ACO team the ability to visualize vehicle sensitivities to various design parameters and identify regions of interest within the design space. The ADM approach can thus be used to inform concept down selection and isolate promising vehicle configurations to be explored in more detail through the manual design process. In addition it provides the customer with an almost instantaneous turnaround on any ''what if" questions that may arise within the bounds of the surrogate model. This approach ultimately expands the ability of the ACO team to provide its customer with broad and rapid turnaround trade studies for launch vehicle conceptual design. The ability to identify a selection of designs which can meet the customer requirements will help ensure lower LCC of launch vehicle designs originating from ACO.

  3. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases.1 Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission.2 Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an Advanced Design Methods (ADM) based approach. This approach applies the concepts of Design of Experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development e ort. In order to t a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful application of an ADM approach to an ACO design study. Following the overview of the tool set automation, an example problem will be given to illustrate the implementation of the ADM approach. The example problem will first cover the inclusion of Ground Rules and Assumptions (GR&A) for a study. The GR&A are very important to the study as they determine the constraints within which a trade study can be conducted. These trades must ultimately reconcile with the customer's desired output and any anticipated \\what if" questions. The example problem will then illustrate the setup and execution of a DOE through the automated ACO tools. This process is accomplished more efficiently in this work by splitting the tools into two separate environments. The first environment encompasses the structural optimization and mass estimation tools, while the second is focused on trajectory optimization. Surrogate models are t to the outputs of each environment and are integrated via connection of the surrogate equations. Throughout this process, checks are implemented to compare the output of the surrogates to the output of manually run cases to ensure that the error of the final surrogates is at an acceptable level. The conclusion of the example problem demonstrates the utility of the ADM based approach. Using surrogate models gives the ACO team the ability to visualize vehicle sensitivities to various design parameters and identify regions of interest within the design space. The ADM approach can thus be used to inform concept down selection and isolate promising vehicle configurations to be explored in more detail through the manual design process. In addition it provides the customer with an almost instantaneous turnaround on any \\what if" questions that may arise within the bounds of the surrogate model. This approach ultimately expands the ability of the ACO team to provide its customer with broad and rapid turnaround trade studies for launch vehicle conceptual design. The ability to identify a selection of designs which can meet the customer requirements will have the potential to lower LCC of launch vehicle designs originating from ACO.

  4. Is blood pressure reduction a valid surrogate endpoint for stroke prevention? an analysis incorporating a systematic review of randomised controlled trials, a by-trial weighted errors-in-variables regression, the surrogate threshold effect (STE) and the biomarker-surrogacy (BioSurrogate) evaluation schema (BSES)

    PubMed Central

    2012-01-01

    Background Blood pressure is considered to be a leading example of a valid surrogate endpoint. The aims of this study were to (i) formally evaluate systolic and diastolic blood pressure reduction as a surrogate endpoint for stroke prevention and (ii) determine what blood pressure reduction would predict a stroke benefit. Methods We identified randomised trials of at least six months duration comparing any pharmacologic anti-hypertensive treatment to placebo or no treatment, and reporting baseline blood pressure, on-trial blood pressure, and fatal and non-fatal stroke. Trials with fewer than five strokes in at least one arm were excluded. Errors-in-variables weighted least squares regression modelled the reduction in stroke as a function of systolic blood pressure reduction and diastolic blood pressure reduction respectively. The lower 95% prediction band was used to determine the minimum systolic blood pressure and diastolic blood pressure difference, the surrogate threshold effect (STE), below which there would be no predicted stroke benefit. The STE was used to generate the surrogate threshold effect proportion (STEP), a surrogacy metric, which with the R-squared trial-level association was used to evaluate blood pressure as a surrogate endpoint for stroke using the Biomarker-Surrogacy Evaluation Schema (BSES3). Results In 18 qualifying trials representing all pharmacologic drug classes of antihypertensives, assuming a reliability coefficient of 0.9, the surrogate threshold effect for a stroke benefit was 7.1 mmHg for systolic blood pressure and 2.4 mmHg for diastolic blood pressure. The trial-level association was 0.41 and 0.64 and the STEP was 66% and 78% for systolic and diastolic blood pressure respectively. The STE and STEP were more robust to measurement error in the independent variable than R-squared trial-level associations. Using the BSES3, assuming a reliability coefficient of 0.9, systolic blood pressure was a B + grade and diastolic blood pressure was an A grade surrogate endpoint for stroke prevention. In comparison, using the same stroke data sets, no STEs could be estimated for cardiovascular (CV) mortality or all-cause mortality reduction, although the STE for CV mortality approached 25 mmHg for systolic blood pressure. Conclusions In this report we provide the first surrogate threshold effect (STE) values for systolic and diastolic blood pressure. We suggest the STEs have face and content validity, evidenced by the inclusivity of trial populations, subject populations and pharmacologic intervention populations in their calculation. We propose that the STE and STEP metrics offer another method of evaluating the evidence supporting surrogate endpoints. We demonstrate how surrogacy evaluations are strengthened if formally evaluated within specific-context evaluation frameworks using the Biomarker- Surrogate Evaluation Schema (BSES3), and we discuss the implications of our evaluation of blood pressure on other biomarkers and patient-reported instruments in relation to surrogacy metrics and trial design. PMID:22409774

  5. Improvement in surrogate endpoints by a multidisciplinary team in a mobile clinic serving a low-income, immigrant minority population in South Florida.

    PubMed

    Singh-Franco, Devada; Perez, Alexandra; Wolowich, William R

    2013-02-01

    To determine effect on surrogate endpoints for cardiovascular disease (CVD), we performed a retrospective chart review of 114 patients seen by a multidisciplinary team that provided primary care services in a mobile clinic over 12 months. Eligible patients had outcomes available for at least six months. Mixed effect modeling examined variation in surrogate markers for CVD: blood pressure (BP), heart rate, and body mass index. Repeated measures ANOVA compared lipids, hemoglobin A1c, and medication use from baseline and throughout study. Most patients were female (75%), Haitian (76%), and low-income ($747/month) with average age 63 years. Common diagnoses were hypertension (82%) and hyperlipidemia (63%). Significant reduction in systolic BP, total- and LDL-cholesterol, and hemoglobin A1c were found (p<.05). Use of ACE-inhibitors, beta-blockers, diuretics, aspirin, metformin, and statins increased significantly (p<.05). Mobile clinic with a multidisciplinary team improved surrogate endpoints over 12 months in underserved, low-income, mostly foreign-born, Haitian population in U.S.

  6. Validation of high-sensitivity performance for a United States Food and Drug Administration cleared cardiac troponin I assay.

    PubMed

    Christenson, Robert H; Mullins, Kristin; Duh, Show-Hong

    2018-06-01

    High-Sensitivity (hs) cardiac troponin (cTn) assays are categorized by two criteria: (i) cTn values above the limit of detection (LoD) for >50% of male and female healthy cohorts of ≥300 individuals; (ii) imprecision ≤10% total %CV for sex-specific 99th-percentile clinical decision values (CDVs). No documented hs-Tn assay has yet been FDA-cleared. The PATHFAST cTnI-II assay's LoD was 2.3 ng/L using CLSI EP-17. The AACC Universal Sample Bank of 847 healthy men (50.6%) and women (49.4%) was used to determine 99th-percentile CDVs with Nonparametric, Harrell-Davis and Robust modeling. Health/Medication questionnaires and Amino-terminal proBNP, Hemoglobin A 1c and estimated Glomerular Filtration Rate surrogates excluded underlying health conditions. The cTnI-II test's total CV was 6.1% at 29 ng/L and 7.1% at 22 ng/L; the LoD was 2.3 ng/L. Of the full 847-member healthy cohort, 113 (13.3%) were excluded by abnormal surrogate biomarkers. The final 734-member healthy population had the following (% > LoD): overall, 487 (66.3%); women, 186 (52.8%); and men, 301 (78.8%). 99th-percentile CDVs by Nonparametric modeling were: 28 ng/L (90% CI: 20-30), overall final 732-member healthy population; 20 ng/L (90% CI: 13-30), 352 women; and 30 ng/L (90% CI: 21-37), 382 men. Differences between sex-specific CDVs were not significantly different (p > .05) with Nonparametric or Harrell-Davis modeling; however, Robust Modeling did show significance (<0.05), with lower CDVs at 11 ng/L (90% CI: 9-12) and 16 ng/L (90% CI: 15-18) for the female and male cohorts, respectively. cTnI-II is the only FDA-cleared assay that has demonstrated high-sensitivity cTn assay. Use of recommended modeling in >300 healthy subjects for determining sex-specific 99th-percentile CDVs did not show statistically significant differences except with the Robust modeling. Copyright © 2018. Published by Elsevier Inc.

  7. Two Decades of Cardiovascular Trials With Primary Surrogate Endpoints: 1990-2011.

    PubMed

    Bikdeli, Behnood; Punnanithinont, Natdanai; Akram, Yasir; Lee, Ike; Desai, Nihar R; Ross, Joseph S; Krumholz, Harlan M

    2017-03-21

    Surrogate endpoint trials test strategies more efficiently but are accompanied by uncertainty about the relationship between changes in surrogate markers and clinical outcomes. We identified cardiovascular trials with primary surrogate endpoints published in the New England Journal of Medicine , Lancet , and JAMA: Journal of the American Medical Association from 1990 to 2011 and determined the trends in publication of surrogate endpoint trials and the success of the trials in meeting their primary endpoints. We tracked for publication of clinical outcome trials on the interventions tested in surrogate trials. We screened 3016 articles and identified 220 surrogate endpoint trials. From the total of 220 surrogate trials, 157 (71.4%) were positive for their primary endpoint. Only 59 (26.8%) surrogate trials had a subsequent clinical outcomes trial. Among these 59 trials, 24 outcomes trial results validated the positive surrogates, whereas 20 subsequent outcome trials were negative following positive results on a surrogate. We identified only 3 examples in which the surrogate trial was negative but a subsequent outcomes trial was conducted and showed benefit. Findings were consistent in a sample cohort of 383 screened articles inclusive of 37 surrogate endpoint trials from 6 other high-impact journals. Although cardiovascular surrogate outcomes trials frequently show superiority of the tested intervention, they are infrequently followed by a prominent outcomes trial. When there was a high-profile clinical outcomes study, nearly half of the positive surrogate trials were not validated. Cardiovascular surrogate outcome trials may be more appropriate for excluding benefit from the patient perspective than for identifying it. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  8. Assessing Uncertainty of Interspecies Correlation Estimation Models for Aromatic Compounds

    EPA Science Inventory

    We developed Interspecies Correlation Estimation (ICE) models for aromatic compounds containing 1 to 4 benzene rings to assess uncertainty in toxicity extrapolation in two data compilation approaches. ICE models are mathematical relationships between surrogate and predicted test ...

  9. Fuel assembly shaker and truck test simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klymyshyn, Nicholas A.; Jensen, Philip J.; Sanborn, Scott E.

    2014-09-30

    This study continues the modeling support of the SNL shaker table task from 2013 and includes analysis of the SNL 2014 truck test campaign. Detailed finite element models of the fuel assembly surrogate used by SNL during testing form the basis of the modeling effort. Additional analysis was performed to characterize and filter the accelerometer data collected during the SNL testing. The detailed fuel assembly finite element model was modified to improve the performance and accuracy of the original surrogate fuel assembly model in an attempt to achieve a closer agreement with the low strains measured during testing. The revisedmore » model was used to recalculate the shaker table load response from the 2013 test campaign. As it happened, the results remained comparable to the values calculated with the original fuel assembly model. From this it is concluded that the original model was suitable for the task and the improvements to the model were not able to bring the calculated strain values down to the extremely low level recorded during testing. The model needs more precision to calculate strains that are so close to zero. The truck test load case had an even lower magnitude than the shaker table case. Strain gage data from the test was compared directly to locations on the model. Truck test strains were lower than the shaker table case, but the model achieved a better relative agreement of 100-200 microstrains (or 0.0001-0.0002 mm/mm). The truck test data included a number of accelerometers at various locations on the truck bed, surrogate basket, and surrogate fuel assembly. This set of accelerometers allowed an evaluation of the dynamics of the conveyance system used in testing. It was discovered that the dynamic load transference through the conveyance has a strong frequency-range dependency. This suggests that different conveyance configurations could behave differently and transmit different magnitudes of loads to the fuel even when traveling down the same road at the same speed. It is recommended that the SNL conveyance system used in testing be characterized through modal analysis and frequency response analysis to provide context and assist in the interpretation of the strain data that was collected during the truck test campaign.« less

  10. External Heat Transfer Coefficient Measurements on a Surrogate Indirect Inertial Confinement Fusion Target

    DOE PAGES

    Miles, Robin; Havstad, Mark; LeBlanc, Mary; ...

    2015-09-15

    External heat transfer coefficients were measured around a surrogate Indirect inertial confinement fusion (ICF) based on the Laser Inertial Fusion Energy (LIFE) design target to validate thermal models of the LIFE target during flight through a fusion chamber. Results indicate that heat transfer coefficients for this target 25-50 W/m 2∙K are consistent with theoretically derived heat transfer coefficients and valid for use in calculation of target heating during flight through a fusion chamber.

  11. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method was studied, and the method was compared to other surrogate-based optimization methods that aim to locate the global optimum using two two-dimensional test functions, a six-dimensional test function, and a five-dimensional engineering example.

  12. Evaluation of Enterococcus faecium NRRL B-2354 as a Surrogate for Salmonella During Extrusion of Low-Moisture Food.

    PubMed

    Verma, Tushar; Wei, Xinyao; Lau, Soon Kiat; Bianchini, Andreia; Eskridge, Kent M; Subbiah, Jeyamkondan

    2018-04-01

    Salmonella in low-moisture foods is an emerging challenge due to numerous food product recalls and foodborne illness outbreaks. Identification of suitable surrogate is critical for process validation at industry level due to implementation of new Food Safety Modernization Act of 2011. The objective of this study was to evaluate Enterococcus faecium NRRL B-2354 as a surrogate for Salmonella during the extrusion of low-moisture food. Oat flour, a low-moisture food, was adjusted to different moisture (14% to 26% wet basis) and fat (5% to 15% w/w) contents and was inoculated with E. faecium NRRL B-2354. Inoculated material was then extruded in a lab-scale single-screw extruder running at different screw speeds (75 to 225 rpm) and different temperatures (75, 85, and 95 °C). A split-plot central composite 2nd order response surface design was used, with the central point replicated six times. The data from the selective media (m-Enterococcus agar) was used to build the response surface model for inactivation of E. faecium NRRL B-2354. Results indicated that E. faecium NRRL B-2354 always had higher heat resistance compared to Salmonella at all conditions evaluated in this study. However, the patterns of contour plots showing the effect of various product and process parameters on inactivation of E. faecium NRRL B-2354 was different from that of Salmonella. Although E. faecium NRRL B-2354 may be an acceptable surrogate for extrusion of low-moisture products due to higher resistance than Salmonella, another surrogate with similar inactivation behavior may be preferred and needs to be identified. Food Safety Modernization Act requires the food industry to validate processing interventions. This study validated extrusion processing and demonstrated that E. faecium NRRL B-2354 is an acceptable surrogate for extrusion of low-moisture products. The developed response surface model allows the industry to identify process conditions to achieve a desired lethality for their products based on composition. © 2018 Institute of Food Technologists®.

  13. Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea

    2015-09-01

    The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less

  14. Neutron Protection Factor Determination and Validation for a Vehicle Surrogate Using a Californium Fission Source

    DTIC Science & Technology

    2017-06-01

    protection factors . The success of this research is a direct result of the immense collaboration across a number of institutions that all shared a...at post detonation neutron transport, an exact solution is not needed. Instead, the RPF research campaign uses a statistical-based method through a... factors of selected light vehicles against residual radiation,” United States Army Ballistic Research Laboratory, Aberdeen Proving Ground, MD, 1988

  15. The effectiveness of surrogate taxa to conserve freshwater biodiversity.

    PubMed

    Stewart, David R; Underwood, Zachary E; Rahel, Frank J; Walters, Annika W

    2018-02-01

    Establishing protected areas has long been an effective conservation strategy and is often based on readily surveyed species. The potential of any freshwater taxa to be a surrogate for other aquatic groups has not been explored fully. We compiled occurrence data on 72 species of freshwater fishes, amphibians, mussels, and aquatic reptiles for the Great Plains, Wyoming (U.S.A.). We used hierarchical Bayesian multispecies mixture models and MaxEnt models to describe species' distributions and the program Zonation to identify areas of conservation priority for each aquatic group. The landscape-scale factors that best characterized aquatic species' distributions differed among groups. There was low agreement and congruence among taxa-specific conservation priorities (<20%), meaning no surrogate priority areas would include or protect the best habitats of other aquatic taxa. Common, wideranging aquatic species were included in taxa-specific priority areas, but rare freshwater species were not included. Thus, the development of conservation priorities based on a single freshwater aquatic group would not protect all species in the other aquatic groups. © 2017 Society for Conservation Biology.

  16. An information-theoretic approach to surrogate-marker evaluation with failure time endpoints.

    PubMed

    Pryseley, Assam; Tilahun, Abel; Alonso, Ariel; Molenberghs, Geert

    2011-04-01

    Over the last decades, the evaluation of potential surrogate endpoints in clinical trials has steadily been growing in importance, not only thanks to the availability of ever more potential markers and surrogate endpoints, also because more methodological development has become available. While early work has been devoted, to a large extent, to Gaussian, binary, and longitudinal endpoints, the case of time-to-event endpoints is in need of careful scrutiny as well, owing to the strong presence of such endpoints in oncology and beyond. While work had been done in the past, it was often cumbersome to use such tools in practice, because of the need for fitting copula or frailty models that were further embedded in a hierarchical or two-stage modeling approach. In this paper, we present a methodologically elegant and easy-to-use approach based on information theory. We resolve essential issues, including the quantification of "surrogacy" based on such an approach. Our results are put to the test in a simulation study and are applied to data from clinical trials in oncology. The methodology has been implemented in R.

  17. Robust best linear estimator for Cox regression with instrumental variables in whole cohort and surrogates with additive measurement error in calibration sample

    PubMed Central

    Wang, Ching-Yun; Song, Xiao

    2017-01-01

    SUMMARY Biomedical researchers are often interested in estimating the effect of an environmental exposure in relation to a chronic disease endpoint. However, the exposure variable of interest may be measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies an additive measurement error model, but it may not have repeated measurements. The subset in which the surrogate variables are available is called a calibration sample. In addition to the surrogate variables that are available among the subjects in the calibration sample, we consider the situation when there is an instrumental variable available for all study subjects. An instrumental variable is correlated with the unobserved true exposure variable, and hence can be useful in the estimation of the regression coefficients. In this paper, we propose a nonparametric method for Cox regression using the observed data from the whole cohort. The nonparametric estimator is the best linear combination of a nonparametric correction estimator from the calibration sample and the difference of the naive estimators from the calibration sample and the whole cohort. The asymptotic distribution is derived, and the finite sample performance of the proposed estimator is examined via intensive simulation studies. The methods are applied to the Nutritional Biomarkers Study of the Women’s Health Initiative. PMID:27546625

  18. Simultaneous tumor and surrogate motion tracking with dynamic MRI for radiation therapy planning

    NASA Astrophysics Data System (ADS)

    Park, Seyoun; Farah, Rana; Shea, Steven M.; Tryggestad, Erik; Hales, Russell; Lee, Junghoon

    2018-01-01

    Respiration-induced tumor motion is a major obstacle for achieving high-precision radiotherapy of cancers in the thoracic and abdominal regions. Surrogate-based estimation and tracking methods are commonly used in radiotherapy, but with limited understanding of quantified correlation to tumor motion. In this study, we propose a method to simultaneously track the lung tumor and external surrogates to evaluate their spatial correlation in a quantitative way using dynamic MRI, which allows real-time acquisition without ionizing radiation exposure. To capture the lung and whole tumor, four MRI-compatible fiducials are placed on the patient’s chest and upper abdomen. Two different types of acquisitions are performed in the sagittal orientation including multi-slice 2D cine MRIs to reconstruct 4D-MRI and two-slice 2D cine MRIs to simultaneously track the tumor and fiducials. A phase-binned 4D-MRI is first reconstructed from multi-slice MR images using body area as a respiratory surrogate and groupwise registration. The 4D-MRI provides 3D template volumes for different breathing phases. 3D tumor position is calculated by 3D-2D template matching in which 3D tumor templates in the 4D-MRI reconstruction and the 2D cine MRIs from the two-slice tracking dataset are registered. 3D trajectories of the external surrogates are derived via matching a 3D geometrical model of the fiducials to their segmentations on the 2D cine MRIs. We tested our method on ten lung cancer patients. Using a correlation analysis, the 3D tumor trajectory demonstrates a noticeable phase mismatch and significant cycle-to-cycle motion variation, while the external surrogate was not sensitive enough to capture such variations. Additionally, there was significant phase mismatch between surrogate signals obtained from the fiducials at different locations.

  19. SELECTION AND CALIBRATION OF SUBSURFACE REACTIVE TRANSPORT MODELS USING A SURROGATE-MODEL APPROACH

    EPA Science Inventory

    While standard techniques for uncertainty analysis have been successfully applied to groundwater flow models, extension to reactive transport is frustrated by numerous difficulties, including excessive computational burden and parameter non-uniqueness. This research introduces a...

  20. Surviving Surrogate Decision-Making: What Helps and Hampers the Experience of Making Medical Decisions for Others

    PubMed Central

    Starks, Helene; Taylor, Janelle S.; Hopley, Elizabeth K.; Fryer-Edwards, Kelly

    2007-01-01

    BACKGROUND A majority of end-of-life medical decisions are made by surrogate decision-makers who have varying degrees of preparation and comfort with their role. Having a seriously ill family member is stressful for surrogates. Moreover, most clinicians have had little training in working effectively with surrogates. OBJECTIVES To better understand the challenges of decision-making from the surrogate’s perspective. DESIGN Semistructured telephone interview study of the experience of surrogate decision-making. PARTICIPANTS Fifty designated surrogates with previous decision-making experience. APPROACH We asked surrogates to describe and reflect on their experience of making medical decisions for others. After coding transcripts, we conducted a content analysis to identify and categorize factors that made decision-making more or less difficult for surrogates. RESULTS Surrogates identified four types of factors: (1) surrogate characteristics and life circumstances (such as coping strategies and competing responsibilities), (2) surrogates’ social networks (such as intrafamily discord about the “right” decision), (3) surrogate–patient relationships and communication (such as difficulties with honoring known preferences), and (4) surrogate–clinician communication and relationship (such as interacting with a single physician whom the surrogate recognizes as the clinical spokesperson vs. many clinicians). CONCLUSIONS These data provide insights into the challenges that surrogates encounter when making decisions for loved ones and indicate areas where clinicians could intervene to facilitate the process of surrogate decision-making. Clinicians may want to include surrogates in advance care planning prior to decision-making, identify and address surrogate stressors during decision-making, and designate one person to communicate information about the patient’s condition, prognosis, and treatment options. PMID:17619223

Top