Sample records for uq research shows

  1. Enabling Linked Science in Global Climate Uncertainty Quantification (UQ) Research

    NASA Astrophysics Data System (ADS)

    Elsethagen, T.; Stephan, E.; Lin, G.; Williams, D.; Banks, E.

    2012-12-01

    This paper shares a real-world global climate UQ science use case and illustrates how a linked science application called Provenance Environment (ProvEn), currently being developed, enables and facilitates scientific teams to publish, share, link, and discover new links over their UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. This research claims that scientists using this linked science approach will not only allow them to greatly benefit from understanding a particular dataset within a knowledge context, a benefit can also be seen by the cross reference of knowledge among the numerous UQ studies being stored in ESGF. ProvEn collects native forms of data provenance resources as the UQ study is carried out. The native data provenance resources can be collected from a variety of sources such as scripts, a workflow engine log, simulation log files, scientific team members etc. Schema alignment is used to translate the native forms of provenance into a set of W3C PROV-O semantic statements used as a common interchange format which will also contain URI references back to resources in the UQ study dataset for querying and cross referencing. ProvEn leverages Fedora Commons' digital object model in a Resource Oriented Architecture (ROA) (i.e. a RESTful framework) to logically organize and partition native and translated provenance resources by UQ study. The ROA also provides scientists the means to both search native and translated forms of provenance.

  2. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  3. Two solanesyl diphosphate synthases with different subcellular localizations and their respective physiological roles in Oryza sativa

    PubMed Central

    Ohara, Kazuaki; Sasaki, Kanako; Yazaki, Kazufumi

    2010-01-01

    Long chain prenyl diphosphates are crucial biosynthetic precursors of ubiquinone (UQ) in many organisms, ranging from bacteria to humans, as well as precursors of plastoquinone in photosynthetic organisms. The cloning and characterization of two solanesyl diphosphate synthase genes, OsSPS1 and OsSPS2, in Oryza sativa is reported here. OsSPS1 was highly expressed in root tissue whereas OsSPS2 was found to be high in both leaves and roots. Enzymatic characterization using recombinant proteins showed that both OsSPS1 and OsSPS2 could produce solanesyl diphosphates as their final product, while OsSPS1 showed stronger activity than OsSPS2. However, an important biological difference was observed between the two genes: OsSPS1 complemented the yeast coq1 disruptant, which does not form UQ, whereas OsSPS2 only very weakly complemented the growth defect of the coq1 mutant. HPLC analyses showed that both OsSPS1 and OsSPS2 yeast transformants produced UQ9 instead of UQ6, which is the native yeast UQ. According to the complementation study, the UQ9 levels in OsSPS2 transformants were much lower than that of OsSPS1. Green fluorescent protein fusion analyses showed that OsSPS1 localized to mitochondria, while OsSPS2 localized to plastids. This suggests that OsSPS1 is involved in the supply of solanesyl diphosphate for ubiquinone-9 biosynthesis in mitochondria, whereas OsSPS2 is involved in providing solanesyl diphosphate for plastoquinone-9 formation. These findings indicate that O. sativa has a different mechanism for the supply of isoprenoid precursors in UQ biosynthesis from Arabidopsis thaliana, in which SPS1 provides a prenyl moiety for UQ9 at the endoplasmic reticulum. PMID:20421194

  4. Two solanesyl diphosphate synthases with different subcellular localizations and their respective physiological roles in Oryza sativa.

    PubMed

    Ohara, Kazuaki; Sasaki, Kanako; Yazaki, Kazufumi

    2010-06-01

    Long chain prenyl diphosphates are crucial biosynthetic precursors of ubiquinone (UQ) in many organisms, ranging from bacteria to humans, as well as precursors of plastoquinone in photosynthetic organisms. The cloning and characterization of two solanesyl diphosphate synthase genes, OsSPS1 and OsSPS2, in Oryza sativa is reported here. OsSPS1 was highly expressed in root tissue whereas OsSPS2 was found to be high in both leaves and roots. Enzymatic characterization using recombinant proteins showed that both OsSPS1 and OsSPS2 could produce solanesyl diphosphates as their final product, while OsSPS1 showed stronger activity than OsSPS2. However, an important biological difference was observed between the two genes: OsSPS1 complemented the yeast coq1 disruptant, which does not form UQ, whereas OsSPS2 only very weakly complemented the growth defect of the coq1 mutant. HPLC analyses showed that both OsSPS1 and OsSPS2 yeast transformants produced UQ9 instead of UQ6, which is the native yeast UQ. According to the complementation study, the UQ9 levels in OsSPS2 transformants were much lower than that of OsSPS1. Green fluorescent protein fusion analyses showed that OsSPS1 localized to mitochondria, while OsSPS2 localized to plastids. This suggests that OsSPS1 is involved in the supply of solanesyl diphosphate for ubiquinone-9 biosynthesis in mitochondria, whereas OsSPS2 is involved in providing solanesyl diphosphate for plastoquinone-9 formation. These findings indicate that O. sativa has a different mechanism for the supply of isoprenoid precursors in UQ biosynthesis from Arabidopsis thaliana, in which SPS1 provides a prenyl moiety for UQ9 at the endoplasmic reticulum.

  5. Electrochemistry of LB films of mixed MGDG:UQ on ITO.

    PubMed

    Hoyo, Javier; Guaus, Ester; Torrent-Burgués, Juan; Sanz, Fausto

    2015-08-01

    The electrochemical behaviour of biomimetic monolayers of monogalactosyldiacylglycerol (MGDG) incorporating ubiquinone-10 (UQ) has been investigated. MGDG is the principal component in the thylakoid membrane and UQ seems a good substitute for plastoquinone-9, involved in photosynthesis chain. The monolayers have been performed using the Langmuir and Langmuir-Blodgett (LB) techniques and the redox behaviour of the LB films, transferred at several surface pressures on a glass covered with indium-tin oxide (ITO), has been characterized by cyclic voltammetry. The cyclic voltammograms show that UQ molecules present two redox processes (I and II) at high UQ content and high surface pressures, and only one redox process (I) at low UQ content and low surface pressures. The apparent rate constants calculated for processes I and II indicate a different kinetic control for the reduction and the oxidation of UQ/UQH2 redox couple, being k(Rapp)(I) = 2.2 · 10(-5) s(-1), k(Rapp)(II) = 5.1 · 10(-14) k(Oapp)(I) = 3.3 · 10(-3) s(-1) and k(Oapp)(II) = 6.1 · 10(-6) s(-1), respectively. The correlation of the redox response with the physical states of the LB films allows determining the positions of the UQ molecules in the biomimetic monolayer, which change with the surface pressure and the UQ content. These positions are known as diving and swimming. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  7. An embedding of the universal Askey-Wilson algebra into Uq (sl2) ⊗Uq (sl2) ⊗Uq (sl2)

    NASA Astrophysics Data System (ADS)

    Huang, Hau-Wen

    2017-09-01

    The Askey-Wilson algebras were used to interpret the algebraic structure hidden in the Racah-Wigner coefficients of the quantum algebra Uq (sl2). In this paper, we display an injection of a universal analog △q of Askey-Wilson algebras into Uq (sl2) ⊗Uq (sl2) ⊗Uq (sl2) behind the application. Moreover we establish the decomposition rules for 3-fold tensor products of irreducible Verma Uq (sl2)-modules and of finite-dimensional irreducible Uq (sl2)-modules into the direct sums of finite-dimensional irreducible △q-modules. As an application, we derive a formula for the Racah-Wigner coefficients of Uq (sl2).

  8. Preliminary Climate Uncertainty Quantification Study on Model-Observation Test Beds at Earth Systems Grid Federation Repository

    NASA Astrophysics Data System (ADS)

    Lin, G.; Stephan, E.; Elsethagen, T.; Meng, D.; Riihimaki, L. D.; McFarlane, S. A.

    2012-12-01

    Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in applications. It determines how likely certain outcomes are if some aspects of the system are not exactly known. UQ studies such as the atmosphere datasets greatly increased in size and complexity because they now comprise of additional complex iterative steps, involve numerous simulation runs and can consist of additional analytical products such as charts, reports, and visualizations to explain levels of uncertainty. These new requirements greatly expand the need for metadata support beyond the NetCDF convention and vocabulary and as a result an additional formal data provenance ontology is required to provide a historical explanation of the origin of the dataset that include references between the explanations and components within the dataset. This work shares a climate observation data UQ science use case and illustrates how to reduce climate observation data uncertainty and use a linked science application called Provenance Environment (ProvEn) to enable and facilitate scientific teams to publish, share, link, and discover knowledge about the UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. Uncertainty exists in observation data sets, which is due to sensor data process (such as time averaging), sensor failure in extreme weather conditions, and sensor manufacture error etc. To reduce the uncertainty in the observation data sets, a method based on Principal Component Analysis (PCA) was proposed to recover the missing values in observation data. Several large principal components (PCs) of data with missing values are computed based on available values using an iterative method. The computed PCs can approximate the true PCs with high accuracy given a condition of missing values is met; the iterative method greatly improve the computational efficiency in computing PCs. Moreover, noise removal is done at the same time during the process of computing missing values by using only several large PCs. The uncertainty quantification is done through statistical analysis of the distribution of different PCs. To record above UQ process, and provide an explanation on the uncertainty before and after the UQ process on the observation data sets, additional data provenance ontology, such as ProvEn, is necessary. In this study, we demonstrate how to reduce observation data uncertainty on climate model-observation test beds and using ProvEn to record the UQ process on ESGF. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand and convey the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. Climate scientists will not only benefit from understanding a particular dataset within a knowledge context, but also benefit from the cross reference of knowledge among the numerous UQ studies being stored in ESGF.

  9. A simple method for constructing the inhomogeneous quantum group IGLq(n) and its universal enveloping algebra Uq(igl(n))

    NASA Astrophysics Data System (ADS)

    Shariati, A.; Aghamohammadi, A.

    1995-12-01

    We propose a simple and concise method to construct the inhomogeneous quantum group IGLq(n) and its universal enveloping algebra Uq(igl(n)). Our technique is based on embedding an n-dimensional quantum space in an n+1-dimensional one as the set xn+1=1. This is possible only if one considers the multiparametric quantum space whose parameters are fixed in a specific way. The quantum group IGLq(n) is then the subset of GLq(n+1), which leaves the xn+1=1 subset invariant. For the deformed universal enveloping algebra Uq(igl(n)), we will show that it can also be embedded in Uq(gl(n+1)), provided one uses the multiparametric deformation of U(gl(n+1)) with a specific choice of its parameters.

  10. Validation of the urgency questionnaire in Portuguese: A new instrument to assess overactive bladder syndrome.

    PubMed

    Moraes, Rodolfo Pacheco de; Silva, Jonas Lopes da; Calado, Adriano Almeida; Cavalcanti, Geraldo de Aguiar

    2018-01-01

    Overactive Bladder (OAB) is a clinical condition characterized by symptoms reported by patients. Therefore, measurement instruments based on reported information are important for understanding its impact and treatment benefits. The aim of this study was to translate, culturally adapt and validate the Urgency Questionnaire (UQ) in Portuguese. Initially, the UQ was translated and culturally adapted to Portuguese. Sixty-three volunteers were enrolled in the study and were interviewed for responding the Portuguese version of the UQ and the validated Portuguese version of the Overactive Bladder Questionnaire short-form (OABq-SF), used as the gold standard measurement for the validation process. Psychometric properties such as criterion validity, stability, and reliability were tested. Forty-six subjects were included in the symptomatic group (presence of "urgency"), and seventeen were included in the asymptomatic group (control group). There was difference between symptomatic and asymptomatic subjects on all of the subscales (p≤0.001). The UQ subscales correlated with the OABq-SF subscales (p≤0.01), except the subscale "time to control urgency" and the item "impact" from the visual analog scales (VAS). However, these scales correlated with the OABq-SF - Symptom Bother Scale. The UQ subscales demonstrated stability over time (p<0.05), but the subscale "fear of incontinence" and the item "severity" of the VAS did not. All of the UQ subscales showed internal consistencies that were considered to be good or excellent. The Portuguese version of the UQ proved to be a valid tool for the evaluation of OAB in individuals whose native language is Portuguese. Copyright® by the International Brazilian Journal of Urology.

  11. Molecular Genetics of Ubiquinone Biosynthesis in Animals

    PubMed Central

    Wang, Ying; Hekimi, Siegfried

    2014-01-01

    Ubiquinone (UQ), also known as coenzyme Q (CoQ), is a redox-active lipid present in all cellular membranes where it functions in a variety of cellular processes. The best known functions of UQ are to act as a mobile electron carrier in the mitochondrial respiratory chain and to serve as a lipid soluble antioxidant in cellular membranes. All eukaryotic cells synthesize their own UQ. Most of the current knowledge on the UQ biosynthetic pathway was obtained by studying Escherichia coli and S. cerevisiae UQ-deficient mutants. The orthologues of all the genes known from yeast studies to be involved in UQ biosynthesis have subsequently been found in higher organisms. Animal mutants with different genetic defects in UQ biosynthesis display very different phenotypes, despite the fact that in all these mutants the same biosynthetic pathway is affected. This review summarizes the present knowledge of the eukaryotic biosynthesis of UQ, with focus on the biosynthetic genes identified in animals, including C. elegans, rodents and humans. Moreover, we review the phenotypes of mutants in these genes and discuss the functional consequences of UQ deficiency in general. PMID:23190198

  12. Uncertainty Quantification Analysis of Both Experimental and CFD Simulation Data of a Bench-scale Fluidized Bed Gasifier

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shahnam, Mehrdad; Gel, Aytekin; Subramaniyan, Arun K.

    Adequate assessment of the uncertainties in modeling and simulation is becoming an integral part of the simulation based engineering design. The goal of this study is to demonstrate the application of non-intrusive Bayesian uncertainty quantification (UQ) methodology in multiphase (gas-solid) flows with experimental and simulation data, as part of our research efforts to determine the most suited approach for UQ of a bench scale fluidized bed gasifier. UQ analysis was first performed on the available experimental data. Global sensitivity analysis performed as part of the UQ analysis shows that among the three operating factors, steam to oxygen ratio has themore » most influence on syngas composition in the bench-scale gasifier experiments. An analysis for forward propagation of uncertainties was performed and results show that an increase in steam to oxygen ratio leads to an increase in H2 mole fraction and a decrease in CO mole fraction. These findings are in agreement with the ANOVA analysis performed in the reference experimental study. Another contribution in addition to the UQ analysis is the optimization-based approach to guide to identify next best set of additional experimental samples, should the possibility arise for additional experiments. Hence, the surrogate models constructed as part of the UQ analysis is employed to improve the information gain and make incremental recommendation, should the possibility to add more experiments arise. In the second step, series of simulations were carried out with the open-source computational fluid dynamics software MFiX to reproduce the experimental conditions, where three operating factors, i.e., coal flow rate, coal particle diameter, and steam-to-oxygen ratio, were systematically varied to understand their effect on the syngas composition. Bayesian UQ analysis was performed on the numerical results. As part of Bayesian UQ analysis, a global sensitivity analysis was performed based on the simulation results, which shows that the predicted syngas composition is strongly affected not only by the steam-to-oxygen ratio (which was observed in experiments as well) but also by variation in the coal flow rate and particle diameter (which was not observed in experiments). The carbon monoxide mole fraction is underpredicted at lower steam-to-oxygen ratios and overpredicted at higher steam-to-oxygen ratios. The opposite trend is observed for the carbon dioxide mole fraction. These discrepancies are attributed to either excessive segregation of the phases that leads to the fuel-rich or -lean regions or alternatively the selection of reaction models, where different reaction models and kinetics can lead to different syngas compositions throughout the gasifier. To improve quality of numerical models used, the effect that uncertainties in reaction models for gasification, char oxidation, carbon monoxide oxidation, and water gas shift will have on the syngas composition at different grid resolution, along with bed temperature were investigated. The global sensitivity analysis showed that among various reaction models employed for water gas shift, gasification, char oxidation, the choice of reaction model for water gas shift has the greatest influence on syngas composition, with gasification reaction model being second. Syngas composition also shows a small sensitivity to temperature of the bed. The hydrodynamic behavior of the bed did not change beyond grid spacing of 18 times the particle diameter. However, the syngas concentration continued to be affected by the grid resolution as low as 9 times the particle diameter. This is due to a better resolution of the phasic interface between the gases and solid that leads to stronger heterogeneous reactions. This report is a compilation of three manuscripts published in peer-reviewed journals for the series of studies mentioned above.« less

  13. The UbiK protein is an accessory factor necessary for bacterial ubiquinone (UQ) biosynthesis and forms a complex with the UQ biogenesis factor UbiJ.

    PubMed

    Loiseau, Laurent; Fyfe, Cameron; Aussel, Laurent; Hajj Chehade, Mahmoud; Hernández, Sara B; Faivre, Bruno; Hamdane, Djemel; Mellot-Draznieks, Caroline; Rascalou, Bérengère; Pelosi, Ludovic; Velours, Christophe; Cornu, David; Lombard, Murielle; Casadesús, Josep; Pierrel, Fabien; Fontecave, Marc; Barras, Frédéric

    2017-07-14

    Ubiquinone (UQ), also referred to as coenzyme Q, is a widespread lipophilic molecule in both prokaryotes and eukaryotes in which it primarily acts as an electron carrier. Eleven proteins are known to participate in UQ biosynthesis in Escherichia coli , and we recently demonstrated that UQ biosynthesis requires additional, nonenzymatic factors, some of which are still unknown. Here, we report on the identification of a bacterial gene, yqiC , which is required for efficient UQ biosynthesis, and which we have renamed ubiK Using several methods, we demonstrated that the UbiK protein forms a complex with the C-terminal part of UbiJ, another UQ biogenesis factor we previously identified. We found that both proteins are likely to contribute to global UQ biosynthesis rather than to a specific biosynthetic step, because both ubiK and ubiJ mutants accumulated octaprenylphenol, an early intermediate of the UQ biosynthetic pathway. Interestingly, we found that both proteins are dispensable for UQ biosynthesis under anaerobiosis, even though they were expressed in the absence of oxygen. We also provide evidence that the UbiK-UbiJ complex interacts with palmitoleic acid, a major lipid in E. coli Last, in Salmonella enterica , ubiK was required for proliferation in macrophages and virulence in mice. We conclude that although the role of the UbiK-UbiJ complex remains unknown, our results support the hypothesis that UbiK is an accessory factor of Ubi enzymes and facilitates UQ biosynthesis by acting as an assembly factor, a targeting factor, or both. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  14. On Goal-Oriented, Hydrogeological Site Investigation: A Holistic Approach (Henry Darcy Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Rubin, Yoram

    2016-04-01

    UQ (for Uncertainty Quantification) is a critical element of groundwater management and by extension, of hydrological site investigation. While it is clear that UQ is an important goal, there is ambiguity as to what the target of the UQ should be, and how to make UQ relevant in the context of public policy. Planning for UQ (meaning what measurements to take, where, how many, what frequency, etc.), one could consider environmental performance parameters (EPMs, such as concentrations or travel time) as the targets of site investigation. But there is a need to go beyond EPMs, and to consider the uncertainty related to impacts such as enhanced cancer-risk due to groundwater contamination or, more generally, to decisions facing regulators. In any case, UQ requires site investigation, and decision-makers, who end up paying for it, are not really interested in EPMs: they care about making operational decisions that are defensible legally and justified from the perspective of public good. The key to UQ, whether considering EPMS or operational decisions concerning the public good, is defining a suitable strategy for site investigation. There is a body of published works on relating site investigations with EPMs, but much less is known on how to support operational decisions with strategies for site characterization. In this lecture, I will address this issue and I will outline a comprehensive approach for addressing it using a statistical formalism that couples hypothesis testing with Bayesian statistics. I refer to this approach as goal-oriented site investigation. I will show how site investigation strategies, with specifics such as which measurements to take and where, could be related to goals lined with operational decisions. This includes (1) defining the relevant goals; (2) formulating hypotheses; (3) defining alternative strategies for site investigation and (4) evaluating them in terms of probabilities for making errors in accepting or rejecting the hypotheses.

  15. Structure of electron transfer flavoprotein-ubiquinone oxidoreductase and electron transfer to the mitochondrial ubiquinone pool.

    PubMed

    Zhang, Jian; Frerman, Frank E; Kim, Jung-Ja P

    2006-10-31

    Electron transfer flavoprotein-ubiquinone oxidoreductase (ETF-QO) is a 4Fe4S flavoprotein located in the inner mitochondrial membrane. It catalyzes ubiquinone (UQ) reduction by ETF, linking oxidation of fatty acids and some amino acids to the mitochondrial respiratory chain. Deficiencies in ETF or ETF-QO result in multiple acyl-CoA dehydrogenase deficiency, a human metabolic disease. Crystal structures of ETF-QO with and without bound UQ were determined, and they are essentially identical. The molecule forms a single structural domain. Three functional regions bind FAD, the 4Fe4S cluster, and UQ and are closely packed and share structural elements, resulting in no discrete structural domains. The UQ-binding pocket consists mainly of hydrophobic residues, and UQ binding differs from that of other UQ-binding proteins. ETF-QO is a monotopic integral membrane protein. The putative membrane-binding surface contains an alpha-helix and a beta-hairpin, forming a hydrophobic plateau. The UQ-flavin distance (8.5 A) is shorter than the UQ-cluster distance (18.8 A), and the very similar redox potentials of FAD and the cluster strongly suggest that the flavin, not the cluster, transfers electrons to UQ. Two possible electron transfer paths can be envisioned. First, electrons from the ETF flavin semiquinone may enter the ETF-QO flavin one by one, followed by rapid equilibration with the cluster. Alternatively, electrons may enter via the cluster, followed by equilibration between centers. In both cases, when ETF-QO is reduced to a two-electron reduced state (one electron at each redox center), the enzyme is primed to reduce UQ to ubiquinol via FAD.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Jarman, Kenneth D.; Xu, Zhijie

    This report describes our initial research to quantify uncertainties in the identification and characterization of possible attack states in a network. As a result, we should be able to estimate the current state in which the network is operating, based on a wide variety of network data, and attach a defensible measure of confidence to these state estimates. The output of this research will be new uncertainty quantification (UQ) methods to help develop a process for model development and apply UQ to characterize attacks/adversaries, create an understanding of the degree to which methods scale to "big" data, and offer methodsmore » for addressing model approaches with regard to validation and accuracy.« less

  17. multiUQ: An intrusive uncertainty quantification tool for gas-liquid multiphase flows

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2017-11-01

    Uncertainty quantification (UQ) can improve our understanding of the sensitivity of gas-liquid multiphase flows to variability about inflow conditions and fluid properties, creating a valuable tool for engineers. While non-intrusive UQ methods (e.g., Monte Carlo) are simple and robust, the cost associated with these techniques can render them unrealistic. In contrast, intrusive UQ techniques modify the governing equations by replacing deterministic variables with stochastic variables, adding complexity, but making UQ cost effective. Our numerical framework, called multiUQ, introduces an intrusive UQ approach for gas-liquid flows, leveraging a polynomial chaos expansion of the stochastic variables: density, momentum, pressure, viscosity, and surface tension. The gas-liquid interface is captured using a conservative level set approach, including a modified reinitialization equation which is robust and quadrature free. A least-squares method is leveraged to compute the stochastic interface normal and curvature needed in the continuum surface force method for surface tension. The solver is tested by applying uncertainty to one or two variables and verifying results against the Monte Carlo approach. NSF Grant #1511325.

  18. Development and psychometric evaluation of the urgency questionnaire for evaluating severity and health-related quality of life impact of urinary urgency in overactive bladder.

    PubMed

    Coyne, Karin S; Sexton, Chris C; Thompson, Christine; Bavendam, Tamara; Brubaker, Linda

    2015-03-01

    Urinary urgency is the cardinal symptom of overactive bladder (OAB). However, there is no single instrument that assesses the context, severity, intensity, and daily life impact of urinary urgency. The purpose of this manuscript is to describe the methods and results of the qualitative and quantitative research conducted to develop a new tool for this purpose, the Urgency Questionnaire (UQ). Qualitative data from interviews with patients with urinary urgency were used to develop and refine the items and response options of the UQ. Three studies were used to evaluate psychometric properties: a clinical trial of tolterodine (Detrol; n = 974); a psychometric validation study (n = 163); and a test-retest validation study (n = 47). Item and exploratory factor analysis (EFA) were performed to assess the subscale structure, and the psychometric performance of the resulting scales was evaluated. Fifteen Likert-scale items and four VAS questions were retained. A four-factor solution was shown to best fit the data, with the subscales: Impact on Daily Activities, Time to Control Urgency, Nocturia, and Fear of Incontinence. All subscales and VAS items demonstrated good reliability (Cronbach's α 0.79-0.94), convergent and discriminant validity, and responsiveness to change. The UQ differentiated between OAB patients and controls. The results provide quantitative evidence that urinary urgency, as assessed by the UQ, is a pathological sensation distinctive from the normal urge to void and suggest that the UQ might be a reliable, valid, and responsive instrument for evaluating the severity and HRQL impact of urinary urgency in OAB.

  19. Microbial community in biofilm on membrane surface of submerged MBR: effect of in-line cleaning chemical agent.

    PubMed

    Lim, B R; Ahn, K H; Song, K G; Cho, J W

    2005-01-01

    The objective of this study was to investigate the change in microbial community pattern with the effect of cleaning agent using a quinone profile that is used for membrane in-line chemical cleaning in SMBR. The dominant quinone types of biofilm were ubiquinone (UQs)-8, -10, followed by menaquinone (MKs)-8(H4), -7 and UQ-9, but those of suspended microorganisms were UQ-8, UQ-10 followed by MKs-8(H4), -7 and -11. Both UQ and MK contents decreased with increasing NaCIO dosage and it seems that there is more resistance from UQ compared to MK. In addition, COD and DOC concentrations increased with increasing NaClO dosage up to 0.05 g-NaCIO/g-SS. The organic degradation performance of the microbial community in the presence of NaClO was impaired. The present study suggested that larger added amounts of NaClO caused an inhibition of organic degradation and cell lysis.

  20. Optimization of nisin production by Lactococcus lactis UQ2 using supplemented whey as alternative culture medium.

    PubMed

    González-Toledo, S Y; Domínguez-Domínguez, J; García-Almendárez, B E; Prado-Barragán, L A; Regalado-González, C

    2010-08-01

    Lactococcus lactis UQ2 is a nisin A-producing native strain. In the present study, the production of nisin by L. lactis UQ2 in a bioreactor using supplemented sweet whey (SW) was optimized by a statistical design of experiments and response surface methodology (RSM). In a 1st approach, a fractional factorial design (FFD) of the order 2(5-1) with 3 central points was used. The effect on nisin production of air flow, SW, soybean peptone (SP), MgSO(4)/MnSO(4) mixture, and Tween 80 was evaluated. From FFD, the most significant factors affecting nisin production were SP (P = 0.011), and SW (P = 0.037). To find optimum conditions, a central composite design (CCD) with 2 central points was used. Three factors were considered, SW (7 to 10 g/L), SP (7 to10 g/L), and small amounts of added nisin as self-inducer (NI 34.4 to 74.4 IU/L). Nisin production was expressed as international units (IU). From RSM, an optimum nisin activity of 180 IU/mL was predicted at 74.4 IU/L NI, 13.8 g/L SP, and 14.9 or 5.11 g/L SW, while confirmatory experiments showed a maximum activity of 178 +/- 5.2 IU/mL, verifying the validity of the model. The 2nd-order model showed a coefficient of determination (R(2)) of 0.828. Optimized conditions were used for constant pH fermentations, where a maximum activity of 575 +/- 17 IU/mL was achieved at pH 6.5 after 12 h. The adsorption-desorption technique was used to partially purify nisin, followed by drying. The resulting powder showed an activity of 102150 IU/g. Practical Application: Nisin production was optimized using supplemented whey as alternative culture medium, using a native L. lactis UQ2 strain. Soybean peptone, SW, and subinhibitory amounts of nisin were successfully employed to optimize nisin production by L. lactis UQ2. Dried semipurified nisin showed an activity of 102150 IU/g.

  1. Effect of temperature on rates of alternative and cytochrome pathway respiration and their relationship with the redox poise of the quinone pool.

    PubMed

    Atkin, Owen K; Zhang, Qisen; Wiskich, Joe T

    2002-01-01

    We investigated the effect of short-term changes in temperature on alternative (Alt) and cytochrome (Cyt) pathway respiration, both in intact tissues and isolated mitochondria of 14-d-old cotyledons of soybean (Glycine max L. cv Stevens). We also established the extent to which temperature alters the interaction between the oxidizing pathways and the level of ubiquinone (UQ) reduction (UQ(r)/UQ(t)). No difference was found between the temperature coefficient of respiration (Q(10); proportional change per 10 degrees C) of Alt and Cyt pathway respiration in cotyledon slices (Q(10) = 1.92 and 1.86, respectively). In isolated mitochondria, the Q(10) of the fully activated Alt pathway (Q(10) = 2.24-2.61) was always equal to, or higher than, that of Cyt c oxidase (COX) alone (Q(10) = 2.08) and the complete Cyt pathway (Q(10) = 2.40-2.55). This was true regardless of substrate or whether ADP was present. There was little difference in the Q(10) of the Cyt pathway with or without ADP; however, the Q(10) of COX was substantially lower in the presence of an uncoupler (Q(10) = 1.61) than its absence (Q(10) = 2.08). The kinetics of Alt and Cyt pathway activity in relation to UQ(r)/UQ(t) were not affected by temperature. For a given UQ(r)/UQ(t) value, the proportion of maximum flux taking place was similar at all temperatures for both pathways (+/-ADP). However, the Q(10) of the Alt and the Cyt pathways (+ADP) increased with increasing UQ(r)/UQ(t). We conclude that the Alt pathway is not less temperature sensitive than the Cyt pathway or COX per se and that changes in the degree of control exerted by individual steps in the respiratory apparatus could result in changes in the Q(10) of mitochondrial O(2) uptake.

  2. Reflection matrices with U q [osp(2) (2|2m)] symmetry

    NASA Astrophysics Data System (ADS)

    Vieira, R. S.; Lima-Santos, A.

    2017-09-01

    We propose a classification of the reflection K-matrices (solutions of the boundary Yang-Baxter equation) for the Uq[osp(2)(2\\vert 2m)]=Uq[C(2)(m+1)] vertex-model. We found four families of solutions, namely, the complete solutions, in which no elements of the reflection K-matrix is null, the block-diagonal solutions, the X-shape solutions and the diagonal solutions. We highlight that these diagonal K-matrices also hold for the Uq[osp(2)(2n+2\\vert 2m)]=Uq[D(2)(n+1, m)] vertex-model.

  3. Solanesyl Diphosphate Synthase, an Enzyme of the Ubiquinone Synthetic Pathway, Is Required throughout the Life Cycle of Trypanosoma brucei

    PubMed Central

    Lai, De-Hua; Poropat, Estefanía; Pravia, Carlos; Landoni, Malena; Couto, Alicia S.; Pérez Rojo, Fernando G.; Fuchs, Alicia G.; Dubin, Marta; Elingold, Igal; Rodríguez, Juan B.; Ferella, Marcela; Esteva, Mónica I.

    2014-01-01

    Ubiquinone 9 (UQ9), the expected product of the long-chain solanesyl diphosphate synthase of Trypanosoma brucei (TbSPPS), has a central role in reoxidation of reducing equivalents in the mitochondrion of T. brucei. The ablation of TbSPPS gene expression by RNA interference increased the generation of reactive oxygen species and reduced cell growth and oxygen consumption. The addition of glycerol to the culture medium exacerbated the phenotype by blocking its endogenous generation and excretion. The participation of TbSPPS in UQ synthesis was further confirmed by growth rescue using UQ with 10 isoprenyl subunits (UQ10). Furthermore, the survival of infected mice was prolonged upon the downregulation of TbSPPS and/or the addition of glycerol to drinking water. TbSPPS is inhibited by 1-[(n-oct-1-ylamino)ethyl] 1,1-bisphosphonic acid, and treatment with this compound was lethal for the cells. The findings that both UQ9 and ATP pools were severely depleted by the drug and that exogenous UQ10 was able to fully rescue growth of the inhibited parasites strongly suggest that TbSPPS and UQ synthesis are the main targets of the drug. These two strategies highlight the importance of TbSPPS for T. brucei, justifying further efforts to validate it as a new drug target. PMID:24376001

  4. All three quinone species play distinct roles in ensuring optimal growth under aerobic and fermentative conditions in E. coli K12

    PubMed Central

    Nitzschke, Annika

    2018-01-01

    The electron transport chain of E. coli contains three different quinone species, ubiquinone (UQ), menaquinone (MK) and demethylmenaquinone (DMK). The content and ratio of the different quinone species vary depending on the external conditions. To study the function of the different quinone species in more detail, strains with deletions preventing UQ synthesis, as well as MK and/or DMK synthesis were cultured under aerobic and anaerobic conditions. The strains were characterized with respect to growth and product synthesis. As quinones are also involved in the control of ArcB/A activity, we analyzed the phosphorylation state of the response regulator as well as the expression of selected genes.The data show reduced aerobic growth coupled to lactate production in the mutants defective in ubiquinone synthesis. This confirms the current assumption that ubiquinone is the main quinone under aerobic growth conditions. In the UQ mutant strains the amount of MK and DMK is significantly elevated. The strain synthesizing only DMK is less affected in growth than the strain synthesizing MK as well as DMK. An inhibitory effect of MK on aerobic growth due to increased oxidative stress is postulated.Under fermentative growth conditions the mutant synthesizing only UQ is severely impaired in growth. Obviously, UQ is not able to replace MK and DMK during anaerobic growth. Mutations affecting quinone synthesis have an impact on ArcA phosphorylation only under anaerobic conditions. ArcA phosphorylation is reduced in strains synthesizing only MK or MK plus DMK. PMID:29614086

  5. Structure of electron transfer flavoprotein-ubiquinone oxidoreductase and electron transfer to the mitochondrial ubiquinone pool

    PubMed Central

    Zhang, Jian; Frerman, Frank E.; Kim, Jung-Ja P.

    2006-01-01

    Electron transfer flavoprotein-ubiquinone oxidoreductase (ETF-QO) is a 4Fe4S flavoprotein located in the inner mitochondrial membrane. It catalyzes ubiquinone (UQ) reduction by ETF, linking oxidation of fatty acids and some amino acids to the mitochondrial respiratory chain. Deficiencies in ETF or ETF-QO result in multiple acyl-CoA dehydrogenase deficiency, a human metabolic disease. Crystal structures of ETF-QO with and without bound UQ were determined, and they are essentially identical. The molecule forms a single structural domain. Three functional regions bind FAD, the 4Fe4S cluster, and UQ and are closely packed and share structural elements, resulting in no discrete structural domains. The UQ-binding pocket consists mainly of hydrophobic residues, and UQ binding differs from that of other UQ-binding proteins. ETF-QO is a monotopic integral membrane protein. The putative membrane-binding surface contains an α-helix and a β-hairpin, forming a hydrophobic plateau. The UQ—flavin distance (8.5 Å) is shorter than the UQ—cluster distance (18.8 Å), and the very similar redox potentials of FAD and the cluster strongly suggest that the flavin, not the cluster, transfers electrons to UQ. Two possible electron transfer paths can be envisioned. First, electrons from the ETF flavin semiquinone may enter the ETF-QO flavin one by one, followed by rapid equilibration with the cluster. Alternatively, electrons may enter via the cluster, followed by equilibration between centers. In both cases, when ETF-QO is reduced to a two-electron reduced state (one electron at each redox center), the enzyme is primed to reduce UQ to ubiquinol via FAD. PMID:17050691

  6. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun

    2017-12-01

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.

  7. The effects of protein crowding in bacterial photosynthetic membranes on the flow of quinone redox species between the photochemical reaction center and the ubiquinol-cytochrome c2 oxidoreductase.

    PubMed

    Woronowicz, Kamil; Sha, Daniel; Frese, Raoul N; Sturgis, James N; Nanda, Vikas; Niederman, Robert A

    2011-08-01

    Atomic force microscopy (AFM) of the native architecture of the intracytoplasmic membrane (ICM) of a variety of species of purple photosynthetic bacteria, obtained at submolecular resolution, shows a tightly packed arrangement of light harvesting (LH) and reaction center (RC) complexes. Since there are no unattributed structures or gaps with space sufficient for the cytochrome bc(1) or ATPase complexes, they are localized in membrane domains distinct from the flat regions imaged by AFM. This has generated a renewed interest in possible long-range pathways for lateral diffusion of UQ redox species that functionally link the RC and the bc(1) complexes. Recent proposals to account for UQ flow in the membrane bilayer are reviewed, along with new experimental evidence provided from an analysis of intrinsic near-IR fluorescence emission that has served to test these hypotheses. The results suggest that different mechanism of UQ flow exist between species such as Rhodobacter sphaeroides, with a highly organized arrangement of LH and RC complexes and fast RC electron transfer turnover, and Phaeospirillum molischianum with a more random organization and slower RC turnover. It is concluded that packing density of the peripheral LH2 antenna in the Rba. sphaeroides ICM imposes constraints that significantly slow the diffusion of UQ redox species between the RC and cytochrome bc(1) complex, while in Phs. molischianum, the crowding of the ICM with LH3 has little effect upon UQ diffusion. This supports the proposal that in this type of ICM, a network of RC-LH1 core complexes observed in AFM provides a pathway for long-range quinone diffusion that is unaffected by differences in LH complex composition or organization.

  8. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghanem, Roger

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced modelsmore » to be used in estimation and inference.« less

  9. Semi-Supervised Multiple Feature Analysis for Action Recognition

    DTIC Science & Technology

    2013-11-26

    Technology and Electrical Engineering, University of Queensland, Brisbane, Australia ( e -mail: sen.wang@uq.edu.au; yi.yang@uq.edu.au). Z. Ma is with...the Language Technologies Institute, Carnegie Mellon Univer- sity, Pittsburgh, PA 15213 USA ( e -mail: kevinma@cs.cmu.edu). X. Li is with the School of...Service Computing in Cyber Physical Society, Chongqing University, Chongqing, China ( e -mail: xueli@itee.uq.edu.au). C. Pang is with the Australian e

  10. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE PAGES

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; ...

    2017-12-27

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  11. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  12. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knio, Omar M.

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather inputmore » in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.« less

  13. Bimodule structure of the mixed tensor product over Uq sℓ (2 | 1) and quantum walled Brauer algebra

    NASA Astrophysics Data System (ADS)

    Bulgakova, D. V.; Kiselev, A. M.; Tipunin, I. Yu.

    2018-03-01

    We study a mixed tensor product 3⊗m ⊗3 ‾ ⊗ n of the three-dimensional fundamental representations of the Hopf algebra Uq sℓ (2 | 1), whenever q is not a root of unity. Formulas for the decomposition of tensor products of any simple and projective Uq sℓ (2 | 1)-module with the generating modules 3 and 3 ‾ are obtained. The centralizer of Uq sℓ (2 | 1) on the mixed tensor product is calculated. It is shown to be the quotient Xm,n of the quantum walled Brauer algebra qw Bm,n. The structure of projective modules over Xm,n is written down explicitly. It is known that the walled Brauer algebras form an infinite tower. We have calculated the corresponding restriction functors on simple and projective modules over Xm,n. This result forms a crucial step in decomposition of the mixed tensor product as a bimodule over Xm,n ⊠Uq sℓ (2 | 1). We give an explicit bimodule structure for all m , n.

  14. Identification of the Catalytic Ubiquinone-binding Site of Vibrio cholerae Sodium-dependent NADH Dehydrogenase

    PubMed Central

    Tuz, Karina; Li, Chen; Fang, Xuan; Raba, Daniel A.; Liang, Pingdong; Minh, David D. L.; Juárez, Oscar

    2017-01-01

    The sodium-dependent NADH dehydrogenase (Na+-NQR) is a key component of the respiratory chain of diverse prokaryotic species, including pathogenic bacteria. Na+-NQR uses the energy released by electron transfer between NADH and ubiquinone (UQ) to pump sodium, producing a gradient that sustains many essential homeostatic processes as well as virulence factor secretion and the elimination of drugs. The location of the UQ binding site has been controversial, with two main hypotheses that suggest that this site could be located in the cytosolic subunit A or in the membrane-bound subunit B. In this work, we performed alanine scanning mutagenesis of aromatic residues located in transmembrane helices II, IV, and V of subunit B, near glycine residues 140 and 141. These two critical glycine residues form part of the structures that regulate the site's accessibility. Our results indicate that the elimination of phenylalanine residue 211 or 213 abolishes the UQ-dependent activity, produces a leak of electrons to oxygen, and completely blocks the binding of UQ and the inhibitor HQNO. Molecular docking calculations predict that UQ interacts with phenylalanine 211 and pinpoints the location of the binding site in the interface of subunits B and D. The mutagenesis and structural analysis allow us to propose a novel UQ-binding motif, which is completely different compared with the sites of other respiratory photosynthetic complexes. These results are essential to understanding the electron transfer pathways and mechanism of Na+-NQR catalysis. PMID:28053088

  15. Development of a novel immunoassay for herbal cannabis using a new fluorescent antibody probe, "Ultra Quenchbody".

    PubMed

    Tsujikawa, Kenji; Saiki, Fujio; Yamamuro, Tadashi; Iwata, Yuko T; Abe, Ryoji; Ohashi, Hiroyuki; Kaigome, Rena; Yamane, Kyosuke; Kuwayama, Kenji; Kanamori, Tatsuyuki; Inoue, Hiroyuki

    2016-09-01

    We developed a novel immunoassay for herbal cannabis based on a new immunoassay principle that uses Ultra Quenchbody ("UQ-body"), a recombinant antibody Fab fragment fluorolabeled at the N-terminal regions. When the antigen binds to anti-Δ(9)-tetrahydrocannabinol (THC) UQ-body, the fluorescence intensity (FI) decreases. The analytical conditions of the immunoassay were optimized based on the FI reduction rate (FIRR). Following are the steps in the final analytical procedure: (1) 10mg of samples were extracted with 1ml of a 60:40 mixture of methanol and phosphate-buffered saline (PBS); (2) the extract was filtered through a centrifugal 0.2-μm polytetrafluoroethylene membrane filter; (3) the filtrate was diluted 100 times with extraction solvent; (4) 6-μl diluted solution was mixed with 19-μl PBS and 75-μl UQ-body solution; and (5) FIRR was measured under 275-mV excitation light. Herbal cannabis samples containing ≥4.0-mg/g THC gave FIRRs of ≥5.2%. FIRRs of negative samples (cigarette, tea, spice, and so-called "synthetic marijuana") were ≤3.1%. When setting the FIRR threshold to 5.0%, cannabis samples containing ≥4.0-mg/g THC were correctly judged as positive without being affected by false positives caused by the negative samples. This detection limit was lower than total THC level (10-200mg/g) in most herbal cannabis samples seized in Japan. In seven of the 10 cannabis samples, the results of the UQ-body test were comparable with those of the Duquenois-Levine test. Thus, the UQ-body-based immunoassay is presumed to be an effective and objective drug screening method for herbal cannabis; however, to show the true usefulness, it is necessary to test a number of real case samples in the field situation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. On uncertainty quantification in hydrogeology and hydrogeophysics

    NASA Astrophysics Data System (ADS)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  17. Global benchmarking of medical student learning outcomes? Implementation and pilot results of the International Foundations of Medicine Clinical Sciences Exam at The University of Queensland, Australia.

    PubMed

    Wilkinson, David; Schafer, Jennifer; Hewett, David; Eley, Diann; Swanson, Dave

    2014-01-01

    To report pilot results for international benchmarking of learning outcomes among 426 final year medical students at the University of Queensland (UQ), Australia. Students took the International Foundations of Medicine (IFOM) Clinical Sciences Exam (CSE) developed by the National Board of Medical Examiners, USA, as a required formative assessment. IFOM CSE comprises 160 multiple-choice questions in medicine, surgery, obstetrics, paediatrics and mental health, taken over 4.5 hours. Significant implementation issues; IFOM scores and benchmarking with International Comparison Group (ICG) scores and United States Medical Licensing Exam (USMLE) Step 2 Clinical Knowledge (CK) scores; and correlation with UQ medical degree cumulative grade point average (GPA). Implementation as an online exam, under university-mandated conditions was successful. Mean IFOM score was 531.3 (maximum 779-minimum 200). The UQ cohort performed better (31% scored below 500) than the ICG (55% below 500). However 49% of the UQ cohort did not meet the USMLE Step 2 CK minimum score. Correlation between IFOM scores and UQ cumulative GPA was reasonable at 0.552 (p < 0.001). International benchmarking is feasible and provides a variety of useful benchmarking opportunities.

  18. Tutorial examples for uncertainty quantification methods.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  19. Area-Preserving Diffeomorphisms, W∞ and { U}q [sl(2)] in Chern-Simons Theory and the Quantum Hall System

    NASA Astrophysics Data System (ADS)

    Kogan, Ian I.

    We discuss a quantum { U}q [sl(2)] symmetry in the Landau problem, which naturally arises due to the relation between { U}q [sl(2)] and the group of magnetic translations. The latter is connected with W∞ and area-preserving (symplectic) diffeomorphisms which are the canonical transformations in the two-dimensional phase space. We shall discuss the hidden quantum symmetry in a 2 + 1 gauge theory with the Chern-Simons term and in a quantum Hall system, which are both connected with the Landau problem.

  20. A Greenhouse-Gas Information System: Monitoring and Validating Emissions Reporting and Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonietz, Karl K.; Dimotakis, Paul E.; Rotman, Douglas A.

    2011-09-26

    This study and report focus on attributes of a greenhouse-gas information system (GHGIS) needed to support MRV&V needs. These needs set the function of such a system apart from scientific/research monitoring of GHGs and carbon-cycle systems, and include (not exclusively): the need for a GHGIS that is operational, as required for decision-support; the need for a system that meets specifications derived from imposed requirements; the need for rigorous calibration, verification, and validation (CV&V) standards, processes, and records for all measurement and modeling/data-inversion data; the need to develop and adopt an uncertainty-quantification (UQ) regimen for all measurement and modeling data; andmore » the requirement that GHGIS products can be subjected to third-party questioning and scientific scrutiny. This report examines and assesses presently available capabilities that could contribute to a future GHGIS. These capabilities include sensors and measurement technologies; data analysis and data uncertainty quantification (UQ) practices and methods; and model-based data-inversion practices, methods, and their associated UQ. The report further examines the need for traceable calibration, verification, and validation processes and attached metadata; differences between present science-/research-oriented needs and those that would be required for an operational GHGIS; the development, operation, and maintenance of a GHGIS missions-operations center (GMOC); and the complex systems engineering and integration that would be required to develop, operate, and evolve a future GHGIS.« less

  1. Weighted Iterative Bayesian Compressive Sensing (WIBCS) for High Dimensional Polynomial Surrogate Construction

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2016-12-01

    Surrogate construction has become a routine procedure when facing computationally intensive studies requiring multiple evaluations of complex models. In particular, surrogate models, otherwise called emulators or response surfaces, replace complex models in uncertainty quantification (UQ) studies, including uncertainty propagation (forward UQ) and parameter estimation (inverse UQ). Further, surrogates based on Polynomial Chaos (PC) expansions are especially convenient for forward UQ and global sensitivity analysis, also known as variance-based decomposition. However, the PC surrogate construction strongly suffers from the curse of dimensionality. With a large number of input parameters, the number of model simulations required for accurate surrogate construction is prohibitively large. Relatedly, non-adaptive PC expansions typically include infeasibly large number of basis terms far exceeding the number of available model evaluations. We develop Weighted Iterative Bayesian Compressive Sensing (WIBCS) algorithm for adaptive basis growth and PC surrogate construction leading to a sparse, high-dimensional PC surrogate with a very few model evaluations. The surrogate is then readily employed for global sensitivity analysis leading to further dimensionality reduction. Besides numerical tests, we demonstrate the construction on the example of Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  2. Quantum Liouville theory and BTZ black hole entropy

    NASA Astrophysics Data System (ADS)

    Chen, Yujun

    In this thesis I give an explicit conformal field theory description of (2+1)-dimensional BTZ black hole entropy. In the boundary Liouville field theory I investigate the reducible Verma modules in the elliptic sector, which correspond to certain irreducible representations of the quantum algebra Uq(sl2) ⊙ Uq̂(sl2). I show that there are states that decouple from these reducible Verma modules in a similar fashion to the decoupling of null states in minimal models. Because of the nonstandard form of the Ward identity for the two-point correlation functions in quantum Liouville field theory, these decoupling states have positive-definite norms. The unitary representations built on these decoupling states give the Bekenstein-Hawking entropy of the BTZ black hole.

  3. Double-bosonization and Majid's conjecture, (I): Rank-inductions of ABCD

    NASA Astrophysics Data System (ADS)

    Hu, Hongmei; Hu, Naihong

    2015-11-01

    Majid developed in [S. Majid, Math. Proc. Cambridge Philos. Soc. 125, 151-192 (1999)] the double-bosonization theory to construct Uq(𝔤) and expected to generate inductively not just a line but a tree of quantum groups starting from a node. In this paper, the authors confirm Majid's first expectation (see p. 178 [S. Majid, Math. Proc. Cambridge Philos. Soc. 125, 151-192 (1999)]) through giving and verifying the full details of the inductive constructions of Uq(𝔤) for the classical types, i.e., the ABCD series. Some examples in low ranks are given to elucidate that any quantum group of classical type can be constructed from the node corresponding to Uq(𝔰𝔩2).

  4. Erratum to: Reducing Preschoolers' Disruptive Behavior in Public with a Brief Parent Discussion Group.

    PubMed

    Joachim, Sabine; Sanders, Matthew R; Turner, Karen M T

    2015-10-01

    The Triple P-Positive Parenting Program is owned by the University of Queensland (UQ). The University through its main technology transfer company UniQuest Pty Limited has licensed Triple P International Pty Ltd to disseminate the program worldwide. Royalties stemming from this dissemination activity are distributed to the Parenting and Family Support Centre, School of Psychology, UQ; Faculty of Health and Behavioural Sciences at UQ; and contributory authors. No author has any share or ownership in Triple P International Pty Ltd. Matthew Sanders is the founder and an author on various Triple P programs and a consultant to Triple P International. Karen Turner is an author of various Triple P programs.

  5. Uncertainty Quantification of Evapotranspiration and Infiltration from Modeling and Historic Time Series at the Savannah River F-Area

    NASA Astrophysics Data System (ADS)

    Faybishenko, B.; Flach, G. P.

    2012-12-01

    The objectives of this presentation are: (a) to illustrate the application of Monte Carlo and fuzzy-probabilistic approaches for uncertainty quantification (UQ) in predictions of potential evapotranspiration (PET), actual evapotranspiration (ET), and infiltration (I), using uncertain hydrological or meteorological time series data, and (b) to compare the results of these calculations with those from field measurements at the U.S. Department of Energy Savannah River Site (SRS), near Aiken, South Carolina, USA. The UQ calculations include the evaluation of aleatory (parameter uncertainty) and epistemic (model) uncertainties. The effect of aleatory uncertainty is expressed by assigning the probability distributions of input parameters, using historical monthly averaged data from the meteorological station at the SRS. The combined effect of aleatory and epistemic uncertainties on the UQ of PET, ET, and Iis then expressed by aggregating the results of calculations from multiple models using a p-box and fuzzy numbers. The uncertainty in PETis calculated using the Bair-Robertson, Blaney-Criddle, Caprio, Hargreaves-Samani, Hamon, Jensen-Haise, Linacre, Makkink, Priestly-Taylor, Penman, Penman-Monteith, Thornthwaite, and Turc models. Then, ET is calculated from the modified Budyko model, followed by calculations of I from the water balance equation. We show that probabilistic and fuzzy-probabilistic calculations using multiple models generate the PET, ET, and Idistributions, which are well within the range of field measurements. We also show that a selection of a subset of models can be used to constrain the uncertainty quantification of PET, ET, and I.

  6. Difference in Functional Performance on the Upper-Quarter Y-Balance Test Between High School Baseball Players and Wrestlers.

    PubMed

    Myers, Heather; Poletti, Mary; Butler, Robert J

    2017-05-01

    The Upper Quarter Y-Balance Test (YBT-UQ) is a unique movement test where individuals perform at the limits of their stability, requiring the coordination of balance, proprioception, range of motion, and stabilization. It is not yet clear if performance on the YBT-UQ differs between sports with dissimilar emphasis on upper-extremity performance. To compare performance on the YBT-UQ between wrestlers, whose sport requires some degree of closed-chain activity, and baseball players, whose sport is primarily open kinetic chain in nature. Cross-sectional. High school preparticipation physical assessment. 24 healthy high school male wrestlers (mean age 16.12 ± 1.24 y) and 24 healthy high school male baseball players (mean age 15.79 ± 1.25 y). All subjects performed the YBT-UQ, which requires reaching in 3 directions while maintaining a push-up position. The variables of interest include the maximum reach in each direction, as well as the composite score. In addition, asymmetries between limbs for each reach direction were compared. Wrestlers performed significantly better than baseball players in the medial direction, inferolateral direction, and in composite scores. In the medial direction, wrestlers exhibited greater scores (P < .01) on both left and right limbs, 10.5 ± 10.2%LL and 9.95 ± 10.2%LL, respectively. Significant differences (P < .01) were also observed in the inferolateral direction, with a difference of 11.3 ± 12.0%LL on the left and 8.7 ± 11.0%LL on the right. Composite scores were higher (P < .01) for the wrestlers, with a difference of 7.0% on the left and 7.1% on the right. This study suggests that wrestlers perform better on the YBT-UQ than baseball players. The findings may suggest sport-specific normative data for the YBT-UQ in high school athletes.

  7. In Silico Discovery of a Substituted 6-Methoxy-quinalidine with Leishmanicidal Activity in Leishmania infantum.

    PubMed

    Stevanović, Strahinja; Perdih, Andrej; Senćanski, Milan; Glišić, Sanja; Duarte, Margarida; Tomás, Ana M; Sena, Filipa V; Sousa, Filipe M; Pereira, Manuela M; Solmajer, Tom

    2018-03-27

    There is an urgent need for the discovery of new antileishmanial drugs with a new mechanism of action. Type 2 NADH dehydrogenase from Leishmania infantum ( Li NDH2) is an enzyme of the parasite's respiratory system, which catalyzes the electron transfer from NADH to ubiquinone without coupled proton pumping. In previous studies of the related NADH: ubiquinone oxidoreductase crystal structure from Saccharomyces cerevisiae , two ubiquinone-binding sites (UQ I and UQ II ) were identified and shown to play an important role in the NDH-2-catalyzed oxidoreduction reaction. Based on the available structural data, we developed a three-dimensional structural model of Li NDH2 using homology detection methods and performed an in silico virtual screening campaign to search for potential inhibitors targeting the Li NDH2 ubiquinone-binding site 1-UQ I . Selected compounds displaying favorable properties in the computational screening experiments were assayed for inhibitory activity in the structurally similar recombinant NDH-2 from S. aureus and leishmanicidal activity was determined in the wild-type axenic amastigotes and promastigotes of L. infantum . The identified compound, a substituted 6-methoxy-quinalidine, showed promising nanomolar leishmanicidal activity on wild-type axenic promastigotes and amastigotes of L. infantum and the potential for further development.

  8. Improved uncertainty quantification in nondestructive assay for nonproliferation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, Tom; Croft, Stephen; Jarman, Ken

    2016-12-01

    This paper illustrates methods to improve uncertainty quantification (UQ) for non-destructive assay (NDA) measurements used in nuclear nonproliferation. First, it is shown that current bottom-up UQ applied to calibration data is not always adequate, for three main reasons: (1) Because there are errors in both the predictors and the response, calibration involves a ratio of random quantities, and calibration data sets in NDA usually consist of only a modest number of samples (3–10); therefore, asymptotic approximations involving quantities needed for UQ such as means and variances are often not sufficiently accurate; (2) Common practice overlooks that calibration implies a partitioningmore » of total error into random and systematic error, and (3) In many NDA applications, test items exhibit non-negligible departures in physical properties from calibration items, so model-based adjustments are used, but item-specific bias remains in some data. Therefore, improved bottom-up UQ using calibration data should predict the typical magnitude of item-specific bias, and the suggestion is to do so by including sources of item-specific bias in synthetic calibration data that is generated using a combination of modeling and real calibration data. Second, for measurements of the same nuclear material item by both the facility operator and international inspectors, current empirical (top-down) UQ is described for estimating operator and inspector systematic and random error variance components. A Bayesian alternative is introduced that easily accommodates constraints on variance components, and is more robust than current top-down methods to the underlying measurement error distributions.« less

  9. Clean and Secure Energy from Coal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Philip; Davies, Lincoln; Kelly, Kerry

    2014-08-31

    The University of Utah, through their Institute for Clean and Secure Energy (ICSE), performed research to utilize the vast energy stored in our domestic coal resources and to do so in a manner that will capture CO 2 from combustion from stationary power generation. The research was organized around the theme of validation and uncertainty quantification (V/UQ) through tightly coupled simulation and experimental designs and through the integration of legal, environment, economics and policy issues.

  10. X-MATE: a flexible system for mapping short read data

    PubMed Central

    Pearson, John V.; Cloonan, Nicole; Grimmond, Sean M.

    2011-01-01

    Summary: Accurate and complete mapping of short-read sequencing to a reference genome greatly enhances the discovery of biological results and improves statistical predictions. We recently presented RNA-MATE, a pipeline for the recursive mapping of RNA-Seq datasets. With the rapid increase in genome re-sequencing projects, progression of available mapping software and the evolution of file formats, we now present X-MATE, an updated version of RNA-MATE, capable of mapping both RNA-Seq and DNA datasets and with improved performance, output file formats, configuration files, and flexibility in core mapping software. Availability: Executables, source code, junction libraries, test data and results and the user manual are available from http://grimmond.imb.uq.edu.au/X-MATE/. Contact: n.cloonan@uq.edu.au; s.grimmond@uq.edu.au Supplementary information: Supplementary data are available at Bioinformatics Online. PMID:21216778

  11. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  12. [Electric short-circuit incident observed with "Upsher" laryngoscopes].

    PubMed

    Tritsch, L; Vailly, B

    2006-01-01

    We observed an electrical short-circuit between a fasten screw of the printed circuit and the handle of an Upsher universal laryngoscope (serial number UQ1). The isolating Silicone layer was broken above the screw. This isolation defect was found all over our Upsher laryngoscopes of the UQ1 series. No doubt that if accumulators were used instead of batteries, emitted heat would be in largest amount and perhaps dangerous.

  13. Sustaining Institution-Wide Induction for Sessional Staff in a Research-Intensive University: The Strength of Shared Ownership

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Duck, Julie M.; Bartle, Emma

    2017-01-01

    The "Tutors@UQ" programme provides an example of a formalised, institution-wide, cross-discipline, academic development programme to enhance the quality of teaching that has been maintained for seven years despite a pattern of substantial organisational change. We present a case study of the programme framed around a four-phase model of…

  14. Erratum to: An Analysis of Training, Generalization, and Maintenance Effects of Primary Care Triple P for Parents of Preschool-Aged Children with Disruptive Behavior.

    PubMed

    Boyle, Cynthia L; Sanders, Matthew R; Lutzker, John R; Prinz, Ronald J; Shapiro, Cheri; Whitaker, Daniel J

    2015-10-01

    The Triple P-Positive Parenting Program is owned by the University of Queensland (UQ). The University through its main technology transfer company UniQuest Pty Limited has licensed Triple P International Pty Ltd to disseminate the program worldwide. Royalties stemming from this dissemination activity are distributed to the Parenting and Family Support Centre, School of Psychology, UQ; Faculty of Health and Behavioural Sciences at UQ; and contributory authors. No author has any share or ownership in Triple P International Pty Ltd. Matthew Sanders is the founder and an author on various Triple P programs and a consultant to Triple P International. Karen Turner is an author of various Triple P programs. Ronald Prinz is a consultant to Triple P International. Cheri Shapiro is a consultant to Triple P America.

  15. The integrable quantum group invariant A2n-1(2) and Dn+1(2) open spin chains

    NASA Astrophysics Data System (ADS)

    Nepomechie, Rafael I.; Pimenta, Rodrigo A.; Retore, Ana L.

    2017-11-01

    A family of A2n(2) integrable open spin chains with Uq (Cn) symmetry was recently identified in arxiv:arXiv:1702.01482. We identify here in a similar way a family of A2n-1(2) integrable open spin chains with Uq (Dn) symmetry, and two families of Dn+1(2) integrable open spin chains with Uq (Bn) symmetry. We discuss the consequences of these symmetries for the degeneracies and multiplicities of the spectrum. We propose Bethe ansatz solutions for two of these models, whose completeness we check numerically for small values of n and chain length N. We find formulas for the Dynkin labels in terms of the numbers of Bethe roots of each type, which are useful for determining the corresponding degeneracies. In an appendix, we briefly consider Dn+1(2) chains with other integrable boundary conditions, which do not have quantum group symmetry.

  16. Data Driven Smart Proxy for CFD Application of Big Data Analytics & Machine Learning in Computational Fluid Dynamics, Report Two: Model Building at the Cell Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ansari, A.; Mohaghegh, S.; Shahnam, M.

    To ensure the usefulness of simulation technologies in practice, their credibility needs to be established with Uncertainty Quantification (UQ) methods. In this project, smart proxy is introduced to significantly reduce the computational cost of conducting large number of multiphase CFD simulations, which is typically required for non-intrusive UQ analysis. Smart proxy for CFD models are developed using pattern recognition capabilities of Artificial Intelligence (AI) and Data Mining (DM) technologies. Several CFD simulation runs with different inlet air velocities for a rectangular fluidized bed are used to create a smart CFD proxy that is capable of replicating the CFD results formore » the entire geometry and inlet velocity range. The smart CFD proxy is validated with blind CFD runs (CFD runs that have not played any role during the development of the smart CFD proxy). The developed and validated smart CFD proxy generates its results in seconds with reasonable error (less than 10%). Upon completion of this project, UQ studies that rely on hundreds or thousands of smart CFD proxy runs can be accomplished in minutes. Following figure demonstrates a validation example (blind CFD run) showing the results from the MFiX simulation and the smart CFD proxy for pressure distribution across a fluidized bed at a given time-step (the layer number corresponds to the vertical location in the bed).« less

  17. Incorporating Uncertainty into Spacecraft Mission and Trajectory Design

    NASA Astrophysics Data System (ADS)

    Juliana D., Feldhacker

    The complex nature of many astrodynamic systems often leads to high computational costs or degraded accuracy in the analysis and design of spacecraft missions, and the incorporation of uncertainty into the trajectory optimization process often becomes intractable. This research applies mathematical modeling techniques to reduce computational cost and improve tractability for design, optimization, uncertainty quantication (UQ) and sensitivity analysis (SA) in astrodynamic systems and develops a method for trajectory optimization under uncertainty (OUU). This thesis demonstrates the use of surrogate regression models and polynomial chaos expansions for the purpose of design and UQ in the complex three-body system. Results are presented for the application of the models to the design of mid-eld rendezvous maneuvers for spacecraft in three-body orbits. The models are shown to provide high accuracy with no a priori knowledge on the sample size required for convergence. Additionally, a method is developed for the direct incorporation of system uncertainties into the design process for the purpose of OUU and robust design; these methods are also applied to the rendezvous problem. It is shown that the models can be used for constrained optimization with orders of magnitude fewer samples than is required for a Monte Carlo approach to the same problem. Finally, this research considers an application for which regression models are not well-suited, namely UQ for the kinetic de ection of potentially hazardous asteroids under the assumptions of real asteroid shape models and uncertainties in the impact trajectory and the surface material properties of the asteroid, which produce a non-smooth system response. An alternate set of models is presented that enables analytic computation of the uncertainties in the imparted momentum from impact. Use of these models for a survey of asteroids allows conclusions to be drawn on the eects of an asteroid's shape on the ability to successfully divert the asteroid via kinetic impactor.

  18. Alternative oxidase (AOX) constitutes a small family of proteins in Citrus clementina and Citrus sinensis L. Osb.

    PubMed

    Araújo Castro, Jacqueline; Gomes Ferreira, Monique Drielle; Santana Silva, Raner José; Andrade, Bruno Silva; Micheli, Fabienne

    2017-01-01

    The alternative oxidase (AOX) protein is present in plants, fungi, protozoa and some invertebrates. It is involved in the mitochondrial respiratory chain, providing an alternative route for the transport of electrons, leading to the reduction of oxygen to form water. The present study aimed to characterize the family of AOX genes in mandarin (Citrus clementina) and sweet orange (Citrus sinensis) at nucleotide and protein levels, including promoter analysis, phylogenetic analysis and C. sinensis gene expression. This study also aimed to do the homology modeling of one AOX isoform (CcAOXd). Moreover, the molecular docking of the CcAOXd protein with the ubiquinone (UQ) was performed. Four AOX genes were identified in each citrus species. These genes have an open reading frame (ORF) ranging from 852 bp to 1150 bp and a number of exons ranging from 4 to 9. The 1500 bp-upstream region of each AOX gene contained regulatory cis-elements related to internal and external response factors. CsAOX genes showed a differential expression in citrus tissues. All AOX proteins were predicted to be located in mitochondria. They contained the conserved motifs LET, NERMHL, LEEEA and RADE-H as well as several putative post-translational modification sites. The CcAOXd protein was modeled by homology to the AOX of Trypanosona brucei (45% of identity). The 3-D structure of CcAOXd showed the presence of two hydrophobic helices that could be involved in the anchoring of the protein in the inner mitochondrial membrane. The active site of the protein is located in a hydrophobic environment deep inside the AOX structure and contains a diiron center. The molecular docking of CcAOXd with UQ showed that the binding site is a recessed pocket formed by the helices and submerged in the membrane. These data are important for future functional studies of citrus AOX genes and/or proteins, as well as for biotechnological approaches leading to AOX inhibition using UQ homologs.

  19. Alternative oxidase (AOX) constitutes a small family of proteins in Citrus clementina and Citrus sinensis L. Osb

    PubMed Central

    Araújo Castro, Jacqueline; Gomes Ferreira, Monique Drielle; Santana Silva, Raner José; Andrade, Bruno Silva

    2017-01-01

    The alternative oxidase (AOX) protein is present in plants, fungi, protozoa and some invertebrates. It is involved in the mitochondrial respiratory chain, providing an alternative route for the transport of electrons, leading to the reduction of oxygen to form water. The present study aimed to characterize the family of AOX genes in mandarin (Citrus clementina) and sweet orange (Citrus sinensis) at nucleotide and protein levels, including promoter analysis, phylogenetic analysis and C. sinensis gene expression. This study also aimed to do the homology modeling of one AOX isoform (CcAOXd). Moreover, the molecular docking of the CcAOXd protein with the ubiquinone (UQ) was performed. Four AOX genes were identified in each citrus species. These genes have an open reading frame (ORF) ranging from 852 bp to 1150 bp and a number of exons ranging from 4 to 9. The 1500 bp-upstream region of each AOX gene contained regulatory cis-elements related to internal and external response factors. CsAOX genes showed a differential expression in citrus tissues. All AOX proteins were predicted to be located in mitochondria. They contained the conserved motifs LET, NERMHL, LEEEA and RADE-H as well as several putative post-translational modification sites. The CcAOXd protein was modeled by homology to the AOX of Trypanosona brucei (45% of identity). The 3-D structure of CcAOXd showed the presence of two hydrophobic helices that could be involved in the anchoring of the protein in the inner mitochondrial membrane. The active site of the protein is located in a hydrophobic environment deep inside the AOX structure and contains a diiron center. The molecular docking of CcAOXd with UQ showed that the binding site is a recessed pocket formed by the helices and submerged in the membrane. These data are important for future functional studies of citrus AOX genes and/or proteins, as well as for biotechnological approaches leading to AOX inhibition using UQ homologs. PMID:28459876

  20. Bilateral differences in the upper quarter function of high school aged baseball and softball players.

    PubMed

    Butler, Robert J; Myers, Heather S; Black, Douglass; Kiesel, Kyle B; Plisky, Phillip J; Moorman, Claude T; Queen, Robin M

    2014-08-01

    The Upper Quarter Y Balance Test (YBT-UQ) was developed as a way to identify upper extremity and trunk mobility in the open kinetic chain in the reaching limb as well as midrange limitations and asymmetries of upper extremity and core stability in the closed kinetic chain on the stabilizing limb. Performance on the YBT-UQ is similar between genders and between limbs; however, this has not been examined in athletes who participate in sports that result in upper extremity asymmetries. The primary purpose of this study is to determine if differences exist between the throwing vs. non-throwing sides in high-school baseball and softball athletes on the YBT-UQ. In order to complete this forty-eight male high school baseball players and seventeen female high school softball players were tested on the YBT-UQ. Reach distances were normalized to arm length (% AL). Comparisons were made between the throwing (T) and non-throwing (NT) arm for each direction as well as the composite score. No significant differences were observed between the T and NT arm for the medial (NT: 98.4 ± 8.6 %AL, T: 99.1 ± 8.6 %AL, p=0.42), inferolateral (NT: 90.8 ± 11.8 %AL, T: 90.3 ± 11.5 %AL, p =0.61), superolateral (NT: 70.6 ± 10.9 %AL, T: 70.4 ± 11.1 % AL, p=0.91) reaches, or the composite score (NT: 87.2 ± 8.9 % AL, T: 86.6 ± 8.1 %AL, p=0.72). Similarly, no differences were observed between the male baseball and female softball players (p=0.30-0.90). Based on these findings, it was concluded that there was no difference in performance on the YBT-UQ between throwing and non-throwing limbs in high school baseball and softball players. 3.

  1. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  2. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  3. UQTk Version 3.0.3 User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sargsyan, Khachik; Safta, Cosmin; Chowdhary, Kamaljit Singh

    2017-05-01

    The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.0.3 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sen- sitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

  4. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    NASA Astrophysics Data System (ADS)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  5. Epigenetic priors for identifying active transcription factor binding sites.

    PubMed

    Cuellar-Partida, Gabriel; Buske, Fabian A; McLeay, Robert C; Whitington, Tom; Noble, William Stafford; Bailey, Timothy L

    2012-01-01

    Accurate knowledge of the genome-wide binding of transcription factors in a particular cell type or under a particular condition is necessary for understanding transcriptional regulation. Using epigenetic data such as histone modification and DNase I, accessibility data has been shown to improve motif-based in silico methods for predicting such binding, but this approach has not yet been fully explored. We describe a probabilistic method for combining one or more tracks of epigenetic data with a standard DNA sequence motif model to improve our ability to identify active transcription factor binding sites (TFBSs). We convert each data type into a position-specific probabilistic prior and combine these priors with a traditional probabilistic motif model to compute a log-posterior odds score. Our experiments, using histone modifications H3K4me1, H3K4me3, H3K9ac and H3K27ac, as well as DNase I sensitivity, show conclusively that the log-posterior odds score consistently outperforms a simple binary filter based on the same data. We also show that our approach performs competitively with a more complex method, CENTIPEDE, and suggest that the relative simplicity of the log-posterior odds scoring method makes it an appealing and very general method for identifying functional TFBSs on the basis of DNA and epigenetic evidence. FIMO, part of the MEME Suite software toolkit, now supports log-posterior odds scoring using position-specific priors for motif search. A web server and source code are available at http://meme.nbcr.net. Utilities for creating priors are at http://research.imb.uq.edu.au/t.bailey/SD/Cuellar2011. t.bailey@uq.edu.au Supplementary data are available at Bioinformatics online.

  6. Data-free and data-driven spectral perturbations for RANS UQ

    NASA Astrophysics Data System (ADS)

    Edeling, Wouter; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    Despite recent developments in high-fidelity turbulent flow simulations, RANS modeling is still vastly used by industry, due to its inherent low cost. Since accuracy is a concern in RANS modeling, model-form UQ is an essential tool for assessing the impacts of this uncertainty on quantities of interest. Applying the spectral decomposition to the modeled Reynolds-Stress Tensor (RST) allows for the introduction of decoupled perturbations into the baseline intensity (kinetic energy), shape (eigenvalues), and orientation (eigenvectors). This constitutes a natural methodology to evaluate the model form uncertainty associated to different aspects of RST modeling. In a predictive setting, one frequently encounters an absence of any relevant reference data. To make data-free predictions with quantified uncertainty we employ physical bounds to a-priori define maximum spectral perturbations. When propagated, these perturbations yield intervals of engineering utility. High-fidelity data opens up the possibility of inferring a distribution of uncertainty, by means of various data-driven machine-learning techniques. We will demonstrate our framework on a number of flow problems where RANS models are prone to failure. This research was partially supported by the Defense Advanced Research Projects Agency under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo), and the DOE PSAAP-II program.

  7. Evaluating the effect of access to free medication to quit smoking: a clinical trial testing the role of motivation.

    PubMed

    Jardin, Bianca F; Cropsey, Karen L; Wahlquist, Amy E; Gray, Kevin M; Silvestri, Gerard A; Cummings, K Michael; Carpenter, Matthew J

    2014-07-01

    Although the majority of smokers are ambivalent about quitting, few treatments specifically target smokers lacking motivation to quit in the near future. Most existing interventions are instead predicated on the belief that active treatments should only be distributed to smokers interested in quitting, a largely untested assumption. In the current clinical trial (N = 157), motivated smokers wanting to quit in the next 30 days were given a 2-week nicotine replacement therapy (NRT) sample and a referral to a quitline (Group MNQ), while unmotivated smokers were randomized to receive the same treatment (Group UNQ) or a quitline referral only (Group UQ). Participants were tracked via telephone for 3 months to assess quitting behaviors and smoking reduction. Groups significantly differed across all comparisons with regard to incidence of any quit attempt (MNQ: 77%, UNQ: 40%, UQ: 18%, p < .05) and any 24-hr quit attempts (62%, 32%, 16%, p < .05). Clinically meaningful differences emerged in the rates of floating (19%, 17%, 6%) and point prevalence abstinence (17%, 15%, 5%). Compared to participants in Group UQ (11%), a greater proportion of participants in Group MNQ (48%, p = .01) and Group UNQ (31%, p = .01) reduced their daily cigarette consumption by at least half. Proxy measures of cessation readiness (e.g., motivation) favored participants receiving active forms of treatment. Providing NRT samples engaged both motivated and unmotivated smokers into the quitting process and produced positive changes in smoking outcomes. This suggests that motivation should not be considered a necessary precondition to receiving treatment. © The Author 2014. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Risk assessment of climate systems for national security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Boslough, Mark Bruce Elrick; Brown, Theresa Jean

    2012-10-01

    Climate change, through drought, flooding, storms, heat waves, and melting Arctic ice, affects the production and flow of resource within and among geographical regions. The interactions among governments, populations, and sectors of the economy require integrated assessment based on risk, through uncertainty quantification (UQ). This project evaluated the capabilities with Sandia National Laboratories to perform such integrated analyses, as they relate to (inter)national security. The combining of the UQ results from climate models with hydrological and economic/infrastructure impact modeling appears to offer the best capability for national security risk assessments.

  9. Some issues in uncertainty quantification and parameter tuning: a case study of convective parameterization scheme in the WRF regional climate model

    NASA Astrophysics Data System (ADS)

    Yang, B.; Qian, Y.; Lin, G.; Leung, R.; Zhang, Y.

    2011-12-01

    The current tuning process of parameters in global climate models is often performed subjectively or treated as an optimization procedure to minimize model biases based on observations. While the latter approach may provide more plausible values for a set of tunable parameters to approximate the observed climate, the system could be forced to an unrealistic physical state or improper balance of budgets through compensating errors over different regions of the globe. In this study, the Weather Research and Forecasting (WRF) model was used to provide a more flexible framework to investigate a number of issues related uncertainty quantification (UQ) and parameter tuning. The WRF model was constrained by reanalysis of data over the Southern Great Plains (SGP), where abundant observational data from various sources was available for calibration of the input parameters and validation of the model results. Focusing on five key input parameters in the new Kain-Fritsch (KF) convective parameterization scheme used in WRF as an example, the purpose of this study was to explore the utility of high-resolution observations for improving simulations of regional patterns and evaluate the transferability of UQ and parameter tuning across physical processes, spatial scales, and climatic regimes, which have important implications to UQ and parameter tuning in global and regional models. A stochastic important-sampling algorithm, Multiple Very Fast Simulated Annealing (MVFSA) was employed to efficiently sample the input parameters in the KF scheme based on a skill score so that the algorithm progressively moved toward regions of the parameter space that minimize model errors. The results based on the WRF simulations with 25-km grid spacing over the SGP showed that the precipitation bias in the model could be significantly reduced when five optimal parameters identified by the MVFSA algorithm were used. The model performance was found to be sensitive to downdraft- and entrainment-related parameters and consumption time of Convective Available Potential Energy (CAPE). Simulated convective precipitation decreased as the ratio of downdraft to updraft flux increased. Larger CAPE consumption time resulted in less convective but more stratiform precipitation. The simulation using optimal parameters obtained by constraining only precipitation generated positive impact on the other output variables, such as temperature and wind. By using the optimal parameters obtained at 25-km simulation, both the magnitude and spatial pattern of simulated precipitation were improved at 12-km spatial resolution. The optimal parameters identified from the SGP region also improved the simulation of precipitation when the model domain was moved to another region with a different climate regime (i.e., the North America monsoon region). These results suggest that benefits of optimal parameters determined through vigorous mathematical procedures such as the MVFSA process are transferable across processes, spatial scales, and climatic regimes to some extent. This motivates future studies to further assess the strategies for UQ and parameter optimization at both global and regional scales.

  10. Uncertainty Quantification and Parameter Tuning: A Case Study of Convective Parameterization Scheme in the WRF Regional Climate Model

    NASA Astrophysics Data System (ADS)

    Qian, Y.; Yang, B.; Lin, G.; Leung, R.; Zhang, Y.

    2012-04-01

    The current tuning process of parameters in global climate models is often performed subjectively or treated as an optimization procedure to minimize model biases based on observations. The latter approach may provide more plausible values for a set of tunable parameters to approximate the observed climate, the system could be forced to an unrealistic physical state or improper balance of budgets through compensating errors over different regions of the globe. In this study, the Weather Research and Forecasting (WRF) model was used to provide a more flexible framework to investigate a number of issues related uncertainty quantification (UQ) and parameter tuning. The WRF model was constrained by reanalysis of data over the Southern Great Plains (SGP), where abundant observational data from various sources was available for calibration of the input parameters and validation of the model results. Focusing on five key input parameters in the new Kain-Fritsch (KF) convective parameterization scheme used in WRF as an example, the purpose of this study was to explore the utility of high-resolution observations for improving simulations of regional patterns and evaluate the transferability of UQ and parameter tuning across physical processes, spatial scales, and climatic regimes, which have important implications to UQ and parameter tuning in global and regional models. A stochastic important-sampling algorithm, Multiple Very Fast Simulated Annealing (MVFSA) was employed to efficiently sample the input parameters in the KF scheme based on a skill score so that the algorithm progressively moved toward regions of the parameter space that minimize model errors. The results based on the WRF simulations with 25-km grid spacing over the SGP showed that the precipitation bias in the model could be significantly reduced when five optimal parameters identified by the MVFSA algorithm were used. The model performance was found to be sensitive to downdraft- and entrainment-related parameters and consumption time of Convective Available Potential Energy (CAPE). Simulated convective precipitation decreased as the ratio of downdraft to updraft flux increased. Larger CAPE consumption time resulted in less convective but more stratiform precipitation. The simulation using optimal parameters obtained by constraining only precipitation generated positive impact on the other output variables, such as temperature and wind. By using the optimal parameters obtained at 25-km simulation, both the magnitude and spatial pattern of simulated precipitation were improved at 12-km spatial resolution. The optimal parameters identified from the SGP region also improved the simulation of precipitation when the model domain was moved to another region with a different climate regime (i.e., the North America monsoon region). These results suggest that benefits of optimal parameters determined through vigorous mathematical procedures such as the MVFSA process are transferable across processes, spatial scales, and climatic regimes to some extent. This motivates future studies to further assess the strategies for UQ and parameter optimization at both global and regional scales.

  11. Some issues in uncertainty quantification and parameter tuning: a case study of convective parameterization scheme in the WRF regional climate model

    NASA Astrophysics Data System (ADS)

    Yang, B.; Qian, Y.; Lin, G.; Leung, R.; Zhang, Y.

    2012-03-01

    The current tuning process of parameters in global climate models is often performed subjectively or treated as an optimization procedure to minimize model biases based on observations. While the latter approach may provide more plausible values for a set of tunable parameters to approximate the observed climate, the system could be forced to an unrealistic physical state or improper balance of budgets through compensating errors over different regions of the globe. In this study, the Weather Research and Forecasting (WRF) model was used to provide a more flexible framework to investigate a number of issues related uncertainty quantification (UQ) and parameter tuning. The WRF model was constrained by reanalysis of data over the Southern Great Plains (SGP), where abundant observational data from various sources was available for calibration of the input parameters and validation of the model results. Focusing on five key input parameters in the new Kain-Fritsch (KF) convective parameterization scheme used in WRF as an example, the purpose of this study was to explore the utility of high-resolution observations for improving simulations of regional patterns and evaluate the transferability of UQ and parameter tuning across physical processes, spatial scales, and climatic regimes, which have important implications to UQ and parameter tuning in global and regional models. A stochastic importance sampling algorithm, Multiple Very Fast Simulated Annealing (MVFSA) was employed to efficiently sample the input parameters in the KF scheme based on a skill score so that the algorithm progressively moved toward regions of the parameter space that minimize model errors. The results based on the WRF simulations with 25-km grid spacing over the SGP showed that the precipitation bias in the model could be significantly reduced when five optimal parameters identified by the MVFSA algorithm were used. The model performance was found to be sensitive to downdraft- and entrainment-related parameters and consumption time of Convective Available Potential Energy (CAPE). Simulated convective precipitation decreased as the ratio of downdraft to updraft flux increased. Larger CAPE consumption time resulted in less convective but more stratiform precipitation. The simulation using optimal parameters obtained by constraining only precipitation generated positive impact on the other output variables, such as temperature and wind. By using the optimal parameters obtained at 25-km simulation, both the magnitude and spatial pattern of simulated precipitation were improved at 12-km spatial resolution. The optimal parameters identified from the SGP region also improved the simulation of precipitation when the model domain was moved to another region with a different climate regime (i.e. the North America monsoon region). These results suggest that benefits of optimal parameters determined through vigorous mathematical procedures such as the MVFSA process are transferable across processes, spatial scales, and climatic regimes to some extent. This motivates future studies to further assess the strategies for UQ and parameter optimization at both global and regional scales.

  12. Final Technical Report: Mathematical Foundations for Uncertainty Quantification in Materials Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plechac, Petr; Vlachos, Dionisios G.

    We developed path-wise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of non-equilibrium extended molecular systems. The combination of these novel methodologies provided the first methods in the literature which are capable to handle UQ questions for stochastic complex systems with some or all of the following features: (a) multi-scale stochastic models such as (bio)chemical reaction networks, with a very large number of parameters, (b) spatially distributed systems such as Kinetic Monte Carlo or Langevin Dynamics, (c) non-equilibrium processes typically associated with coupled physico-chemical mechanisms, driven boundary conditions, hybrid micro-macro systems,more » etc. A particular computational challenge arises in simulations of multi-scale reaction networks and molecular systems. Mathematical techniques were applied to in silico prediction of novel materials with emphasis on the effect of microstructure on model uncertainty quantification (UQ). We outline acceleration methods to make calculations of real chemistry feasible followed by two complementary tasks on structure optimization and microstructure-induced UQ.« less

  13. Active Subspaces for Wind Plant Surrogate Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Ryan N; Quick, Julian; Dykes, Katherine L

    Understanding the uncertainty in wind plant performance is crucial to their cost-effective design and operation. However, conventional approaches to uncertainty quantification (UQ), such as Monte Carlo techniques or surrogate modeling, are often computationally intractable for utility-scale wind plants because of poor congergence rates or the curse of dimensionality. In this paper we demonstrate that wind plant power uncertainty can be well represented with a low-dimensional active subspace, thereby achieving a significant reduction in the dimension of the surrogate modeling problem. We apply the active sub-spaces technique to UQ of plant power output with respect to uncertainty in turbine axial inductionmore » factors, and find a single active subspace direction dominates the sensitivity in power output. When this single active subspace direction is used to construct a quadratic surrogate model, the number of model unknowns can be reduced by up to 3 orders of magnitude without compromising performance on unseen test data. We conclude that the dimension reduction achieved with active subspaces makes surrogate-based UQ approaches tractable for utility-scale wind plants.« less

  14. Paracousti-UQ: A Stochastic 3-D Acoustic Wave Propagation Algorithm.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preston, Leiph

    Acoustic full waveform algorithms, such as Paracousti, provide deterministic solutions in complex, 3-D variable environments. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected sound levels within an environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. Performing Monte Carlo (MC) simulations is one method of assessing this uncertainty, but it can quickly become computationally intractable for realistic problems. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a fractionmore » of the computational cost of MC. Paracousti-UQ solves the SPDE system of 3-D acoustic wave propagation equations and provides estimates of the uncertainty of the output simulated wave field (e.g., amplitudes, waveforms) based on estimated probability distributions of the input medium and source parameters. This report describes the derivation of the stochastic partial differential equations, their implementation, and comparison of Paracousti-UQ results with MC simulations using simple models.« less

  15. On uncertainty quantification of lithium-ion batteries: Application to an LiC6/LiCoO2 cell

    NASA Astrophysics Data System (ADS)

    Hadigol, Mohammad; Maute, Kurt; Doostan, Alireza

    2015-12-01

    In this work, a stochastic, physics-based model for Lithium-ion batteries (LIBs) is presented in order to study the effects of parametric model uncertainties on the cell capacity, voltage, and concentrations. To this end, the proposed uncertainty quantification (UQ) approach, based on sparse polynomial chaos expansions, relies on a small number of battery simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobol' indices. Such information aids in designing more efficient and targeted quality control procedures, which consequently may result in reducing the LIB production cost. An LiC6/LiCoO2 cell with 19 uncertain parameters discharged at 0.25C, 1C and 4C rates is considered to study the performance and accuracy of the proposed UQ approach. The results suggest that, for the considered cell, the battery discharge rate is a key factor affecting not only the performance variability of the cell, but also the determination of most important random inputs.

  16. Effects of dietary lipid, vitamins and minerals on total amounts and redox status of glutathione and ubiquinone in tissues of Atlantic salmon (Salmo salar): a multivariate approach.

    PubMed

    Hamre, Kristin; Torstensen, Bente E; Maage, Amund; Waagbø, Rune; Berge, Rolf K; Albrektsen, Sissel

    2010-10-01

    The hypothesis of the present study was that Atlantic salmon (Salmo salar) would respond to large variations in supplementation of dietary pro- and antioxidants, and marine lipid, with adjustment of the endogenously synthesised antioxidants, glutathione (GSH) and ubiquinone (UQ). An experiment with 2(7-3) reduced factorial design (the number of cases reduced systematically from 2(7) (full design) to 2(4) (reduced design)) was conducted, where vitamins, minerals and lipid were supplemented in the diet at high and low levels. For the vitamins and minerals the high levels were chosen to be just below anticipated toxic levels and the low levels were just above the requirement (vitamin C, 30 and 1000 mg/kg; vitamin E, 70 and 430 mg/kg; Fe, 70 and 1200 mg/kg; Cu, 8 and 110 mg/kg; Mn, 12 and 200 mg/kg). For astaxanthin, the dietary levels were 10 and 50 mg/kg and for lipid, 150 and 330 g/kg. The experiment was started with post-smolts (148 (sd 17 g)) and lasted for 5 months. The only effect on GSH was a minor increase ( < 10 %) in total concentration in the liver in response to high dietary lipid. GSH redox state was not affected. UQ responded to dietary lipid, astaxanthin and vitamin E, both with regard to total concentration and redox state. Except for an effect of Fe on plasma GSH, the trace elements and vitamin C had no effect on tissue levels and oxidation state of GSH and UQ. This shows that the endogenous redox state is quite robust with regard to variation of dietary pro- and antioxidants in Atlantic salmon.

  17. Capelli bitableaux and Z-forms of general linear Lie superalgebras.

    PubMed Central

    Brini, A; Teolis, A G

    1990-01-01

    The combinatorics of the enveloping algebra UQ(pl(L)) of the general linear Lie superalgebra of a finite dimensional Z2-graded Q-vector space is studied. Three non-equivalent Z-forms of UQ(pl(L)) are introduced: one of these Z-forms is a version of the Kostant Z-form and the others are Lie algebra analogs of Rota and Stein's straightening formulae for the supersymmetric algebra Super[L P] and for its dual Super[L* P*]. The method is based on an extension of Capelli's technique of variabili ausiliarie to algebras containing positively and negatively signed elements. PMID:11607048

  18. A comparison of per sample global scaling and per gene normalization methods for differential expression analysis of RNA-seq data.

    PubMed

    Li, Xiaohong; Brock, Guy N; Rouchka, Eric C; Cooper, Nigel G F; Wu, Dongfeng; O'Toole, Timothy E; Gill, Ryan S; Eteleeb, Abdallah M; O'Brien, Liz; Rai, Shesh N

    2017-01-01

    Normalization is an essential step with considerable impact on high-throughput RNA sequencing (RNA-seq) data analysis. Although there are numerous methods for read count normalization, it remains a challenge to choose an optimal method due to multiple factors contributing to read count variability that affects the overall sensitivity and specificity. In order to properly determine the most appropriate normalization methods, it is critical to compare the performance and shortcomings of a representative set of normalization routines based on different dataset characteristics. Therefore, we set out to evaluate the performance of the commonly used methods (DESeq, TMM-edgeR, FPKM-CuffDiff, TC, Med UQ and FQ) and two new methods we propose: Med-pgQ2 and UQ-pgQ2 (per-gene normalization after per-sample median or upper-quartile global scaling). Our per-gene normalization approach allows for comparisons between conditions based on similar count levels. Using the benchmark Microarray Quality Control Project (MAQC) and simulated datasets, we performed differential gene expression analysis to evaluate these methods. When evaluating MAQC2 with two replicates, we observed that Med-pgQ2 and UQ-pgQ2 achieved a slightly higher area under the Receiver Operating Characteristic Curve (AUC), a specificity rate > 85%, the detection power > 92% and an actual false discovery rate (FDR) under 0.06 given the nominal FDR (≤0.05). Although the top commonly used methods (DESeq and TMM-edgeR) yield a higher power (>93%) for MAQC2 data, they trade off with a reduced specificity (<70%) and a slightly higher actual FDR than our proposed methods. In addition, the results from an analysis based on the qualitative characteristics of sample distribution for MAQC2 and human breast cancer datasets show that only our gene-wise normalization methods corrected data skewed towards lower read counts. However, when we evaluated MAQC3 with less variation in five replicates, all methods performed similarly. Thus, our proposed Med-pgQ2 and UQ-pgQ2 methods perform slightly better for differential gene analysis of RNA-seq data skewed towards lowly expressed read counts with high variation by improving specificity while maintaining a good detection power with a control of the nominal FDR level.

  19. A comparison of per sample global scaling and per gene normalization methods for differential expression analysis of RNA-seq data

    PubMed Central

    Li, Xiaohong; Brock, Guy N.; Rouchka, Eric C.; Cooper, Nigel G. F.; Wu, Dongfeng; O’Toole, Timothy E.; Gill, Ryan S.; Eteleeb, Abdallah M.; O’Brien, Liz

    2017-01-01

    Normalization is an essential step with considerable impact on high-throughput RNA sequencing (RNA-seq) data analysis. Although there are numerous methods for read count normalization, it remains a challenge to choose an optimal method due to multiple factors contributing to read count variability that affects the overall sensitivity and specificity. In order to properly determine the most appropriate normalization methods, it is critical to compare the performance and shortcomings of a representative set of normalization routines based on different dataset characteristics. Therefore, we set out to evaluate the performance of the commonly used methods (DESeq, TMM-edgeR, FPKM-CuffDiff, TC, Med UQ and FQ) and two new methods we propose: Med-pgQ2 and UQ-pgQ2 (per-gene normalization after per-sample median or upper-quartile global scaling). Our per-gene normalization approach allows for comparisons between conditions based on similar count levels. Using the benchmark Microarray Quality Control Project (MAQC) and simulated datasets, we performed differential gene expression analysis to evaluate these methods. When evaluating MAQC2 with two replicates, we observed that Med-pgQ2 and UQ-pgQ2 achieved a slightly higher area under the Receiver Operating Characteristic Curve (AUC), a specificity rate > 85%, the detection power > 92% and an actual false discovery rate (FDR) under 0.06 given the nominal FDR (≤0.05). Although the top commonly used methods (DESeq and TMM-edgeR) yield a higher power (>93%) for MAQC2 data, they trade off with a reduced specificity (<70%) and a slightly higher actual FDR than our proposed methods. In addition, the results from an analysis based on the qualitative characteristics of sample distribution for MAQC2 and human breast cancer datasets show that only our gene-wise normalization methods corrected data skewed towards lower read counts. However, when we evaluated MAQC3 with less variation in five replicates, all methods performed similarly. Thus, our proposed Med-pgQ2 and UQ-pgQ2 methods perform slightly better for differential gene analysis of RNA-seq data skewed towards lowly expressed read counts with high variation by improving specificity while maintaining a good detection power with a control of the nominal FDR level. PMID:28459823

  20. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  1. Examining Fundamental Movement Competency and Closed-Chain Upper-Extremity Dynamic Balance in Swimmers.

    PubMed

    Bullock, Garrett S; Brookreson, Nate; Knab, Amy M; Butler, Robert J

    2017-06-01

    Abnormal fundamental movement patterns and upper-quarter dynamic balance are proposed mechanisms affecting athletic performance and injury risk. There are few studies investigating functional movement and closed-chain upper-extremity dynamic stability in swimmers. The purpose of this study was to determine differences in fundamental movement competency and closed-chain upper-extremity dynamic balance, using the Functional Movement Screen (FMS) and Upper-Quarter Y Balance Test (YBT-UQ), of high school (HS; n = 70) and collegiate (COL; n = 70) swimmers. Variables included the individual movement tests on the FMS and the average normalized reach (percent limb length [%LL]) for each direction, with the YBT-UQ. Statistical analysis was completed using a chi square for the independent test scores on the FMS while independent samples t-test to examine performance on the YBT-UQ (p ≤ 0.05). HS swimmers exhibited a statistically significant greater percentage of below average performance (score of 0 or 1) on the following FMS tests: lunge (HS: 22.9%, COL: 4.3%), hurdle step (HS: 31.4%, COL: 7.1%), and push-up (HS: 61.4%, COL: 31.4%). Furthermore, COL males performed worse in the lunge (male: 9%, female: 0%), whereas COL females had poorer efficiency in the push-up (male: 17.6%, female: 44%). Significant effects of competition level and sex were observed in YBT-UQ medial reach (HS: female 92.06, male 101.63; COL: female 101.3, male 101.5% LL). Individual fundamental movement patterns that involved lumbopelvic neuromuscular control differed between HS and COL swimmers. General upper-extremity dynamic balance differed between competition levels. These data may be helpful in understanding injury and performance-based normative data for participation and return to swimming.

  2. Inverse regression-based uncertainty quantification algorithms for high-dimensional models: Theory and practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lin, Guang; Li, Bing

    2016-09-01

    A well-known challenge in uncertainty quantification (UQ) is the "curse of dimensionality". However, many high-dimensional UQ problems are essentially low-dimensional, because the randomness of the quantity of interest (QoI) is caused only by uncertain parameters varying within a low-dimensional subspace, known as the sufficient dimension reduction (SDR) subspace. Motivated by this observation, we propose and demonstrate in this paper an inverse regression-based UQ approach (IRUQ) for high-dimensional problems. Specifically, we use an inverse regression procedure to estimate the SDR subspace and then convert the original problem to a low-dimensional one, which can be efficiently solved by building a response surface model such as a polynomial chaos expansion. The novelty and advantages of the proposed approach is seen in its computational efficiency and practicality. Comparing with Monte Carlo, the traditionally preferred approach for high-dimensional UQ, IRUQ with a comparable cost generally gives much more accurate solutions even for high-dimensional problems, and even when the dimension reduction is not exactly sufficient. Theoretically, IRUQ is proved to converge twice as fast as the approach it uses seeking the SDR subspace. For example, while a sliced inverse regression method converges to the SDR subspace at the rate ofmore » $$O(n^{-1/2})$$, the corresponding IRUQ converges at $$O(n^{-1})$$. IRUQ also provides several desired conveniences in practice. It is non-intrusive, requiring only a simulator to generate realizations of the QoI, and there is no need to compute the high-dimensional gradient of the QoI. Finally, error bars can be derived for the estimation results reported by IRUQ.« less

  3. Upper quadrant port placement for robot-assisted renal surgery: implementation of the Floating Arm and the XL Protype.

    PubMed

    Totonchi, Samer; Elgin, Robert; Monahan, Michael; Johnston, William K

    2014-08-01

    Abstract Background and Purpose: Placement of the fourth arm (4th arm) in the lower quadrant (LQ) is commonly described for robot-assisted renal surgical procedures but has anatomic restrictions and limited ergonomics. An alternative, upper quadrant (UQ) location is desirable, but patient habitus and spacing may restrict robotic attachment. We investigate current trends in 4th arm port placement and propose an alternative method at attaching the robot-the "Floating Arm" (FLA). Robotic surgeons from the Endourological Society were surveyed. A 20-cm extra-long (XL Protype) da Vinci instrument was developed for the FLA technique. A dry lab allowed quantitative comparison of spacing and ranges of motion for standard da Vinci ports (dVP), bariatric dVP, telescoping dVP, and FLA. There were 108 respondents who participated. Half of the respondents avoid using the 4th arm (30% lack of need and 20% because of interference). The majority (90%) typically positions the 4th arm in the LQ, but many reported limitations in this location. Few (5%) place 4th arm in the UQ, while most (73%) have never heard of UQ placement. Existing techniques may increase shoulder height clearance but inversely shorten the working length of the instrument intracorporeally. Alternatively, the XL Protype significantly increased the shoulder length and maintained available working distances intracorporeally. Adjacent arm interference angle was essentially identical (27 degrees) for all ports except a greater range of movement for the XL Protype (35 degrees). Few surgeons are using an UQ positioning or use techniques to improve attachment of the 4th arm. The greatest freedom may be obtained by implementing the FLA, but this necessitates production of a longer instrument.

  4. Quantum group symmetries and completeness for \\boldsymbol {A}_{\\boldsymbol {2n}}^{\\boldsymbol{(2)}} open spin chains

    NASA Astrophysics Data System (ADS)

    Ahmed, Ibrahim; Nepomechie, Rafael I.; Wang, Chunguang

    2017-07-01

    We argue that the Hamiltonians for A(2)2n open quantum spin chains corresponding to two choices of integrable boundary conditions have the symmetries Uq(Bn) and Uq(Cn) , respectively. We find a formula for the Dynkin labels of the Bethe states (which determine the degeneracies of the corresponding eigenvalues) in terms of the numbers of Bethe roots of each type. With the help of this formula, we verify numerically (for a generic value of the anisotropy parameter) that the degeneracies and multiplicities of the spectra implied by the quantum group symmetries are completely described by the Bethe ansatz.

  5. Forensic Uncertainty Quantification of Explosive Dispersal of Particles

    NASA Astrophysics Data System (ADS)

    Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho

    2017-06-01

    In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.

  6. Multilevel UQ strategies for large-scale multiphysics applications: PSAAP II solar receiver

    NASA Astrophysics Data System (ADS)

    Jofre, Lluis; Geraci, Gianluca; Iaccarino, Gianluca

    2017-06-01

    Uncertainty quantification (UQ) plays a fundamental part in building confidence in predictive science. Of particular interest is the case of modeling and simulating engineering applications where, due to the inherent complexity, many uncertainties naturally arise, e.g. domain geometry, operating conditions, errors induced by modeling assumptions, etc. In this regard, one of the pacing items, especially in high-fidelity computational fluid dynamics (CFD) simulations, is the large amount of computing resources typically required to propagate incertitude through the models. Upcoming exascale supercomputers will significantly increase the available computational power. However, UQ approaches cannot entrust their applicability only on brute force Monte Carlo (MC) sampling; the large number of uncertainty sources and the presence of nonlinearities in the solution will make straightforward MC analysis unaffordable. Therefore, this work explores the multilevel MC strategy, and its extension to multi-fidelity and time convergence, to accelerate the estimation of the effect of uncertainties. The approach is described in detail, and its performance demonstrated on a radiated turbulent particle-laden flow case relevant to solar energy receivers (PSAAP II: Particle-laden turbulence in a radiation environment). Investigation funded by DoE's NNSA under PSAAP II.

  7. Three-Dimensional Stresses in a Half Space Caused by Penny-Shaped Inclusions

    DTIC Science & Technology

    1988-08-19

    eigenstrains [111, and Green’s function in the half space 112]. Mura has recently reviewed these research efforts [13]. When the elastic moduli of an...external stress field uq is applied. On the other hand, a material containing inclusions is subjected to an internal stress caused by the eigenstrain J...even if it is free from any external loads. The definition of eigenstrains has been given by Mura [13] and is the same as the stress-free

  8. CERT TST November 2016 Visit Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, Robert Currier; Bailey, Teresa S.; Kahler, III, Albert Comstock

    2017-04-27

    The dozen plus presentations covered the span of the Center’s activities, including experimental progress, simulations of the experiments (both for calibration and validation), UQ analysis, nuclear data impacts, status of simulation codes, methods development, computational science progress, and plans for upcoming priorities. All three institutions comprising the Center (Texas A&M, University of Colorado Boulder, and Simon Fraser University) were represented. Center-supported students not only gave two of the oral presentations, but also highlighted their research in a number of excellent posters.

  9. Optimal Experimental Design of Borehole Locations for Bayesian Inference of Past Ice Sheet Surface Temperatures

    NASA Astrophysics Data System (ADS)

    Davis, A. D.; Huan, X.; Heimbach, P.; Marzouk, Y.

    2017-12-01

    Borehole data are essential for calibrating ice sheet models. However, field expeditions for acquiring borehole data are often time-consuming, expensive, and dangerous. It is thus essential to plan the best sampling locations that maximize the value of data while minimizing costs and risks. We present an uncertainty quantification (UQ) workflow based on rigorous probability framework to achieve these objectives. First, we employ an optimal experimental design (OED) procedure to compute borehole locations that yield the highest expected information gain. We take into account practical considerations of location accessibility (e.g., proximity to research sites, terrain, and ice velocity may affect feasibility of drilling) and robustness (e.g., real-time constraints such as weather may force researchers to drill at sub-optimal locations near those originally planned), by incorporating a penalty reflecting accessibility as well as sensitivity to deviations from the optimal locations. Next, we extract vertical temperature profiles from these boreholes and formulate a Bayesian inverse problem to reconstruct past surface temperatures. Using a model of temperature advection/diffusion, the top boundary condition (corresponding to surface temperatures) is calibrated via efficient Markov chain Monte Carlo (MCMC). The overall procedure can then be iterated to choose new optimal borehole locations for the next expeditions.Through this work, we demonstrate powerful UQ methods for designing experiments, calibrating models, making predictions, and assessing sensitivity--all performed under an uncertain environment. We develop a theoretical framework as well as practical software within an intuitive workflow, and illustrate their usefulness for combining data and models for environmental and climate research.

  10. Generalized Knizhnik-Zamolodchikov equation for Ding-Iohara-Miki algebra

    NASA Astrophysics Data System (ADS)

    Awata, Hidetoshi; Kanno, Hiroaki; Mironov, Andrei; Morozov, Alexei; Morozov, Andrey; Ohkubo, Yusuke; Zenkevich, Yegor

    2017-07-01

    We derive the generalization of the Knizhnik-Zamolodchikov equation (KZE) associated with the Ding-Iohara-Miki algebra Uq ,t(gl^ ^ 1) . We demonstrate that certain refined topological string amplitudes satisfy these equations and find that the braiding transformations are performed by the R matrix of Uq ,t(gl^ ^ 1) . The resulting system is the uplifting of the u^1 Wess-Zumino-Witten model. The solutions to the (q ,t ) KZE are identified with the (spectral dual of) building blocks of the Nekrasov partition function for five-dimensional linear quiver gauge theories. We also construct an elliptic version of the KZE and discuss its modular and monodromy properties, the latter being related to a dual version of the KZE.

  11. The structure of the yeast NADH dehydrogenase (Ndi1) reveals overlapping binding sites for water- and lipid-soluble substrates.

    PubMed

    Iwata, Momi; Lee, Yang; Yamashita, Tetsuo; Yagi, Takao; Iwata, So; Cameron, Alexander D; Maher, Megan J

    2012-09-18

    Bioenergy is efficiently produced in the mitochondria by the respiratory system consisting of complexes I-V. In various organisms, complex I can be replaced by the alternative NADH-quinone oxidoreductase (NDH-2), which catalyzes the transfer of an electron from NADH via FAD to quinone, without proton pumping. The Ndi1 protein from Saccharomyces cerevisiae is a monotopic membrane protein, directed to the matrix. A number of studies have investigated the potential use of Ndi1 as a therapeutic agent against complex I disorders, and the NDH-2 enzymes have emerged as potential therapeutic targets for treatments against the causative agents of malaria and tuberculosis. Here we present the crystal structures of Ndi1 in its substrate-free, NAD(+)- and ubiquinone- (UQ2) complexed states. The structures reveal that Ndi1 is a peripheral membrane protein forming an intimate dimer, in which packing of the monomeric units within the dimer creates an amphiphilic membrane-anchor domain structure. Crucially, the structures of the Ndi1-NAD(+) and Ndi1-UQ2 complexes show overlapping binding sites for the NAD(+) and quinone substrates.

  12. Impact of implicit effects on uncertainties and sensitivities of the Doppler coefficient of a LWR pin cell

    NASA Astrophysics Data System (ADS)

    Hursin, Mathieu; Leray, Olivier; Perret, Gregory; Pautz, Andreas; Bostelmann, Friederike; Aures, Alexander; Zwermann, Winfried

    2017-09-01

    In the present work, PSI and GRS sensitivity analysis (SA) and uncertainty quantification (UQ) methods, SHARK-X and XSUSA respectively, are compared for reactivity coefficient calculation; for reference the results of the TSUNAMI and SAMPLER modules of the SCALE code package are also provided. The main objective of paper is to assess the impact of the implicit effect, e.g., considering the effect of cross section perturbation on the self-shielding calculation, on the Doppler coefficient SA and UQ. Analyses are done for a Light Water Reactor (LWR) pin cell based on Phase I of the UAM LWR benchmark. The negligence of implicit effects in XSUSA and TSUNAMI leads to deviations of a few percent between the sensitivity profiles compared to SAMPLER and TSUNAMI (incl. implicit effects) except for 238U elastic scattering. The implicit effect is much larger for the SHARK-X calculations because of its coarser energy group structure between 10 eV and 10 keV compared to the applied SCALE libraries. It is concluded that the influence of the implicit effect strongly depends on the energy mesh of the nuclear data library of the neutron transport solver involved in the UQ calculations and may be magnified by the response considered.

  13. Uncertainty Quantification of Nonlinear Electrokinetic Response in a Microchannel-Membrane Junction

    NASA Astrophysics Data System (ADS)

    Alizadeh, Shima; Iaccarino, Gianluca; Mani, Ali

    2015-11-01

    We have conducted uncertainty quantification (UQ) for electrokinetic transport of ionic species through a hybrid microfluidic system using different probabilistic techniques. The system of interest is an H-configuration consisting of two parallel microchannels that are connected via a nafion junction. This system is commonly used for ion preconcentration and stacking by utilizing a nonlinear response at the channel-nafion junction that leads to deionization shocks. In this work, the nafion medium is modeled as many parallel nano-pores where, the nano-pore diameter, nafion porosity, and surface charge density are independent random variables. We evaluated the resulting uncertainty on the ion concentration fields as well as the deionization shock location. The UQ methods predicted consistent statistics for the outputs and the results revealed that the shock location is weakly sensitive to the nano-pore surface charge and primarily driven by nano-pore diameters. The present study can inform the design of electrokinetic networks with increased robustness to natural manufacturing variability. Applications include water desalination and lab-on-a-chip systems. Shima is a graduate student in the department of Mechanical Engineering at Stanford University. She received her Master's degree from Stanford in 2011. Her research interests include Electrokinetics in porous structures and high performance computing.

  14. Clinical communication skills learning outcomes among first year medical students are consistent irrespective of participation in an interview for admission to medical school.

    PubMed

    Casey, Mavourneen; Wilkinson, David; Fitzgerald, Jennifer; Eley, Diann; Connor, Jason

    2014-07-01

    Although contentious most medical schools interview potential students to assess personal abilities such as communication. To investigate any differences in clinical communication skills (CCS) between first year students admitted to UQ medical school with or without an admissions interview. A retrospective analysis of 1495 student assessment scores obtained after structured communication skills training (CCS) between 2007 and 2010. The average assessment score was 3.76 ([95% CI, 3.73-3.78]) and adjusting for student characteristics, showed no main effect for interview (p = 0.89). The strongest predictor of scores was gender with females achieving significantly higher scores (3.91 [95% CI, 3.54-4.28] vs. 3.76 [95% CI, 3.39-4.13]; p ≤ 0.001). Data show no differences in post-training assessment measures between students who were interviewed during selection or not. Further research about the quality and retention of communications skills after training is warranted.

  15. A hybrid anchored-ANOVA - POD/Kriging method for uncertainty quantification in unsteady high-fidelity CFD simulations

    NASA Astrophysics Data System (ADS)

    Margheri, Luca; Sagaut, Pierre

    2016-11-01

    To significantly increase the contribution of numerical computational fluid dynamics (CFD) simulation for risk assessment and decision making, it is important to quantitatively measure the impact of uncertainties to assess the reliability and robustness of the results. As unsteady high-fidelity CFD simulations are becoming the standard for industrial applications, reducing the number of required samples to perform sensitivity (SA) and uncertainty quantification (UQ) analysis is an actual engineering challenge. The novel approach presented in this paper is based on an efficient hybridization between the anchored-ANOVA and the POD/Kriging methods, which have already been used in CFD-UQ realistic applications, and the definition of best practices to achieve global accuracy. The anchored-ANOVA method is used to efficiently reduce the UQ dimension space, while the POD/Kriging is used to smooth and interpolate each anchored-ANOVA term. The main advantages of the proposed method are illustrated through four applications with increasing complexity, most of them based on Large-Eddy Simulation as a high-fidelity CFD tool: the turbulent channel flow, the flow around an isolated bluff-body, a pedestrian wind comfort study in a full scale urban area and an application to toxic gas dispersion in a full scale city area. The proposed c-APK method (anchored-ANOVA-POD/Kriging) inherits the advantages of each key element: interpolation through POD/Kriging precludes the use of quadrature schemes therefore allowing for a more flexible sampling strategy while the ANOVA decomposition allows for a better domain exploration. A comparison of the three methods is given for each application. In addition, the importance of adding flexibility to the control parameters and the choice of the quantity of interest (QoI) are discussed. As a result, global accuracy can be achieved with a reasonable number of samples allowing computationally expensive CFD-UQ analysis.

  16. A Two-Ended Shooting Technique for Calculating Normal Modes in Underwater Acoustic Propagation,

    DTIC Science & Technology

    1985-09-01

    tnad~ I’or Public rL.00uq cmd ina ts 85 12 _ 8 126 UNLIMITED DISTRIBUTION * I ’ National Defence Defense Nationale Research and Bureau de Recherche...d6crit un algorithme de calcul des modes acoustiques normaux en mer. L’algoritbme est applicable & un prof il arbitraire do densit6 et do vitesse du son... profondeur do roncontre au milieu, habituellement pr&s du point do vitesse du son minimum. on amfiliore par it~ration la solution dlesaai jusqu’& ce

  17. Reflection K-matrices for a nineteen vertex model with Uq [ osp (2 | 2) (2) ] symmetry

    NASA Astrophysics Data System (ADS)

    Vieira, R. S.; Lima Santos, A.

    2017-09-01

    We derive the solutions of the boundary Yang-Baxter equation associated with a supersymmetric nineteen vertex model constructed from the three-dimensional representation of the twisted quantum affine Lie superalgebra Uq [ osp (2 | 2) (2) ]. We found three classes of solutions. The type I solution is characterized by three boundary free-parameters and all elements of the corresponding reflection K-matrix are different from zero. In the type II solution, the reflection K-matrix is even (every element of the K-matrix with an odd parity is null) and it has only one boundary free-parameter. Finally, the type III solution corresponds to a diagonal reflection K-matrix with two boundary free-parameters.

  18. A General Uncertainty Quantification Methodology for Cloud Microphysical Property Retrievals

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Xie, S.; Chen, X.; Zhao, C.

    2014-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program provides long-term (~20 years) ground-based cloud remote sensing observations. However, there are large uncertainties in the retrieval products of cloud microphysical properties based on the active and/or passive remote-sensing measurements. To address this uncertainty issue, a DOE Atmospheric System Research scientific focus study, Quantification of Uncertainties in Cloud Retrievals (QUICR), has been formed. In addition to an overview of recent progress of QUICR, we will demonstrate the capacity of an observation-based general uncertainty quantification (UQ) methodology via the ARM Climate Research Facility baseline cloud microphysical properties (MICROBASE) product. This UQ method utilizes the Karhunen-Loéve expansion (KLE) and Central Limit Theorems (CLT) to quantify the retrieval uncertainties from observations and algorithm parameters. The input perturbations are imposed on major modes to take into account the cross correlations between input data, which greatly reduces the dimension of random variables (up to a factor of 50) and quantifies vertically resolved full probability distribution functions of retrieved quantities. Moreover, this KLE/CLT approach has the capability of attributing the uncertainties in the retrieval output to individual uncertainty source and thus sheds light on improving the retrieval algorithm and observations. We will present the results of a case study for the ice water content at the Southern Great Plains during an intensive observing period on March 9, 2000. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  19. Investigation of Biotransport in a Tumor With Uncertain Material Properties Using a Nonintrusive Spectral Uncertainty Quantification Method.

    PubMed

    Alexanderian, Alen; Zhu, Liang; Salloum, Maher; Ma, Ronghui; Yu, Meilin

    2017-09-01

    In this study, statistical models are developed for modeling uncertain heterogeneous permeability and porosity in tumors, and the resulting uncertainties in pressure and velocity fields during an intratumoral injection are quantified using a nonintrusive spectral uncertainty quantification (UQ) method. Specifically, the uncertain permeability is modeled as a log-Gaussian random field, represented using a truncated Karhunen-Lòeve (KL) expansion, and the uncertain porosity is modeled as a log-normal random variable. The efficacy of the developed statistical models is validated by simulating the concentration fields with permeability and porosity of different uncertainty levels. The irregularity in the concentration field bears reasonable visual agreement with that in MicroCT images from experiments. The pressure and velocity fields are represented using polynomial chaos (PC) expansions to enable efficient computation of their statistical properties. The coefficients in the PC expansion are computed using a nonintrusive spectral projection method with the Smolyak sparse quadrature. The developed UQ approach is then used to quantify the uncertainties in the random pressure and velocity fields. A global sensitivity analysis is also performed to assess the contribution of individual KL modes of the log-permeability field to the total variance of the pressure field. It is demonstrated that the developed UQ approach can effectively quantify the flow uncertainties induced by uncertain material properties of the tumor.

  20. Are alexithymia, ambivalence over emotional expression, and social insecurity overlapping constructs?

    PubMed

    Müller, Jochen; Bühner, Markus; Ziegler, Matthias; Sahin, Lâle

    2008-03-01

    The aim of the present study was to analyze the relationship and differential validity of three constructs related to reduced emotional expression. One hundred six patients of a psychosomatic clinic completed questionnaires assessing alexithymia (TAS-20, BVAQ), ambivalence over emotional expression (AEQ-G18), and social insecurity (UQ). A second-order principal component analysis with the scales of all questionnaires yielded three factors and revealed that the scale Competence Ambivalence assessed by the AEQ-G18 loaded on the same factor as the TAS-20 and BVAQ scales measuring Difficulties Describing and Identifying Feelings. A high correlation between the factor Social Insecurity (composed of all UQ scales) and the factor Difficulty Identifying and Describing Feelings (composed of BVAQ, TAS-20, and AEQ-G18 scales) was found. In contrast to this, the factor Emotionalizing and External Thinking showed only low correlations with the remaining factors. The results of the present study did not support the view that the alexithymia facets related to difficulties identifying and describing feelings and Competence Ambivalence are distinct constructs, when measured by self-report. This might be explained by methodological problems with the assessment of alexithymia and ambivalence. Furthermore, the results indicate that social insecurity is strongly related with the "difficulty identifying and describing feelings" facets of alexithymia and with effect ambivalence.

  1. Comparing the normalization methods for the differential analysis of Illumina high-throughput RNA-Seq data.

    PubMed

    Li, Peipei; Piao, Yongjun; Shon, Ho Sun; Ryu, Keun Ho

    2015-10-28

    Recently, rapid improvements in technology and decrease in sequencing costs have made RNA-Seq a widely used technique to quantify gene expression levels. Various normalization approaches have been proposed, owing to the importance of normalization in the analysis of RNA-Seq data. A comparison of recently proposed normalization methods is required to generate suitable guidelines for the selection of the most appropriate approach for future experiments. In this paper, we compared eight non-abundance (RC, UQ, Med, TMM, DESeq, Q, RPKM, and ERPKM) and two abundance estimation normalization methods (RSEM and Sailfish). The experiments were based on real Illumina high-throughput RNA-Seq of 35- and 76-nucleotide sequences produced in the MAQC project and simulation reads. Reads were mapped with human genome obtained from UCSC Genome Browser Database. For precise evaluation, we investigated Spearman correlation between the normalization results from RNA-Seq and MAQC qRT-PCR values for 996 genes. Based on this work, we showed that out of the eight non-abundance estimation normalization methods, RC, UQ, Med, TMM, DESeq, and Q gave similar normalization results for all data sets. For RNA-Seq of a 35-nucleotide sequence, RPKM showed the highest correlation results, but for RNA-Seq of a 76-nucleotide sequence, least correlation was observed than the other methods. ERPKM did not improve results than RPKM. Between two abundance estimation normalization methods, for RNA-Seq of a 35-nucleotide sequence, higher correlation was obtained with Sailfish than that with RSEM, which was better than without using abundance estimation methods. However, for RNA-Seq of a 76-nucleotide sequence, the results achieved by RSEM were similar to without applying abundance estimation methods, and were much better than with Sailfish. Furthermore, we found that adding a poly-A tail increased alignment numbers, but did not improve normalization results. Spearman correlation analysis revealed that RC, UQ, Med, TMM, DESeq, and Q did not noticeably improve gene expression normalization, regardless of read length. Other normalization methods were more efficient when alignment accuracy was low; Sailfish with RPKM gave the best normalization results. When alignment accuracy was high, RC was sufficient for gene expression calculation. And we suggest ignoring poly-A tail during differential gene expression analysis.

  2. Uncertainty Quantification of the FUN3D-Predicted NASA CRM Flutter Boundary

    NASA Technical Reports Server (NTRS)

    Stanford, Bret K.; Massey, Steven J.

    2017-01-01

    A nonintrusive point collocation method is used to propagate parametric uncertainties of the flexible Common Research Model, a generic transport configuration, through the unsteady aeroelastic CFD solver FUN3D. A range of random input variables are considered, including atmospheric flow variables, structural variables, and inertial (lumped mass) variables. UQ results are explored for a range of output metrics (with a focus on dynamic flutter stability), for both subsonic and transonic Mach numbers, for two different CFD mesh refinements. A particular focus is placed on computing failure probabilities: the probability that the wing will flutter within the flight envelope.

  3. Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templeton, Jeremy Alan; Blaylock, Myra L.; Domino, Stefan P.

    2015-09-01

    The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.

  4. Wearable electronics

    NASA Astrophysics Data System (ADS)

    MNice; gee bee two, em; GrahamRounce

    2014-05-01

    In reply to the physicsworld.com news story “Graphene oxide could make textiles smarter” (10 March, http://ow.ly/uqScv), which described a new type of fibre with an electrochemical capacitance of up to 410 F/g.

  5. Aeras: A next generation global atmosphere model

    DOE PAGES

    Spotz, William F.; Smith, Thomas M.; Demeshko, Irina P.; ...

    2015-06-01

    Sandia National Laboratories is developing a new global atmosphere model named Aeras that is performance portable and supports the quantification of uncertainties. These next-generation capabilities are enabled by building Aeras on top of Albany, a code base that supports the rapid development of scientific application codes while leveraging Sandia's foundational mathematics and computer science packages in Trilinos and Dakota. Embedded uncertainty quantification (UQ) is an original design capability of Albany, and performance portability is a recent upgrade. Other required features, such as shell-type elements, spectral elements, efficient explicit and semi-implicit time-stepping, transient sensitivity analysis, and concurrent ensembles, were not componentsmore » of Albany as the project began, and have been (or are being) added by the Aeras team. We present early UQ and performance portability results for the shallow water equations.« less

  6. CASL L2 milestone report : VUQ.Y1.03, %22Enable statistical sensitivity and UQ demonstrations for VERA.%22

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Witkowski, Walter R.

    2011-04-01

    The CASL Level 2 Milestone VUQ.Y1.03, 'Enable statistical sensitivity and UQ demonstrations for VERA,' was successfully completed in March 2011. The VUQ focus area led this effort, in close partnership with AMA, and with support from VRI. DAKOTA was coupled to VIPRE-W thermal-hydraulics simulations representing reactors of interest to address crud-related challenge problems in order to understand the sensitivity and uncertainty in simulation outputs with respect to uncertain operating and model form parameters. This report summarizes work coupling the software tools, characterizing uncertainties, selecting sensitivity and uncertainty quantification algorithms, and analyzing the results of iterative studies. These demonstration studies focusedmore » on sensitivity and uncertainty of mass evaporation rate calculated by VIPRE-W, a key predictor for crud-induced power shift (CIPS).« less

  7. Electrochemical measurement of lateral diffusion coefficients of ubiquinones and plastoquinones of various isoprenoid chain lengths incorporated in model bilayers.

    PubMed Central

    Marchal, D; Boireau, W; Laval, J M; Moiroux, J; Bourdillon, C

    1998-01-01

    The long-range diffusion coefficients of isoprenoid quinones in a model of lipid bilayer were determined by a method avoiding fluorescent probe labeling of the molecules. The quinone electron carriers were incorporated in supported dimyristoylphosphatidylcholine layers at physiological molar fractions (<3 mol%). The elaborate bilayer template contained a built-in gold electrode at which the redox molecules solubilized in the bilayer were reduced or oxidized. The lateral diffusion coefficient of a natural quinone like UQ10 or PQ9 was 2.0 +/- 0.4 x 10(-8) cm2 s(-1) at 30 degrees C, two to three times smaller than the diffusion coefficient of a lipid analog in the same artificial bilayer. The lateral mobilities of the oxidized or reduced forms could be determined separately and were found to be identical in the 4-13 pH range. For a series of isoprenoid quinones, UQ2 or PQ2 to UQ10, the diffusion coefficient exhibited a marked dependence on the length of the isoprenoid chain. The data fit very well the quantitative behavior predicted by a continuum fluid model in which the isoprenoid chains are taken as rigid particles moving in the less viscous part of the bilayer and rubbing against the more viscous layers of lipid heads. The present study supports the concept of a homogeneous pool of quinone located in the less viscous region of the bilayer. PMID:9545054

  8. Turbulent dispersion of slightly buoyant oil droplets and turbulent breakup of crude oil droplets mixed with dispersants

    NASA Astrophysics Data System (ADS)

    Gopalan, Balaji

    In part I, high speed in-line digital holographic cinematography is used for studying turbulent diffusion of slightly buoyant 0.5-1.2 mm diameter diesel droplets (specific gravity of 0.85) and 50 mum diameter neutral density particles. Experiments are performed in a 50x50x70 mm3 sample volume in a controlled, nearly isotropic turbulence facility, which is characterized by 2-D PIV. An automated tracking program has been used for measuring velocity time history of more than 17000 droplets and 15000 particles. The PDF's of droplet velocity fluctuations are close to Gaussian for all turbulent intensities ( u'i ). The mean rise velocity of droplets is enhanced or suppressed, compared to quiescent rise velocity (Uq), depending on Stokes number at lower turbulence levels, but becomes unconditionally enhanced at higher turbulence levels. The horizontal droplet velocity rms exceeds the fluid velocity rms for most of the data, while the vertical ones are higher than the fluid only at the highest turbulence level. The scaled droplet horizontal diffusion coefficient is higher than the vertical one, for 1 < u'i /Uq < 5, consistent with trends of the droplet velocity fluctuations. Conversely, the scaled droplet horizontal diffusion timescale is smaller than the vertical one due to crossing trajectories effect. The droplet diffusion coefficients scaled by the product of turbulence intensity and an integral length scale is a monotonically increasing function of u'i /Uq. Part II of this work explains the formation of micron sized droplets in turbulent flows from crude oil droplets pre-mixed with dispersants. Experimental visualization shows that this breakup starts with the formation of very long and quite stable, single or multiple micro threads that trail behind millimeter sized droplets. These threads form in regions with localized increase in concentration of surfactant, which in turn depends on the flow around the droplet. The resulting reduction of local surface tension, aided by high oil viscosity and stretching by the flow, suppresses capillary breakup and explains the stability of these threads. Due to increasing surface area and diffusion of dispersants into the continuous phase, the threads eventually breakup into ˜3 mum droplets.

  9. Identifying key controls on the behavior of an acidic-U(VI) plume in the Savannah River Site using reactive transport modeling.

    PubMed

    Bea, Sergio A; Wainwright, Haruko; Spycher, Nicolas; Faybishenko, Boris; Hubbard, Susan S; Denham, Miles E

    2013-08-01

    Acidic low-level waste radioactive waste solutions were discharged to three unlined seepage basins at the F-Area of the Department of Energy (DOE) Savannah River Site (SRS), South Carolina, USA, from 1955 through 1989. Despite many years of active remediation, the groundwater remains acidic and contaminated with significant levels of U(VI) and other radionuclides. Monitored Natural Attenuation (MNA) is a desired closure strategy for the site, based on the premise that regional flow of clean background groundwater will eventually neutralize the groundwater acidity, immobilizing U(VI) through adsorption. An in situ treatment system is currently in place to accelerate this in the downgradient portion of the plume and similar measures could be taken upgradient if necessary. Understanding the long-term pH and U(VI) adsorption behavior at the site is critical to assess feasibility of MNA along with the in-situ remediation treatments. This paper presents a reactive transport (RT) model and uncertainty quantification (UQ) analyses to explore key controls on the U(VI)-plume evolution and long-term mobility at this site. Two-dimensional numerical RT simulations are run including the saturated and unsaturated (vadose) zones, U(VI) and H(+) adsorption (surface complexation) onto sediments, dissolution and precipitation of Al and Fe minerals, and key hydrodynamic processes are considered. UQ techniques are applied using a new open-source tool that is part of the developing ASCEM reactive transport modeling and analysis framework to: (1) identify the complex physical and geochemical processes that control the U(VI) plume migration in the pH range where the plume is highly mobile, (2) evaluate those physical and geochemical parameters that are most controlling, and (3) predict the future plume evolution constrained by historical, chemical and hydrological data. The RT simulation results show a good agreement with the observed historical pH and concentrations of U(VI), nitrates and Al concentrations at multiple locations. Mineral dissolution and precipitation combined with adsorption reactions on goethite and kaolinite (the main minerals present with quartz) could buffer pH at the site for long periods of time. UQ analysis using the Morris one-at-a-time (OAT) method indicates that the model/parameter is most sensitive to the pH of the waste solution, discharge rates, and the reactive surface area available for adsorption. However, as a key finding, UQ analysis also indicates that this model (and parameters) sensitivity evolves in space and time, and its understanding could be crucial to assess the temporal efficiency of a remediation strategy in contaminated sites. Results also indicate that residual U(VI) and H(+) adsorbed in the vadose zone, as well as aquifer permeability, could have a significant impact on the acidic plume long-term mobility. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Good Models Gone Bad: Quantifying and Predicting Parameter-Induced Climate Model Simulation Failures

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Brandon, S.; Covey, C. C.; Domyancic, D.; Ivanova, D. P.

    2012-12-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Statistical analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation failures of the Parallel Ocean Program (POP2). About 8.5% of our POP2 runs failed for numerical reasons at certain combinations of parameter values. We apply support vector machine (SVM) classification from the fields of pattern recognition and machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. The SVM classifiers readily predict POP2 failures in an independent validation ensemble, and are subsequently used to determine the causes of the failures via a global sensitivity analysis. Four parameters related to ocean mixing and viscosity are identified as the major sources of POP2 failures. Our method can be used to improve the robustness of complex scientific models to parameter perturbations and to better steer UQ ensembles. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  11. Strategies for Reduced-Order Models in Uncertainty Quantification of Complex Turbulent Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Qi, Di

    Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are applied in the training phase for calibrating model errors to achieve optimal imperfect model parameters; and total statistical energy dynamics are introduced to improve the model sensitivity in the prediction phase especially when strong external perturbations are exerted. The validity of reduced-order models for predicting statistical responses and intermittency is demonstrated on a series of instructive models with increasing complexity, including the stochastic triad model, the Lorenz '96 model, and models for barotropic and baroclinic turbulence. The skillful low-order modeling methods developed here should also be useful for other applications such as efficient algorithms for data assimilation.

  12. Uncertainty quantification of fast sodium current steady-state inactivation for multi-scale models of cardiac electrophysiology.

    PubMed

    Pathmanathan, Pras; Shotwell, Matthew S; Gavaghan, David J; Cordeiro, Jonathan M; Gray, Richard A

    2015-01-01

    Perhaps the most mature area of multi-scale systems biology is the modelling of the heart. Current models are grounded in over fifty years of research in the development of biophysically detailed models of the electrophysiology (EP) of cardiac cells, but one aspect which is inadequately addressed is the incorporation of uncertainty and physiological variability. Uncertainty quantification (UQ) is the identification and characterisation of the uncertainty in model parameters derived from experimental data, and the computation of the resultant uncertainty in model outputs. It is a necessary tool for establishing the credibility of computational models, and will likely be expected of EP models for future safety-critical clinical applications. The focus of this paper is formal UQ of one major sub-component of cardiac EP models, the steady-state inactivation of the fast sodium current, INa. To better capture average behaviour and quantify variability across cells, we have applied for the first time an 'individual-based' statistical methodology to assess voltage clamp data. Advantages of this approach over a more traditional 'population-averaged' approach are highlighted. The method was used to characterise variability amongst cells isolated from canine epi and endocardium, and this variability was then 'propagated forward' through a canine model to determine the resultant uncertainty in model predictions at different scales, such as of upstroke velocity and spiral wave dynamics. Statistically significant differences between epi and endocardial cells (greater half-inactivation and less steep slope of steady state inactivation curve for endo) was observed, and the forward propagation revealed a lack of robustness of the model to underlying variability, but also surprising robustness to variability at the tissue scale. Overall, the methodology can be used to: (i) better analyse voltage clamp data; (ii) characterise underlying population variability; (iii) investigate consequences of variability; and (iv) improve the ability to validate a model. To our knowledge this article is the first to quantify population variability in membrane dynamics in this manner, and the first to perform formal UQ for a component of a cardiac model. The approach is likely to find much wider applicability across systems biology as current application domains reach greater levels of maturity. Published by Elsevier Ltd.

  13. Ensemble-based uncertainty quantification for coordination and control of thermostatically controlled loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lian, Jianming; Engel, Dave

    2017-07-27

    This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.

  14. Dawn Usage, Scheduling, and Governance Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louis, S

    2009-11-02

    This document describes Dawn use, scheduling, and governance concerns. Users started running full-machine science runs in early April 2009 during the initial open shakedown period. Scheduling Dawn while in the Open Computing Facility (OCF) was controlled and coordinated via phone calls, emails, and a small number of controlled banks. With Dawn moving to the Secure Computing Facility (SCF) in fall of 2009, a more detailed scheduling and governance model is required. The three major objectives are: (1) Ensure Dawn resources are allocated on a program priority-driven basis; (2) Utilize Dawn resources on the job mixes for which they were intended;more » and (3) Minimize idle cycles through use of partitions, banks and proper job mix. The SCF workload for Dawn will be inherently different than Purple or BG/L, and therefore needs a different approach. Dawn's primary function is to permit adequate access for tri-lab code development in preparation for Sequoia, and in particular for weapons multi-physics codes in support of UQ. A second purpose is to provide time allocations for large-scale science runs and for UQ suite calculations to advance SSP program priorities. This proposed governance model will be the basis for initial time allocation of Dawn computing resources for the science and UQ workloads that merit priority on this class of resource, either because they cannot be reasonably attempted on any other resources due to size of problem, or because of the unavailability of sizable allocations on other ASC capability or capacity platforms. This proposed model intends to make the most effective use of Dawn as possible, but without being overly constrained by more formal proposal processes such as those now used for Purple CCCs.« less

  15. Embedded Model Error Representation and Propagation in Climate Models

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Thornton, P. E.

    2017-12-01

    Over the last decade, parametric uncertainty quantification (UQ) methods have reached a level of maturity, while the same can not be said about representation and quantification of structural or model errors. Lack of characterization of model errors, induced by physical assumptions, phenomenological parameterizations or constitutive laws, is a major handicap in predictive science. In particular, e.g. in climate models, significant computational resources are dedicated to model calibration without gaining improvement in predictive skill. Neglecting model errors during calibration/tuning will lead to overconfident and biased model parameters. At the same time, the most advanced methods accounting for model error merely correct output biases, augmenting model outputs with statistical error terms that can potentially violate physical laws, or make the calibrated model ineffective for extrapolative scenarios. This work will overview a principled path for representing and quantifying model errors, as well as propagating them together with the rest of the predictive uncertainty budget, including data noise, parametric uncertainties and surrogate-related errors. Namely, the model error terms will be embedded in select model components rather than as external corrections. Such embedding ensures consistency with physical constraints on model predictions, and renders calibrated model predictions meaningful and robust with respect to model errors. Besides, in the presence of observational data, the approach can effectively differentiate model structural deficiencies from those of data acquisition. The methodology is implemented in UQ Toolkit (www.sandia.gov/uqtoolkit), relying on a host of available forward and inverse UQ tools. We will demonstrate the application of the technique on few application of interest, including ACME Land Model calibration via a wide range of measurements obtained at select sites.

  16. Effect of normalization methods on the performance of supervised learning algorithms applied to HTSeq-FPKM-UQ data sets: 7SK RNA expression as a predictor of survival in patients with colon adenocarcinoma.

    PubMed

    Shahriyari, Leili

    2017-11-03

    One of the main challenges in machine learning (ML) is choosing an appropriate normalization method. Here, we examine the effect of various normalization methods on analyzing FPKM upper quartile (FPKM-UQ) RNA sequencing data sets. We collect the HTSeq-FPKM-UQ files of patients with colon adenocarcinoma from TCGA-COAD project. We compare three most common normalization methods: scaling, standardizing using z-score and vector normalization by visualizing the normalized data set and evaluating the performance of 12 supervised learning algorithms on the normalized data set. Additionally, for each of these normalization methods, we use two different normalization strategies: normalizing samples (files) or normalizing features (genes). Regardless of normalization methods, a support vector machine (SVM) model with the radial basis function kernel had the maximum accuracy (78%) in predicting the vital status of the patients. However, the fitting time of SVM depended on the normalization methods, and it reached its minimum fitting time when files were normalized to the unit length. Furthermore, among all 12 learning algorithms and 6 different normalization techniques, the Bernoulli naive Bayes model after standardizing files had the best performance in terms of maximizing the accuracy as well as minimizing the fitting time. We also investigated the effect of dimensionality reduction methods on the performance of the supervised ML algorithms. Reducing the dimension of the data set did not increase the maximum accuracy of 78%. However, it leaded to discovery of the 7SK RNA gene expression as a predictor of survival in patients with colon adenocarcinoma with accuracy of 78%. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. Inverse regression-based uncertainty quantification algorithms for high-dimensional models: Theory and practice

    NASA Astrophysics Data System (ADS)

    Li, Weixuan; Lin, Guang; Li, Bing

    2016-09-01

    Many uncertainty quantification (UQ) approaches suffer from the curse of dimensionality, that is, their computational costs become intractable for problems involving a large number of uncertainty parameters. In these situations, the classic Monte Carlo often remains the preferred method of choice because its convergence rate O (n - 1 / 2), where n is the required number of model simulations, does not depend on the dimension of the problem. However, many high-dimensional UQ problems are intrinsically low-dimensional, because the variation of the quantity of interest (QoI) is often caused by only a few latent parameters varying within a low-dimensional subspace, known as the sufficient dimension reduction (SDR) subspace in the statistics literature. Motivated by this observation, we propose two inverse regression-based UQ algorithms (IRUQ) for high-dimensional problems. Both algorithms use inverse regression to convert the original high-dimensional problem to a low-dimensional one, which is then efficiently solved by building a response surface for the reduced model, for example via the polynomial chaos expansion. The first algorithm, which is for the situations where an exact SDR subspace exists, is proved to converge at rate O (n-1), hence much faster than MC. The second algorithm, which doesn't require an exact SDR, employs the reduced model as a control variate to reduce the error of the MC estimate. The accuracy gain could still be significant, depending on how well the reduced model approximates the original high-dimensional one. IRUQ also provides several additional practical advantages: it is non-intrusive; it does not require computing the high-dimensional gradient of the QoI; and it reports an error bar so the user knows how reliable the result is.

  18. Normative data and the influence of age and gender on power, balance, flexibility, and functional movement in healthy service members.

    PubMed

    Teyhen, Deydre S; Riebel, Mark A; McArthur, Derrick R; Savini, Matthew; Jones, Mackenzie J; Goffar, Stephen L; Kiesel, Kyle B; Plisky, Phillip J

    2014-04-01

    Determine the influence of age and sex and describe normative data on field expedient tests associated with power, balance, trunk stability, mobility, and functional movement in a military population. Participants (n = 247) completed a series of clinical and functional tests, including closed-chain ankle dorsiflexion (DF), Functional Movement Screen (FMS), Y-Balance Test Lower Quarter (YBT-LQ), Y-Balance Test Upper Quarter (YBT-UQ), single leg vertical jump (SLVJ), 6-m timed hop (6-m timed), and triple hop. Descriptive statistics were calculated. Analysis of variance tests were performed to compare the results based on sex and age (<30 years, >30 years). Service members demonstrated DF of 34.2 ± 6.1°, FMS composite score of 16.2 ± 2.2, YBT-LQ normalized composite score of 96.9 ± 8.6%, YBT-UQ normalized composite score of 87.6 ± 9.6%, SLVJ of 26.9 ± 8.6 cm, 6-m hop of 2.4 ± 0.5 seconds, and a triple hop of 390.9 ± 110.8 cm. Men performed greater than women (p < 0.05) on the YBT-LQ, YBT-UQ, SLVJ, 6-m timed, and triple hop. Those <30 years of age performed better than older participants (p < 0.05) on the DF, FMS, YBT-LQ, SLVJ, 6-m hop, and triple hop. Findings provide normative data on military members. Men performed better on power, balance, and trunk stability tests, whereas younger individuals performed better on power, balance, mobility, and functional movement. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  19. Superfund Record of Decision: National Starch & Chemical ...

    EPA Pesticide Factsheets

    ... wlwr'c: III = hazan' illtlt'x. lIu!w = 1:~;tim..II'd wor.!;t CiI!;e illl ... s. Shon-term Effectiveness refers to the tec.h1uQ1 and ... It ru~sd~~~ l~~~ .: G!""ar:ts C=eek. ...

  20. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    NASA Astrophysics Data System (ADS)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilkey, Lindsay

    This milestone presents a demonstration of the High-to-Low (Hi2Lo) process in the VVI focus area. Validation and additional calculations with the commercial computational fluid dynamics code, STAR-CCM+, were performed using a 5x5 fuel assembly with non-mixing geometry and spacer grids. This geometry was based on the benchmark experiment provided by Westinghouse. Results from the simulations were compared to existing experimental data and to the subchannel thermal-hydraulics code COBRA-TF (CTF). An uncertainty quantification (UQ) process was developed for the STAR-CCM+ model and results of the STAR UQ were communicated to CTF. Results from STAR-CCM+ simulations were used as experimental design pointsmore » in CTF to calibrate the mixing parameter β and compared to results obtained using experimental data points. This demonstrated that CTF’s β parameter can be calibrated to match existing experimental data more closely. The Hi2Lo process for the STAR-CCM+/CTF code coupling was documented in this milestone and closely linked L3:VVI.H2LP15.01 milestone report.« less

  2. High Mach Number Scramjet Test Flows in the X3 Expansion Tube

    NASA Astrophysics Data System (ADS)

    Gildfind, D. E.; Sancho, J.; Morgan, R. G.

    The University of Queensland (UQ) has two free-piston driven expansion tube facilities; X2 has a total length of 23 m and was originally commissioned in 1995 [1]; X3 is much longer at 62 m, and was commissioned in 2001 [2].

  3. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Treesearch

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  4. Design and Commissioning of a New Lightweight Piston for the X3 Expansion Tube

    NASA Astrophysics Data System (ADS)

    Gildfind, D. E.; Morgan, R. G.; Sancho, J.

    The University of Queensland's (UQ) X3 facility (Figure 1) is the world's largest free-piston driven expansion tube. It is used to generate hypersonic test flows such as simulation of planetary entry (6-15 km/s) or scramjet flight (3-5 km/s).

  5. Investigation of advanced UQ for CRUD prediction with VIPRE.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eldred, Michael Scott

    2011-09-01

    This document summarizes the results from a level 3 milestone study within the CASL VUQ effort. It demonstrates the application of 'advanced UQ,' in particular dimension-adaptive p-refinement for polynomial chaos and stochastic collocation. The study calculates statistics for several quantities of interest that are indicators for the formation of CRUD (Chalk River unidentified deposit), which can lead to CIPS (CRUD induced power shift). Stochastic expansion methods are attractive methods for uncertainty quantification due to their fast convergence properties. For smooth functions (i.e., analytic, infinitely-differentiable) in L{sup 2} (i.e., possessing finite variance), exponential convergence rates can be obtained under order refinementmore » for integrated statistical quantities of interest such as mean, variance, and probability. Two stochastic expansion methods are of interest: nonintrusive polynomial chaos expansion (PCE), which computes coefficients for a known basis of multivariate orthogonal polynomials, and stochastic collocation (SC), which forms multivariate interpolation polynomials for known coefficients. Within the DAKOTA project, recent research in stochastic expansion methods has focused on automated polynomial order refinement ('p-refinement') of expansions to support scalability to higher dimensional random input spaces [4, 3]. By preferentially refining only in the most important dimensions of the input space, the applicability of these methods can be extended from O(10{sup 0})-O(10{sup 1}) random variables to O(10{sup 2}) and beyond, depending on the degree of anisotropy (i.e., the extent to which randominput variables have differing degrees of influence on the statistical quantities of interest (QOIs)). Thus, the purpose of this study is to investigate the application of these adaptive stochastic expansion methods to the analysis of CRUD using the VIPRE simulation tools for two different plant models of differing random dimension, anisotropy, and smoothness.« less

  6. Sparse Polynomial Chaos Surrogate for ACME Land Model via Iterative Bayesian Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2015-12-01

    For computationally expensive climate models, Monte-Carlo approaches of exploring the input parameter space are often prohibitive due to slow convergence with respect to ensemble size. To alleviate this, we build inexpensive surrogates using uncertainty quantification (UQ) methods employing Polynomial Chaos (PC) expansions that approximate the input-output relationships using as few model evaluations as possible. However, when many uncertain input parameters are present, such UQ studies suffer from the curse of dimensionality. In particular, for 50-100 input parameters non-adaptive PC representations have infeasible numbers of basis terms. To this end, we develop and employ Weighted Iterative Bayesian Compressive Sensing to learn the most important input parameter relationships for efficient, sparse PC surrogate construction with posterior uncertainty quantified due to insufficient data. Besides drastic dimensionality reduction, the uncertain surrogate can efficiently replace the model in computationally intensive studies such as forward uncertainty propagation and variance-based sensitivity analysis, as well as design optimization and parameter estimation using observational data. We applied the surrogate construction and variance-based uncertainty decomposition to Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  7. The Dynamic Theory, A New View of Space, Time, and Matter

    DTIC Science & Technology

    1978-03-01

    UQ = 0, system the principle of increasing entropy requires that (dqo)2 > 0 so that f (do)2 > 0. Introducing the quantitization conditions results in...added significance in considering the differential change in entropy or dq° =f do. Thus the only mathematically consistent difinition for is 1 lanf ax

  8. A Generalized Perturbation Theory Solver In Rattlesnake Based On PETSc With Application To TREAT Steady State Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunert, Sebastian; Wang, Congjian; Wang, Yaqi

    Rattlesnake and MAMMOTH are the designated TREAT analysis tools currently being developed at the Idaho National Laboratory. Concurrent with development of the multi-physics, multi-scale capabilities, sensitivity analysis and uncertainty quantification (SA/UQ) capabilities are required for predicitive modeling of the TREAT reactor. For steady-state SA/UQ, that is essential for setting initial conditions for the transients, generalized perturbation theory (GPT) will be used. This work describes the implementation of a PETSc based solver for the generalized adjoint equations that constitute a inhomogeneous, rank deficient problem. The standard approach is to use an outer iteration strategy with repeated removal of the fundamental modemore » contamination. The described GPT algorithm directly solves the GPT equations without the need of an outer iteration procedure by using Krylov subspaces that are orthogonal to the operator’s nullspace. Three test problems are solved and provide sufficient verification for the Rattlesnake’s GPT capability. We conclude with a preliminary example evaluating the impact of the Boron distribution in the TREAT reactor using perturbation theory.« less

  9. Uncertainty quantification of CO₂ saturation estimated from electrical resistance tomography data at the Cranfield site

    DOE PAGES

    Yang, Xianjin; Chen, Xiao; Carrigan, Charles R.; ...

    2014-06-03

    A parametric bootstrap approach is presented for uncertainty quantification (UQ) of CO₂ saturation derived from electrical resistance tomography (ERT) data collected at the Cranfield, Mississippi (USA) carbon sequestration site. There are many sources of uncertainty in ERT-derived CO₂ saturation, but we focus on how the ERT observation errors propagate to the estimated CO₂ saturation in a nonlinear inversion process. Our UQ approach consists of three steps. We first estimated the observational errors from a large number of reciprocal ERT measurements. The second step was to invert the pre-injection baseline data and the resulting resistivity tomograph was used as the priormore » information for nonlinear inversion of time-lapse data. We assigned a 3% random noise to the baseline model. Finally, we used a parametric bootstrap method to obtain bootstrap CO₂ saturation samples by deterministically solving a nonlinear inverse problem many times with resampled data and resampled baseline models. Then the mean and standard deviation of CO₂ saturation were calculated from the bootstrap samples. We found that the maximum standard deviation of CO₂ saturation was around 6% with a corresponding maximum saturation of 30% for a data set collected 100 days after injection began. There was no apparent spatial correlation between the mean and standard deviation of CO₂ saturation but the standard deviation values increased with time as the saturation increased. The uncertainty in CO₂ saturation also depends on the ERT reciprocal error threshold used to identify and remove noisy data and inversion constraints such as temporal roughness. Five hundred realizations requiring 3.5 h on a single 12-core node were needed for the nonlinear Monte Carlo inversion to arrive at stationary variances while the Markov Chain Monte Carlo (MCMC) stochastic inverse approach may expend days for a global search. This indicates that UQ of 2D or 3D ERT inverse problems can be performed on a laptop or desktop PC.« less

  10. Using Uncertainty Quantification to Guide Development and Improvements of a Regional-Scale Model of the Coastal Lowlands Aquifer System Spanning Texas, Louisiana, Mississippi, Alabama and Florida

    NASA Astrophysics Data System (ADS)

    Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.

    2017-12-01

    Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.

  11. Ground Testing for Hypervelocity Flow, Capabilities and Limitations

    DTIC Science & Technology

    2010-03-29

    Brisbane (T4) in Australia, see http://www.uq.edu.au/~e4dmee/t4.html, and larger ones at Göttingen in Germany (HEG), see e. g., Hannemann (2002), and...Fluids, 11:4026–4039. Hannemann , K. (2002). High-enthalpy flows in the HEG shock tunnel: Experiment and numerical rebuilding. 22nd AIAA Aerodynamic

  12. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  13. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation metric will provide a quantified confidence and probability of success for the final SLS dynamics model, which will be critical for a successful launch program, and can be applied in the many other industries where an accurate dynamic model is required.

  14. Computational electromagnetic methods for transcranial magnetic stimulation

    NASA Astrophysics Data System (ADS)

    Gomez, Luis J.

    Transcranial magnetic stimulation (TMS) is a noninvasive technique used both as a research tool for cognitive neuroscience and as a FDA approved treatment for depression. During TMS, coils positioned near the scalp generate electric fields and activate targeted brain regions. In this thesis, several computational electromagnetics methods that improve the analysis, design, and uncertainty quantification of TMS systems were developed. Analysis: A new fast direct technique for solving the large and sparse linear system of equations (LSEs) arising from the finite difference (FD) discretization of Maxwell's quasi-static equations was developed. Following a factorization step, the solver permits computation of TMS fields inside realistic brain models in seconds, allowing for patient-specific real-time usage during TMS. The solver is an alternative to iterative methods for solving FD LSEs, often requiring run-times of minutes. A new integral equation (IE) method for analyzing TMS fields was developed. The human head is highly-heterogeneous and characterized by high-relative permittivities (107). IE techniques for analyzing electromagnetic interactions with such media suffer from high-contrast and low-frequency breakdowns. The novel high-permittivity and low-frequency stable internally combined volume-surface IE method developed. The method not only applies to the analysis of high-permittivity objects, but it is also the first IE tool that is stable when analyzing highly-inhomogeneous negative permittivity plasmas. Design: TMS applications call for electric fields to be sharply focused on regions that lie deep inside the brain. Unfortunately, fields generated by present-day Figure-8 coils stimulate relatively large regions near the brain surface. An optimization method for designing single feed TMS coil-arrays capable of producing more localized and deeper stimulation was developed. Results show that the coil-arrays stimulate 2.4 cm into the head while stimulating 3.0 times less volume than Figure-8 coils. Uncertainty quantification (UQ): The location/volume/depth of the stimulated region during TMS is often strongly affected by variability in the position and orientation of TMS coils, as well as anatomical differences between patients. A surrogate model-assisted UQ framework was developed and used to statistically characterize TMS depression therapy. The framework identifies key parameters that strongly affect TMS fields, and partially explains variations in TMS treatment responses.

  15. Draft Background Document: Summary of Data on Municipal ...

    EPA Pesticide Factsheets

    ... it •i HI it 221 IM ll HA ii ii HA it ii NA il li NA II 12.' 1*8 1.113 •l Nt ii ii NA •• -.*l UQ • NT .2* «ff .Uj •• HA II -.115 ll N* il • l 1 .91 j •• NA ii .1?.' - tit .mi .112 ...

  16. ARM Best Estimate Data (ARMBE) Products for Climate Science for a Sustainable Energy Future (CSSEF)

    DOE Data Explorer

    Riihimaki, Laura; Gaustad, Krista; McFarlane, Sally

    2014-06-12

    This data set was created for the Climate Science for a Sustainable Energy Future (CSSEF) model testbed project and is an extension of the hourly average ARMBE dataset to other extended facility sites and to include uncertainty estimates. Uncertainty estimates were needed in order to use uncertainty quantification (UQ) techniques with the data.

  17. Baseline Hazardous Waste Stream Characterization Survey at the 21st Tactical Fighter Wing Elmendorf AFB, Alaska

    DTIC Science & Technology

    1991-11-01

    CES/DEEV). The Defense Reutilization and Marketing Office (DRMO) is responsible for contractual removal of hazardous waste. BES supports the program...RGNT HOSP ELMEŕDORFGGEPB Test Results ~ Utnit- "-R~one LCL ucg L wosnLCL uq L FP in LCL mci L irwtn Ter~rrn ride LLL miYL 9.1 crot -ne LF mqzL

  18. Asymptotic representations of augmented q-Onsager algebra and boundary K-operators related to Baxter Q-operators

    NASA Astrophysics Data System (ADS)

    Baseilhac, Pascal; Tsuboi, Zengo

    2018-04-01

    We consider intertwining relations of the augmented q-Onsager algebra introduced by Ito and Terwilliger, and obtain generic (diagonal) boundary K-operators in terms of the Cartan element of Uq (sl2). These K-operators solve reflection equations. Taking appropriate limits of these K-operators in Verma modules, we derive K-operators for Baxter Q-operators and corresponding reflection equations.

  19. Bellerophon: a program to detect chimeric sequences in multiple sequence alignments.

    PubMed

    Huber, Thomas; Faulkner, Geoffrey; Hugenholtz, Philip

    2004-09-22

    Bellerophon is a program for detecting chimeric sequences in multiple sequence datasets by an adaption of partial treeing analysis. Bellerophon was specifically developed to detect 16S rRNA gene chimeras in PCR-clone libraries of environmental samples but can be applied to other nucleotide sequence alignments. Bellerophon is available as an interactive web server at http://foo.maths.uq.edu.au/~huber/bellerophon.pl

  20. Multi-fidelity numerical simulations of shock/turbulent-boundary layer interaction with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bermejo-Moreno, Ivan; Campo, Laura; Larsson, Johan; Emory, Mike; Bodart, Julien; Palacios, Francisco; Iaccarino, Gianluca; Eaton, John

    2013-11-01

    We study the interaction between an oblique shock wave and the turbulent boundary layers inside a nearly-square duct by combining wall-modeled LES, 2D and 3D RANS simulations, targeting the experiment of Campo, Helmer & Eaton, 2012 (nominal conditions: M = 2 . 05 , Reθ = 6 , 500). A primary objective is to quantify the effect of aleatory and epistemic uncertainties on the STBLI. Aleatory uncertainties considered include the inflow conditions (Mach number of the incoming air stream and thickness of the boundary layers) and perturbations of the duct geometry upstream of the interaction. The epistemic uncertainty under consideration focuses on the RANS turbulence model form by injecting perturbations in the Reynolds stress anisotropy in regions of the flow where the model assumptions (in particular, the Boussinesq eddy-viscosity hypothesis) may be invalid. These perturbations are then propagated through the flow solver into the solution. The uncertainty quantification (UQ) analysis is done through 2D and 3D RANS simulations, assessing the importance of the three-dimensional effects imposed by the nearly-square duct geometry. Wall-modeled LES are used to verify elements of the UQ methodology and to explore the flow features and physics of the STBLI for multiple shock strengths. Financial support from the United States Department of Energy under the PSAAP program is gratefully acknowledged.

  1. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE PAGES

    Jakeman, Anthony J.; Jakeman, John Davis

    2018-03-14

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  2. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    NASA Astrophysics Data System (ADS)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  3. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, Anthony J.; Jakeman, John Davis

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  5. Film thickness dependence of phase separation and dewetting behaviors in PMMA/SAN blend films.

    PubMed

    You, Jichun; Liao, Yonggui; Men, Yongfeng; Shi, Tongfei; An, Lijia

    2010-09-21

    Film thickness dependence of complex behaviors coupled by phase separation and dewetting in blend [poly(methyl methacrylate) (PMMA) and poly(styrene-ran-acrylonitrile) (SAN)] films on silicon oxide substrate at 175 °C was investigated by grazing incidence ultrasmall-angle X-ray scattering (GIUSAX) and in situ atomic force microscopy (AFM). It was found that the dewetting pathway was under the control of the parameter U(q0)/E, which described the initial amplitude of the surface undulation and original thickness of film, respectively. Furthermore, our results showed that interplay between phase separation and dewetting depended crucially on film thickness. Three mechanisms including dewetting-phase separation/wetting, dewetting/wetting-phase separation, and phase separation/wetting-pseudodewetting were discussed in detail. In conclusion, it is relative rates of phase separation and dewetting that dominate the interplay between them.

  6. Region 4 of Rhizobium etli Primary Sigma Factor (SigA) Confers Transcriptional Laxity in Escherichia coli.

    PubMed

    Santillán, Orlando; Ramírez-Romero, Miguel A; Lozano, Luis; Checa, Alberto; Encarnación, Sergio M; Dávila, Guillermo

    2016-01-01

    Sigma factors are RNA polymerase subunits engaged in promoter recognition and DNA strand separation during transcription initiation in bacteria. Primary sigma factors are responsible for the expression of housekeeping genes and are essential for survival. RpoD, the primary sigma factor of Escherichia coli, a γ-proteobacteria, recognizes consensus promoter sequences highly similar to those of some α-proteobacteria species. Despite this resemblance, RpoD is unable to sustain transcription from most of the α-proteobacterial promoters tested so far. In contrast, we have found that SigA, the primary sigma factor of Rhizobium etli, an α-proteobacteria, is able to transcribe E. coli promoters, although it exhibits only 48% identity (98% coverage) to RpoD. We have called this the transcriptional laxity phenomenon. Here, we show that SigA partially complements the thermo-sensitive deficiency of RpoD285 from E. coli strain UQ285 and that the SigA region σ4 is responsible for this phenotype. Sixteen out of 74 residues (21.6%) within region σ4 are variable between RpoD and SigA. Mutating these residues significantly improves SigA ability to complement E. coli UQ285. Only six of these residues fall into positions already known to interact with promoter DNA and to comprise a helix-turn-helix motif. The remaining variable positions are located on previously unexplored sites inside region σ4, specifically into the first two α-helices of the region. Neither of the variable positions confined to these helices seem to interact directly with promoter sequence; instead, we adduce that these residues participate allosterically by contributing to correct region folding and/or positioning of the HTH motif. We propose that transcriptional laxity is a mechanism for ensuring transcription in spite of naturally occurring mutations from endogenous promoters and/or horizontally transferred DNA sequences, allowing survival and fast environmental adaptation of α-proteobacteria.

  7. Region 4 of Rhizobium etli Primary Sigma Factor (SigA) Confers Transcriptional Laxity in Escherichia coli

    PubMed Central

    Santillán, Orlando; Ramírez-Romero, Miguel A.; Lozano, Luis; Checa, Alberto; Encarnación, Sergio M.; Dávila, Guillermo

    2016-01-01

    Sigma factors are RNA polymerase subunits engaged in promoter recognition and DNA strand separation during transcription initiation in bacteria. Primary sigma factors are responsible for the expression of housekeeping genes and are essential for survival. RpoD, the primary sigma factor of Escherichia coli, a γ-proteobacteria, recognizes consensus promoter sequences highly similar to those of some α-proteobacteria species. Despite this resemblance, RpoD is unable to sustain transcription from most of the α-proteobacterial promoters tested so far. In contrast, we have found that SigA, the primary sigma factor of Rhizobium etli, an α-proteobacteria, is able to transcribe E. coli promoters, although it exhibits only 48% identity (98% coverage) to RpoD. We have called this the transcriptional laxity phenomenon. Here, we show that SigA partially complements the thermo-sensitive deficiency of RpoD285 from E. coli strain UQ285 and that the SigA region σ4 is responsible for this phenotype. Sixteen out of 74 residues (21.6%) within region σ4 are variable between RpoD and SigA. Mutating these residues significantly improves SigA ability to complement E. coli UQ285. Only six of these residues fall into positions already known to interact with promoter DNA and to comprise a helix-turn-helix motif. The remaining variable positions are located on previously unexplored sites inside region σ4, specifically into the first two α-helices of the region. Neither of the variable positions confined to these helices seem to interact directly with promoter sequence; instead, we adduce that these residues participate allosterically by contributing to correct region folding and/or positioning of the HTH motif. We propose that transcriptional laxity is a mechanism for ensuring transcription in spite of naturally occurring mutations from endogenous promoters and/or horizontally transferred DNA sequences, allowing survival and fast environmental adaptation of α-proteobacteria. PMID:27468278

  8. A non-extensive thermodynamic theory of ecological systems

    NASA Astrophysics Data System (ADS)

    Van Xuan, Le; Khac Ngoc, Nguyen; Lan, Nguyen Tri; Viet, Nguyen Ai

    2017-06-01

    After almost 30 years of development, it is not controversial issue that the so-called Tsallis entropy provides a useful approach to studying the complexity where the non-additivity of the systems under consideration is frequently met. Also, in the ecological research, Tsallis entropy, or in other words, q-entropy has been found itself as a generalized approach to define a range of diversity indices including Shannon-Wiener and Simpson indices. As a further stage of development in theoretical research, a thermodynamic theory based on Tsallis entropy or diversity indices in ecology has to be constructed for ecological systems to provide knowledge of ecological macroscopic behaviors. The standard method of theoretical physics is used in the manipulation and the equivalence between phenomenological thermodynamics and ecological aspects is the purpose of the ongoing research. The present work is in the line of the authors research to implement Tsallis non-extensivity approach to obtain the most important thermodynamic quantities of ecological systems such as internal energy Uq and temperature Tq based on a given modeled truncated Boltzmann distribution of the Whittaker plot for a dataset. These quantities have their own ecological meaning, especially the temperature Tq provides the insight of equilibrium condition among ecological systems as it is well-known in 0th law of thermodynamics.

  9. Imprecise Probability Methods for Weapons UQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Vander Wiel, Scott Alan

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  10. Quantifying Risks and Uncertainties Associated with Induced Seismicity due to CO2 Injection into Geologic Formations with Faults

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Nguyen, B. N.; Bacon, D. H.; White, M. D.; Murray, C. J.

    2016-12-01

    A multiphase flow and reactive transport simulator named STOMP-CO2-R has been developed and coupled to the ABAQUS® finite element package for geomechanical analysis enabling comprehensive thermo-hydro-geochemical-mechanical (THMC) analyses. The coupled THMC simulator has been applied to analyze faulted CO2 reservoir responses (e.g., stress and strain distributions, pressure buildup, slip tendency factor, pressure margin to fracture) with various complexities in fault and reservoir structures and mineralogy. Depending on the geological and reaction network settings, long-term injection of CO2 can have a significant effect on the elastic stiffness and permeability of formation rocks. In parallel, an uncertainty quantification framework (UQ-CO2), which consists of entropy-based prior uncertainty representation, efficient sampling, geostatistical reservoir modeling, and effective response surface analysis, has been developed for quantifying risks and uncertainties associated with CO2 sequestration. It has been demonstrated for evaluating risks in CO2 leakage through natural pathways and wellbores, and for developing predictive reduced order models. Recently, a parallel STOMP-CO2-R has been developed and the updated STOMP/ABAQUS model has been proven to have a great scalability, which makes it possible to integrate the model with the UQ framework to effectively and efficiently explore multidimensional parameter space (e.g., permeability, elastic modulus, crack orientation, fault friction coefficient) for a more systematic analysis of induced seismicity risks.

  11. Implementing digital technology to enhance student learning of pathology.

    PubMed

    Farah, C S; Maybury, T

    2009-08-01

    The introduction of digital technologies into the dental curriculum is an ongoing feature of broader changes going on in tertiary education. This report examines the introduction of digital virtual microscopy technology into the curriculum of the School of Dentistry at the University of Queensland (UQ) in Brisbane, Australia. Sixty students studying a course in pathology in 2005 were introduced to virtual microscopy technology alongside the more traditional light microscope and then asked to evaluate their own learning outcomes from this technology via a structured 5-point LIKART survey. A wide variety of questions dealing the pedagogic implications of the introduction of virtual microscopy into pathology were asked of students with the overall result being that it positively enhanced their learning of pathology via digital microscopic means. The success of virtual microscopy in dentistry at UQ is then discussed in the larger context of changes going on in tertiary education. In particular, the change from the print-literate tradition to the electronic one, that is from 'literacy to electracy'. Virtual microscopy is designated as a component of this transformation to electracy. Whilst traditional microscopic skills may still be valued in dental curricula, the move to virtual microscopy and computer-assisted, student-centred learning of pathology appears to enhance the learning experience in relation to its effectiveness in helping students engage and interact with the course material.

  12. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    DOE PAGES

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  13. Materials integrity in microsystems: a framework for a petascale predictive-science-based multiscale modeling and simulation system

    NASA Astrophysics Data System (ADS)

    To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian

    2008-09-01

    Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.

  14. Realization of Uq(sp(2n)) within the Differential Algebra on Quantum Symplectic Space

    NASA Astrophysics Data System (ADS)

    Zhang, Jiao; Hu, Naihong

    2017-10-01

    We realize the Hopf algebra U_q({sp}_{2n}) as an algebra of quantum differential operators on the quantum symplectic space X(f_s;R) and prove that X(f_s;R) is a U_q({sp}_{2n})-module algebra whose irreducible summands are just its homogeneous subspaces. We give a coherence realization for all the positive root vectors under the actions of Lusztig's braid automorphisms of U_q({sp}_{2n}).

  15. Let's call it "aphasia": Rationales for eliminating the term "dysphasia".

    PubMed

    Worrall, Linda; Simmons-Mackie, Nina; Wallace, Sarah J; Rose, Tanya; Brady, Marian C; Kong, Anthony Pak Hin; Murray, Laura; Hallowell, Brooke

    2016-10-01

    Health professionals, researchers, and policy makers often consider the two terms aphasia and dysphasia to be synonymous. The aim of this article is to argue the merits of the exclusive use of the term aphasia and present a strategy for creating change through institutions such as the WHO-ICD. Our contention is that one term avoids confusion, speech-language pathologists prefer aphasia, scholarly publications indicate a preference for the term aphasia, stroke clinical guidelines indicate a preference for the term aphasia, consumer organizations use the title aphasia in their name and on their websites, and languages other than English use a term similar to aphasia. The use of the term dysphasia in the broader medical community may stem from the two terms being used interchangeably in the ICD10. Aphasia United http://www.shrs.uq.edu.au/aphasiaunited , an international movement for uniting the voice of all stakeholders in aphasia within an international context, will seek to eliminate the use of the term dysphasia.

  16. Development of Plant Gene Vectors for Tissue-Specific Expression Using GFP as a Reporter Gene

    NASA Technical Reports Server (NTRS)

    Jackson, Jacquelyn; Egnin, Marceline; Xue, Qi-Han; Prakash, C. S.

    1997-01-01

    Reporter genes are widely employed in plant molecular biology research to analyze gene expression and to identify promoters. Gus (UidA) is currently the most popular reporter gene but its detection requires a destructive assay. The use of jellyfish green fluorescent protein (GFP) gene from Aequorea Victoria holds promise for noninvasive detection of in vivo gene expression. To study how various plant promoters are expressed in sweet potato (Ipomoea batatas), we are transcriptionally fusing the intron-modified (mGFP) or synthetic (modified for codon-usage) GFP coding regions to these promoters: double cauliflower mosaic virus 35S (CaMV 35S) with AMV translational enhancer, ubiquitin7-intron-ubiquitin coding region (ubi7-intron-UQ) and sporaminA. A few of these vectors have been constructed and introduced into E. coli DH5a and Agrobacterium tumefaciens EHA105. Transient expression studies are underway using protoplast-electroporation and particle bombardment of leaf tissues.

  17. Bayesian Methods for Effective Field Theories

    NASA Astrophysics Data System (ADS)

    Wesolowski, Sarah

    Microscopic predictions of the properties of atomic nuclei have reached a high level of precision in the past decade. This progress mandates improved uncertainty quantification (UQ) for a robust comparison of experiment with theory. With the uncertainty from many-body methods under control, calculations are now sensitive to the input inter-nucleon interactions. These interactions include parameters that must be fit to experiment, inducing both uncertainty from the fit and from missing physics in the operator structure of the Hamiltonian. Furthermore, the implementation of the inter-nucleon interactions is not unique, which presents the additional problem of assessing results using different interactions. Effective field theories (EFTs) take advantage of a separation of high- and low-energy scales in the problem to form a power-counting scheme that allows the organization of terms in the Hamiltonian based on their expected contribution to observable predictions. This scheme gives a natural framework for quantification of uncertainty due to missing physics. The free parameters of the EFT, called the low-energy constants (LECs), must be fit to data, but in a properly constructed EFT these constants will be natural-sized, i.e., of order unity. The constraints provided by the EFT, namely the size of the systematic uncertainty from truncation of the theory and the natural size of the LECs, are assumed information even before a calculation is performed or a fit is done. Bayesian statistical methods provide a framework for treating uncertainties that naturally incorporates prior information as well as putting stochastic and systematic uncertainties on an equal footing. For EFT UQ Bayesian methods allow the relevant EFT properties to be incorporated quantitatively as prior probability distribution functions (pdfs). Following the logic of probability theory, observable quantities and underlying physical parameters such as the EFT breakdown scale may be expressed as pdfs that incorporate the prior pdfs. Problems of model selection, such as distinguishing between competing EFT implementations, are also natural in a Bayesian framework. In this thesis we focus on two complementary topics for EFT UQ using Bayesian methods--quantifying EFT truncation uncertainty and parameter estimation for LECs. Using the order-by-order calculations and underlying EFT constraints as prior information, we show how to estimate EFT truncation uncertainties. We then apply the result to calculating truncation uncertainties on predictions of nucleon-nucleon scattering in chiral effective field theory. We apply model-checking diagnostics to our calculations to ensure that the statistical model of truncation uncertainty produces consistent results. A framework for EFT parameter estimation based on EFT convergence properties and naturalness is developed which includes a series of diagnostics to ensure the extraction of the maximum amount of available information from data to estimate LECs with minimal bias. We develop this framework using model EFTs and apply it to the problem of extrapolating lattice quantum chromodynamics results for the nucleon mass. We then apply aspects of the parameter estimation framework to perform case studies in chiral EFT parameter estimation, investigating a possible operator redundancy at fourth order in the chiral expansion and the appropriate inclusion of truncation uncertainty in estimating LECs.

  18. A survey of Existing V&V, UQ and M&S Data and Knowledge Bases in Support of the Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hyung Lee; Rich Johnson, Ph.D.; Kimberlyn C. Moussesau

    2011-12-01

    The Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Oak Ridge National Laboratory, Utah State University and others. The objective of this consortium is to establish a comprehensive knowledge base to provide Verification and Validation (V&V) and Uncertainty Quantification (UQ) and other resources for advanced modeling and simulation (M&S) in nuclear reactor design and analysis. NE-KAMS will become a valuable resource for the nuclear industry, the national laboratories, the U.S. NRC and the public to help ensure themore » safe operation of existing and future nuclear reactors. A survey and evaluation of the state-of-the-art of existing V&V and M&S databases, including the Department of Energy and commercial databases, has been performed to ensure that the NE-KAMS effort will not be duplicating existing resources and capabilities and to assess the scope of the effort required to develop and implement NE-KAMS. The survey and evaluation have indeed highlighted the unique set of value-added functionality and services that NE-KAMS will provide to its users. Additionally, the survey has helped develop a better understanding of the architecture and functionality of these data and knowledge bases that can be used to leverage the development of NE-KAMS.« less

  19. The NETL MFiX Suite of multiphase flow models: A brief review and recent applications of MFiX-TFM to fossil energy technologies

    DOE PAGES

    Li, Tingwen; Rogers, William A.; Syamlal, Madhava; ...

    2016-07-29

    Here, the MFiX suite of multiphase computational fluid dynamics (CFD) codes is being developed at U.S. Department of Energy's National Energy Technology Laboratory (NETL). It includes several different approaches to multiphase simulation: MFiX-TFM, a two-fluid (Eulerian–Eulerian) model; MFiX-DEM, an Eulerian fluid model with a Lagrangian Discrete Element Model for the solids phase; and MFiX-PIC, Eulerian fluid model with Lagrangian particle ‘parcels’ representing particle groups. These models are undergoing continuous development and application, with verification, validation, and uncertainty quantification (VV&UQ) as integrated activities. After a brief summary of recent progress in the verification, validation and uncertainty quantification (VV&UQ), this article highlightsmore » two recent accomplishments in the application of MFiX-TFM to fossil energy technology development. First, recent application of MFiX to the pilot-scale KBR TRIG™ Transport Gasifier located at DOE's National Carbon Capture Center (NCCC) is described. Gasifier performance over a range of operating conditions was modeled and compared to NCCC operational data to validate the ability of the model to predict parametric behavior. Second, comparison of code predictions at a detailed fundamental scale is presented studying solid sorbents for the post-combustion capture of CO 2 from flue gas. Specifically designed NETL experiments are being used to validate hydrodynamics and chemical kinetics for the sorbent-based carbon capture process.« less

  20. Validation of an Automated Torsional and Warping Stress Analysis Program

    DTIC Science & Technology

    1992-08-19

    AT ftA NC[ VIPS’ Af $UPP69T ds (ZqOoo x,~)(23.6 ui7)( .000012433) 127672 P~s Af .SL Cq6"): dws (2qOOC KcI)(21Ci;)2)G-.OOOQQ 3623):’ -2uqO KSI AT M~C...TORSIONAL ,’KMENT .50000000 ENDING AT 1,2.04000 FMD • LE3 END PHIZ .00)OOE+00 PHI: .o0000Eo0f PHI2: .38240E-04 PHI3: -. 33�E-05 I"OR. SHR. WEL

  1. Techniques of Flow Visualization

    DTIC Science & Technology

    1987-12-01

    fussent-ils "classiques". Elle inclut nombre d’exemples sans lesquels elle ne saurait etre complete, c’est pourquoi je remercie tous les auteurs et...two fluids is the same and not zero. The velocity u at this interface is u( y =h) oil air dx 3y (2.11 where x is the streamwise coordinate in the...plane of the wall, y is the coordinate nor- mal to the wall, h is the oil film thickness, ^i viscosity, and p pressure. Since the ra- tio VI^J^J./UQ

  2. Stratospheric Turbulence and Vertical Effective Diffusion Coefficients

    DTIC Science & Technology

    1975-09-29

    UMBER AFCRL-TR-75.-0519 - 4. TILE (moiS."Eti) S. Tlr OF C RP~hT S PESO0 COVERED STRATOSPHERIC TURBULENCE AND VERTICAL EFFECTIVE DIFFUSION COEFFICIENTS...that CAT plays a prominent role in vertical transport in the stratosphere. I ~1 Unclassified t FUrs,*Tv C , Uq C ~ml .. at ’r *n he.. a* U I Department...phenomenon. Thorpe himself refers (1973) to underwater K-H as "underwater CAT." ____ ____ ____WE006 SflJGLE ( SPAD M LAVER 4" Ri" i0 15 0t (m’iJr

  3. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    NASA Technical Reports Server (NTRS)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  4. A Novel Approach to Mission-Level Engineering of Complex Systems of Systems: Addressing Integration and Interoperability Shortfalls by Interrogating the Interstitials

    DTIC Science & Technology

    2013-12-17

    allows the explicit inclusion of causality into the computations of the metrics (Held, 2008). This is important as many traditional component and...interactions and develop a Physical Space SRL to grade the SoS. Utilizing Li, Di, PS and BP we can ultimately assess the probability of realization...the aleatoric realm), identify sensitivities in the SoS and provide a mechanism to reduce risk. 5.3 Importance of UQ Because of the nature of all

  5. Department of Defense Data Model, Version 1, Fy 1998, Volume 3.

    DTIC Science & Technology

    1998-05-31

    TM E...S C A (F K ) 1- l-DC l-H z Z ZUJ zz UJ UJu. UJUJ E E Ei= SS S Sg UJUJ fr DC DC 3 O °u^S Ujfeujg = 3 LULU rr DC^ TDC < DCOC...BQ-ZZ «£ sOuiui ■ z oc UJ u z o u _l ff < u. 1- z UJ ^oc ffUJ O 0. feit ^z u. Sr1" 9 < U.Q tm LU Lut- OC < □ 10 lü ^ (/)

  6. Short Duration Reduced Gravity Drop Tower Design and Development

    NASA Astrophysics Data System (ADS)

    Osborne, B.; Welch, C.

    The industrial and commercial development of space-related activities is intimately linked to the ability to conduct reduced gravity research. Reduced gravity experimentation is important to many diverse fields of research in the understanding of fundamental and applied aspects of physical phenomena. Both terrestrial and extra-terrestrial experimental facilities are currently available to allow researchers access to reduced gravity environments. This paper discusses two drop tower designs, a 2.0 second facility built in Australia and a proposed 2.2 second facility in the United Kingdom. Both drop towers utilise a drag shield for isolating the falling experiment from the drag forces of the air during the test. The design and development of The University of Queensland's (Australia) 2.0 second drop tower, including its specifications and operational procedures is discussed first. Sensitive aspects of the design process are examined. Future plans are then presented for a new short duration (2.2 sec) ground-based reduced gravity drop tower. The new drop tower has been designed for Kingston University (United Kingdom) to support teaching and research in the field of reduced gravity physics. The design has been informed by the previous UQ drop tower design process and utilises a catapult mechanism to increase test time and also incorporates features to allow participants for a variety of backgrounds (from high school students through to university researchers) to learn and experiment in reduced gravity. Operational performance expectations for this new facility are also discussed.

  7. Fast-SNP: a fast matrix pre-processing algorithm for efficient loopless flux optimization of metabolic models

    PubMed Central

    Saa, Pedro A.; Nielsen, Lars K.

    2016-01-01

    Motivation: Computation of steady-state flux solutions in large metabolic models is routinely performed using flux balance analysis based on a simple LP (Linear Programming) formulation. A minimal requirement for thermodynamic feasibility of the flux solution is the absence of internal loops, which are enforced using ‘loopless constraints’. The resulting loopless flux problem is a substantially harder MILP (Mixed Integer Linear Programming) problem, which is computationally expensive for large metabolic models. Results: We developed a pre-processing algorithm that significantly reduces the size of the original loopless problem into an easier and equivalent MILP problem. The pre-processing step employs a fast matrix sparsification algorithm—Fast- sparse null-space pursuit (SNP)—inspired by recent results on SNP. By finding a reduced feasible ‘loop-law’ matrix subject to known directionalities, Fast-SNP considerably improves the computational efficiency in several metabolic models running different loopless optimization problems. Furthermore, analysis of the topology encoded in the reduced loop matrix enabled identification of key directional constraints for the potential permanent elimination of infeasible loops in the underlying model. Overall, Fast-SNP is an effective and simple algorithm for efficient formulation of loop-law constraints, making loopless flux optimization feasible and numerically tractable at large scale. Availability and Implementation: Source code for MATLAB including examples is freely available for download at http://www.aibn.uq.edu.au/cssb-resources under Software. Optimization uses Gurobi, CPLEX or GLPK (the latter is included with the algorithm). Contact: lars.nielsen@uq.edu.au Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27559155

  8. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  9. Geoscience in the Big Data Era: Are models obsolete?

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; Zheng, L.; Stark, P. B.; Morra, G.; Knepley, M.; Wang, X.

    2016-12-01

    In last few decades, the velocity, volume, and variety of geophysical data have increased, while the development of the Internet and distributed computing has led to the emergence of "data science." Fitting and running numerical models, especially based on PDEs, is the main consumer of flops in geoscience. Can large amounts of diverse data supplant modeling? Without the ability to conduct randomized, controlled experiments, causal inference requires understanding the physics. It is sometimes possible to predict well without understanding the system—if (1) the system is predictable, (2) data on "important" variables are available, and (3) the system changes slowly enough. And sometimes even a crude model can help the data "speak for themselves" much more clearly. For example, Shearer (1991) used a 1-dimensional velocity model to stack long-period seismograms, revealing upper mantle discontinuities. This was a "big data" approach: the main use of computing was in the data processing, rather than in modeling, yet the "signal" became clear. In contrast, modelers tend to use all available computing power to fit even more complex models, resulting in a cycle where uncertainty quantification (UQ) is never possible: even if realistic UQ required only 1,000 model evaluations, it is never in reach. To make more reliable inferences requires better data analysis and statistics, not more complex models. Geoscientists need to learn new skills and tools: sound software engineering practices; open programming languages suitable for big data; parallel and distributed computing; data visualization; and basic nonparametric, computationally based statistical inference, such as permutation tests. They should work reproducibly, scripting all analyses and avoiding point-and-click tools.

  10. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  11. Scalable Methods for Uncertainty Quantification, Data Assimilation and Target Accuracy Assessment for Multi-Physics Advanced Simulation of Light Water Reactors

    NASA Astrophysics Data System (ADS)

    Khuwaileh, Bassam

    High fidelity simulation of nuclear reactors entails large scale applications characterized with high dimensionality and tremendous complexity where various physics models are integrated in the form of coupled models (e.g. neutronic with thermal-hydraulic feedback). Each of the coupled modules represents a high fidelity formulation of the first principles governing the physics of interest. Therefore, new developments in high fidelity multi-physics simulation and the corresponding sensitivity/uncertainty quantification analysis are paramount to the development and competitiveness of reactors achieved through enhanced understanding of the design and safety margins. Accordingly, this dissertation introduces efficient and scalable algorithms for performing efficient Uncertainty Quantification (UQ), Data Assimilation (DA) and Target Accuracy Assessment (TAA) for large scale, multi-physics reactor design and safety problems. This dissertation builds upon previous efforts for adaptive core simulation and reduced order modeling algorithms and extends these efforts towards coupled multi-physics models with feedback. The core idea is to recast the reactor physics analysis in terms of reduced order models. This can be achieved via identifying the important/influential degrees of freedom (DoF) via the subspace analysis, such that the required analysis can be recast by considering the important DoF only. In this dissertation, efficient algorithms for lower dimensional subspace construction have been developed for single physics and multi-physics applications with feedback. Then the reduced subspace is used to solve realistic, large scale forward (UQ) and inverse problems (DA and TAA). Once the elite set of DoF is determined, the uncertainty/sensitivity/target accuracy assessment and data assimilation analysis can be performed accurately and efficiently for large scale, high dimensional multi-physics nuclear engineering applications. Hence, in this work a Karhunen-Loeve (KL) based algorithm previously developed to quantify the uncertainty for single physics models is extended for large scale multi-physics coupled problems with feedback effect. Moreover, a non-linear surrogate based UQ approach is developed, used and compared to performance of the KL approach and brute force Monte Carlo (MC) approach. On the other hand, an efficient Data Assimilation (DA) algorithm is developed to assess information about model's parameters: nuclear data cross-sections and thermal-hydraulics parameters. Two improvements are introduced in order to perform DA on the high dimensional problems. First, a goal-oriented surrogate model can be used to replace the original models in the depletion sequence (MPACT -- COBRA-TF - ORIGEN). Second, approximating the complex and high dimensional solution space with a lower dimensional subspace makes the sampling process necessary for DA possible for high dimensional problems. Moreover, safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. Accordingly, an inverse problem can be defined and solved to assess the contributions from sources of uncertainty; and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this dissertation a subspace-based gradient-free and nonlinear algorithm for inverse uncertainty quantification namely the Target Accuracy Assessment (TAA) has been developed and tested. The ideas proposed in this dissertation were first validated using lattice physics applications simulated using SCALE6.1 package (Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) lattice models). Ultimately, the algorithms proposed her were applied to perform UQ and DA for assembly level (CASL progression problem number 6) and core wide problems representing Watts Bar Nuclear 1 (WBN1) for cycle 1 of depletion (CASL Progression Problem Number 9) modeled via simulated using VERA-CS which consists of several multi-physics coupled models. The analysis and algorithms developed in this dissertation were encoded and implemented in a newly developed tool kit algorithms for Reduced Order Modeling based Uncertainty/Sensitivity Estimator (ROMUSE).

  12. MFIX-DEM Phi: Performance and Capability Improvements Towards Industrial Grade Open-source DEM Framework with Integrated Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GEL, Aytekin; Jiao, Yang; Emady, Heather

    Two major challenges hinder the effective use and adoption of multiphase computational fluid dynamics tools by the industry. The first is the need for significant computational resources, which is inversely proportional to the accuracy of solutions due to computational intensity of the algorithms. The second barrier is assessing the prediction credibility and confidence in the simulation results. In this project, a multi-tiered approach has been proposed under four broad activities to overcome these challenges while addressing all of the objectives outlined in FOA-0001238 through Phases 1 and 2 of the project. The present report consists of the results for onlymore » Phase 1, which was the funded performance period. From the start the project, all of the objectives outlined in FOA were addressed through four major activity tasks in an integrated and balanced fashion to improve adoption of MFIX suite of solvers for industrial use. The first task aimed to improve the performance of MFIX-DEM specifically targeting to acquire the peak performance on Intel Xeon and Xeon Phi based systems, which are expected to be one of the primary high-performance computing platforms both affordable and available for the industrial users in the next two to five years. However, due to a number of changes in course of the project, the scope of the performance improvements related task was significantly reduced to avoid duplicate work. Hence, more emphasis was placed on the other three tasks as discussed below.The second task aimed at physical modeling enhancements through implementation of polydispersity capability and validation of heat transfer models in MFIX. An extended verification and validation (V&V) study was performed for the new polydispersity feature implemented in MFIX-DEM both for granular and coupled gas-solid flows. The features of the polydispersity capability and results for an industrially relevant problem were disseminated through journal papers (one published and one under review at the time of writing of the final technical report). As part of the validation efforts, another industrially relevant problem of interest based on rotary drums was studied for several modes of heat transfer and results were presented in conferences. Third task was aimed towards an important and unique contribution of the project, which was to develop a unified uncertainty quantification framework by integrating MFIX-DEM with a graphical user interface (GUI) driven uncertainty quantification (UQ) engine, i.e., MFIX-GUI and PSUADE. The goal was to enable a user with only modest knowledge of statistics to effectively utilize the UQ framework offered with MFIX-DEM Phi to perform UQ analysis routinely. For Phase 1, a proof-of-concept demonstration of the proposed framework was completed and shared. Direct industry involvement was one of the key virtues of this project, which was performed through forth task. For this purpose, even at the proposal stage, the project team received strong interest in the proposed capabilities from two major corporations, which were further expanded throughout Phase 1 and a new collaboration with another major corporation from chemical industry was also initiated. The level of interest received and continued collaboration for the project during Phase 1 clearly shows the relevance and potential impact of the project for the industrial users.« less

  13. Toward quantum-like modeling of financial processes

    NASA Astrophysics Data System (ADS)

    Choustova, Olga

    2007-05-01

    We apply methods of quantum mechanics for mathematical modeling of price dynamics at the financial market. We propose to describe behavioral financial factors (e.g., expectations of traders) by using the pilot wave (Bohmian) model of quantum mechanics. Trajectories of prices are determined by two financial potentials: classical-like V(q) ("hard" market conditions, e.g., natural resources) and quantum-like U(q) (behavioral market conditions). On the one hand, our Bohmian model is a quantum-like model for the financial market, cf. with works of W. Segal, I. E. Segal, E. Haven, E. W. Piotrowski, J. Sladkowski. On the other hand, (since Bohmian mechanics provides the possibility to describe individual price trajectories) it belongs to the domain of extended research on deterministic dynamics for financial assets (C.W.J. Granger, W.A. Barnett, A. J. Benhabib, W.A. Brock, C. Sayers, J. Y. Campbell, A. W. Lo, A. C. MacKinlay, A. Serletis, S. Kuchta, M. Frank, R. Gencay, T. Stengos, M. J. Hinich, D. Patterson, D. A. Hsieh, D. T. Caplan, J.A. Scheinkman, B. LeBaron and many others).

  14. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  15. An Examination of Hypercube Implementations of Genetic Algorithms

    DTIC Science & Technology

    1992-03-01

    t of th s C!ie(tol.3 ) i ., r .’ Itor l’C ". Ing sUq , -Cs iS or reducing ,is ourlen : .V,isr,qon Heac uar’ers Ser .ces. Directorate for -nformation...is of length n and the building.block-size is r , all combinations of the n loc* taken r at a time must be generated. The cardinality of the size i n...U Approved for public release; distribution unlimited RO Form Approved REPORT DOCUMENTATION PAGE OMB No. 0704-0188 Pi C rponr! - ; Lurcen " r :T"s

  16. Colored knot polynomials for arbitrary pretzel knots and links

    DOE PAGES

    Galakhov, D.; Melnikov, D.; Mironov, A.; ...

    2015-04-01

    A very simple expression is conjectured for arbitrary colored Jones and HOMFLY polynomials of a rich (g+1)-parametric family of pretzel knots and links. The answer for the Jones and HOMFLY is fully and explicitly expressed through the Racah matrix of Uq(SU N), and looks related to a modular transformation of toric conformal block. Knot polynomials are among the hottest topics in modern theory. They are supposed to summarize nicely representation theory of quantum algebras and modular properties of conformal blocks. The result reported in the present letter, provides a spectacular illustration and support to this general expectation.

  17. Numerical prediction of a draft tube flow taking into account uncertain inlet conditions

    NASA Astrophysics Data System (ADS)

    Brugiere, O.; Balarac, G.; Corre, C.; Metais, O.; Flores, E.; Pleroy

    2012-11-01

    The swirling turbulent flow in a hydroturbine draft tube is computed with a non-intrusive uncertainty quantification (UQ) method coupled to Reynolds-Averaged Navier-Stokes (RANS) modelling in order to take into account in the numerical prediction the physical uncertainties existing on the inlet flow conditions. The proposed approach yields not only mean velocity fields to be compared with measured profiles, as is customary in Computational Fluid Dynamics (CFD) practice, but also variance of these quantities from which error bars can be deduced on the computed profiles, thus making more significant the comparison between experiment and computation.

  18. SCRAM: a pipeline for fast index-free small RNA read alignment and visualization.

    PubMed

    Fletcher, Stephen J; Boden, Mikael; Mitter, Neena; Carroll, Bernard J

    2018-03-15

    Small RNAs play key roles in gene regulation, defense against viral pathogens and maintenance of genome stability, though many aspects of their biogenesis and function remain to be elucidated. SCRAM (Small Complementary RNA Mapper) is a novel, simple-to-use short read aligner and visualization suite that enhances exploration of small RNA datasets. The SCRAM pipeline is implemented in Go and Python, and is freely available under MIT license. Source code, multiplatform binaries and a Docker image can be accessed via https://sfletc.github.io/scram/. s.fletcher@uq.edu.au. Supplementary data are available at Bioinformatics online.

  19. Medicine Clerkship Implementation in a Hospitalist Group: Curricular Innovation and Review

    PubMed Central

    Carter, William J.

    2016-01-01

    Background: In 2008, the Department of Hospital Medicine at Ochsner Clinic Foundation in New Orleans, LA, began training its own students for the first time as a result of the partnership between our institution and the University of Queensland (UQ) in Brisbane, Australia, that established a global medical school. The Department of Hospital Medicine is responsible for the Medicine clerkship for third-year medical students. We have 5 resident teams at the main hospital in the system, but the majority of our hospitalists work alone. Because of staffing issues, we have had to change our mentality from having teaching hospitalists and nonteaching hospitalists to viewing all hospitalists as potential educators. Methods: The department has slowly increased the number of students in the Medicine clerkship each year with the goal of training 120 third-year students in the New Orleans area in 2016. The students in the Medicine clerkship will be divided into five 8-week rotations, allowing for 25 students to be trained at one time. Results: The UQ curriculum is similar to that of most 4-year American schools, but some differences in methods, such as a heavy emphasis on bedside instruction and oral summative assessments, are novel to us. These differences have provided our department with new goals for professional and instructor development. For the actual instruction, we pair students one on one with hospitalists and also assign them to resident teams. Student placement has been a challenge, but we are making improvements as we gain experience and explore opportunities for placement at our community hospitals. Conclusion: Our arrangement may be adapted to other institutions in the future as the number of students increases and the availability of resident teachers becomes more difficult nationwide. PMID:27046406

  20. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    NASA Astrophysics Data System (ADS)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  1. Study the effect of reservoir spatial heterogeneity on CO2 sequestration under an uncertainty quantification (UQ) software framework

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.

    2011-12-01

    In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.

  2. Predictors of non-return to work 2 years post-injury in road traffic crash survivors: Results from the UQ SuPPORT study.

    PubMed

    Heron-Delaney, Michelle; Warren, Jacelle; Kenardy, Justin A

    2017-06-01

    Individuals who have sustained an injury from a road traffic crash (RTC) are at increased risk for long lasting health problems and non-return to work (NRTW). Determining the predictors of NRTW is necessary to develop screening tools to identify at-risk individuals and to provide early targeted intervention for successful return to work (RTW). The aim of this study was to identify factors that can predict which individuals will not RTW following minor or moderate injuries sustained from a RTC. Participants were 194 claimants (63.4% female) within a common-law "fault-based" system from the UQ SuPPORT cohort who were working prior to their RTC. Participants were assessed at 6 months on a variety of physical and mental health measures and RTW status was determined at 2 years post-RTC. RTW rate was 78.4%. Univariate predictors of NRTW included being the driver or passenger, having a prior psychiatric diagnosis, high disability level, low mental or physical quality of life, predicted non-recovery, high pain, low function, high expectations of pain persistency, low expectations about RTW, having a psychiatric diagnosis, elevated depression or anxiety. The final multivariable logistic regression model included only two variables: disability level and expectations about RTW. Seventy-five percent of individuals who will not RTW by 2 years can be identified accurately at an early stage, using only these two predictors. The results are promising, because they suggest that having information about two factors, which are easily obtainable, can predict with accuracy those who will require additional support to facilitate RTW. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Formation Of Amino Acids And Nucleotide Bases In A Titan Atmosphere Simulation Experiment

    NASA Astrophysics Data System (ADS)

    Horst, Sarah; Yelle, R. V.; Buch, A.; Carrasco, N.; Cernogora, G.; Dutuit, O.; Quirico, E.; Sciamma-O'Brien, E.; Smith, M. A.; Somogyi, A.; Szopa, C.; Thissen, R.; Vuitton, V.

    2010-10-01

    Titan has been a subject of astrobiological interest since the Voyager spacecraft first revealed the diversity of the organic chemistry occurring in the atmosphere. However, it was not until the arrival of Cassini-Huygens that the chemical complexity of Titan's atmosphere was fully appreciated. The Cassini Plasma Spectrometer (CAPS) observed negative ions with m/z values up to 10,000 u/q at 950 km [1] and positive ions with m/z up to 400 u/q [2]. CAPS has also observed O+ flowing into Titan's atmosphere [3]. While Titan's atmosphere is relatively oxygen poor compared to terrestrial planets, CO is the fourth most abundant molecule in the atmosphere (˜50 ppm). The fact that the observed O+ flux is deposited in the region now known to contain large organic molecules leads to the exciting possibility that oxygen can be incorporated into these molecules resulting in the production of prebiotic molecules. In this work, Titan aerosol analogues (or "tholins") produced in PAMPRE, a Titan atmosphere simulation experiment, have been analyzed in a very high resolution LTQ Orbitrap mass spectrometer. These PAMPRE tholins were produced by capacitively coupled RF discharge in a mixture of N2, CH4 and CO. The tholins were found to contain 18 molecules with molecular formulae corresponding to biological amino acids and nucleotide bases. GC-MS measurements have confirmed the structure of seven: adenine, cytosine, uracil, thymine, guanine, glycine and alanine. The production of prebiotic molecules under atmospheric conditions presents a new source of prebiotic material and may increase the range of planets where life could begin. [1] Coates AJ, et al. (2007). Geophys. Res. Lett. 34:22103- +. [2] Crary FJ, et al. (2009). Planet. Space Sci. 57:1847- 1856. [3] Hartle RE, et al. (2006). Geophys. Res. Lett. 33:8201-+.

  4. Experimental investigation of turbulent diffusion of slightly buoyant droplets in locally isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Gopalan, Balaji; Malkiel, Edwin; Katz, Joseph

    2008-09-01

    High-speed inline digital holographic cinematography is used for studying turbulent diffusion of slightly buoyant 0.5-1.2 mm diameter diesel droplets and 50 μm diameter neutral density particles. Experiments are performed in a 50×50×70 mm3 sample volume in a controlled, nearly isotropic turbulence facility, which is characterized by two dimensional particle image velocimetry. An automated tracking program has been used for measuring velocity time history of more than 17 000 droplets and 15 000 particles. For most of the present conditions, rms values of horizontal droplet velocity exceed those of the fluid. The rms values of droplet vertical velocity are higher than those of the fluid only for the highest turbulence level. The turbulent diffusion coefficient is calculated by integration of the ensemble-averaged Lagrangian velocity autocovariance. Trends of the asymptotic droplet diffusion coefficient are examined by noting that it can be viewed as a product of a mean square velocity and a diffusion time scale. To compare the effects of turbulence and buoyancy, the turbulence intensity (ui') is scaled by the droplet quiescent rise velocity (Uq). The droplet diffusion coefficients in horizontal and vertical directions are lower than those of the fluid at low normalized turbulence intensity, but exceed it with increasing normalized turbulence intensity. For most of the present conditions the droplet horizontal diffusion coefficient is higher than the vertical diffusion coefficient, consistent with trends of the droplet velocity fluctuations and in contrast to the trends of the diffusion timescales. The droplet diffusion coefficients scaled by the product of turbulence intensity and an integral length scale are a monotonically increasing function of ui'/Uq.

  5. Efficient uncertainty quantification in fully-integrated surface and subsurface hydrologic simulations

    NASA Astrophysics Data System (ADS)

    Miller, K. L.; Berg, S. J.; Davison, J. H.; Sudicky, E. A.; Forsyth, P. A.

    2018-01-01

    Although high performance computers and advanced numerical methods have made the application of fully-integrated surface and subsurface flow and transport models such as HydroGeoSphere common place, run times for large complex basin models can still be on the order of days to weeks, thus, limiting the usefulness of traditional workhorse algorithms for uncertainty quantification (UQ) such as Latin Hypercube simulation (LHS) or Monte Carlo simulation (MCS), which generally require thousands of simulations to achieve an acceptable level of accuracy. In this paper we investigate non-intrusive polynomial chaos for uncertainty quantification, which in contrast to random sampling methods (e.g., LHS and MCS), represents a model response of interest as a weighted sum of polynomials over the random inputs. Once a chaos expansion has been constructed, approximating the mean, covariance, probability density function, cumulative distribution function, and other common statistics as well as local and global sensitivity measures is straightforward and computationally inexpensive, thus making PCE an attractive UQ method for hydrologic models with long run times. Our polynomial chaos implementation was validated through comparison with analytical solutions as well as solutions obtained via LHS for simple numerical problems. It was then used to quantify parametric uncertainty in a series of numerical problems with increasing complexity, including a two-dimensional fully-saturated, steady flow and transient transport problem with six uncertain parameters and one quantity of interest; a one-dimensional variably-saturated column test involving transient flow and transport, four uncertain parameters, and two quantities of interest at 101 spatial locations and five different times each (1010 total); and a three-dimensional fully-integrated surface and subsurface flow and transport problem for a small test catchment involving seven uncertain parameters and three quantities of interest at 241 different times each. Numerical experiments show that polynomial chaos is an effective and robust method for quantifying uncertainty in fully-integrated hydrologic simulations, which provides a rich set of features and is computationally efficient. Our approach has the potential for significant speedup over existing sampling based methods when the number of uncertain model parameters is modest ( ≤ 20). To our knowledge, this is the first implementation of the algorithm in a comprehensive, fully-integrated, physically-based three-dimensional hydrosystem model.

  6. Analysis of ISO NE Balancing Requirements: Uncertainty-based Secure Ranges for ISO New England Dynamic Inerchange Adjustments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Makarov, Yuri V.; Wu, Di

    The document describes detailed uncertainty quantification (UQ) methodology developed by PNNL to estimate secure ranges of potential dynamic intra-hour interchange adjustments in the ISO-NE system and provides description of the dynamic interchange adjustment (DINA) tool developed under the same contract. The overall system ramping up and down capability, spinning reserve requirements, interchange schedules, load variations and uncertainties from various sources that are relevant to the ISO-NE system are incorporated into the methodology and the tool. The DINA tool has been tested by PNNL and ISO-NE staff engineers using ISO-NE data.

  7. Hazards/Failure Modes and Effects Analysis MK 1 MOD 0 LSO-HUD Console System.

    DTIC Science & Technology

    1980-03-24

    AsI~f~ ! 127 = 3gc Z Isre -0 -q ~sI I I 𔃻~~~ ~ _ _ 3_______ II! -0udC Z Z’ P4 12 d-U * ~s ’:i~i42 S- 60 -, Uh ~ U3l I OM -C ~ . - U 4~ dcd 8U-q Ali...8 VI SCOPE AND METHODOLOGY OF ANALYSIS ........ 1O FIGURE 1: H/ FMEA /(SSA) WORK SHEET FORMAT ........... 14 APPENDIX A: HAZARD/FAILURE MODES AND...EFFECTS ANALYSIS (H/ FMEA ) -- WORK SHEETS ......... 15(A-O) TABLE: SUBSYSTEM: UNIT I Heads-Up Display Console .............. 17(A-1) UNIT 2 Auxiliary

  8. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biros, George

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. Thesemore » include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for large-scale models. We propose to develop the mathematical tools to address these challenges in the context of extreme-scale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for large-scale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parameter-to-output map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leading-edge supercomputers, including UT-Austin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.« less

  9. A Broadband Study of the Emission from the Composite Supernova Remnant MSH 11-62

    NASA Technical Reports Server (NTRS)

    Slane, Patrick; Hughes, John P.; Temim, Tea; Rousseau, Romain; Castro, Daniel; Foight, Dillon; Gaensler, B. M.; Funk, Stefan; Lemoine-Goumard, Marianne; Gelfand, Joseph D.; hide

    2012-01-01

    MSH 11-62 (G29U)-Q.1) is a composite supernova remnant for which radio and X-ray observations have identified the remnant shell as well as its central pulsar wind nebula. The observations suggest a relatively young system expanding into a low-density region. Here, we present a study of MSH ll-62 using observations with the Chandra, XMM-Newton, and Fermi observatories, along with radio observations from the Australia Telescope Compact Array. We identify a compact X-ray source that appears to be the putative pulsar that powers the nebula, and show that the X-ray spectrum of the nebula bears the signature of synchrotron losses as particles diffuse into the outer nebula. Using data from the Fermi Large Area Telescope, we identify gamma-ray emission originating from MSH 11-62. With density constraints from the new X-ray measurements of the remnant, we model the evolution of the composite system in order to constrain the properties of the underlying pulsar and the origin of the gamma-ray emission.

  10. A greenhouse-gas information system monitoring and validating emissions reporting and mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonietz, Karl K; Dimotakis, Paul E; Roman, Douglas A

    2011-09-26

    Current GHG-mitigating regimes, whether internationally agreed or self-imposed, rely on the aggregation of self-reported data, with limited checks for consistency and accuracy, for monitoring. As nations commit to more stringent GHG emissions-mitigation actions and as economic rewards or penalties are attached to emission levels, self-reported data will require independent confirmation that they are accurate and reliable, if they are to provide the basis for critical choices and actions that may be required. Supporting emissions-mitigation efforts and agreements, as well as monitoring energy- and fossil-fuel intensive national and global activities would be best achieved by a process of: (1) monitoring ofmore » emissions and emission-mitigation actions, based, in part, on, (2) (self-) reporting of pertinent bottom-up inventory data, (3) verification that reported data derive from and are consistent with agreed-upon processes and procedures, and (4) validation that reported emissions and emissions-mitigation action data are correct, based on independent measurements (top-down) derived from a suite of sensors in space, air, land, and, possibly, sea, used to deduce and attribute anthropogenic emissions. These data would be assessed and used to deduce and attribute measured GHG concentrations to anthropogenic emissions, attributed geographically and, to the extent possible, by economic sector. The validation element is needed to provide independent assurance that emissions are in accord with reported values, and should be considered as an important addition to the accepted MRV process, leading to a MRV&V process. This study and report focus on attributes of a greenhouse-gas information system (GHGIS) needed to support MRV&V needs. These needs set the function of such a system apart from scientific/research monitoring of GHGs and carbon-cycle systems, and include (not exclusively): the need for a GHGIS that is operational, as required for decision-support; the need for a system that meets specifications derived from imposed requirements; the need for rigorous calibration, verification, and validation (CV&V) standards, processes, and records for all measurement and modeling/data-inversion data; the need to develop and adopt an uncertainty-quantification (UQ) regimen for all measurement and modeling data; and the requirement that GHGIS products can be subjected to third-party questioning and scientific scrutiny. This report examines and assesses presently available capabilities that could contribute to a future GHGIS. These capabilities include sensors and measurement technologies; data analysis and data uncertainty quantification (UQ) practices and methods; and model-based data-inversion practices, methods, and their associated UQ. The report further examines the need for traceable calibration, verification, and validation processes and attached metadata; differences between present science-/research-oriented needs and those that would be required for an operational GHGIS; the development, operation, and maintenance of a GHGIS missions-operations center (GMOC); and the complex systems engineering and integration that would be required to develop, operate, and evolve a future GHGIS. Present monitoring systems would be heavily relied on in any GHGIS implementation at the outset and would likely continue to provide valuable future contributions to GHGIS. However, present monitoring systems were developed to serve science/research purposes. This study concludes that no component or capability presently available is at the level of technological maturity and readiness required for implementation in an operational GHGIS today. However, purpose-designed and -built components could be developed and implemented in support of a future GHGIS. The study concludes that it is possible to develop and provide a capability-driven prototype GHGIS, as part of a Phase-1 effort, within three years from project-funding start, that would make use of and integrate existing sensing and system capabilities. As part of a Phase-2 effort, a requirements-driven, operational GHGIS could be developed, within ten years from project funding start. That schedule is driven by the development and long lead-times for some system components. The two efforts would be focused on different deliverables but could commence concurrently, to save time, if that was deemed desirable. We note that, developing and supporting an operational GHGIS will require a new approach and management, sustained funding and other support, as well as technical advances and development of purpose-built components that meet the requisite specifications. A functioning GHGIS will provide the basis for reasoned choices on how best to respond to rising GHG levels, especially when proposed U.S. actions are compared with or conditioned on the actions of other nations.« less

  11. ROBUST ONLINE MONITORING FOR CALIBRATION ASSESSMENT OF TRANSMITTERS AND INSTRUMENTATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Tipireddy, Ramakrishna; Lerchen, Megan E.

    Robust online monitoring (OLM) technologies are expected to enable the extension or elimination of periodic sensor calibration intervals in operating and new reactors. Specifically, the next generation of OLM technology is expected to include newly developed advanced algorithms that improve monitoring of sensor/system performance and enable the use of plant data to derive information that currently cannot be measured. These advances in OLM technologies will improve the safety and reliability of current and planned nuclear power systems through improved accuracy and increased reliability of sensors used to monitor key parameters. In this paper, we discuss an overview of research beingmore » performed within the Nuclear Energy Enabling Technologies (NEET)/Advanced Sensors and Instrumentation (ASI) program, for the development of OLM algorithms to use sensor outputs and, in combination with other available information, 1) determine whether one or more sensors are out of calibration or failing and 2) replace a failing sensor with reliable, accurate sensor outputs. Algorithm development is focused on the following OLM functions: • Signal validation – fault detection and selection of acceptance criteria • Virtual sensing – signal value prediction and acceptance criteria • Response-time assessment – fault detection and acceptance criteria selection A GP-based uncertainty quantification (UQ) method previously developed for UQ in OLM, was adapted for use in sensor-fault detection and virtual sensing. For signal validation, the various components to the OLM residual (which is computed using an AAKR model) were explicitly defined and modeled using a GP. Evaluation was conducted using flow loop data from multiple sources. Results using experimental data from laboratory-scale flow loops indicate that the approach, while capable of detecting sensor drift, may be incapable of discriminating between sensor drift and model inadequacy. This may be due to a simplification applied in the initial modeling, where the sensor degradation is assumed to be stationary. In the case of virtual sensors, the GP model was used in a predictive mode to estimate the correct sensor reading for sensors that may have failed. Results have indicated the viability of using this approach for virtual sensing. However, the GP model has proven to be computationally expensive, and so alternative algorithms for virtual sensing are being evaluated. Finally, automated approaches to performing noise analysis for extracting sensor response time were developed. Evaluation of this technique using laboratory-scale data indicates that it compares well with manual techniques previously used for noise analysis. Moreover, the automated and manual approaches for noise analysis also compare well with the current “gold standard”, hydraulic ramp testing, for response time monitoring. Ongoing research in this project is focused on further evaluation of the algorithms, optimization for accuracy and computational efficiency, and integration into a suite of tools for robust OLM that are applicable to monitoring sensor calibration state in nuclear power plants.« less

  12. Report on UQ and PCMM Analysis of Vacuum Drying for UFD S&T Gaps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. Fluss

    2015-08-31

    This report discusses two phenomena that could affect the safety, licensing, transportation, storage, and disposition of the spent fuel storage casks and their contents (radial hydriding during drying and water retention after drying) associated with the drying of canisters for dry spent fuel storage. The report discusses modeling frameworks and evaluations that are, or have been, developed as a means to better understand these phenomena. Where applicable, the report also discusses data needs and procedures for monitoring or evaluating the condition of storage containers during and after drying. A recommendation for the manufacturing of a fully passivated fuel rod, resistantmore » to oxidation and hydriding is outlined.« less

  13. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  14. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  15. Functional role of coenzyme Q in the energy coupling of NADH-CoQ oxidoreductase (Complex I): stabilization of the semiquinone state with the application of inside-positive membrane potential to proteoliposomes.

    PubMed

    Ohnishi, Tomoko; Ohnishi, S Tsuyoshi; Shinzawa-Ito, Kyoko; Yoshikawa, Shinya

    2008-01-01

    Coenzyme Q10 (which is also designated as CoQ10, ubiquinone-10, UQ10, CoQ, UQ or simply as Q) plays an important role in energy metabolism. For NADH-Q oxidoreductase (complex I), Ohnishi and Salerno proposed a hypothesis that the proton pump is operated by the redox-driven conformational change of a Q-binding protein, and that the bound form of semiquinone (SQ) serves as its gate [FEBS Letters 579 (2005) 45-55]. This was based on the following experimental results: (i) EPR signals of the fast-relaxing SQ anion (designated as QNf(.-)) are observable only in the presence of the proton electrochemical potential (DeltamuH+); (ii) iron-sulfur cluster N2 and QNf(.-) are directly spin-coupled; and (iii) their center-to-center distance was calculated as 12angstroms, but QNf(.-) is only 5angstroms deeper than N2 perpendicularly to the membrane. After the priming reduction of Q to QNf(.-), the proton pump operates only in the steps between the semiquinone anion (QNf(.-)) and fully reduced quinone (QH2). Thus, by cycling twice for one NADH molecule, the pump transports 4H+ per 2e(-). This hypothesis predicts the following phenomena: (a) Coupled with the piericidin A sensitive NADH-DBQ or Q1 reductase reaction, DeltamuH+ would be established; (b) DeltamuH+ would enhance the SQ EPR signals; and (c) the dissipation of DeltamuH+ with the addition of an uncoupler would increase the rate of NADH oxidation and decrease the SQ signals. We reconstituted bovine heart complex I, which was prepared at Yoshikawa's laboratory, into proteoliposomes. Using this system, we succeeded in demonstrating that all of these phenomena actually took place. We believe that these results strongly support our hypothesis.

  16. Impact of Nuclear Data Uncertainties on Advanced Fuel Cycles and their Irradiated Fuel - a Comparison between Libraries

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2014-04-01

    The uncertainties on the isotopic composition throughout the burnup due to the nuclear data uncertainties are analysed. The different sources of uncertainties: decay data, fission yield and cross sections; are propagated individually, and their effect assessed. Two applications are studied: EFIT (an ADS-like reactor) and ESFR (Sodium Fast Reactor). The impact of the uncertainties on cross sections provided by the EAF-2010, SCALE6.1 and COMMARA-2.0 libraries are compared. These Uncertainty Quantification (UQ) studies have been carried out with a Monte Carlo sampling approach implemented in the depletion/activation code ACAB. Such implementation has been improved to overcome depletion/activation problems with variations of the neutron spectrum.

  17. Long Term Quadrotor Stabilization

    DTIC Science & Technology

    2011-03-01

    funtion Hmotor Motor transfer function Hsampler Sampler transfer function i An integer indexing variable Ix x-axis moment of inertia Iy y-axis moment...following relationship : ⎡⎢⎢⎢⎢⎢⎢⎢⎣ up uq ur utℎrust ⎤⎥⎥⎥⎥⎥⎥⎥⎦ = ⎡⎢⎢⎢⎢⎢⎢⎢⎣ 0 −1 0 1 1 0 −1 0 1 −1 1 −1 1 1 1 1 ⎤⎥⎥⎥⎥⎥⎥⎥⎦ ⎡⎢⎢⎢⎢⎢⎢⎢⎣ u1 u2 u3 u4...method only captures the magnitude of the angular rate and not the sign. because of the square root relationship and the need to have a positive value

  18. Existence and asymptotic behavior of nontrivial solutions to the Swift-Hohenberg equation

    NASA Astrophysics Data System (ADS)

    Marino, G.; Mosconi, S.

    2017-12-01

    In this paper, we discuss several results regarding existence, non-existence and asymptotic properties of solutions to u⁗ + qu″ + f (u) = 0, under various hypotheses on the parameter q and on the potential F (t) = ∫0t f (s) ds, generally assumed to be bounded from below. We prove a non-existence result in the case q ≤ 0 and an existence result of periodic solution for: 1) almost every suitably small (depending on F), positive values of q; 2) all suitably large (depending on F) values of q. Finally, we describe some conditions on F which ensure that some (or all) solutions uq to the equation satisfy ‖uq‖∞ → 0, as q ↓ 0.

  19. Plasticity models of material variability based on uncertainty quantification techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Reese E.; Rizzi, Francesco; Boyce, Brad

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQmore » techniques can be used in model selection and assessing the quality of calibrated physical parameters.« less

  20. q-Poincaré supersymmetry in AdS5/CFT4

    NASA Astrophysics Data System (ADS)

    Borsato, Riccardo; Torrielli, Alessandro

    2018-03-01

    We consider the exact S-matrix governing the planar spectral problem for strings on AdS5 ×S5 and N = 4 super Yang-Mills, and we show that it is invariant under a novel "boost" symmetry, which acts as a differentiation with respect to the particle momentum. This generator leads us also to reinterpret the usual centrally extended psu (2 | 2) symmetry, and to conclude that the S-matrix is invariant under a q-Poincaré supersymmetry algebra, where the deformation parameter is related to the 't Hooft coupling. We determine the two-particle action (coproduct) that turns out to be non-local, and study the property of the new symmetry under crossing transformations. We look at both the strong-coupling (large tension in the string theory) and weak-coupling (spin-chain description of the gauge theory) limits; in the former regime we calculate the cobracket utilising the universal classical r-matrix of Beisert and Spill. In the eventuality that the boost has higher partners, we also construct a quantum affine version of 2D Poincaré symmetry, by contraction of the quantum affine algebra Uq (sl2 ˆ) in Drinfeld's second realisation.

  1. Funding for the 2ND IAEA technical meeting on fusion data processing, validation and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenwald, Martin

    The International Atomic Energy Agency (IAEA) will organize the second Technical Meeting on Fusion Da Processing, Validation and Analysis from 30 May to 02 June, 2017, in Cambridge, MA USA. The meeting w be hosted by the MIT Plasma Science and Fusion Center (PSFC). The objective of the meeting is to provide a platform where a set of topics relevant to fusion data processing, validation and analysis are discussed with the view of extrapolation needs to next step fusion devices such as ITER. The validation and analysis of experimental data obtained from diagnostics used to characterize fusion plasmas are crucialmore » for a knowledge based understanding of the physical processes governing the dynamics of these plasmas. The meeting will aim at fostering, in particular, discussions of research and development results that set out or underline trends observed in the current major fusion confinement devices. General information on the IAEA, including its mission and organization, can be found at the IAEA websit Uncertainty quantification (UQ) Model selection, validation, and verification (V&V) Probability theory and statistical analysis Inverse problems & equilibrium reconstru ction Integrated data analysis Real time data analysis Machine learning Signal/image proc essing & pattern recognition Experimental design and synthetic diagnostics Data management« less

  2. Predicting Ice Sheet and Climate Evolution at Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heimbach, Patrick

    2016-02-06

    A main research objectives of PISCEES is the development of formal methods for quantifying uncertainties in ice sheet modeling. Uncertainties in simulating and projecting mass loss from the polar ice sheets arise primarily from initial conditions, surface and basal boundary conditions, and model parameters. In general terms, two main chains of uncertainty propagation may be identified: 1. inverse propagation of observation and/or prior onto posterior control variable uncertainties; 2. forward propagation of prior or posterior control variable uncertainties onto those of target output quantities of interest (e.g., climate indices or ice sheet mass loss). A related goal is the developmentmore » of computationally efficient methods for producing initial conditions for an ice sheet that are close to available present-day observations and essentially free of artificial model drift, which is required in order to be useful for model projections (“initialization problem”). To be of maximum value, such optimal initial states should be accompanied by “useful” uncertainty estimates that account for the different sources of uncerainties, as well as the degree to which the optimum state is constrained by available observations. The PISCEES proposal outlined two approaches for quantifying uncertainties. The first targets the full exploration of the uncertainty in model projections with sampling-based methods and a workflow managed by DAKOTA (the main delivery vehicle for software developed under QUEST). This is feasible for low-dimensional problems, e.g., those with a handful of global parameters to be inferred. This approach can benefit from derivative/adjoint information, but it is not necessary, which is why it often referred to as “non-intrusive”. The second approach makes heavy use of derivative information from model adjoints to address quantifying uncertainty in high-dimensions (e.g., basal boundary conditions in ice sheet models). The use of local gradient, or Hessian information (i.e., second derivatives of the cost function), requires additional code development and implementation, and is thus often referred to as an “intrusive” approach. Within PISCEES, MIT has been tasked to develop methods for derivative-based UQ, the ”intrusive” approach discussed above. These methods rely on the availability of first (adjoint) and second (Hessian) derivative code, developed through intrusive methods such as algorithmic differentiation (AD). While representing a significant burden in terms of code development, derivative-baesd UQ is able to cope with very high-dimensional uncertainty spaces. That is, unlike sampling methods (all variations of Monte Carlo), calculational burden is independent of the dimension of the uncertainty space. This is a significant advantage for spatially distributed uncertainty fields, such as threedimensional initial conditions, three-dimensional parameter fields, or two-dimensional surface and basal boundary conditions. Importantly, uncertainty fields for ice sheet models generally fall into this category.« less

  3. Global sensitivity analysis, probabilistic calibration, and predictive assessment for the data assimilation linked ecosystem carbon model

    DOE PAGES

    Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...

    2015-07-01

    In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less

  4. OR14-V-Uncertainty-PD2La Uncertainty Quantification for Nuclear Safeguards and Nondestructive Assay Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, Andrew D.; Croft, Stephen; McElroy, Robert Dennis

    2017-08-01

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically provide error bars and also partition total uncertainty into “random” and “systematic” components so that, for example, an error bar can be developed for the total mass estimate in multiple items. Uncertainty Quantification (UQ) for NDA has always been important, but itmore » is recognized that greater rigor is needed and achievable using modern statistical methods.« less

  5. A Concept for the HIFiRE 8 Flight Test

    NASA Astrophysics Data System (ADS)

    Alesi, H.; Paull, A.; Smart, M.; Bowcutt, K. G.

    2015-09-01

    HIFiRE 8 is a hypersonic flight test experiment scheduled for launch in late 2018 from the Woomera Test Center in Australia. This project aims to develop a Flight Test Vehicle that will, for the first time, complete 30 seconds of scramjet powered hypersonic flight at a Mach Number of 7.0. The engine used for this flight will be a rectangular to elliptic shape transition scramjet. It will be fuelled with gaseous hydrogen. The flight test engine configuration will be derived using scientific and engineering evaluation in the UQ shock tunnel T4 and other potential ground-based facilities. This paper presents current plans for the HIFiRE 8 trajectory, mission events, airframe and engine designs and also includes descriptions of critical subsystems and associated modelling, simulation and analysis activities.

  6. Uncertainty Quantification in Climate Modeling and Projection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change informationmore » for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.« less

  7. RaftProt: mammalian lipid raft proteome database.

    PubMed

    Shah, Anup; Chen, David; Boda, Akash R; Foster, Leonard J; Davis, Melissa J; Hill, Michelle M

    2015-01-01

    RaftProt (http://lipid-raft-database.di.uq.edu.au/) is a database of mammalian lipid raft-associated proteins as reported in high-throughput mass spectrometry studies. Lipid rafts are specialized membrane microdomains enriched in cholesterol and sphingolipids thought to act as dynamic signalling and sorting platforms. Given their fundamental roles in cellular regulation, there is a plethora of information on the size, composition and regulation of these membrane microdomains, including a large number of proteomics studies. To facilitate the mining and analysis of published lipid raft proteomics studies, we have developed a searchable database RaftProt. In addition to browsing the studies, performing basic queries by protein and gene names, searching experiments by cell, tissue and organisms; we have implemented several advanced features to facilitate data mining. To address the issue of potential bias due to biochemical preparation procedures used, we have captured the lipid raft preparation methods and implemented advanced search option for methodology and sample treatment conditions, such as cholesterol depletion. Furthermore, we have identified a list of high confidence proteins, and enabled searching only from this list of likely bona fide lipid raft proteins. Given the apparent biological importance of lipid raft and their associated proteins, this database would constitute a key resource for the scientific community. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Calypso: a user-friendly web-server for mining and visualizing microbiome-environment interactions.

    PubMed

    Zakrzewski, Martha; Proietti, Carla; Ellis, Jonathan J; Hasan, Shihab; Brion, Marie-Jo; Berger, Bernard; Krause, Lutz

    2017-03-01

    Calypso is an easy-to-use online software suite that allows non-expert users to mine, interpret and compare taxonomic information from metagenomic or 16S rDNA datasets. Calypso has a focus on multivariate statistical approaches that can identify complex environment-microbiome associations. The software enables quantitative visualizations, statistical testing, multivariate analysis, supervised learning, factor analysis, multivariable regression, network analysis and diversity estimates. Comprehensive help pages, tutorials and videos are provided via a wiki page. The web-interface is accessible via http://cgenome.net/calypso/ . The software is programmed in Java, PERL and R and the source code is available from Zenodo ( https://zenodo.org/record/50931 ). The software is freely available for non-commercial users. l.krause@uq.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  9. Development of a neutronics calculation method for designing commercial type Japanese sodium-cooled fast reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeda, T.; Shimazu, Y.; Hibi, K.

    2012-07-01

    Under the R and D project to improve the modeling accuracy for the design of fast breeder reactors the authors are developing a neutronics calculation method for designing a large commercial type sodium- cooled fast reactor. The calculation method is established by taking into account the special features of the reactor such as the use of annular fuel pellet, inner duct tube in large fuel assemblies, large core. The Verification and Validation, and Uncertainty Qualification (V and V and UQ) of the calculation method is being performed by using measured data from the prototype FBR Monju. The results of thismore » project will be used in the design and analysis of the commercial type demonstration FBR, known as the Japanese Sodium fast Reactor (JSFR). (authors)« less

  10. Robust Online Monitoring for Calibration Assessment of Transmitters and Instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Coble, Jamie B.; Shumaker, Brent

    Robust online monitoring (OLM) technologies are expected to enable the extension or elimination of periodic sensor calibration intervals in operating and new reactors. These advances in OLM technologies will improve the safety and reliability of current and planned nuclear power systems through improved accuracy and increased reliability of sensors used to monitor key parameters. In this article, we discuss an overview of research being performed within the Nuclear Energy Enabling Technologies (NEET)/Advanced Sensors and Instrumentation (ASI) program, for the development of OLM algorithms to use sensor outputs and, in combination with other available information, 1) determine whether one or moremore » sensors are out of calibration or failing and 2) replace a failing sensor with reliable, accurate sensor outputs. Algorithm development is focused on the following OLM functions: • Signal validation • Virtual sensing • Sensor response-time assessment These algorithms incorporate, at their base, a Gaussian Process-based uncertainty quantification (UQ) method. Various plant models (using kernel regression, GP, or hierarchical models) may be used to predict sensor responses under various plant conditions. These predicted responses can then be applied in fault detection (sensor output and response time) and in computing the correct value (virtual sensing) of a failing physical sensor. The methods being evaluated in this work can compute confidence levels along with the predicted sensor responses, and as a result, may have the potential for compensating for sensor drift in real-time (online recalibration). Evaluation was conducted using data from multiple sources (laboratory flow loops and plant data). Ongoing research in this project is focused on further evaluation of the algorithms, optimization for accuracy and computational efficiency, and integration into a suite of tools for robust OLM that are applicable to monitoring sensor calibration state in nuclear power plants.« less

  11. Quantum Bohmian model for financial market

    NASA Astrophysics Data System (ADS)

    Choustova, Olga Al.

    2007-01-01

    We apply methods of quantum mechanics for mathematical modeling of price dynamics at the financial market. The Hamiltonian formalism on the price/price-change phase space describes the classical-like evolution of prices. This classical dynamics of prices is determined by “hard” conditions (natural resources, industrial production, services and so on). These conditions are mathematically described by the classical financial potential V(q), where q=(q1,…,qn) is the vector of prices of various shares. But the information exchange and market psychology play important (and sometimes determining) role in price dynamics. We propose to describe such behavioral financial factors by using the pilot wave (Bohmian) model of quantum mechanics. The theory of financial behavioral waves takes into account the market psychology. The real trajectories of prices are determined (through the financial analogue of the second Newton law) by two financial potentials: classical-like V(q) (“hard” market conditions) and quantum-like U(q) (behavioral market conditions).

  12. Photometric observations of nine Transneptunian objects and Centaurs

    NASA Astrophysics Data System (ADS)

    Hromakina, T.; Perna, D.; Belskaya, I.; Dotto, E.; Rossi, A.; Bisi, F.

    2018-02-01

    We present the results of photometric observations of six Transneptunian objects and three Centaurs, estimations of their rotational periods and corresponding amplitudes. For six of them we present also lower limits of density values. All observations were made using 3.6-m TNG telescope (La Palma, Spain). For four objects - (148975) 2001 XA255, (281371) 2008 FC76, (315898) 2008 QD4, and 2008 CT190 - the estimation of short-term variability was made for the first time. We confirm rotation period values for two objects: (55636) 2002 TX300 and (202421) 2005 UQ513, and improve the precision of previously reported rotational period values for other three - (120178) 2003 OP32, (145452) 2005 RN43, (444030) 2004 NT33 - by using both our and literature data. We also discuss here that small distant bodies, similar to asteroids in the Main belt, tend to have double-peaked rotational periods caused by the elongated shape rather than surface albedo variations.

  13. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  14. The DEEP-South: Preliminary Photometric Results from the KMTNet-CTIO

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Jin; Moon, Hong-Kyu; Choi, Young-Jun; Yim, Hong-Suh; Bae, Youngho; Roh, Dong-Goo; the DEEP-South Team

    2015-08-01

    The DEep Ecliptic Patrol of the Southern sky (DEEP-South) will not only conduct characterization of targeted asteroids and blind survey at the sweet spots, but also utilize data mining of small Solar System bodies in the whole KMTNet archive. As round-the-clock observation with the KMTNet is optimized for spin characterization of tumbling and slow-rotating bodies as it facilitates debiasing previously reported lightcurve observations. It is also most suitable for detection and rapid follow-up of Atens and Atiras, the “difficult objects” that are being discovered at lower solar elongations.For the sake of efficiency, we implemented an observation scheduler, SMART (Scheduler for Measuring Asteroids RoTation), designed to conduct follow-up observations in a timely manner. It automatically updates catalogs, generates ephemerides, checks priorities, prepares target lists, and sends a suite of scripts to site operators. We also developed photometric analysis software called ASAP (Asteroid Spin Analysis Package) that aids to find a set of appropriate comparison stars in an image, to derive spin parameters and reconstruct lightcurve simultaneously in a semi-automatic manner. In this presentation, we will show our preliminary results of time series analyses of a number of km-sized Potentially Hazardous Asteroids (PHAs), 5189 (1990 UQ), 12923 (1999 GK4), 53426 (1999 SL5), 136614 (1993 VA6), 385186 (1994 AW1), and 2000 OH from test runs in February and March 2015 at the KMTNet-CTIO.

  15. Work-related musculoskeletal disorders in Australian dentists and orthodontists: Risk assessment and prevention.

    PubMed

    Sakzewski, Lisa; Naser-ud-Din, Shazia

    2015-01-01

    As professionals work longer hours and live longer there have been concerns regarding the Work related Musculoskeletal Disorders (WMSD) affecting both professional and personal lives. Moreover, past decade has seen a surge in interest in all allied health sciences personnel with self reporting cross sectional studies. Health professionals often suffer WMSD due to occupational stress. It is important to assess the problem in order to find ways to prevent it. Hence, the focus of this cross-sectional survey. The aim was to investigate the prevalence and risk factors of WMSD between Australian dentists and orthodontists. A postal survey was sent to 447 Australian orthodontists and 450 Queensland dentists using the universal Nordic scale previously piloted at UQ and refined for this cross-sectional study. Questions were directed towards individuals, workplace and psychosocial variables and were designed to gather information regarding health, lifestyle, education, awareness of musculoskeletal problems and current preventative strategies. A high prevalence of musculoskeletal problems was found for both dentists (88.9%) and orthodontists (83.6%) reported in the last 12 months. The main predictor in both groups was increased work stress. Less than a third of those professionals surveyed had received education regarding dental practice ergonomics during their tertiary education. Dentists and orthodontists experienced a high rate of musculoskeletal problems which were associated with increased levels of stress at work. Further research should be directed toward interventions aimed at reducing stress in the work environment as well as improving work posture.

  16. Functional characterization of enone oxidoreductases from strawberry and tomato fruit.

    PubMed

    Klein, Dorothée; Fink, Barbara; Arold, Beate; Eisenreich, Wolfgang; Schwab, Wilfried

    2007-08-08

    Fragaria x ananassa enone oxidoreductase (FaEO), earlier putatively assigned as quinone oxidoreductase, is a ripening-induced, negatively auxin-regulated enzyme that catalyzes the formation of 4-hydroxy-2,5-dimethyl-3(2H)-furanone (HDMF), the key flavor compound in strawberry fruit by the reduction of the alpha,beta-unsaturated bond of the highly reactive precursor 4-hydroxy-5-methyl-2-methylene-3(2H)-furanone (HMMF). Here we show that recombinant FaEO does not reduce the double bond of straight-chain 2-alkenals or 2-alkenones but rather hydrogenates previously unknown HMMF derivatives substituted at the methylene functional group. The furanones were prepared from 4-hydroxy-5-methyl-3(2H)-furanone with a number of aldehydes and a ketone. The kinetic data for the newly synthesized aroma-active substrates and products are similar to the values obtained for an enone oxidoreductase from Arabidopsis thaliana catalyzing the alpha,beta-hydrogenation of 2-alkenals. HMMF, the substrate of FaEO that is formed during strawberry fruit ripening, was also detected in tomato and pineapple fruit by HPLC-ESI-MSn and became 13C-labeled when d-[6-13C]-glucose was applied to the fruits, which suggested that a similar HDMF biosynthetic pathway occurs in the different plant species. With a database search (http://ted.bti.cornell.edu/ and http://genet.imb.uq.edu.au/Pineapple/), we identified a tomato and pineapple expressed sequence tag that shows significant homology to FaEO. Solanum lycopersicon EO (SlEO) was cloned from cDNA, and the protein was expressed in Escherichia coli and purified. Biochemical studies confirmed the involvement of SlEO in the biosynthesis of HDMF in tomato fruit.

  17. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  18. Proof-of-Concept Study for Uncertainty Quantification and Sensitivity Analysis using the BRL Shaped-Charge Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Justin Matthew

    These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less

  19. The 7th Transgenic Technology meeting: debut for "down under" (http://www.tasq.uq.edu.au/TT2007).

    PubMed

    Gertsenstein, Marina; Vintersten, Kristina

    2007-10-01

    The 7th Transgenic Technology meeting was held in Brisbane, Australia on February 12-14, 2007. Not only did this gathering mark a milestone as it was hosted outside the European continent for the first time, but also because it was the initial meeting to be held on behalf of the new International Society for Transgenic Technologies (ISTT, http://www.transtechsociety.org/ ). As in previous years, the topics were aimed towards both a scientific as well as a technical audience. The subjects covered a wide range of cutting edge applications in the field of genetic modifications in animal models, with the focus on (but by no means limited to) mice. True to the meetings tradition, a large emphasis was also laid on discussions about the management of transgenic production units. With the beautiful Australian sun shining over the venue, and a large number of exceptional speakers, this was a most pleasant and informative conference.

  20. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  1. CT radiation profile width measurement using CR imaging plate raw data

    PubMed Central

    Yang, Chang‐Ying Joseph

    2015-01-01

    This technical note demonstrates computed tomography (CT) radiation profile measurement using computed radiography (CR) imaging plate raw data showing it is possible to perform the CT collimation width measurement using a single scan without saturating the imaging plate. Previously described methods require careful adjustments to the CR reader settings in order to avoid signal clipping in the CR processed image. CT radiation profile measurements were taken as part of routine quality control on 14 CT scanners from four vendors. CR cassettes were placed on the CT scanner bed, raised to isocenter, and leveled. Axial scans were taken at all available collimations, advancing the cassette for each scan. The CR plates were processed and raw CR data were analyzed using MATLAB scripts to measure collimation widths. The raw data approach was compared with previously established methodology. The quality control analysis scripts are released as open source using creative commons licensing. A log‐linear relationship was found between raw pixel value and air kerma, and raw data collimation width measurements were in agreement with CR‐processed, bit‐reduced data, using previously described methodology. The raw data approach, with intrinsically wider dynamic range, allows improved measurement flexibility and precision. As a result, we demonstrate a methodology for CT collimation width measurements using a single CT scan and without the need for CR scanning parameter adjustments which is more convenient for routine quality control work. PACS numbers: 87.57.Q‐, 87.59.bd, 87.57.uq PMID:26699559

  2. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    PubMed

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The Roles of Verification, Validation and Uncertainty Quantification in the NASA Standard for Models and Simulations

    NASA Technical Reports Server (NTRS)

    Zang, Thomas A.; Luckring, James M.; Morrison, Joseph H.; Blattnig, Steve R.; Green, Lawrence L.; Tripathi, Ram K.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) recently issued an interim version of the Standard for Models and Simulations (M&S Standard) [1]. The action to develop the M&S Standard was identified in an internal assessment [2] of agency-wide changes needed in the wake of the Columbia Accident [3]. The primary goal of this standard is to ensure that the credibility of M&S results is properly conveyed to those making decisions affecting human safety or mission success criteria. The secondary goal is to assure that the credibility of the results from models and simulations meets the project requirements (for credibility). This presentation explains the motivation and key aspects of the M&S Standard, with a special focus on the requirements for verification, validation and uncertainty quantification. Some pilot applications of this standard to computational fluid dynamics applications will be provided as illustrations. The authors of this paper are the members of the team that developed the initial three drafts of the standard, the last of which benefited from extensive comments from most of the NASA Centers. The current version (number 4) incorporates modifications made by a team representing 9 of the 10 NASA Centers. A permanent version of the M&S Standard is expected by December 2007. The scope of the M&S Standard is confined to those uses of M&S that support program and project decisions that may affect human safety or mission success criteria. Such decisions occur, in decreasing order of importance, in the operations, the test & evaluation, and the design & analysis phases. Requirements are placed on (1) program and project management, (2) models, (3) simulations and analyses, (4) verification, validation and uncertainty quantification (VV&UQ), (5) recommended practices, (6) training, (7) credibility assessment, and (8) reporting results to decision makers. A key component of (7) and (8) is the use of a Credibility Assessment Scale, some of the details of which were developed in consultation with William Oberkampf, David Peercy and Timothy Trocano of Sandia National Laboratories. The focus of most of the requirements, including those for VV&UQ, is on the documentation of what was done and the reporting, using the Credibility Assessment Scale, of the level of rigor that was followed. The aspects of one option for the Credibilty Assessment Scale are (1) code verification, (2) solution verification, (3) validation, (4) predictive capability, (5) technical review, (6) process control, and (7) operator and analyst qualification.

  4. Topics in Bethe Ansatz

    NASA Astrophysics Data System (ADS)

    Wang, Chunguang

    Integrable quantum spin chains have close connections to integrable quantum field. theories, modern condensed matter physics, string and Yang-Mills theories. Bethe. ansatz is one of the most important approaches for solving quantum integrable spin. chains. At the heart of the algebraic structure of integrable quantum spin chains is. the quantum Yang-Baxter equation and the boundary Yang-Baxter equation. This. thesis focuses on four topics in Bethe ansatz. The Bethe equations for the isotropic periodic spin-1/2 Heisenberg chain with N. sites have solutions containing ±i/2 that are singular: both the corresponding energy and the algebraic Bethe ansatz vector are divergent. Such solutions must be carefully regularized. We consider a regularization involving a parameter that can be. determined using a generalization of the Bethe equations. These generalized Bethe. equations provide a practical way of determining which singular solutions correspond. to eigenvectors of the model. The Bethe equations for the periodic XXX and XXZ spin chains admit singular. solutions, for which the corresponding eigenvalues and eigenvectors are ill-defined. We use a twist regularization to derive conditions for such singular solutions to bephysical, in which case they correspond to genuine eigenvalues and eigenvectors of. the Hamiltonian. We analyze the ground state of the open spin-1/2 isotropic quantum spin chain. with a non-diagonal boundary term using a recently proposed Bethe ansatz solution. As the coefficient of the non-diagonal boundary term tends to zero, the Bethe roots. split evenly into two sets: those that remain finite, and those that become infinite. We. argue that the former satisfy conventional Bethe equations, while the latter satisfy a. generalization of the Richardson-Gaudin equations. We derive an expression for the. leading correction to the boundary energy in terms of the boundary parameters. We argue that the Hamiltonians for A(2) 2n open quantum spin chains corresponding. to two choices of integrable boundary conditions have the symmetries Uq(Bn) and. Uq(Cn), respectively. The deformation of Cn is novel, with a nonstandard coproduct. We find a formula for the Dynkin labels of the Bethe states (which determine the degeneracies of the corresponding eigenvalues) in terms of the numbers of Bethe roots of. each type. With the help of this formula, we verify numerically (for a generic value of. the anisotropy parameter) that the degeneracies and multiplicities of the spectra implied by the quantum group symmetries are completely described by the Bethe ansatz.

  5. An Algebraic Construction of Duality Functions for the Stochastic {U_q( A_n^{(1)})} Vertex Model and Its Degenerations

    NASA Astrophysics Data System (ADS)

    Kuan, Jeffrey

    2018-03-01

    A recent paper (Kuniba in Nucl Phys B 913:248-277, 2016) introduced the stochastic U}_q(A_n^{(1)})} vertex model. The stochastic S-matrix is related to the R-matrix of the quantum group {U_q(A_n^{(1)})} by a gauge transformation. We will show that a certain function {D^+_{m intertwines with the transfer matrix and its space reversal. When interpreting the transfer matrix as the transition matrix of a discrete-time totally asymmetric particle system on the one-dimensional lattice Z , the function {D^+m} becomes a Markov duality function {Dm} which only depends on q and the vertical spin parameters μ_x. By considering degenerations in the spectral parameter, the duality results also hold on a finite lattice with closed boundary conditions, and for a continuous-time degeneration. This duality function had previously appeared in a multi-species ASEP(q, j) process (Kuan in A multi-species ASEP(q, j) and q-TAZRP with stochastic duality, 2017). The proof here uses that the R-matrix intertwines with the co-product, but does not explicitly use the Yang-Baxter equation. It will also be shown that the stochastic U}_q(A_n^{(1)})} is a multi-species version of a stochastic vertex model studied in Borodin and Petrov (Higher spin six vertex model and symmetric rational functions, 2016) and Corwin and Petrov (Commun Math Phys 343:651-700, 2016). This will be done by generalizing the fusion process of Corwin and Petrov (2016) and showing that it matches the fusion of Kulish and yu (Lett Math Phys 5:393-403, 1981) up to the gauge transformation. We also show, by direct computation, that the multi-species q-Hahn Boson process (which arises at a special value of the spectral parameter) also satisfies duality with respect to D_∞, generalizing the single-species result of Corwin (Int Math Res Not 2015:5577-5603, 2015).

  6. Development and Use of Engineering Standards for Computational Fluid Dynamics for Complex Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Lee, Hyung B.; Ghia, Urmila; Bayyuk, Sami; Oberkampf, William L.; Roy, Christopher J.; Benek, John A.; Rumsey, Christopher L.; Powers, Joseph M.; Bush, Robert H.; Mani, Mortaza

    2016-01-01

    Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making.

  7. Kinematic source inversions of teleseismic data based on the QUESO library for uncertainty quantification and prediction

    NASA Astrophysics Data System (ADS)

    Zielke, O.; McDougall, D.; Mai, P. M.; Babuska, I.

    2014-12-01

    One fundamental aspect of seismic hazard mitigation is gaining a better understanding of the rupture process. Because direct observation of the relevant parameters and properties is not possible, other means such as kinematic source inversions are used instead. By constraining the spatial and temporal evolution of fault slip during an earthquake, those inversion approaches may enable valuable insights in the physics of the rupture process. However, due to the underdetermined nature of this inversion problem (i.e., inverting a kinematic source model for an extended fault based on seismic data), the provided solutions are generally non-unique. Here we present a statistical (Bayesian) inversion approach based on an open-source library for uncertainty quantification (UQ) called QUESO that was developed at ICES (UT Austin). The approach has advantages with respect to deterministic inversion approaches as it provides not only a single (non-unique) solution but also provides uncertainty bounds with it. Those uncertainty bounds help to qualitatively and quantitatively judge how well constrained an inversion solution is and how much rupture complexity the data reliably resolve. The presented inversion scheme uses only tele-seismically recorded body waves but future developments may lead us towards joint inversion schemes. After giving an insight in the inversion scheme ifself (based on delayed rejection adaptive metropolis, DRAM) we explore the method's resolution potential. For that, we synthetically generate tele-seismic data, add for example different levels of noise and/or change fault plane parameterization and then apply our inversion scheme in the attempt to extract the (known) kinematic rupture model. We conclude with exemplary inverting real tele-seismic data of a recent large earthquake and compare those results with deterministically derived kinematic source models provided by other research groups.

  8. Intrusive Method for Uncertainty Quantification in a Multiphase Flow Solver

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2016-11-01

    Uncertainty quantification (UQ) is a necessary, interesting, and often neglected aspect of fluid flow simulations. To determine the significance of uncertain initial and boundary conditions, a multiphase flow solver is being created which extends a single phase, intrusive, polynomial chaos scheme into multiphase flows. Reliably estimating the impact of input uncertainty on design criteria can help identify and minimize unwanted variability in critical areas, and has the potential to help advance knowledge in atomizing jets, jet engines, pharmaceuticals, and food processing. Use of an intrusive polynomial chaos method has been shown to significantly reduce computational cost over non-intrusive collocation methods such as Monte-Carlo. This method requires transforming the model equations into a weak form through substitution of stochastic (random) variables. Ultimately, the model deploys a stochastic Navier Stokes equation, a stochastic conservative level set approach including reinitialization, as well as stochastic normals and curvature. By implementing these approaches together in one framework, basic problems may be investigated which shed light on model expansion, uncertainty theory, and fluid flow in general. NSF Grant Number 1511325.

  9. Quantum groups, Yang-Baxter maps and quasi-determinants

    NASA Astrophysics Data System (ADS)

    Tsuboi, Zengo

    2018-01-01

    For any quasi-triangular Hopf algebra, there exists the universal R-matrix, which satisfies the Yang-Baxter equation. It is known that the adjoint action of the universal R-matrix on the elements of the tensor square of the algebra constitutes a quantum Yang-Baxter map, which satisfies the set-theoretic Yang-Baxter equation. The map has a zero curvature representation among L-operators defined as images of the universal R-matrix. We find that the zero curvature representation can be solved by the Gauss decomposition of a product of L-operators. Thereby obtained a quasi-determinant expression of the quantum Yang-Baxter map associated with the quantum algebra Uq (gl (n)). Moreover, the map is identified with products of quasi-Plücker coordinates over a matrix composed of the L-operators. We also consider the quasi-classical limit, where the underlying quantum algebra reduces to a Poisson algebra. The quasi-determinant expression of the quantum Yang-Baxter map reduces to ratios of determinants, which give a new expression of a classical Yang-Baxter map.

  10. Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations

    NASA Astrophysics Data System (ADS)

    Giovanis, D. G.; Shields, M. D.

    2018-07-01

    This paper addresses uncertainty quantification (UQ) for problems where scalar (or low-dimensional vector) response quantities are insufficient and, instead, full-field (very high-dimensional) responses are of interest. To do so, an adaptive stochastic simulation-based methodology is introduced that refines the probability space based on Grassmann manifold variations. The proposed method has a multi-element character discretizing the probability space into simplex elements using a Delaunay triangulation. For every simplex, the high-dimensional solutions corresponding to its vertices (sample points) are projected onto the Grassmann manifold. The pairwise distances between these points are calculated using appropriately defined metrics and the elements with large total distance are sub-sampled and refined. As a result, regions of the probability space that produce significant changes in the full-field solution are accurately resolved. An added benefit is that an approximation of the solution within each element can be obtained by interpolation on the Grassmann manifold. The method is applied to study the probability of shear band formation in a bulk metallic glass using the shear transformation zone theory.

  11. Uncertainty analyses of CO2 plume expansion subsequent to wellbore CO2 leakage into aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Zhangshuan; Bacon, Diana H.; Engel, David W.

    2014-08-01

    In this study, we apply an uncertainty quantification (UQ) framework to CO2 sequestration problems. In one scenario, we look at the risk of wellbore leakage of CO2 into a shallow unconfined aquifer in an urban area; in another scenario, we study the effects of reservoir heterogeneity on CO2 migration. We combine various sampling approaches (quasi-Monte Carlo, probabilistic collocation, and adaptive sampling) in order to reduce the number of forward calculations while trying to fully explore the input parameter space and quantify the input uncertainty. The CO2 migration is simulated using the PNNL-developed simulator STOMP-CO2e (the water-salt-CO2 module). For computationally demandingmore » simulations with 3D heterogeneity fields, we combined the framework with a scalable version module, eSTOMP, as the forward modeling simulator. We built response curves and response surfaces of model outputs with respect to input parameters, to look at the individual and combined effects, and identify and rank the significance of the input parameters.« less

  12. Identification of microbial carotenoids and isoprenoid quinones from Rhodococcus sp. B7740 and its stability in the presence of iron in model gastric conditions.

    PubMed

    Chen, Yashu; Xie, Bijun; Yang, Jifang; Chen, Jigang; Sun, Zhida

    2018-02-01

    Rhodococcus sp. B7740 is a newfound bacterium which was isolated from 25m deep seawater in the arctic. In this paper, Rhodococcus sp. B7740 was firstly discovered to produce abundant natural isoprenoids, including ubiquinone-4(UQ-4), 13 kinds of menaquinones, three rare aromatic carotenoids and more than one common carotenoid. These compounds were identified by UV-Visible, HPLC-APCI-MS/MS and HRMS spectra. Results demonstrated that Rhodococcus sp. B7740 might be a worthy source of natural isoprenoids especially for scarce aromatic carotenoids. Among them, isorenieratene with 528.3762Da (calculated for 528.3756Da, error: 1.1ppm), a carotenoid with aromatic ring, was purified by HSCCC. The stability of isorenieratene under the mimical gastric conditions was measured compared with common dietary carotenoids, β-carotene and lutein. Unlike β-carotene and lutein, isorenieratene exhibited rather stable in the presence of free iron or heme iron. Its high retention rate in gastrointestinal tract after ingestion indicates the benefits for health. Copyright © 2017. Published by Elsevier Ltd.

  13. Fels-Rand: an Xlisp-Stat program for the comparative analysis of data under phylogenetic uncertainty.

    PubMed

    Blomberg, S

    2000-11-01

    Currently available programs for the comparative analysis of phylogenetic data do not perform optimally when the phylogeny is not completely specified (i.e. the phylogeny contains polytomies). Recent literature suggests that a better way to analyse the data would be to create random trees from the known phylogeny that are fully-resolved but consistent with the known tree. A computer program is presented, Fels-Rand, that performs such analyses. A randomisation procedure is used to generate trees that are fully resolved but whose structure is consistent with the original tree. Statistics are then calculated on a large number of these randomly-generated trees. Fels-Rand uses the object-oriented features of Xlisp-Stat to manipulate internal tree representations. Xlisp-Stat's dynamic graphing features are used to provide heuristic tools to aid in analysis, particularly outlier analysis. The usefulness of Xlisp-Stat as a system for phylogenetic computation is discussed. Available from the author or at http://www.uq.edu.au/~ansblomb/Fels-Rand.sit.hqx. Xlisp-Stat is available from http://stat.umn.edu/~luke/xls/xlsinfo/xlsinfo.html. s.blomberg@abdn.ac.uk

  14. Uncertainty Quantification in Scale-Dependent Models of Flow in Porous Media: SCALE-DEPENDENT UQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tartakovsky, A. M.; Panzeri, M.; Tartakovsky, G. D.

    Equations governing flow and transport in heterogeneous porous media are scale-dependent. We demonstrate that it is possible to identify a support scalemore » $$\\eta^*$$, such that the typically employed approximate formulations of Moment Equations (ME) yield accurate (statistical) moments of a target environmental state variable. Under these circumstances, the ME approach can be used as an alternative to the Monte Carlo (MC) method for Uncertainty Quantification in diverse fields of Earth and environmental sciences. MEs are directly satisfied by the leading moments of the quantities of interest and are defined on the same support scale as the governing stochastic partial differential equations (PDEs). Computable approximations of the otherwise exact MEs can be obtained through perturbation expansion of moments of the state variables in orders of the standard deviation of the random model parameters. As such, their convergence is guaranteed only for the standard deviation smaller than one. We demonstrate our approach in the context of steady-state groundwater flow in a porous medium with a spatially random hydraulic conductivity.« less

  15. Robust optimization of supersonic ORC nozzle guide vanes

    NASA Astrophysics Data System (ADS)

    Bufi, Elio A.; Cinnella, Paola

    2017-03-01

    An efficient Robust Optimization (RO) strategy is developed for the design of 2D supersonic Organic Rankine Cycle turbine expanders. The dense gas effects are not-negligible for this application and they are taken into account describing the thermodynamics by means of the Peng-Robinson-Stryjek-Vera equation of state. The design methodology combines an Uncertainty Quantification (UQ) loop based on a Bayesian kriging model of the system response to the uncertain parameters, used to approximate statistics (mean and variance) of the uncertain system output, a CFD solver, and a multi-objective non-dominated sorting algorithm (NSGA), also based on a Kriging surrogate of the multi-objective fitness function, along with an adaptive infill strategy for surrogate enrichment at each generation of the NSGA. The objective functions are the average and variance of the isentropic efficiency. The blade shape is parametrized by means of a Free Form Deformation (FFD) approach. The robust optimal blades are compared to the baseline design (based on the Method of Characteristics) and to a blade obtained by means of a deterministic CFD-based optimization.

  16. Performance and Loads Data from a Wind Tunnel Test of a Full-Scale Rotor with Four Blade Tip Planforms

    DTIC Science & Technology

    1980-09-01

    a.c ~~cceccc.~~CCo COeCO @Coc Cacao ace cc* Ow. a 6 -**40 r’M18 - 4 5* 1~ *4 I VII I I cccccccco CeCCCCCC cccoo cacao acccC acC eccc t4 C m 9- VFAFA...0 Q- 0W%0 0 *0 00N’ 0M 00M 0 00 0 00 a. De (A m~) 0010 moo 0 t- 0 N fl,0 tt- 0O t0%0 N-4 co 0 N cc Z 0 -. 4b . N-4 0 F400 4 0 0 0 00 00 a 40 040 .0...C3M C 0 0.-e 00-~ 0 0 LL V4% % De ON WON r’- eN Ml.ý4N ’tGo N Nf Ln N0 0.00 oA𔃺 MNO 4NO ONO x Wr a t.-0 0o a)o ~0 1,- 0 000 000 t-UQ tj n6os f- o 0

  17. Iterative development of Stand Up Australia: a multi-component intervention to reduce workplace sitting

    PubMed Central

    2014-01-01

    Background Sitting, particularly in prolonged, unbroken bouts, is widespread within the office workplace, yet few interventions have addressed this newly-identified health risk behaviour. This paper describes the iterative development process and resulting intervention procedures for the Stand Up Australia research program focusing on a multi-component workplace intervention to reduce sitting time. Methods The development of Stand Up Australia followed three phases. 1) Conceptualisation: Stand Up Australia was based on social cognitive theory and social ecological model components. These were operationalised via a taxonomy of intervention strategies and designed to target multiple levels of influence including: organisational structures (e.g. via management consultation), the physical work environment (via provision of height-adjustable workstations), and individual employees (e.g. via face-to-face coaching). 2) Formative research: Intervention components were separately tested for their feasibility and acceptability. 3) Pilot studies: Stand Up Comcare tested the integrated intervention elements in a controlled pilot study examining efficacy, feasibility and acceptability. Stand Up UQ examined the additional value of the organisational- and individual-level components over height-adjustable workstations only in a three-arm controlled trial. In both pilot studies, office workers’ sitting time was measured objectively using activPAL3 devices and the intervention was refined based on qualitative feedback from managers and employees. Results Results and feedback from participants and managers involved in the intervention development phases suggest high efficacy, acceptance, and feasibility of all intervention components. The final version of the Stand Up Australia intervention includes strategies at the organisational (senior management consultation, representatives consultation workshop, team champions, staff information and brainstorming session with information booklet, and supportive emails from managers to staff), environmental (height-adjustable workstations), and individual level (face-to-face coaching session and telephone support). Stand Up Australia is currently being evaluated in the context of a cluster-randomised controlled trial at the Department of Human Services (DHS) in Melbourne, Australia. Conclusions Stand Up Australia is an evidence-guided and systematically developed workplace intervention targeting reductions in office workers’ sitting time. PMID:24559162

  18. Underground Coal Thermal Treatment: Task 6 Topical Report, Utah Clean Coal Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, P.J.; Deo, M.; Edding, E.G.

    The long-term objective of this task is to develop a transformational energy production technology by in- situ thermal treatment of a coal seam for the production of substitute natural gas and/or liquid transportation fuels while leaving much of the coal’s carbon in the ground. This process converts coal to a high-efficiency, low-greenhouse gas (GHG) emitting fuel. It holds the potential of providing environmentally acceptable access to previously unusable coal resources. This task focused on three areas: Experimental. The Underground Coal Thermal Treatment (UCTT) team focused on experiments at two scales, bench-top and slightly larger, to develop data to understand themore » feasibility of a UCTT process as well as to develop validation/uncertainty quantification (V/UQ) data for the simulation team. Simulation. The investigators completed development of High Performance Computing (HPC) simulations of UCTT. This built on our simulation developments over the course of the task and included the application of Computational Fluid Dynamics (CFD)- based tools to perform HPC simulations of a realistically sized domain representative of an actual coal field located in Utah. CO 2 storage. In order to help determine the amount of CO 2 that can be sequestered in a coal formation that has undergone UCTT, adsorption isotherms were performed on coals treated to 325, 450, and 600°C with slow heating rates. Raw material was sourced from the Sufco (Utah), Carlinville (Illinois), and North Antelope (Wyoming) mines. The study indicated that adsorptive capacity for the coals increased with treatment temperature and that coals treated to 325°C showed less or similar capacity to the untreated coals.« less

  19. Differences in the binding of the primary quinone receptor in Photosystem I and reaction centres of Rhodobacter sphaeroides-R26 studied with transient EPR spectroscopy

    NASA Astrophysics Data System (ADS)

    van der Est, A.; Sieckmann, I.; Lubitz, W.; Stehlik, D.

    1995-05-01

    The binding of the primary quinone acceptor, Q, in Photosystem I (PS I) and reaction centres (RC's) of Rhodobacter Sphaeroide-R26 in which, the non-heme iron has been replaced by zinc (Zn-bRC's) is studied using transient EPR spectroscopy. In PS I, Q is phylloquinone (vitamin K 1, VK 1) and is referred to as A 1. In Zn-bRC's, it is ubiquinone-10 (UQ 10) and called Q A. Native samples of the two RC's as well as those in which A 1 and Q A have been replaced by perdeuterated napthoquinone (NQ- d6) and duroquinone (DQ- d12) are compared. The spin polarized K-band (24 GHz) spectra of the charge separated state P +.Q -. (P = primary chlorophyll donor) in Zn-bRC's show that substitution of Q A, with NQ- d6 and DQ- d12 does not have a measurable effect on the quinone orientation in the Q A site. In contrast, large differences in the orientation of VK 1, NQ- d6 and DQ- d12 in the A 1 site in PS I are found. In addition, all three quinones in PS I are oriented differently than Q A in Zn-bRC's. Further, the x and y principal values of the g-tensors of VK 1-., NQ -. and DQ -. in PS I are shown to be significantly larger than in frozen alcohol and Zn-bRC's. It is suggested that the differences in the orientation and a g-values of the quinones in the two RC's arise from a weaker binding to the protein in PS I.

  20. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    NASA Astrophysics Data System (ADS)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  1. Clean and Secure Energy from Domestic Oil Shale and Oil Sands Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spinti, Jennifer; Birgenheier, Lauren; Deo, Milind

    This report summarizes the significant findings from the Clean and Secure Energy from Domestic Oil Shale and Oil Sands Resources program sponsored by the Department of Energy through the National Energy Technology Laboratory. There were four principle areas of research; Environmental, legal, and policy issues related to development of oil shale and oil sands resources; Economic and environmental assessment of domestic unconventional fuels industry; Basin-scale assessment of conventional and unconventional fuel development impacts; and Liquid fuel production by in situ thermal processing of oil shale Multiple research projects were conducted in each area and the results have been communicated viamore » sponsored conferences, conference presentations, invited talks, interviews with the media, numerous topical reports, journal publications, and a book that summarizes much of the oil shale research relating to Utah’s Uinta Basin. In addition, a repository of materials related to oil shale and oil sands has been created within the University of Utah’s Institutional Repository, including the materials generated during this research program. Below is a listing of all topical and progress reports generated by this project and submitted to the Office of Science and Technical Information (OSTI). A listing of all peer-reviewed publications generated as a result of this project is included at the end of this report; Geomechanical and Fluid Transport Properties 1 (December, 2015); Validation Results for Core-Scale Oil Shale Pyrolysis (February, 2015); and Rates and Mechanisms of Oil Shale Pyrolysis: A Chemical Structure Approach (November, 2014); Policy Issues Associated With Using Simulation to Assess Environmental Impacts (November, 2014); Policy Analysis of the Canadian Oil Sands Experience (September, 2013); V-UQ of Generation 1 Simulator with AMSO Experimental Data (August, 2013); Lands with Wilderness Characteristics, Resource Management Plan Constraints, and Land Exchanges (March, 2012); Conjunctive Surface and Groundwater Management in Utah: Implications for Oil Shale and Oil Sands Development (May, 2012); Development of CFD-Based Simulation Tools for In Situ Thermal Processing of Oil Shale/Sands (February, 2012); Core-Based Integrated Sedimentologic, Stratigraphic, and Geochemical Analysis of the Oil Shale Bearing Green River Formation, Uinta Basin, Utah (April, 2011); Atomistic Modeling of Oil Shale Kerogens and Asphaltenes Along with their Interactions with the Inorganic Mineral Matrix (April, 2011); Pore Scale Analysis of Oil Shale/Sands Pyrolysis (March, 2011); Land and Resource Management Issues Relevant to Deploying In-Situ Thermal Technologies (January, 2011); Policy Analysis of Produced Water Issues Associated with In-Situ Thermal Technologies (January, 2011); and Policy Analysis of Water Availability and Use Issues for Domestic Oil Shale and Oil Sands Development (March, 2010)« less

  2. Yang-Baxter maps, discrete integrable equations and quantum groups

    NASA Astrophysics Data System (ADS)

    Bazhanov, Vladimir V.; Sergeev, Sergey M.

    2018-01-01

    For every quantized Lie algebra there exists a map from the tensor square of the algebra to itself, which by construction satisfies the set-theoretic Yang-Baxter equation. This map allows one to define an integrable discrete quantum evolution system on quadrilateral lattices, where local degrees of freedom (dynamical variables) take values in a tensor power of the quantized Lie algebra. The corresponding equations of motion admit the zero curvature representation. The commuting Integrals of Motion are defined in the standard way via the Quantum Inverse Problem Method, utilizing Baxter's famous commuting transfer matrix approach. All elements of the above construction have a meaningful quasi-classical limit. As a result one obtains an integrable discrete Hamiltonian evolution system, where the local equation of motion are determined by a classical Yang-Baxter map and the action functional is determined by the quasi-classical asymptotics of the universal R-matrix of the underlying quantum algebra. In this paper we present detailed considerations of the above scheme on the example of the algebra Uq (sl (2)) leading to discrete Liouville equations, however the approach is rather general and can be applied to any quantized Lie algebra.

  3. Algebro-geometric approach for a centrally extended Uq[sl(2|2)] R-matrix

    NASA Astrophysics Data System (ADS)

    Martins, M. J.

    2017-04-01

    In this paper we investigate the algebraic geometric nature of a solution of the Yang-Baxter equation based on the quantum deformation of the centrally extended sl (2 | 2) superalgebra proposed by Beisert and Koroteev [1]. We derive an alternative representation for the R-matrix in which the matrix elements are given in terms of rational functions depending on weights sited on a degree six surface. For generic gauge the weights geometry are governed by a genus one ruled surface while for a symmetric gauge choice the weights lie instead on a genus five curve. We have written down the polynomial identities satisfied by the R-matrix entries needed to uncover the corresponding geometric properties. For arbitrary gauge the R-matrix geometry is argued to be birational to the direct product CP1 ×CP1 × A where A is an Abelian surface. For the symmetric gauge we present evidences that the geometric content is that of a surface of general type lying on the so-called Severi line with irregularity two and geometric genus nine. We discuss potential geometric degenerations when the two free couplings are restricted to certain one-dimensional subspaces.

  4. Joint modeling and registration of cell populations in cohorts of high-dimensional flow cytometric data.

    PubMed

    Pyne, Saumyadipta; Lee, Sharon X; Wang, Kui; Irish, Jonathan; Tamayo, Pablo; Nazaire, Marc-Danie; Duong, Tarn; Ng, Shu-Kay; Hafler, David; Levy, Ronald; Nolan, Garry P; Mesirov, Jill; McLachlan, Geoffrey J

    2014-01-01

    In biomedical applications, an experimenter encounters different potential sources of variation in data such as individual samples, multiple experimental conditions, and multivariate responses of a panel of markers such as from a signaling network. In multiparametric cytometry, which is often used for analyzing patient samples, such issues are critical. While computational methods can identify cell populations in individual samples, without the ability to automatically match them across samples, it is difficult to compare and characterize the populations in typical experiments, such as those responding to various stimulations or distinctive of particular patients or time-points, especially when there are many samples. Joint Clustering and Matching (JCM) is a multi-level framework for simultaneous modeling and registration of populations across a cohort. JCM models every population with a robust multivariate probability distribution. Simultaneously, JCM fits a random-effects model to construct an overall batch template--used for registering populations across samples, and classifying new samples. By tackling systems-level variation, JCM supports practical biomedical applications involving large cohorts. Software for fitting the JCM models have been implemented in an R package EMMIX-JCM, available from http://www.maths.uq.edu.au/~gjm/mix_soft/EMMIX-JCM/.

  5. Enforcing positivity in intrusive PC-UQ methods for reactive ODE systems

    DOE PAGES

    Najm, Habib N.; Valorani, Mauro

    2014-04-12

    We explore the relation between the development of a non-negligible probability of negative states and the instability of numerical integration of the intrusive Galerkin ordinary differential equation system describing uncertain chemical ignition. To prevent this instability without resorting to either multi-element local polynomial chaos (PC) methods or increasing the order of the PC representation in time, we propose a procedure aimed at modifying the amplitude of the PC modes to bring the probability of negative state values below a user-defined threshold. This modification can be effectively described as a filtering procedure of the spectral PC coefficients, which is applied on-the-flymore » during the numerical integration when the current value of the probability of negative states exceeds the prescribed threshold. We demonstrate the filtering procedure using a simple model of an ignition process in a batch reactor. This is carried out by comparing different observables and error measures as obtained by non-intrusive Monte Carlo and Gauss-quadrature integration and the filtered intrusive procedure. Lastly, the filtering procedure has been shown to effectively stabilize divergent intrusive solutions, and also to improve the accuracy of stable intrusive solutions which are close to the stability limits.« less

  6. National Dam Inspection Program. Baggaley Dam. (NDI Number PA-00454, PennDER Number-65-10) Ohio River Basin, Indian Camp Run, Westmoreland County, Pennsylvania. Phase I Inspection Report.

    DTIC Science & Technology

    1980-05-01

    o U)q) ) C o ’--I o 2m)U c = 00 -1 a o 4 0 r-4 C 0 o4.) -C)L 0 bo3 0Q) 4 S .) * -ObO -) "a ) - 4C )j 0 04- >4. = M.- Q C) Q) D a al~. 9)C) 4)a ) C)2. Q...Z1 ocacr. . 1 xm j. )( E- 2 9) . ,> ( 1 - - 4C 0 .1a () 4 CZ3 0 = > 0 C E bo to a 4 Lca A3 0- 0 (d 2 0 -4 o 00 (a C-) "-I wz X .4 0 0I X: 0 cc 1) E (UG4...C L (DU) 4)0L r-4 4C ~ ) 0 0 r.1 V 4 LO0U0 0dv 04) 40 V0.0 0 m cC -4 X V 4 c4.) 4.3 3 4) ) -q .) -f .4to V 41 4.) 02 0 C tv 0c c C .4) 0a 0 9) In-4 0

  7. Tracheoesophageal Prosthesis Use Is Associated With Improved Overall Quality of Life in Veterans With Laryngeal Cancer.

    PubMed

    Patel, Ramya S; Mohr, Tiffany; Hartman, Christine; Stach, Carol; Sikora, Andrew G; Zevallos, Jose P; Sandulache, Vlad C

    2018-05-01

    Veterans have an increased risk of laryngeal cancer, yet their oncologic and functional outcomes remain understudied. We sought to determine the longitudinal impact of tracheoesophageal puncture and voice prosthesis on quality-of-life measures in veterans following total laryngectomy (TL). We performed a cross-sectional analysis of TL patients (n = 68) treated at the Michael E. DeBakey Veterans Affairs Medical Center using the Voice Handicap Index (VHI), MD Anderson Dysphagia Index (MDADI), and University of Washington Quality of Life Index (UW-QOL). Using tracheoesophageal (TE) speech was associated with significantly better VHI, MDADI, and UW-QOL scores compared to other forms of communication. The association between TE speech use on VHI, MDADI, and UQ-QOL persisted even when the analysis was limited to patients with >5-year follow-up and was maintained on multivariate analysis that accounted for a history of radiation and laryngectomy for recurrent laryngeal cancer. Using tracheoesophageal speech after total laryngectomy is associated with durable improvements in quality of life and functional outcomes in veterans. Tracheoesophageal voice restoration should be attempted whenever technically feasible in patients that meet the complex psychosocial and physical requirements to appropriately utilize TE speech.

  8. Predicting the Dynamics of Protein Abundance

    PubMed Central

    Mehdi, Ahmed M.; Patrick, Ralph; Bailey, Timothy L.; Bodén, Mikael

    2014-01-01

    Protein synthesis is finely regulated across all organisms, from bacteria to humans, and its integrity underpins many important processes. Emerging evidence suggests that the dynamic range of protein abundance is greater than that observed at the transcript level. Technological breakthroughs now mean that sequencing-based measurement of mRNA levels is routine, but protocols for measuring protein abundance remain both complex and expensive. This paper introduces a Bayesian network that integrates transcriptomic and proteomic data to predict protein abundance and to model the effects of its determinants. We aim to use this model to follow a molecular response over time, from condition-specific data, in order to understand adaptation during processes such as the cell cycle. With microarray data now available for many conditions, the general utility of a protein abundance predictor is broad. Whereas most quantitative proteomics studies have focused on higher organisms, we developed a predictive model of protein abundance for both Saccharomyces cerevisiae and Schizosaccharomyces pombe to explore the latitude at the protein level. Our predictor primarily relies on mRNA level, mRNA–protein interaction, mRNA folding energy and half-life, and tRNA adaptation. The combination of key features, allowing for the low certainty and uneven coverage of experimental observations, gives comparatively minor but robust prediction accuracy. The model substantially improved the analysis of protein regulation during the cell cycle: predicted protein abundance identified twice as many cell-cycle-associated proteins as experimental mRNA levels. Predicted protein abundance was more dynamic than observed mRNA expression, agreeing with experimental protein abundance from a human cell line. We illustrate how the same model can be used to predict the folding energy of mRNA when protein abundance is available, lending credence to the emerging view that mRNA folding affects translation efficiency. The software and data used in this research are available at http://bioinf.scmb.uq.edu.au/proteinabundance/. PMID:24532840

  9. Predicting the dynamics of protein abundance.

    PubMed

    Mehdi, Ahmed M; Patrick, Ralph; Bailey, Timothy L; Bodén, Mikael

    2014-05-01

    Protein synthesis is finely regulated across all organisms, from bacteria to humans, and its integrity underpins many important processes. Emerging evidence suggests that the dynamic range of protein abundance is greater than that observed at the transcript level. Technological breakthroughs now mean that sequencing-based measurement of mRNA levels is routine, but protocols for measuring protein abundance remain both complex and expensive. This paper introduces a Bayesian network that integrates transcriptomic and proteomic data to predict protein abundance and to model the effects of its determinants. We aim to use this model to follow a molecular response over time, from condition-specific data, in order to understand adaptation during processes such as the cell cycle. With microarray data now available for many conditions, the general utility of a protein abundance predictor is broad. Whereas most quantitative proteomics studies have focused on higher organisms, we developed a predictive model of protein abundance for both Saccharomyces cerevisiae and Schizosaccharomyces pombe to explore the latitude at the protein level. Our predictor primarily relies on mRNA level, mRNA-protein interaction, mRNA folding energy and half-life, and tRNA adaptation. The combination of key features, allowing for the low certainty and uneven coverage of experimental observations, gives comparatively minor but robust prediction accuracy. The model substantially improved the analysis of protein regulation during the cell cycle: predicted protein abundance identified twice as many cell-cycle-associated proteins as experimental mRNA levels. Predicted protein abundance was more dynamic than observed mRNA expression, agreeing with experimental protein abundance from a human cell line. We illustrate how the same model can be used to predict the folding energy of mRNA when protein abundance is available, lending credence to the emerging view that mRNA folding affects translation efficiency. The software and data used in this research are available at http://bioinf.scmb.uq.edu.au/proteinabundance/.

  10. DLocalMotif: a discriminative approach for discovering local motifs in protein sequences.

    PubMed

    Mehdi, Ahmed M; Sehgal, Muhammad Shoaib B; Kobe, Bostjan; Bailey, Timothy L; Bodén, Mikael

    2013-01-01

    Local motifs are patterns of DNA or protein sequences that occur within a sequence interval relative to a biologically defined anchor or landmark. Current protein motif discovery methods do not adequately consider such constraints to identify biologically significant motifs that are only weakly over-represented but spatially confined. Using negatives, i.e. sequences known to not contain a local motif, can further increase the specificity of their discovery. This article introduces the method DLocalMotif that makes use of positional information and negative data for local motif discovery in protein sequences. DLocalMotif combines three scoring functions, measuring degrees of motif over-representation, entropy and spatial confinement, specifically designed to discriminatively exploit the availability of negative data. The method is shown to outperform current methods that use only a subset of these motif characteristics. We apply the method to several biological datasets. The analysis of peroxisomal targeting signals uncovers several novel motifs that occur immediately upstream of the dominant peroxisomal targeting signal-1 signal. The analysis of proline-tyrosine nuclear localization signals uncovers multiple novel motifs that overlap with C2H2 zinc finger domains. We also evaluate the method on classical nuclear localization signals and endoplasmic reticulum retention signals and find that DLocalMotif successfully recovers biologically relevant sequence properties. http://bioinf.scmb.uq.edu.au/dlocalmotif/

  11. Supercritical fluid extraction and ultra performance liquid chromatography of respiratory quinones for microbial community analysis in environmental and biological samples.

    PubMed

    Hanif, Muhammad; Atsuta, Yoichi; Fujie, Koichi; Daimon, Hiroyuki

    2012-03-05

    Microbial community structure plays a significant role in environmental assessment and animal health management. The development of a superior analytical strategy for the characterization of microbial community structure is an ongoing challenge. In this study, we developed an effective supercritical fluid extraction (SFE) and ultra performance liquid chromatography (UPLC) method for the analysis of bacterial respiratory quinones (RQ) in environmental and biological samples. RQ profile analysis is one of the most widely used culture-independent tools for characterizing microbial community structure. A UPLC equipped with a photo diode array (PDA) detector was successfully applied to the simultaneous determination of ubiquinones (UQ) and menaquinones (MK) without tedious pretreatment. Supercritical carbon dioxide (scCO(2)) extraction with the solid-phase cartridge trap proved to be a more effective and rapid method for extracting respiratory quinones, compared to a conventional organic solvent extraction method. This methodology leads to a successful analytical procedure that involves a significant reduction in the complexity and sample preparation time. Application of the optimized methodology to characterize microbial communities based on the RQ profile was demonstrated for a variety of environmental samples (activated sludge, digested sludge, and compost) and biological samples (swine and Japanese quail feces).

  12. Introducing Scenario Based Learning interactive to postgraduates in UQ Orthodontic Program.

    PubMed

    Naser-ud-Din, S

    2015-08-01

    E-learning has gained momentum in health sciences and seems to have great potential in specialist dental education. Higher acceptability by learners is particularly associated with the surge of smart devices. Currently, there are limited number of e-learning modules available for dental education, particularly in Orthodontics. Scenario Based Learning interactive (SBLi(®)) software was used for the first time in Orthodontics Postgraduate training at the University of Queensland. Nine interactive modules were created embedded with clinical procedure videos, web-links, evidence-based literature, along with opportunity for self-assessment and evaluation. Qualitative data were collected before and after the administration of the SBLi(®) for Orthodontics. The purpose of this data was to investigate learning styles and the acceptance of e-modules as part of postgraduate training. Advantages of the package included high acceptance rate, greater confidence in the application of clinical skills covered in the modules and reduced contact time particularly with limited academic staff. E-modules demonstrated high compatibility with the learning styles of the participants and were considered engaging. It seems apparent that e-learning is most effective in a blended learning environment, supplemented with the traditional classroom approach, rather than as a sole mechanism for postgraduate training. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  14. Uncertainty quantification analysis of the dynamics of an electrostatically actuated microelectromechanical switch model

    NASA Astrophysics Data System (ADS)

    Snow, Michael G.; Bajaj, Anil K.

    2015-08-01

    This work presents an uncertainty quantification (UQ) analysis of a comprehensive model for an electrostatically actuated microelectromechanical system (MEMS) switch. The goal is to elucidate the effects of parameter variations on certain key performance characteristics of the switch. A sufficiently detailed model of the electrostatically actuated switch in the basic configuration of a clamped-clamped beam is developed. This multi-physics model accounts for various physical effects, including the electrostatic fringing field, finite length of electrodes, squeeze film damping, and contact between the beam and the dielectric layer. The performance characteristics of immediate interest are the static and dynamic pull-in voltages for the switch. Numerical approaches for evaluating these characteristics are developed and described. Using Latin Hypercube Sampling and other sampling methods, the model is evaluated to find these performance characteristics when variability in the model's geometric and physical parameters is specified. Response surfaces of these results are constructed via a Multivariate Adaptive Regression Splines (MARS) technique. Using a Direct Simulation Monte Carlo (DSMC) technique on these response surfaces gives smooth probability density functions (PDFs) of the outputs characteristics when input probability characteristics are specified. The relative variation in the two pull-in voltages due to each of the input parameters is used to determine the critical parameters.

  15. An arsenate-reducing and alkane-metabolizing novel bacterium, Rhizobium arsenicireducens sp. nov., isolated from arsenic-rich groundwater.

    PubMed

    Mohapatra, Balaram; Sarkar, Angana; Joshi, Swati; Chatterjee, Atrayee; Kazy, Sufia Khannam; Maiti, Mrinal Kumar; Satyanarayana, Tulasi; Sar, Pinaki

    2017-03-01

    A novel arsenic (As)-resistant, arsenate-respiring, alkane-metabolizing bacterium KAs 5-22 T , isolated from As-rich groundwater of West Bengal was characterized by physiological and genomic properties. Cells of strain KAs 5-22 T were Gram-stain-negative, rod-shaped, motile, and facultative anaerobic. Growth occurred at optimum of pH 6.0-7.0, temperature 30 °C. 16S rRNA gene affiliated the strain KAs 5-22 T to the genus Rhizobium showing maximum similarity (98.4 %) with the type strain of Rhizobium naphthalenivorans TSY03b T followed by (98.0 % similarity) Rhizobium selenitireducens B1 T . The genomic G + C content was 59.4 mol%, and DNA-DNA relatedness with its closest phylogenetic neighbors was 50.2 %. Chemotaxonomy indicated UQ-10 as the major quinone; phosphatidylethanolamine, phosphatidylglycerol, and diphosphatidylglycerol as major polar lipids; C 16:0 , C 17:0 , 2-OH C 10:0 , 3-OH C 16:0 , and unresolved C 18:1 ɷ7C/ɷ9C as predominant fatty acids. The cells were found to reduce O 2 , As 5+ , NO 3 - , SO 4 2- and Fe 3+ as alternate electron acceptors. The strain's ability to metabolize dodecane or other alkanes as sole carbon source using As 5+ as terminal electron acceptor was supported by the presence of genes encoding benzyl succinate synthase (bssA like) and molybdopterin-binding site (mopB) of As 5+ respiratory reductase (arrA). Differential phenotypic, chemotaxonomic, genotypic as well as physiological properties revealed that the strain KAs 5-22 T is separated from its nearest recognized Rhizobium species. On the basis of the data presented, strain KAs 5-22 T is considered to represent a novel species of the genus Rhizobium, for which the name Rhizobium arsenicireducens sp. nov. is proposed as type strain (=LMG 28795 T =MTCC 12115 T ).

  16. Critical Transitions in Thin Layer Turbulence

    NASA Astrophysics Data System (ADS)

    Benavides, Santiago; Alexakis, Alexandros

    2017-11-01

    We investigate a model of thin layer turbulence that follows the evolution of the two-dimensional motions u2 D (x , y) along the horizontal directions (x , y) coupled to a single Fourier mode along the vertical direction (z) of the form uq (x , y , z) = [vx (x , y) sin (qz) ,vy (x , y) sin (qz) ,vz (x , y) cos (qz) ] , reducing thus the system to two coupled, two-dimensional equations. Its reduced dimensionality allows a thorough investigation of the transition from a forward to an inverse cascade of energy as the thickness of the layer H = π / q is varied.Starting from a thick layer and reducing its thickness it is shown that two critical heights are met (i) one for which the forward unidirectional cascade (similar to three-dimensional turbulence) transitions to a bidirectional cascade transferring energy to both small and large scales and (ii) one for which the bidirectional cascade transitions to a unidirectional inverse cascade when the layer becomes very thin (similar to two-dimensional turbulence). The two critical heights are shown to have different properties close to criticality that we are able to analyze with numerical simulations for a wide range of Reynolds numbers and aspect ratios. This work was Granted access to the HPC resources of MesoPSL financed by the Region Ile de France and the project Equip@Meso (reference ANR-10-EQPX-29-01).

  17. Revealing the functionality of hypothetical protein KPN00728 from Klebsiella pneumoniae MGH78578: molecular dynamics simulation approaches

    PubMed Central

    2011-01-01

    Background Previously, the hypothetical protein, KPN00728 from Klebsiella pneumoniae MGH78578 was the Succinate dehydrogenase (SDH) chain C subunit via structural prediction and molecular docking simulation studies. However, due to limitation in docking simulation, an in-depth understanding of how SDH interaction occurs across the transmembrane of mitochondria could not be provided. Results In this present study, molecular dynamics (MD) simulation of KPN00728 and SDH chain D in a membrane was performed in order to gain a deeper insight into its molecular role as SDH. Structural stability was successfully obtained in the calculation for area per lipid, tail order parameter, thickness of lipid and secondary structural properties. Interestingly, water molecules were found to be highly possible in mediating the interaction between Ubiquinone (UQ) and SDH chain C via interaction with Ser27 and Arg31 residues as compared with earlier docking study. Polar residues such as Asp95 and Glu101 (KPN00728), Asp15 and Glu78 (SDH chain D) might have contributed in the creation of a polar environment which is essential for electron transport chain in Krebs cycle. Conclusions As a conclusion, a part from the structural stability comparability, the dynamic of the interacting residues and hydrogen bonding analysis had further proved that the interaction of KPN00728 as SDH is preserved and well agreed with our postulation earlier. PMID:22372825

  18. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-01-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  19. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-08-01

    Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  20. Teaching style beliefs among U.S. and Israeli faculty.

    PubMed

    Behar-Horenstein, Linda S; Mitchell, Gail S; Notzer, Netta; Penfield, Randy; Eli, Ilana

    2006-08-01

    The purpose of this study was to determine if self-reported teaching style beliefs were different among faculty at a U.S. and an Israeli dental school. Teacher-centered practices refer to beliefs that the teacher holds the subject matter expertise and students are generally passive learners who must be told what to think. Student-centered practices refer to beliefs that students must learn how to construct their own understanding. Student-centered teaching is directed towards enabling students to think about complex issues. Twenty-seven of fifty-eight (47.37 percent) faculty at a dental school in the United States and thirty of thirty-four (88 percent) Israeli dental faculty teaching in basic science courses completed the Teaching Behavior Preferences Survey (TBPS). The TBPS is a thirty-item instrument that measures two domains of teaching styles--teacher-centered (TC) and student-centered (SC)--and four subdomains: methods of instruction (MI), classroom milieu (CM), use of questions (UQ), and use of assessment (UA). Findings revealed that there were no significant differences in student-centered and teacher-centered teaching practices and methods of instruction, classroom milieu, and use of questions. There was a significant difference between the U.S. and Israeli groups in their reported use of assessment. The U.S. faculty reported a greater preference for student-centered assessment practices than did the Israeli faculty.

  1. Development of a Mobile Phone-Based Weight Loss Lifestyle Intervention for Filipino Americans with Type 2 Diabetes: Protocol and Early Results From the PilAm Go4Health Randomized Controlled Trial

    PubMed Central

    2016-01-01

    Background Filipino Americans are the second largest Asian subgroup in the United States, and were found to have the highest prevalence of obesity and type 2 diabetes (T2D) compared to all Asian subgroups and non-Hispanic whites. In addition to genetic factors, risk factors for Filipinos that contribute to this health disparity include high sedentary rates and high fat diets. However, Filipinos are seriously underrepresented in preventive health research. Research is needed to identify effective interventions to reduce Filipino diabetes risks, subsequent comorbidities, and premature death. Objective The overall goal of this project is to assess the feasibility and potential efficacy of the Filipino Americans Go4Health Weight Loss Program (PilAm Go4Health). This program is a culturally adapted weight loss lifestyle intervention, using digital technology for Filipinos with T2D, to reduce their risk for metabolic syndrome. Methods This study was a 3-month mobile phone-based pilot randomized controlled trial (RCT) weight loss intervention with a wait list active control, followed by a 3-month maintenance phase design for 45 overweight Filipinos with T2D. Participants were randomized to an intervention group (n=22) or active control group (n=23), and analyses of the results are underway. The primary outcome will be percent weight change of the participants, and secondary outcomes will include changes in waist circumference, fasting plasma glucose, glycated hemoglobin A1c, physical activity, fat intake, and sugar-sweetened beverage intake. Data analyses will include descriptive statistics to describe sample characteristics and a feasibility assessment based on recruitment, adherence, and retention. Chi-square, Fisher's exact tests, t-tests, and nonparametric rank tests will be used to assess characteristics of randomized groups. Primary analyses will use analysis of covariance and linear mixed models to compare primary and secondary outcomes at 3 months, compared by arm and controlled for baseline levels. Results Recruitment was completed in January, 2016, and participant follow-up continued through June, 2016. At baseline, mean age was 57 years, 100% (45/45) of participants self-identified as Filipinos, and the cohort was comprised of 17 males and 28 females. Overall, participants were obese with a baseline mean body mass index of 30.2 kg/m2 (standard deviation 4.9). The majority of participants were immigrants (84%, 38/45), with 47% (21/45) living in the United States for more than 10 years. One third of all participants (33%, 15/45) had previously used a pedometer. Conclusions This study will provide preliminary evidence to determine if the PilAm Go4Health weight loss lifestyle intervention is feasible, and if the program demonstrates potential efficacy to reduce risks for metabolic syndrome in Filipinos with T2D. Positive results will lend support for a larger RCT to evaluate the effectiveness of the PilAm Go4Health intervention for Filipinos. ClinicalTrial ClinicalTrials.gov: NCT02290184; https://clinicaltrials.gov/ct2/show/NCT02290184 (Archived at http://www.webcitation.org/6k1kUqKSP) PMID:27608829

  2. Development of a Mobile Phone-Based Weight Loss Lifestyle Intervention for Filipino Americans with Type 2 Diabetes: Protocol and Early Results From the PilAm Go4Health Randomized Controlled Trial.

    PubMed

    Bender, Melinda Sarmiento; Santos, Glenn-Milo; Villanueva, Carissa; Arai, Shoshana

    2016-09-08

    Filipino Americans are the second largest Asian subgroup in the United States, and were found to have the highest prevalence of obesity and type 2 diabetes (T2D) compared to all Asian subgroups and non-Hispanic whites. In addition to genetic factors, risk factors for Filipinos that contribute to this health disparity include high sedentary rates and high fat diets. However, Filipinos are seriously underrepresented in preventive health research. Research is needed to identify effective interventions to reduce Filipino diabetes risks, subsequent comorbidities, and premature death. The overall goal of this project is to assess the feasibility and potential efficacy of the Filipino Americans Go4Health Weight Loss Program (PilAm Go4Health). This program is a culturally adapted weight loss lifestyle intervention, using digital technology for Filipinos with T2D, to reduce their risk for metabolic syndrome. This study was a 3-month mobile phone-based pilot randomized controlled trial (RCT) weight loss intervention with a wait list active control, followed by a 3-month maintenance phase design for 45 overweight Filipinos with T2D. Participants were randomized to an intervention group (n=22) or active control group (n=23), and analyses of the results are underway. The primary outcome will be percent weight change of the participants, and secondary outcomes will include changes in waist circumference, fasting plasma glucose, glycated hemoglobin A1c, physical activity, fat intake, and sugar-sweetened beverage intake. Data analyses will include descriptive statistics to describe sample characteristics and a feasibility assessment based on recruitment, adherence, and retention. Chi-square, Fisher's exact tests, t-tests, and nonparametric rank tests will be used to assess characteristics of randomized groups. Primary analyses will use analysis of covariance and linear mixed models to compare primary and secondary outcomes at 3 months, compared by arm and controlled for baseline levels. Recruitment was completed in January, 2016, and participant follow-up continued through June, 2016. At baseline, mean age was 57 years, 100% (45/45) of participants self-identified as Filipinos, and the cohort was comprised of 17 males and 28 females. Overall, participants were obese with a baseline mean body mass index of 30.2 kg/m2 (standard deviation 4.9). The majority of participants were immigrants (84%, 38/45), with 47% (21/45) living in the United States for more than 10 years. One third of all participants (33%, 15/45) had previously used a pedometer. This study will provide preliminary evidence to determine if the PilAm Go4Health weight loss lifestyle intervention is feasible, and if the program demonstrates potential efficacy to reduce risks for metabolic syndrome in Filipinos with T2D. Positive results will lend support for a larger RCT to evaluate the effectiveness of the PilAm Go4Health intervention for Filipinos. ClinicalTrials.gov: NCT02290184; https://clinicaltrials.gov/ct2/show/NCT02290184 (Archived at http://www.webcitation.org/6k1kUqKSP).

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeChant, Lawrence Justin; Smith, Justin A.

    Here we discuss an improved Corcos (Corcos (1963), (1963)) style cross spectral density utilizing zero pressure gradient, supersonic (Beresh et. al. (2013)) data sets. Using the connection between narrow band measurements with broadband cross-spectral density, i.e. Γ(ξ ,η ,ω )= Φ (ω) A(ωη/U )exp (-i ωξ/U) we focus on estimating coherence expressions of the form: A (ξω nb/U) and B (ηω nb/ U) where ω nb denotes the narrow band frequency, i.e. the band center frequency value and ξ and η are sensors spacing in streamwise/longitudinal and cross-stream/lateral directions, respectively. A methodology to estimate the parameters which retains the Corcosmore » exponential functional form, A(ξω/U)=exp(-k lat ηω/U) but identifies new parameters (constants) consistent with the Beresh et. al. data sets is discussed. The Corcos result requires that the data be properly explained by self-similar variable: ξω/U and ηω/U. The longitudinal (streamwise) variable ξω/U tends to provide a better data collapse, while, consistent with the literature the lateral ηω/U is only successful for higher band center frequencies. Assuming the similarity variables provide a useful description of the data, the longitudinal coherence decay constant result using the Beresh et. al. data sets yields a value for the longitudinal constant k long≈0.36-0.28 that is approximately 3x larger than the “traditional” (low speed, large Reynolds number and zero pressure gradient) of k long≈0.11. We suggest that the most likely reason that the Beresh et. al. data sets incur increased longitudinal decay which results in reduced coherence lengths is due to wall shear induced compression causing an adverse pressure gradient. Focusing on the higher band center frequency measurements where the frequency dependent similarity variables are applicable, the lateral or transverse coherence decay constant k lat≈0.7 is consistent with the “traditional” (low speed, large Reynolds number and zero pressure gradient). It should be noted, that the longitudinal/streamwise coherence decay deviates from the value observed by other researchers while the lateral/ cross-stream value is consistent has been observed by other researchers. We believe that while the measurements used to obtain new decay constant estimates are from internal wind tunnel tests, they likely provide a useful estimate expected reentry flow behavior and are therefore recommended for use. These data could also be useful in determining the uncertainty of correlation length for a uncertainty quantification (UQ) analysis.« less

  4. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2013)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2013-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories’ core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI’s industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI’s academic participants (Carnegie Mellon University, Princeton University, West Virginia University, Boston University and the University of Texas at Austin) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 13, CCSI announced the initial release of its first set of computational tools and models during the October 2012 meeting of its Industry Advisory Board. This initial release led to five companies licensing the CCSI Toolset under a Test and Evaluation Agreement this year. By the end of FY13, the CCSI Technical Team had completed development of an updated suite of computational tools and models. The list below summarizes the new and enhanced toolset components that were released following comprehensive testing during October 2013. 1. FOQUS. Framework for Optimization and Quantification of Uncertainty and Sensitivity. Package includes: FOQUS Graphic User Interface (GUI), simulation-based optimization engine, Turbine Client, and heat integration capabilities. There is also an updated simulation interface and new configuration GUI for connecting Aspen Plus or Aspen Custom Modeler (ACM) simulations to FOQUS and the Turbine Science Gateway. 2. A new MFIX-based Computational Fluid Dynamics (CFD) model to predict particle attrition. 3. A new dynamic reduced model (RM) builder, which generates computationally efficient RMs of the behavior of a dynamic system. 4. A completely re-written version of the algebraic surrogate model builder for optimization (ALAMO). The new version is several orders of magnitude faster than the initial release and eliminates the MATLAB dependency. 5. A new suite of high resolution filtered models for the hydrodynamics associated with horizontal cylindrical objects in a flow path. 6. The new Turbine Science Gateway (Cluster), which supports FOQUS for running multiple simulations for optimization or UQ using a local computer or cluster. 7. A new statistical tool (BSS-ANOVA-UQ) for calibration and validation of CFD models. 8. A new basic data submodel in Aspen Plus format for a representative high viscosity capture solvent, 2-MPZ system. 9. An updated RM tool for CFD (REVEAL) that can create a RM from MFIX. A new lightweight, stand-alone version will be available in late 2013. 10. An updated RM integration tool to convert the RM from REVEAL into a CAPE-OPEN or ACM model for use in a process simulator. 11. An updated suite of unified steady-state and dynamic process models for solid sorbent carbon capture included bubbling fluidized bed and moving bed reactors. 12. An updated and unified set of compressor models including steady-state design point model and dynamic model with surge detection. 13. A new framework for the synthesis and optimization of coal oxycombustion power plants using advanced optimization algorithms. This release focuses on modeling and optimization of a cryogenic air separation unit (ASU). 14. A new technical risk model in spreadsheet format. 15. An updated version of the sorbent kinetic/equilibrium model for parameter estimation for the 1st generation sorbent model. 16. An updated process synthesis superstructure model to determine optimal process configurations utilizing surrogate models from ALAMO for adsorption and regeneration in a solid sorbent process. 17. Validation models for NETL Carbon Capture Unit utilizing sorbent AX. Additional validation models will be available for sorbent 32D in 2014. 18. An updated hollow fiber membrane model and system example for carbon capture. 19. An updated reference power plant model in Thermoflex that includes additional steam extraction and reinjection points to enable heat integration module. 20. An updated financial risk model in spreadsheet format.« less

  5. Quinone reduction via secondary B-branch electron transfer in mutant bacterial reaction centers.

    PubMed

    Laible, Philip D; Kirmaier, Christine; Udawatte, Chandani S M; Hofman, Samuel J; Holten, Dewey; Hanson, Deborah K

    2003-02-18

    Symmetry-related branches of electron-transfer cofactors-initiating with a primary electron donor (P) and terminating in quinone acceptors (Q)-are common features of photosynthetic reaction centers (RC). Experimental observations show activity of only one of them-the A branch-in wild-type bacterial RCs. In a mutant RC, we now demonstrate that electron transfer can occur along the entire, normally inactive B-branch pathway to reduce the terminal acceptor Q(B) on the time scale of nanoseconds. The transmembrane charge-separated state P(+)Q(B)(-) is created in this manner in a Rhodobacter capsulatus RC containing the F(L181)Y-Y(M208)F-L(M212)H-W(M250)V mutations (YFHV). The W(M250)V mutation quantitatively blocks binding of Q(A), thereby eliminating Q(B) reduction via the normal A-branch pathway. Full occupancy of the Q(B) site by the native UQ(10) is ensured (without the necessity of reconstitution by exogenous quinone) by purification of RCs with the mild detergent, Deriphat 160-C. The lifetime of P(+)Q(B)(-) in the YFHV mutant RC is >6 s (at pH 8.0, 298 K). This charge-separated state is not formed upon addition of competitive inhibitors of Q(B) binding (terbutryn or stigmatellin). Furthermore, this lifetime is much longer than the value of approximately 1-1.5 s found when P(+)Q(B)(-) is produced in the wild-type RC by A-side activity alone. Collectively, these results demonstrate that P(+)Q(B)(-) is formed solely by activity of the B-branch carriers in the YFHV RC. In comparison, P(+)Q(B)(-) can form by either the A or B branches in the YFH RC, as indicated by the biexponential lifetimes of approximately 1 and approximately 6-10 s. These findings suggest that P(+)Q(B)(-) states formed via the two branches are distinct and that P(+)Q(B)(-) formed by the B side does not decay via the normal (indirect) pathway that utilizes the A-side cofactors when present. These differences may report on structural and energetic factors that further distinguish the functional asymmetry of the two cofactor branches.

  6. Stochastic Partial Differential Equation Solver for Hydroacoustic Modeling: Improvements to Paracousti Sound Propagation Solver

    NASA Astrophysics Data System (ADS)

    Preston, L. A.

    2017-12-01

    Marine hydrokinetic (MHK) devices offer a clean, renewable alternative energy source for the future. Responsible utilization of MHK devices, however, requires that the effects of acoustic noise produced by these devices on marine life and marine-related human activities be well understood. Paracousti is a 3-D full waveform acoustic modeling suite that can accurately propagate MHK noise signals in the complex bathymetry found in the near-shore to open ocean environment and considers real properties of the seabed, water column, and air-surface interface. However, this is a deterministic simulation that assumes the environment and source are exactly known. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected noise levels within the marine environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. One method is to use Monte Carlo (MC) techniques where simulation results from a large number of deterministic solutions are aggregated to provide statistical properties of the output signal. However, MC methods can be computationally prohibitive since they can require tens of thousands or more simulations to build up an accurate representation of those statistical properties. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a small fraction of the computational cost of MC. We are developing a SPDE solver for the 3-D acoustic wave propagation problem called Paracousti-UQ to help regulators and operators assess the statistical properties of environmental noise produced by MHK devices. In this presentation, we present the SPDE method and compare statistical distributions of simulated acoustic signals in simple models to MC simulations to show the accuracy and efficiency of the SPDE method. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc. for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.

  7. A methodology for on‐board CBCT imaging dose using optically stimulated luminescence detectors

    PubMed Central

    Yusuf, Muhammad; Alothmany, Nazeeh; Kinsara, A. Abdulrahman; Abdulkhaliq, Fahad; Ghamdi, Suliman M.; Saoudi, Abdelhamid

    2016-01-01

    Cone‐beam computed tomography CBCT systems are used in radiation therapy for patient alignment and positioning. The CBCT imaging procedure for patient setup adds substantial radiation dose to patient's normal tissue. This study presents a complete procedure for the CBCT dosimetry using the InLight optically‐stimulated‐luminescence (OSL) nanoDots. We report five dose parameters: the mean slice dose (DMSD); the cone beam dose index (CBDIW); the mean volume dose (DMVD); point‐dose profile, D(FOV); and the off‐field Dose. In addition, CBCT skin doses for seven pelvic tumor patients are reported. CBCT‐dose measurement was performed on a custom‐made cylindrical acrylic body phantom (50 cm length, 32 cm diameter). We machined 25 circular disks (2 cm thick) with grooves and holes to hold OSL‐nanoDots. OSLs that showed similar sensitivities were selected and calibrated against a Farmer‐type ionization‐chamber (0.6 CT) before being inserted into the grooves and holes. For the phantom scan, a standard CBCT‐imaging protocol (pelvic sites: 125 kVp, 80 mA and 25 ms) was used. Five dose parameters were quantified: DMSD, CBDIW, DMVD, D(FOV), and the off‐field dose. The DMSD for the central slice was 31.1±0.85 mGy, and CBDIW was 34.5±0.6 mGy at 16 cm FOV. The DMVD was 25.6±1.1 mGy. The off‐field dose was 10.5 mGy. For patients, the anterior and lateral skin doses attributable to CBCT imaging were 39.04±4.4 and 27.1±1.3 mGy, respectively. OSL nanoDots were convenient to use in measuring CBCT dose. The method of selecting the nanoDots greatly reduced uncertainty in the OSL measurements. Our detailed calibration procedure and CBCT dose measurements and calculations could prove useful in developing OSL routines for CBCT quality assessment, which in turn gives them the property of high spatial resolution, meaning that they have the potential for measurement of dose in regions of severe dose‐gradients. PACS number(s): 87.57.‐s, 87.57.Q, 87.57.uq PMID:27685143

  8. Molecular and eco-physiological characterization of arsenic (As)-transforming Achromobacter sp. KAs 3-5T from As-contaminated groundwater of West Bengal, India.

    PubMed

    Mohapatra, Balaram; Satyanarayana, Tulasi; Sar, Pinaki

    2018-05-02

    Molecular and eco-physiological characterization of arsenic (As)-transforming and hydrocarbon-utilizing Achromobacter type strain KAs 3-5 T has been investigated in order to gain an insight into As-geomicrobiology in the contaminated groundwater. The bacterium is isolated from As-rich groundwater of West Bengal, India. Comparative 16S rRNA gene sequence phylogenetic analysis confirmed that the strain KAs 3-5 T is closely related to Achromobacter mucicolens LMG 26685 T (99.17%) and Achromobacter animicus LMG 26690 T (99.17%), thus affiliated to the genus Achromobacter. Strain KAs 3-5 T is nonflagellated, mesophilic, facultative anaerobe, having a broad metabolic repertoire of using various sugars, sugar-/fatty acids, hydrocarbons as principal carbon substrates, and O 2 , NO 3 - , NO 2 - , and Fe 3+ as terminal electron acceptors. Growth with hydrocarbons led to cellular aggregation and adherence of the cells to the hydrocarbon particles confirmed through electron microscopic observations. The strain KAs 3-5 T showed high As resistance (MIC of 5 mM for As 3+ , 25 mM for As 5+ ) and reductive transformation of As 5+ under aerobic conditions while utilizing both sugars and hydrocarbons. Molecular taxonomy specified a high genomic GC content (65.5 mol %), ubiquinone 8 (UQ-8) as respiratory quinone, spermidine as predominant polyamine in the bacterium. The differential presence of C 12:0 , C 14:0 2-OH, C 18:1 ω7c, and C 14:0 iso 3-OH/ C 16:1 iso fatty acids, phosphatidylglycerol (PG), phosphatidylcholine (PC), two unknown phospholipid (PL1, PL2) as polar lipids, low DNA-DNA relatedness (33.0-41.0%) with the Achromobacter members, and unique metabolic capacities clearly indicated the distinct genomic and physiological properties of strain KAs 3-5 T among known species of the genus Achromobacter. These findings lead to improve our understanding on metabolic flexibility of bacteria residing in As-contaminated groundwater and As-bacteria interactions within oligotrophic aquifer system.

  9. Large-Scale Uncertainty and Error Analysis for Time-dependent Fluid/Structure Interactions in Wind Turbine Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alonso, Juan J.; Iaccarino, Gianluca

    2013-08-25

    The following is the final report covering the entire period of this aforementioned grant, June 1, 2011 - May 31, 2013 for the portion of the effort corresponding to Stanford University (SU). SU has partnered with Sandia National Laboratories (PI: Mike S. Eldred) and Purdue University (PI: Dongbin Xiu) to complete this research project and this final report includes those contributions made by the members of the team at Stanford. Dr. Eldred is continuing his contributions to this project under a no-cost extension and his contributions to the overall effort will be detailed at a later time (once his effortmore » has concluded) on a separate project submitted by Sandia National Laboratories. At Stanford, the team is made up of Profs. Alonso, Iaccarino, and Duraisamy, post-doctoral researcher Vinod Lakshminarayan, and graduate student Santiago Padron. At Sandia National Laboratories, the team includes Michael Eldred, Matt Barone, John Jakeman, and Stefan Domino, and at Purdue University, we have Prof. Dongbin Xiu as our main collaborator. The overall objective of this project was to develop a novel, comprehensive methodology for uncertainty quantification by combining stochastic expansions (nonintrusive polynomial chaos and stochastic collocation), the adjoint approach, and fusion with experimental data to account for aleatory and epistemic uncertainties from random variable, random field, and model form sources. The expected outcomes of this activity were detailed in the proposal and are repeated here to set the stage for the results that we have generated during the time period of execution of this project: 1. The rigorous determination of an error budget comprising numerical errors in physical space and statistical errors in stochastic space and its use for optimal allocation of resources; 2. A considerable increase in efficiency when performing uncertainty quantification with a large number of uncertain variables in complex non-linear multi-physics problems; 3. A solution to the long-time integration problem of spectral chaos approaches; 4. A rigorous methodology to account for aleatory and epistemic uncertainties, to emphasize the most important variables via dimension reduction and dimension-adaptive refinement, and to support fusion with experimental data using Bayesian inference; 5. The application of novel methodologies to time-dependent reliability studies in wind turbine applications including a number of efforts relating to the uncertainty quantification in vertical-axis wind turbine applications. In this report, we summarize all accomplishments in the project (during the time period specified) focusing on advances in UQ algorithms and deployment efforts to the wind turbine application area. Detailed publications in each of these areas have also been completed and are available from the respective conference proceedings and journals as detailed in a later section.« less

  10. Development of an Uncertainty Quantification Predictive Chemical Reaction Model for Syngas Combustion

    DOE PAGES

    Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.; ...

    2017-01-24

    An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less

  11. Development of an Uncertainty Quantification Predictive Chemical Reaction Model for Syngas Combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.

    An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less

  12. Predicting disulfide connectivity from protein sequence using multiple sequence feature vectors and secondary structure.

    PubMed

    Song, Jiangning; Yuan, Zheng; Tan, Hao; Huber, Thomas; Burrage, Kevin

    2007-12-01

    Disulfide bonds are primary covalent crosslinks between two cysteine residues in proteins that play critical roles in stabilizing the protein structures and are commonly found in extracy-toplasmatic or secreted proteins. In protein folding prediction, the localization of disulfide bonds can greatly reduce the search in conformational space. Therefore, there is a great need to develop computational methods capable of accurately predicting disulfide connectivity patterns in proteins that could have potentially important applications. We have developed a novel method to predict disulfide connectivity patterns from protein primary sequence, using a support vector regression (SVR) approach based on multiple sequence feature vectors and predicted secondary structure by the PSIPRED program. The results indicate that our method could achieve a prediction accuracy of 74.4% and 77.9%, respectively, when averaged on proteins with two to five disulfide bridges using 4-fold cross-validation, measured on the protein and cysteine pair on a well-defined non-homologous dataset. We assessed the effects of different sequence encoding schemes on the prediction performance of disulfide connectivity. It has been shown that the sequence encoding scheme based on multiple sequence feature vectors coupled with predicted secondary structure can significantly improve the prediction accuracy, thus enabling our method to outperform most of other currently available predictors. Our work provides a complementary approach to the current algorithms that should be useful in computationally assigning disulfide connectivity patterns and helps in the annotation of protein sequences generated by large-scale whole-genome projects. The prediction web server and Supplementary Material are accessible at http://foo.maths.uq.edu.au/~huber/disulfide

  13. Electrocardiogram‐gated coronary CT angiography dose estimates using ImPACT

    PubMed Central

    Asada, Yasuki; Matsubara, Kosuke; Suzuki, Shouichi; Koshida, Kichiro; Matsunaga, Yuta; Haba, Tomonobu; Kawaguchi, Ai; Toyama, Hiroshi; Kato, Ryoichi

    2016-01-01

    The primary study objective was to assess radiation doses using a modified form of the Imaging Performance Assessment of Computed Tomography (CT) scanner (ImPACT) patient dosimetry for cardiac applications on an Aquilion ONE ViSION Edition scanner, including the Ca score, target computed tomography angiography (CTA), prospective CTA, continuous CTA/cardiac function analysis (CFA), and CTA/CFA modulation. Accordingly, we clarified the CT dose index (CTDI) to determine the relationship between heart rate (HR) and X‐ray exposure. As a secondary objective, we compared radiation doses using modified ImPACT, a whole‐body dosimetry phantom study, and the k‐factor method to verify the validity of the dose results obtained with modified ImPACT. The effective dose determined for the reference person (4.66 mSv at 60 beats per minute (bpm) and 33.43 mSv at 90 bpm) were approximately 10% less than those determined for the phantom study (5.28 mSv and 36.68 mSv). The effective doses according to the k‐factor (0.014 mSv·mGy−1·cm−1; 2.57 mSv and 17.10 mSv) were significantly lower than those obtained with the other two methods. In the present study, we have shown that ImPACT, when modified for cardiac applications, can assess both absorbed and effective doses. The results of our dose comparison indicate that modified ImPACT dose assessment is a promising and practical method for evaluating coronary CTA. PACS number(s): 87.57.Q‐, 87.59.Dj, 87.57.uq PMID:27455500

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Kunkun, E-mail: ktg@illinois.edu; Inria Bordeaux – Sud-Ouest, Team Cardamom, 200 avenue de la Vieille Tour, 33405 Talence; Congedo, Pietro M.

    The Polynomial Dimensional Decomposition (PDD) is employed in this work for the global sensitivity analysis and uncertainty quantification (UQ) of stochastic systems subject to a moderate to large number of input random variables. Due to the intimate connection between the PDD and the Analysis of Variance (ANOVA) approaches, PDD is able to provide a simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to the Polynomial Chaos expansion (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of standard methods unaffordable formore » real engineering applications. In order to address the problem of the curse of dimensionality, this work proposes essentially variance-based adaptive strategies aiming to build a cheap meta-model (i.e. surrogate model) by employing the sparse PDD approach with its coefficients computed by regression. Three levels of adaptivity are carried out in this paper: 1) the truncated dimensionality for ANOVA component functions, 2) the active dimension technique especially for second- and higher-order parameter interactions, and 3) the stepwise regression approach designed to retain only the most influential polynomials in the PDD expansion. During this adaptive procedure featuring stepwise regressions, the surrogate model representation keeps containing few terms, so that the cost to resolve repeatedly the linear systems of the least-squares regression problem is negligible. The size of the finally obtained sparse PDD representation is much smaller than the one of the full expansion, since only significant terms are eventually retained. Consequently, a much smaller number of calls to the deterministic model is required to compute the final PDD coefficients.« less

  15. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2012-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less

  16. Modeling transport phenomena and uncertainty quantification in solidification processes

    NASA Astrophysics Data System (ADS)

    Fezi, Kyle S.

    Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification time, and sump profile predictions. Uncertain model inputs of interest included the secondary dendrite arm spacing, equiaxed particle size, equiaxed packing fraction, heat transfer coefficient, and material properties. The most influential input parameters for predicting the macrosegregation level were the dendrite arm spacing, which also strongly depended on the choice of mushy zone permeability model, and the equiaxed packing fraction. Additionally, the degree of uncertainty required to produce accurate predictions depended on the output of interest from the model.

  17. Efficient Screening of Climate Model Sensitivity to a Large Number of Perturbed Input Parameters [plus supporting information

    DOE PAGES

    Covey, Curt; Lucas, Donald D.; Tannahill, John; ...

    2013-07-01

    Modern climate models contain numerous input parameters, each with a range of possible values. Since the volume of parameter space increases exponentially with the number of parameters N, it is generally impossible to directly evaluate a model throughout this space even if just 2-3 values are chosen for each parameter. Sensitivity screening algorithms, however, can identify input parameters having relatively little effect on a variety of output fields, either individually or in nonlinear combination.This can aid both model development and the uncertainty quantification (UQ) process. Here we report results from a parameter sensitivity screening algorithm hitherto untested in climate modeling,more » the Morris one-at-a-time (MOAT) method. This algorithm drastically reduces the computational cost of estimating sensitivities in a high dimensional parameter space because the sample size grows linearly rather than exponentially with N. It nevertheless samples over much of the N-dimensional volume and allows assessment of parameter interactions, unlike traditional elementary one-at-a-time (EOAT) parameter variation. We applied both EOAT and MOAT to the Community Atmosphere Model (CAM), assessing CAM’s behavior as a function of 27 uncertain input parameters related to the boundary layer, clouds, and other subgrid scale processes. For radiation balance at the top of the atmosphere, EOAT and MOAT rank most input parameters similarly, but MOAT identifies a sensitivity that EOAT underplays for two convection parameters that operate nonlinearly in the model. MOAT’s ranking of input parameters is robust to modest algorithmic variations, and it is qualitatively consistent with model development experience. Supporting information is also provided at the end of the full text of the article.« less

  18. Level-2 Milestone 3244: Deploy Dawn ID Machine for Initial Science Runs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, D

    2009-09-21

    This report documents the delivery, installation, integration, testing, and acceptance of the Dawn system, ASC L2 milestone 3244: Deploy Dawn ID Machine for Initial Science Runs, due September 30, 2009. The full text of the milestone is included in Attachment 1. The description of the milestone is: This milestone will be a result of work started three years ago with the planning for a multi-petaFLOPS UQ-focused platform (Sequoia) and will be satisfied when a smaller ID version of the final system is delivered, installed, integrated, tested, accepted, and deployed at LLNL for initial science runs in support of SSP mission.more » The deliverable for this milestone will be a LA petascale computing system (named Dawn) usable for code development and scaling necessary to ensure effective use of a final Sequoia platform (expected in 2011-2012), and for urgent SSP program needs. Allocation and scheduling of Dawn as an LA system will likely be performed informally, similar to what has been used for BlueGene/L. However, provision will be made to allow for dedicated access times for application scaling studies across the entire Dawn resource. The milestone was completed on April 1, 2009, when science runs began running on the Dawn system. The following sections describe the Dawn system architecture, current status, installation and integration time line, and testing and acceptance process. A project plan is included as Attachment 2. Attachment 3 is a letter certifying the handoff of the system to a nuclear weapons stockpile customer. Attachment 4 presents the results of science runs completed on the system.« less

  19. ASC-AD penetration modeling FY05 status report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kistler, Bruce L.; Ostien, Jakob T.; Chiesa, Michael L.

    2006-04-01

    Sandia currently lacks a high fidelity method for predicting loads on and subsequent structural response of earth penetrating weapons. This project seeks to test, debug, improve and validate methodologies for modeling earth penetration. Results of this project will allow us to optimize and certify designs for the B61-11, Robust Nuclear Earth Penetrator (RNEP), PEN-X and future nuclear and conventional penetrator systems. Since this is an ASC Advanced Deployment project the primary goal of the work is to test, debug, verify and validate new Sierra (and Nevada) tools. Also, since this project is part of the V&V program within ASC, uncertaintymore » quantification (UQ), optimization using DAKOTA [1] and sensitivity analysis are an integral part of the work. This project evaluates, verifies and validates new constitutive models, penetration methodologies and Sierra/Nevada codes. In FY05 the project focused mostly on PRESTO [2] using the Spherical Cavity Expansion (SCE) [3,4] and PRESTO Lagrangian analysis with a preformed hole (Pen-X) methodologies. Modeling penetration tests using PRESTO with a pilot hole was also attempted to evaluate constitutive models. Future years work would include the Alegra/SHISM [5] and AlegrdEP (Earth Penetration) methodologies when they are ready for validation testing. Constitutive models such as Soil-and-Foam, the Sandia Geomodel [6], and the K&C Concrete model [7] were also tested and evaluated. This report is submitted to satisfy annual documentation requirements for the ASC Advanced Deployment program. This report summarizes FY05 work performed in the Penetration Mechanical Response (ASC-APPS) and Penetration Mechanics (ASC-V&V) projects. A single report is written to document the two projects because of the significant amount of technical overlap.« less

  20. ADMultiImg: a novel missing modality transfer learning based CAD system for diagnosis of MCI due to AD using incomplete multi-modality imaging data

    NASA Astrophysics Data System (ADS)

    Liu, Xiaonan; Chen, Kewei; Wu, Teresa; Weidman, David; Lure, Fleming; Li, Jing

    2018-02-01

    Alzheimer's Disease (AD) is the most common cause of dementia and currently has no cure. Treatments targeting early stages of AD such as Mild Cognitive Impairment (MCI) may be most effective to deaccelerate AD, thus attracting increasing attention. However, MCI has substantial heterogeneity in that it can be caused by various underlying conditions, not only AD. To detect MCI due to AD, NIA-AA published updated consensus criteria in 2011, in which the use of multi-modality images was highlighted as one of the most promising methods. It is of great interest to develop a CAD system based on automatic, quantitative analysis of multi-modality images and machine learning algorithms to help physicians more adequately diagnose MCI due to AD. The challenge, however, is that multi-modality images are not universally available for many patients due to cost, access, safety, and lack of consent. We developed a novel Missing Modality Transfer Learning (MMTL) algorithm capable of utilizing whatever imaging modalities are available for an MCI patient to diagnose the patient's likelihood of MCI due to AD. Furthermore, we integrated MMTL with radiomics steps including image processing, feature extraction, and feature screening, and a post-processing for uncertainty quantification (UQ), and developed a CAD system called "ADMultiImg" to assist clinical diagnosis of MCI due to AD using multi-modality images together with patient demographic and genetic information. Tested on ADNI date, our system can generate a diagnosis with high accuracy even for patients with only partially available image modalities (AUC=0.94), and therefore may have broad clinical utility.

  1. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.

    PubMed

    Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A

    2005-04-07

    Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.

  2. Explicit blow-up solutions to the Schroedinger maps from R{sup 2} to the hyperbolic 2-space H{sup 2}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding Qing

    2009-10-15

    In this article, we prove that the equation of the Schroedinger maps from R{sup 2} to the hyperbolic 2-space H{sup 2} is SU(1,1)-gauge equivalent to the following 1+2 dimensional nonlinear Schroedinger-type system of three unknown complex functions p, q, r, and a real function u: iq{sub t}+q{sub zz}-2uq+2(pq){sub z}-2pq{sub z}-4|p|{sup 2}q=0, ir{sub t}-r{sub zz}+2ur+2(pr){sub z}-2pr{sub z}+4|p|{sup 2}r=0, ip{sub t}+(qr){sub z}-u{sub z}=0, p{sub z}+p{sub z}=-|q|{sup 2}+|r|{sup 2}, -r{sub z}+q{sub z}=-2(pr+pq), where z is a complex coordinate of the plane R{sup 2} and z is the complex conjugate of z. Although this nonlinear Schroedinger-type system looks complicated, it admits a class ofmore » explicit blow-up smooth solutions: p=0, q=(e{sup i(bzz/2(a+bt))}/a+bt){alpha}z, r=e{sup -i(bzz/2(a+bt))}/(a+bt){alpha}z, u=2{alpha}{sup 2}zz/(a+bt){sup 2}, where a and b are real numbers with ab<0 and {alpha} satisfies {alpha}{sup 2}=b{sup 2}/16. From these facts, we explicitly construct smooth solutions to the Schroedinger maps from R{sup 2} to the hyperbolic 2-space H{sup 2} by using the gauge transformations such that the absolute values of their gradients blow up in finite time. This reveals some blow-up phenomenon of Schroedinger maps.« less

  3. A mixture model-based approach to the clustering of microarray expression data.

    PubMed

    McLachlan, G J; Bean, R W; Peel, D

    2002-03-01

    This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets. EMMIX-GENE is available at http://www.maths.uq.edu.au/~gjm/emmix-gene/

  4. Social Responsibility and the State's Duty to provide Healthcare: An Islamic Ethico-Legal Perspective.

    PubMed

    Padela, Aasim I

    2017-12-01

    The United Nations Educational, Scientific and Cultural Organization's (UNESCO) Declaration on Bioethics and Human Rights asserts that governments are morally obliged to promote health and to provide access to quality healthcare, essential medicines and adequate nutrition and water to all members of society. According to UNESCO, this obligation is grounded in a moral commitment to promoting fundamental human rights and emerges from the principle of social responsibility. Yet in an era of ethical pluralism and contentions over the universality of human rights conventions, the extent to which the UNESCO Declaration can motivate behaviors and policies rests, at least in part, upon accepting the moral arguments it makes. In this essay I reflect on a state's moral obligation to provide healthcare from the perspective of Islamic moral theology and law. I examine how Islamic ethico-legal conceptual analogues for human rights and communal responsibility, ḥuqūq al-'ibād and farḍ al-kifāyah and other related constructs might be used to advance a moral argument for healthcare provision by the state. Moving from theory to application, I next illustrate how notions of human rights and social responsibility were used by Muslim stakeholders to buttress moral arguments to support American healthcare reform. In this way, the paper advance discourses on a universal bioethics and common morality by bringing into view the concordances and discordances between Islamic ethico-legal constructs and moral arguments advanced by transnational health policy advocates. It also provides insight into applied Islamic bioethics by demonstrating how Islamic ethico-legal values might inform the discursive outputs of Muslim organizations. © 2016 John Wiley & Sons Ltd.

  5. Adaptive surrogate modeling by ANOVA and sparse polynomial dimensional decomposition for global sensitivity analysis in fluid simulation

    NASA Astrophysics Data System (ADS)

    Tang, Kunkun; Congedo, Pietro M.; Abgrall, Rémi

    2016-06-01

    The Polynomial Dimensional Decomposition (PDD) is employed in this work for the global sensitivity analysis and uncertainty quantification (UQ) of stochastic systems subject to a moderate to large number of input random variables. Due to the intimate connection between the PDD and the Analysis of Variance (ANOVA) approaches, PDD is able to provide a simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to the Polynomial Chaos expansion (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of standard methods unaffordable for real engineering applications. In order to address the problem of the curse of dimensionality, this work proposes essentially variance-based adaptive strategies aiming to build a cheap meta-model (i.e. surrogate model) by employing the sparse PDD approach with its coefficients computed by regression. Three levels of adaptivity are carried out in this paper: 1) the truncated dimensionality for ANOVA component functions, 2) the active dimension technique especially for second- and higher-order parameter interactions, and 3) the stepwise regression approach designed to retain only the most influential polynomials in the PDD expansion. During this adaptive procedure featuring stepwise regressions, the surrogate model representation keeps containing few terms, so that the cost to resolve repeatedly the linear systems of the least-squares regression problem is negligible. The size of the finally obtained sparse PDD representation is much smaller than the one of the full expansion, since only significant terms are eventually retained. Consequently, a much smaller number of calls to the deterministic model is required to compute the final PDD coefficients.

  6. Partitioning of electron flux between the respiratory chains of the yeast Candida parapsilosis: parallel working of the two chains.

    PubMed

    Guerin, M G; Camougrand, N M

    1994-02-08

    Partitioning of the electron flux between the classical and the alternative respiratory chains of the yeast Candida parapsilosis, was measured as a function of the oxidation rate and of the Q-pool redox poise. At low respiration rate, electrons from external NADH travelled preferentially through the alternative pathway as indicated by the antimycin A-insensitivity of electron flow. Inhibition of the alternative pathway by SHAM restored full antimycin A-sensitivity to the remaining electro flow. The dependence of the respiratory rate on the redox poise of the quinone pool was investigated when the electron flux was mediated either by the main respiratory chain (growth in the absence of antimycin A) or by the second respiratory chain (growth in the presence of antimycin A). In the former case, a linear relationship was found between these two parameters. In contrast, in the latter case, the relationship between Q-pool reduction level and electron flux was non-linear, but it could be resolved into two distinct curves. This second quinone is not reducible in the presence of antimycin A but only in the presence of high concentrations of myxothiazol or cyanide. Since two quinone species exist in C. parapsilosis, UQ9 and Qx (C33H54O4), we hypothesized that these two curves could correspond to the functioning of the second quinone engaged during the alternative pathway activity. Partitioning of electrons between both respiratory chains could occur upstream of complex III with the second chain functioning in parallel to the main one, and with the additional possibility of merging into the main one at the complex IV level.

  7. Numerical Error Estimation with UQ

    NASA Astrophysics Data System (ADS)

    Ackmann, Jan; Korn, Peter; Marotzke, Jochem

    2014-05-01

    Ocean models are still in need of means to quantify model errors, which are inevitably made when running numerical experiments. The total model error can formally be decomposed into two parts, the formulation error and the discretization error. The formulation error arises from the continuous formulation of the model not fully describing the studied physical process. The discretization error arises from having to solve a discretized model instead of the continuously formulated model. Our work on error estimation is concerned with the discretization error. Given a solution of a discretized model, our general problem statement is to find a way to quantify the uncertainties due to discretization in physical quantities of interest (diagnostics), which are frequently used in Geophysical Fluid Dynamics. The approach we use to tackle this problem is called the "Goal Error Ensemble method". The basic idea of the Goal Error Ensemble method is that errors in diagnostics can be translated into a weighted sum of local model errors, which makes it conceptually based on the Dual Weighted Residual method from Computational Fluid Dynamics. In contrast to the Dual Weighted Residual method these local model errors are not considered deterministically but interpreted as local model uncertainty and described stochastically by a random process. The parameters for the random process are tuned with high-resolution near-initial model information. However, the original Goal Error Ensemble method, introduced in [1], was successfully evaluated only in the case of inviscid flows without lateral boundaries in a shallow-water framework and is hence only of limited use in a numerical ocean model. Our work consists in extending the method to bounded, viscous flows in a shallow-water framework. As our numerical model, we use the ICON-Shallow-Water model. In viscous flows our high-resolution information is dependent on the viscosity parameter, making our uncertainty measures viscosity-dependent. We will show that we can choose a sensible parameter by using the Reynolds-number as a criteria. Another topic, we will discuss is the choice of the underlying distribution of the random process. This is especially of importance in the scope of lateral boundaries. We will present resulting error estimates for different height- and velocity-based diagnostics applied to the Munk gyre experiment. References [1] F. RAUSER: Error Estimation in Geophysical Fluid Dynamics through Learning; PhD Thesis, IMPRS-ESM, Hamburg, 2010 [2] F. RAUSER, J. MAROTZKE, P. KORN: Ensemble-type numerical uncertainty quantification from single model integrations; SIAM/ASA Journal on Uncertainty Quantification, submitted

  8. On the predictivity of pore-scale simulations: Estimating uncertainties with multilevel Monte Carlo

    NASA Astrophysics Data System (ADS)

    Icardi, Matteo; Boccardo, Gianluca; Tempone, Raúl

    2016-09-01

    A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another ;equivalent; sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [1], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers, extrapolation and post-processing techniques. The proposed method can be efficiently used in many porous media applications for problems such as stochastic homogenization/upscaling, propagation of uncertainty from microscopic fluid and rock properties to macro-scale parameters, robust estimation of Representative Elementary Volume size for arbitrary physics.

  9. The role of correlations in uncertainty quantification of transportation relevant fuel models

    DOE PAGES

    Fridlyand, Aleksandr; Johnson, Matthew S.; Goldsborough, S. Scott; ...

    2017-02-03

    Large reaction mechanisms are often used to describe the combustion behavior of transportation-relevant fuels like gasoline, where these are typically represented by surrogate blends, e.g., n-heptane/iso-octane/toluene. We describe efforts to quantify the uncertainty in the predictions of such mechanisms at realistic engine conditions, seeking to better understand the robustness of the model as well as the important reaction pathways and their impacts on combustion behavior. In this work, we examine the importance of taking into account correlations among reactions that utilize the same rate rules and those with multiple product channels on forward propagation of uncertainty by Monte Carlo simulations.more » Automated means are developed to generate the uncertainty factor assignment for a detailed chemical kinetic mechanism, by first uniquely identifying each reacting species, then sorting each of the reactions based on the rate rule utilized. Simulation results reveal that in the low temperature combustion regime for iso-octane, the majority of the uncertainty in the model predictions can be attributed to low temperature reactions of the fuel sub-mechanism. The foundational, or small-molecule chemistry (C 0-C 4) only contributes significantly to uncertainties in the predictions at the highest temperatures (Tc=900 K). Accounting for correlations between important reactions is shown to produce non-negligible differences in the estimates of uncertainty. Including correlations among reactions that use the same rate rules increases uncertainty in the model predictions, while accounting for correlations among reactions with multiple branches decreases uncertainty in some cases. Significant non-linear response is observed in the model predictions depending on how the probability distributions of the uncertain rate constants are defined.Finally, we concluded that care must be exercised in defining these probability distributions in order to reduce bias, and physically unrealistic estimates in the forward propagation of uncertainty for a range of UQ activities.« less

  10. Rapamycin and CHIR99021 Coordinate Robust Cardiomyocyte Differentiation From Human Pluripotent Stem Cells Via Reducing p53-Dependent Apoptosis.

    PubMed

    Qiu, Xiao-Xu; Liu, Yang; Zhang, Yi-Fan; Guan, Ya-Na; Jia, Qian-Qian; Wang, Chen; Liang, He; Li, Yong-Qin; Yang, Huang-Tian; Qin, Yong-Wen; Huang, Shuang; Zhao, Xian-Xian; Jing, Qing

    2017-10-02

    Cardiomyocytes differentiated from human pluripotent stem cells can serve as an unexhausted source for a cellular cardiac disease model. Although small molecule-mediated cardiomyocyte differentiation methods have been established, the differentiation efficiency is relatively unsatisfactory in multiple lines due to line-to-line variation. Additionally, hurdles including line-specific low expression of endogenous growth factors and the high apoptotic tendency of human pluripotent stem cells also need to be overcome to establish robust and efficient cardiomyocyte differentiation. We used the H9-human cardiac troponin T-eGFP reporter cell line to screen for small molecules that promote cardiac differentiation in a monolayer-based and growth factor-free differentiation model. We found that collaterally treating human pluripotent stem cells with rapamycin and CHIR99021 during the initial stage was essential for efficient and reliable cardiomyocyte differentiation. Moreover, this method maintained consistency in efficiency across different human embryonic stem cell and human induced pluripotent stem cell lines without specifically optimizing multiple parameters (the efficiency in H7, H9, and UQ1 human induced pluripotent stem cells is 98.3%, 93.3%, and 90.6%, respectively). This combination also increased the yield of cardiomyocytes (1:24) and at the same time reduced medium consumption by about 50% when compared with the previous protocols. Further analysis indicated that inhibition of the mammalian target of rapamycin allows efficient cardiomyocyte differentiation through overcoming p53-dependent apoptosis of human pluripotent stem cells during high-density monolayer culture via blunting p53 translation and mitochondrial reactive oxygen species production. We have demonstrated that mammalian target of rapamycin exerts a stage-specific and multifaceted regulation over cardiac differentiation and provides an optimized approach for generating large numbers of functional cardiomyocytes for disease modeling and in vitro drug screening. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  11. A dosimetry technique for measuring kilovoltage cone‐beam CT dose on a linear accelerator using radiotherapy equipment

    PubMed Central

    Lawford, Catherine E.

    2014-01-01

    This work develops a technique for kilovoltage cone‐beam CT (CBCT) dosimetry that incorporates both point dose and integral dose in the form of dose length product, and uses readily available radiotherapy equipment. The dose from imaging protocols for a range of imaging parameters and treatment sites was evaluated. Conventional CT dosimetry using 100 mm long pencil chambers has been shown to be inadequate for the large fields in CBCT and has been replaced in this work by a combination of point dose and integral dose. Absolute dose measurements were made with a small volume ion chamber at the central slice of a radiotherapy phantom. Beam profiles were measured using a linear diode array large enough to capture the entire imaging field. These profiles were normalized to absolute dose to form dose line integrals, which were then weighted with radial depth to form the DLPCBCT. This metric is analogous to the standard dose length product (DLP), but derived differently to suit the unique properties of CBCT. Imaging protocols for head and neck, chest, and prostate sites delivered absolute doses of 0.9, 2.2, and 2.9 cGy to the center of the phantom, and DLPCBCT of 28.2, 665.1, and 565.3 mGy.cm, respectively. Results are displayed as dose per 100 mAs and as a function of key imaging parameters such as kVp, mAs, and collimator selection in a summary table. DLPCBCT was found to correlate closely with the dimension of the imaging region and provided a good indication of integral dose. It is important to assess integral dose when determining radiation doses to patients using CBCT. By incorporating measured beam profiles and DLP, this technique provides a CBCT dosimetry in radiotherapy phantoms and allows the prediction of imaging dose for new CBCT protocols. PACS number: 87.57.uq PMID:25207398

  12. Quantitative signal intensity alteration in infrapatellar fat pad predict incident radiographic osteoarthritis: the Osteoarthritis Initiative.

    PubMed

    Wang, Kang; Ding, Changhai; Hannon, Michael J; Chen, Zhongshan; Kwoh, C Kent; Hunter, David J

    2018-04-12

    To determine if infrapatellar fat pad (IPFP) signal intensity (SI) measures are predictive of incident radiographic osteoarthritis (iROA) over 4 years in the OA Initiative (OAI) study. Case knees (n=355) defined by iROA were matched one-to-one by gender, age and radiographic status with control knees. T2-weighted MR images were assessed at P0 (the visit when iROA was found on radiograph), P-1 (1 year prior to P0) and baseline, and utilized to assess IPFP SI semi-automatically using MATLAB. Conditional logistic regression analyses were used to assess risk of iROA associated with IPFP SI alteration after adjustment for covariates. Participants were on average 60.2 years old, predominantly female (66.7%) and overweight (mean BMI: 28.3). Baseline IPFP measures including mean value and standard deviation of IPFP SI [Mean(IPFP), sDev(IPFP)] (HR, 95%CI: 5.2, 1.1 to 23.6 and 5.7, 2.2 to 14.5, respectively), mean value and standard deviation of IPFP high SI [Mean(H), sDev(H)] (HR, 95%CI: 3.3, 1.7 to 6.4 and 3.1, 1.3 to 7.7, respectively), median value and upper quartile value of IPFP high SI [Median(H), UQ(H)], and clustering effect of high SI [Clustering factor(H)] were associated with iROA during 4 years. All P-1 IPFP measures were associated with iROA after 12 months. P-0 IPFP SI measures were all associated with ROA. The quantitative segmentation of high signal in IPFP is confirming previous work based on semiquantitative assessment suggesting its predictive validity. The IPFP high SI alteration could be an important imaging biomarker to predict the occurrence of radiographic OA. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Process compensated resonance testing modeling for damage evolution and uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Biedermann, Eric; Heffernan, Julieanne; Mayes, Alexander; Gatewood, Garrett; Jauriqui, Leanne; Goodlet, Brent; Pollock, Tresa; Torbet, Chris; Aldrin, John C.; Mazdiyasni, Siamack

    2017-02-01

    Process Compensated Resonance Testing (PCRT) is a nondestructive evaluation (NDE) method based on the fundamentals of Resonant Ultrasound Spectroscopy (RUS). PCRT is used for material characterization, defect detection, process control and life monitoring of critical gas turbine engine and aircraft components. Forward modeling and model inversion for PCRT have the potential to greatly increase the method's material characterization capability while reducing its dependence on compiling a large population of physical resonance measurements. This paper presents progress on forward modeling studies for damage mechanisms and defects in common to structural materials for gas turbine engines. Finite element method (FEM) models of single crystal (SX) Ni-based superalloy Mar-M247 dog bones and Ti-6Al-4V cylindrical bars were created, and FEM modal analyses calculated the resonance frequencies for the samples in their baseline condition. Then the frequency effects of superalloy creep (high-temperature plastic deformation) and macroscopic texture (preferred crystallographic orientation of grains detrimental to fatigue properties) were evaluated. A PCRT sorting module for creep damage in Mar-M247 was trained with a virtual database made entirely of modeled design points. The sorting module demonstrated successful discrimination of design points with as little as 1% creep strain in the gauge section from a population of acceptable design points with a range of material and geometric variation. The resonance frequency effects of macro-scale texture in Ti-6Al-4V were quantified with forward models of cylinder samples. FEM-based model inversion was demonstrated for Mar-M247 bulk material properties and variations in crystallographic orientation. PCRT uncertainty quantification (UQ) was performed using Monte Carlo studies for Mar-M247 that quantified the overall uncertainty in resonance frequencies resulting from coupled variation in geometry, material properties, crystallographic orientation and creep damage. A model calibration process was also developed that evaluates inversion fitting to differences from a designated reference sample rather than absolute property values, yielding a reduction in fit error.

  14. On two parabolic systems: Convergence and blowup

    NASA Astrophysics Data System (ADS)

    Huang, Yamin

    1998-12-01

    This dissertation studies two parabolic systems. It consists of two parts. In part one (chapter one), we prove a convergence result, namely, the solution (AK,/ BK) of a system of chemical diffusion-reaction equations (with reaction rate K) converges to the solution (A, B) of a diffusion- instantaneous-reaction equation. To prove our main result, we use some L1 and L2 'energy' estimates and a compactness result due to Aubin (1). As a by-product we also prove that as K approaches infinity, the limit solution exhibits phase separation between A and B. In part two (chapter two), we study the blowup rate for a system of heat equations ut=/Delta u,/ vt=/Delta v in a bounded domain Ωtimes(0,T) coupled in the nonlinear Neumann boundary conditions [/partial u/over/partial n]=vp,/ [/partial v/over/partial n]=uq on ∂Omega×[ 0,T), where p>0,/ q>0,/ pq>1 and n is the exterior normal vector on ∂Omega. Under certain assumptions, we establish exact blowup rate which generalizes the corresponding results of some authors' recent work including Deng (2), Deng-Fila-Levine (3) and Hu-Yin (4). ftn (1) J. P. A scUBIN, Un theoreme de compacite, C. R. Acad. Sci., 256(1963), pp. 5042-5044. (2) K. D scENG, Blow-up rates for parabolic systems, Z. Angew. Math. Phys., 47(1996), No. 1, pp. 132-143. (3) K. D scENG, M. F scILA AND H. A. L scEVINE, On critical exponents for a system of heat equations coupled in the boundary conditions, Acta Math. Univ. Comenian. (N.S.), 36(1994), No. 2, pp. 169-192. (4) B. H scU scAND H. M. Y scIN, The profile near blowup time for solutions of the heat equation with a nonlinear boundary condition, Trans. Amer. Math. Soc., 346(1994), pp. 117-135.

  15. Gaussian processes with built-in dimensionality reduction: Applications to high-dimensional uncertainty propagation

    NASA Astrophysics Data System (ADS)

    Tripathy, Rohit; Bilionis, Ilias; Gonzalez, Marcial

    2016-09-01

    Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optimization under uncertainty, typically require several thousand evaluations of the underlying computer codes. To cope with the cost of simulations, one replaces the real response surface with a cheap surrogate based, e.g., on polynomial chaos expansions, neural networks, support vector machines, or Gaussian processes (GP). However, the number of simulations required to learn a generic multivariate response grows exponentially as the input dimension increases. This curse of dimensionality can only be addressed, if the response exhibits some special structure that can be discovered and exploited. A wide range of physical responses exhibit a special structure known as an active subspace (AS). An AS is a linear manifold of the stochastic space characterized by maximal response variation. The idea is that one should first identify this low dimensional manifold, project the high-dimensional input onto it, and then link the projection to the output. If the dimensionality of the AS is low enough, then learning the link function is a much easier problem than the original problem of learning a high-dimensional function. The classic approach to discovering the AS requires gradient information, a fact that severely limits its applicability. Furthermore, and partly because of its reliance to gradients, it is not able to handle noisy observations. The latter is an essential trait if one wants to be able to propagate uncertainty through stochastic simulators, e.g., through molecular dynamics codes. In this work, we develop a probabilistic version of AS which is gradient-free and robust to observational noise. Our approach relies on a novel Gaussian process regression with built-in dimensionality reduction. In particular, the AS is represented as an orthogonal projection matrix that serves as yet another covariance function hyper-parameter to be estimated from the data. To train the model, we design a two-step maximum likelihood optimization procedure that ensures the orthogonality of the projection matrix by exploiting recent results on the Stiefel manifold, i.e., the manifold of matrices with orthogonal columns. The additional benefit of our probabilistic formulation, is that it allows us to select the dimensionality of the AS via the Bayesian information criterion. We validate our approach by showing that it can discover the right AS in synthetic examples without gradient information using both noiseless and noisy observations. We demonstrate that our method is able to discover the same AS as the classical approach in a challenging one-hundred-dimensional problem involving an elliptic stochastic partial differential equation with random conductivity. Finally, we use our approach to study the effect of geometric and material uncertainties in the propagation of solitary waves in a one dimensional granular system.

  16. Gaussian processes with built-in dimensionality reduction: Applications to high-dimensional uncertainty propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tripathy, Rohit, E-mail: rtripath@purdue.edu; Bilionis, Ilias, E-mail: ibilion@purdue.edu; Gonzalez, Marcial, E-mail: marcial-gonzalez@purdue.edu

    2016-09-15

    Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optimization under uncertainty, typically require several thousand evaluations of the underlying computer codes. To cope with the cost of simulations, one replaces the real response surface with a cheap surrogate based, e.g., on polynomial chaos expansions, neural networks, support vector machines, or Gaussian processes (GP). However, the number of simulations required to learn a generic multivariate response grows exponentially as the input dimension increases. This curse of dimensionality can only be addressed, if the response exhibits some special structure that can be discovered and exploited. A wide range ofmore » physical responses exhibit a special structure known as an active subspace (AS). An AS is a linear manifold of the stochastic space characterized by maximal response variation. The idea is that one should first identify this low dimensional manifold, project the high-dimensional input onto it, and then link the projection to the output. If the dimensionality of the AS is low enough, then learning the link function is a much easier problem than the original problem of learning a high-dimensional function. The classic approach to discovering the AS requires gradient information, a fact that severely limits its applicability. Furthermore, and partly because of its reliance to gradients, it is not able to handle noisy observations. The latter is an essential trait if one wants to be able to propagate uncertainty through stochastic simulators, e.g., through molecular dynamics codes. In this work, we develop a probabilistic version of AS which is gradient-free and robust to observational noise. Our approach relies on a novel Gaussian process regression with built-in dimensionality reduction. In particular, the AS is represented as an orthogonal projection matrix that serves as yet another covariance function hyper-parameter to be estimated from the data. To train the model, we design a two-step maximum likelihood optimization procedure that ensures the orthogonality of the projection matrix by exploiting recent results on the Stiefel manifold, i.e., the manifold of matrices with orthogonal columns. The additional benefit of our probabilistic formulation, is that it allows us to select the dimensionality of the AS via the Bayesian information criterion. We validate our approach by showing that it can discover the right AS in synthetic examples without gradient information using both noiseless and noisy observations. We demonstrate that our method is able to discover the same AS as the classical approach in a challenging one-hundred-dimensional problem involving an elliptic stochastic partial differential equation with random conductivity. Finally, we use our approach to study the effect of geometric and material uncertainties in the propagation of solitary waves in a one dimensional granular system.« less

  17. Biplane interventional pediatric system with cone‐beam CT: dose and image quality characterization for the default protocols

    PubMed Central

    Vañó, Eliseo; Alejo, Luis; Ubeda, Carlos; Gutiérrez‐Larraya, Federico; Garayoa, Julia

    2016-01-01

    The aim of this study was to assess image quality and radiation dose of a biplane angiographic system with cone‐beam CT (CBCT) capability tuned for pediatric cardiac procedures. The results of this study can be used to explore dose reduction techniques. For pulsed fluoroscopy and cine modes, polymethyl methacrylate phantoms of various thicknesses and a Leeds TOR 18‐FG test object were employed. Various fields of view (FOV) were selected. For CBCT, the study employed head and body dose phantoms, Catphan 504, and an anthropomorphic cardiology phantom. The study also compared two 3D rotational angiography protocols. The entrance surface air kerma per frame increases by a factor of 3–12 when comparing cine and fluoroscopy frames. The biggest difference in the signal‐to‐noise ratio between fluoroscopy and cine modes occurs at FOV 32 cm because fluoroscopy is acquired at a 1440×1440 pixel matrix size and in unbinned mode, whereas cine is acquired at 720×720 pixels and in binned mode. The high‐contrast spatial resolution of cine is better than that of fluoroscopy, except for FOV 32 cm, because fluoroscopy mode with 32 cm FOV is unbinned. Acquiring CBCT series with a 16 cm head phantom using the standard dose protocol results in a threefold dose increase compared with the low‐dose protocol. Although the amount of noise present in the images acquired with the low‐dose protocol is much higher than that obtained with the standard mode, the images present better spatial resolution. A 1 mm diameter rod with 250 Hounsfield units can be distinguished in reconstructed images with an 8 mm slice width. Pediatric‐specific protocols provide lower doses while maintaining sufficient image quality. The system offers a novel 3D imaging mode. The acquisition of CBCT images results in increased doses administered to the patients, but also provides further diagnostic information contained in the volumetric images. The assessed CBCT protocols provide images that are noisy, but with very good spatial resolution. PACS number(s): 87.59.‐e, 87.59.‐C, 87.59.‐cf, 87.59.Dj, 87.57. uq PMID:27455474

  18. Inverse methods-based estimation of plate coupling in a plate motion model governed by mantle flow

    NASA Astrophysics Data System (ADS)

    Ratnaswamy, V.; Stadler, G.; Gurnis, M.

    2013-12-01

    Plate motion is primarily controlled by buoyancy (slab pull) which occurs at convergent plate margins where oceanic plates undergo deformation near the seismogenic zone. Yielding within subducting plates, lateral variations in viscosity, and the strength of seismic coupling between plate margins likely have an important control on plate motion. Here, we wish to infer the inter-plate coupling for different subduction zones, and develop a method for inferring it as a PDE-constrained optimization problem, where the cost functional is the misfit in plate velocities and is constrained by the nonlinear Stokes equation. The inverse models have well resolved slabs, plates, and plate margins in addition to a power law rheology with yielding in the upper mantle. Additionally, a Newton method is used to solve the nonlinear Stokes equation with viscosity bounds. We infer plate boundary strength using an inexact Gauss-Newton method with line search for backtracking. Each inverse model is applied to two simple 2-D scenarios (each with three subduction zones), one with back-arc spreading and one without. For each case we examine the sensitivity of the inversion to the amount of surface velocity used: 1) full surface velocity data and 2) surface velocity data simplified using a single scalar average (2-D equivalent to an Euler pole) for each plate. We can recover plate boundary strength in each case, even in the presence of highly nonlinear flow with extreme variations in viscosity. Additionally, we ascribe an uncertainty in each plate's velocity and perform an uncertainty quantification (UQ) through the Hessian of the misfit in plate velocities. We find that as plate boundaries become strongly coupled, the uncertainty in the inferred plate boundary strength decreases. For very weak, uncoupled subduction zones, the uncertainty of inferred plate margin strength increases since there is little sensitivity between plate margin strength and plate velocity. This result is significant because it implies we can infer which plate boundaries are more coupled (seismically) for a realistic dynamic model of plates and mantle flow.

  19. QMU as an approach to strengthening the predictive capabilities of complex models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne.; Boggs, Paul T.; Grace, Matthew D.

    2010-09-01

    Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge andmore » relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third, Bayesian methods for optimal testing in the QMU framework were developed. This completion of this project represent an increased understanding of how to apply and use the QMU process as a means for improving model predictions of the behavior of complex systems. 4« less

  20. Tolerance and UQ4SIM: Nimble Uncertainty Documentation and Analysis Software

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2008-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and variabilities is a necessary first step toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. The basic premise of uncertainty markup is to craft a tolerance and tagging mini-language that offers a natural, unobtrusive presentation and does not depend on parsing each type of input file format. Each file is marked up with tolerances and optionally, associated tags that serve to label the parameters and their uncertainties. The evolution of such a language, often called a Domain Specific Language or DSL, is given in [1], but in final form it parallels tolerances specified on an engineering drawing, e.g., 1 +/- 0.5, 5 +/- 10%, 2 +/- 10 where % signifies percent and o signifies order of magnitude. Tags, necessary for error propagation, can be added by placing a quotation-mark-delimited tag after the tolerance, e.g., 0.7 +/- 20% 'T_effective'. In addition, tolerances might have different underlying distributions, e.g., Uniform, Normal, or Triangular, or the tolerances may merely be intervals due to lack of knowledge (uncertainty). Finally, to address pragmatic considerations such as older models that require specific number-field formats, C-style format specifiers can be appended to the tolerance like so, 1.35 +/- 10U_3.2f. As an example of use, consider figure 1, where a chemical reaction input file is has been marked up to include tolerances and tags per table 1. Not only does the technique provide a natural method of specifying tolerances, but it also servers as in situ documentation of model uncertainties. This tolerance language comes with a utility to strip the tolerances (and tags), to provide a path to the nominal model parameter file. And, as shown in [1], having the ability to quickly mark and identify model parameter uncertainties facilitates error propagation, which in turn yield output uncertainties.

  1. Microbial Communities in Subpermafrost Saline Fracture Water at the Lupin Au Mine, Nunavut, Canada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onstott, Tullis; McGown, Daniel; Bakermans, Corien

    2009-01-01

    We report the first investigation of a deep subpermafrost microbial ecosystem, a terrestrial analog for the Martian subsurface. Our multidisciplinary team analyzed fracture water collected at 890 and 1,130 m depths beneath a 540-m-thick permafrost layer at the Lupin Au mine (Nunavut, Canada). 14C, 3H, and noble gas isotope analyses suggest that the Na Ca Cl, suboxic, fracture water represents a mixture of geologically ancient brine, ~25-kyr-old, meteoric water and a minor modern talik-water component. Microbial planktonic concentrations were ~103 cells mL 1. Analysis of the 16S rRNA gene from extracted DNA and enrichment cultures revealed 42 unique operational taxonomicmore » units in 11 genera with Desulfosporosinus, Halothiobacillus, and Pseudomonas representing the most prominent phylotypes and failed to detect Archaea. The abundance of terminally branched and midchain-branched saturated fatty acids (5 to 15 mol%) was consistent with the abundance of Grampositive bacteria in the clone libraries. Geochemical data, the ubiquinone (UQ) abundance (3 to 11 mol%), and the presence of both aerobic and anaerobic bacteria indicated that the environment was suboxic, not anoxic. Stable sulfur isotope analyses of the fracture water detected the presence of microbial sulfate reduction, and analyses of the vein-filling pyrite indicated that it was in isotopic equilibrium with the dissolved sulfide. Free energy calculations revealed that sulfate reduction and sulfide oxidation via denitrification and not methanogenesis were the most thermodynamically viable consistent with the principal metabolisms inferred from the 16S rRNA community composition and with CH4 isotopic compositions. The sulfate-reducing bacteria most likely colonized the subsurface during the Pleistocene or earlier, whereas aerobic bacteria may have entered the fracture water networks either during deglaciation prior to permafrost formation 9,000 years ago or from the nearby talik through the hydrologic gradient created during mine dewatering. Although the absence of methanogens from this subsurface ecosystem is somewhat surprising, it may be attributable to an energy bottleneck that restricts their migration from surface permafrost deposits where they are frequently reported. These results have implications for the biological origin of CH4 on Mars.« less

  2. Evaluation of cumulative dose for cone‐beam computed tomography (CBCT) scans within phantoms made from different compositions using Monte Carlo simulations

    PubMed Central

    Martin, Colin J.; Sankaralingam, Marimuthu; Oomen, Kurian; Gentle, David J.

    2015-01-01

    Measurement of cumulative dose f(0,150) with a small ionization chamber within standard polymethyl methacrylate (PMMA) CT head and body phantoms, 150 mm in length, is a possible practical method for cone‐beam computed tomography (CBCT) dosimetry. This differs from evaluating cumulative dose under scatter equilibrium conditions within an infinitely long phantom f(0,∞), which is proposed by AAPM TG‐111 for CBCT dosimetry. The aim of this study was to investigate the feasibility of using f(0,150) to estimate values for f(0,∞) in long head and body phantoms made of PMMA, polyethylene (PE), and water, using beam qualities for tube potentials of 80−140 kV. The study also investigated the possibility of using 150 mm PE phantoms for assessment of f(0,∞) within long PE phantoms, the ICRU/AAPM phantom. The influence of scan parameters, composition, and length of the phantoms was investigated. The capability of f(0,150) to assess f(0,∞) has been defined as the efficiency and assessed in terms of the ratios ϵ(f(0,150)/f(0,∞)). The efficiencies were calculated using Monte Carlo simulations for an On‐Board Imager (OBI) system mounted on a TrueBeam linear accelerator. Head and body scanning protocols with beams of width 40−500 mm were used. Efficiencies ϵ(PMMA/PMMA) and ϵ(PE/PE) as a function of beam width exhibited three separate regions. For beam widths <150 mm, ϵ(PMMA/PMMA) and ϵ(PE/PE) values were greater than 90% for the head and body phantoms. The efficiency values then fell rapidly with increasing beam width before levelling off at 74% for ϵ(PMMA/PMMA) and 69% for ϵ(PE/PE) for a 500 mm beam width. The quantities ϵ(PMMA/PE) and ϵ(PMMA/Water) varied with beam width in a different manner. Values at the centers of the phantoms for narrow beams were lower and increased to a steady state for ∼100−150 mm wide beams, before declining with increasing the beam width, whereas values at the peripheries decreased steadily with beam width. Results for ϵ(PMMA/PMMA) were virtually independent of tube potential, but there was more variation for ϵ(PMMA/PE) and ϵ(PMMA/Water). f(0,150) underestimated f(0,∞) for beam widths used for CBCT scans, thus it is necessary to use long phantoms, or apply conversion factors (Cfs) to measurements with standard PMMA CT phantoms. The efficiency values have been used to derive (Cfs) to allow evaluation of f(0,∞) from measurements of f(0,150). The (Cfs) only showed a weak dependence on scan parameters and scanner type, and so may be suitable for general application. PACS number: 87.55.K‐, 87.57.Q‐, 87.57.uq. PMID:26699590

  3. Human Research Program Integrated Research Plan: December 20, 2007, Interim Baseline

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Human Research Program (HRP) delivers human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. This Integrated Research Plan (IRP) describes the program s research activities that are intended to address the needs of human space exploration and serve HRP customers. The timescale of human space exploration is envisioned to take many decades. The IRP illustrates the program s research plan through the timescale of early lunar missions of extended duration. The document serves several purposes for the Human Research Program: The IRP provides a means to assure that the most significant risks to human space explorers are being adequately mitigated and/or addressed, The IRP shows the relationship of research activities to expected outcomes and need dates, The IRP shows the interrelationships among research activities that may interact to produce products that are integrative or cross defined research disciplines, The IRP illustrates the non-deterministic nature of research and technology activities by showing expected decision points and potential follow-on activities, The IRP shows the assignments of responsibility within the program organization and, as practical, the intended solicitation approach, The IRP shows the intended use of research platforms such as the International Space Station, NASA Space Radiation Laboratory, and various space flight analogs. The IRP does not show all budgeted activities of the Human research program, as some of these are enabling functions, such as management, facilities and infrastructure

  4. UQ for Decision Making: How (at least five) Kinds of Probability Might Come Into Play

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    In 1959 IJ Good published the discussion "Kinds of Probability" in Science. Good identified (at least) five kinds. The need for (at least) a sixth kind of probability when quantifying uncertainty in the context of climate science is discussed. This discussion brings out the differences in weather-like forecasting tasks and climate-links tasks, with a focus on the effective use both of science and of modelling in support of decision making. Good also introduced the idea of a "Dynamic probability" a probability one expects to change without any additional empirical evidence; the probabilities assigned by a chess playing program when it is only half thorough its analysis being an example. This case is contrasted with the case of "Mature probabilities" where a forecast algorithm (or model) has converged on its asymptotic probabilities and the question hinges in whether or not those probabilities are expected to change significantly before the event in question occurs, even in the absence of new empirical evidence. If so, then how might one report and deploy such immature probabilities in scientific-support of decision-making rationally? Mature Probability is suggested as a useful sixth kind, although Good would doubtlessly argue that we can get by with just one, effective communication with decision makers may be enhanced by speaking as if the others existed. This again highlights the distinction between weather-like contexts and climate-like contexts. In the former context one has access to a relevant climatology (a relevant, arguably informative distribution prior to any model simulations), in the latter context that information is not available although one can fall back on the scientific basis upon which the model itself rests, and estimate the probability that the model output is in fact misinformative. This subjective "probability of a big surprise" is one way to communicate the probability of model-based information holding in practice, the probability that the information the model-based probability is conditioned on holds. It is argued that no model-based climate-like probability forecast is complete without a quantitative estimate of its own irrelevance, and that the clear identification of model-based probability forecasts as mature or immature, are critical elements for maintaining the credibility of science-based decision support, and can shape uncertainty quantification more widely.

  5. How Prepared Are MSW Graduates for Doctoral Research? Views of PhD Research Faculty

    ERIC Educational Resources Information Center

    Drisko, James W.; Evans, Kristin

    2018-01-01

    This national survey of PhD faculty assessed the research preparation of entering doctoral social work students on a wide range of research knowledge and related skills. The prior literature shows that PhD programs repeat much BSW and MSW research course content. This study shows that the trend continues and has perhaps widened. PhD research…

  6. Review of Research Shows, Overall, Acupuncture Did Not Increase Pregnancy Rates with IVF

    MedlinePlus

    ... X Y Z Review of Research Shows, Overall, Acupuncture Did Not Increase Pregnancy Rates With IVF Share: An analysis of research conducted on acupuncture as an adjuvant (booster) treatment to in vitro ...

  7. Researches in agri-food supply chain: A bibliometric study

    NASA Astrophysics Data System (ADS)

    Hisjam, Muhammad; Sutopo, Wahyudi

    2017-11-01

    Agri-food is very important for human being. Problems in managing agri-food are very complicated. There are many entities involved in managing agri-food with conflict of interest between them makes the problems become more complicated. Using supply chain approaches in agri-food will help solving the problems. The purpose of this paper is to show that the publications in agri-food supply chain research area are still promising and to show the research trend in agri-food supply chain. The study was a bibliometric study by using some queries on the website with the largest database of peer-reviewed literature. The queries were using various categories and refinements. Firstly the study was exploring all publications in this research area in some categories and then divided the duration into 2 intervals. The last query was to know how many publications are review type publications. The results show that the number of the publications with agri-food supply chain topics are still limited, and tend to increase. It means researches in this area are still promising. The results also show the most publications are from which source title, country, and affiliation. The results also show the research trend in this research area. The quantities of review type publications in agri-food supply chain are still few. It shows the need for more review type publications in this area.

  8. INTEGRIN-MEDIATED CELL ATTACHMENT SHOWS TIME-DEPENDENT UPREGULATION OF GAP JUNCTION COMMUNICATION.

    EPA Science Inventory


    Integrin-mediated Cell Attachment Shows Time-Dependent Upregulation of Gap Junction
    Communication

    Rachel Grindstaff and Carl Blackman, National Health & Environmental Effects Research
    Laboratory, Office of Research and Development, US EPA, Research Triang...

  9. A plea for pragmatism in clinical research ethics.

    PubMed

    Brendel, David H; Miller, Franklin G

    2008-04-01

    Pragmatism is a distinctive approach to clinical research ethics that can guide bioethicists and members of institutional review boards (IRBs) as they struggle to balance the competing values of promoting medical research and protecting human subjects participating in it. After defining our understanding of pragmatism in the setting of clinical research ethics, we show how a pragmatic approach can provide guidance not only for the day-to-day functioning of the IRB, but also for evaluation of policy standards, such as the one that addresses acceptable risks for healthy children in clinical research trials. We also show how pragmatic considerations might influence the debate about the use of deception in clinical research. Finally, we show how a pragmatic approach, by regarding the promotion of human research and the protection of human subjects as equally important values, helps to break down the false dichotomy between science and ethics in clinical research.

  10. The Application of a Multiphase Triangulation Approach to Mixed Methods: The Research of an Aspiring School Principal Development Program

    ERIC Educational Resources Information Center

    Youngs, Howard; Piggot-Irvine, Eileen

    2012-01-01

    Mixed methods research has emerged as a credible alternative to unitary research approaches. The authors show how a combination of a triangulation convergence model with a triangulation multilevel model was used to research an aspiring school principal development pilot program. The multilevel model is used to show the national and regional levels…

  11. Partnerships and Pricing Services. Research Notes.

    ERIC Educational Resources Information Center

    Jordan, Debra J.

    1998-01-01

    Research shows that partnerships have become crucial to long-term organizational success. Benefits and constraints of partnerships are outlined. A second research article on pricing shows that establishing and advertising an anchor price helps consumers understand increases and discounts. Implications for camp management are discussed. (SAS)

  12. 47 CFR 5.63 - Supplementary statements required.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... authorization in the Experimental Radio Service to be used for communications essential to a research project in... the research project being conducted. (2) A showing that communications facilities are necessary for the research project involved. (3) A showing that existing communications facilities are inadequate or...

  13. The effect of different volumes of high-intensity interval training on proinsulin in participants with the metabolic syndrome: a randomised trial.

    PubMed

    Ramos, Joyce S; Dalleck, Lance C; Borrani, Fabio; Mallard, Alistair R; Clark, Bronwyn; Keating, Shelley E; Fassett, Robert G; Coombes, Jeff S

    2016-11-01

    The continuous demand for insulin in the face of insulin resistance, coupled with the glucolipotoxic environment associated with the metabolic syndrome (MetS), adversely affects the quality of insulin produced and secreted by the pancreatic beta cells. This is depicted by increased circulating intact proinsulin concentration, which is associated with increased MetS severity and risk of cardiovascular (CV) mortality. High-intensity interval training (HIIT) has been shown to reduce insulin resistance and other CV disease risk factors to a greater degree than moderate-intensity continuous training (MICT). We therefore aimed to investigate the impact of MICT and different volumes of HIIT on circulating intact proinsulin concentration. This was a substudy of the 'Exercise in prevention of Metabolic Syndrome' (EX-MET) multicentre trial. Sixty-six individuals with MetS were randomised to 16 weeks of: (1) MICT (n = 21, 30 min at 60-70% peak heart rate [HRpeak], five times/week); (2) 4HIIT (n = 22, 4 × 4 min bouts at 85-95% HRpeak, interspersed with 3 min of active recovery at 50-70% HRpeak, three times/week); or (3) 1HIIT (n = 23, 1 × 4 min bout at 85-95% HRpeak, three times/week). A subanalysis investigated the differential impact of these training programmes on intact proinsulin concentration in MetS individuals with type 2 diabetes (MICT, n = 6; 4HIIT, n = 9; 1HIIT, n = 12) and without type 2 diabetes (MICT, n = 15; 4HIIT, n = 13; 1HIIT, n = 11). Intact proinsulin, insulin and C-peptide concentrations were measured in duplicate via ELISA, following a 12 h fast, before and after the exercise programme. Fasting intact proinsulin concentration was also expressed relative to insulin and C-peptide concentrations. Following the exercise training, there were no significant (p > 0.05) changes in fasting intact proinsulin concentration indices in all participants (pre- vs post-programme proinsulin, proinsulin:insulin, proinsulin:C-peptide: MICT 19% decrease, 6% increase, 4% increase; 4HIIT 19% decrease, 8% decrease, 11% decrease; 1HIIT 34% increase, 49% increase, 36% increase). In participants who did not have type 2 diabetes, only 4HIIT significantly (p < 0.05) reduced fasting intact proinsulin concentration indices from pre to post intervention (pre- vs post-programme proinsulin, proinsulin:insulin, proinsulin:C-peptide: 4HIIT 32% decrease, 26% decrease, 32% decrease, p < 0.05; 1HIIT, 14% increase, 32% increase, 16% increase, p > 0.05; MICT 27% decrease, 17% decrease, 11% decrease), with a group × time interaction effect, indicating a greater reduction in intact proinsulin indices following 4HIIT compared with MICT and 1HIIT. There were no significant (p > 0.05) changes in intact proinsulin concentration indices in participants with type 2 diabetes. Higher-volume HIIT (4HIIT) improved insulin quality in MetS participants without type 2 diabetes. ClinicalTrials.gov NCT01676870 FUNDING: The study was funded by the Norwegian University of Science and Technology and from an unrestricted research grant from the Coca-Cola company. Funding for the collection of physical activity data was derived from a 'UQ New Staff Start Up' grant awarded to B. Clark.

  14. NASA Glenn's Contributions to Aircraft Engine Noise Research

    NASA Technical Reports Server (NTRS)

    Huff, Dennis L.

    2014-01-01

    This presentation reviews engine noise research conducted at the NASA Glenn Research Center over the past 70 years. This report includes a historical perspective of the Center and the facilities used to conduct the research. Major noise research programs are highlighted to show their impact on industry and on the development of aircraft noise reduction technology. Noise reduction trends are discussed, and future aircraft concepts are presented. Since the 1960s, research results show that the average perceived noise level has been reduced by about 20 decibels (dB). Studies also show that, depending on the size of the airport, the aircraft fleet mix, and the actual growth in air travel, another 15 to 17 dB reduction will be required to achieve NASAs long-term goal of providing technologies to limit objectionable noise to the boundaries of an average airport.

  15. NASA Glenn's Contributions to Aircraft Engine Noise Research

    NASA Technical Reports Server (NTRS)

    Huff, Dennis L.

    2013-01-01

    This report reviews all engine noise research conducted at the NASA Glenn Research Center over the past 70 years. This report includes a historical perspective of the Center and the facilities used to conduct the research. Major noise research programs are highlighted to show their impact on industry and on the development of aircraft noise reduction technology. Noise reduction trends are discussed, and future aircraft concepts are presented. Since the 1960s, research results show that the average perceived noise level has been reduced by about 20 decibels (dB). Studies also show that, depending on the size of the airport, the aircraft fleet mix, and the actual growth in air travel, another 15 to 17 dB reduction will be required to achieve NASA's long-term goal of providing technologies to limit objectionable noise to the boundaries of an average airport.

  16. Hibiscus

    MedlinePlus

    ... pressure. Most early research shows that drinking hibiscus tea for 2-6 weeks decreases blood pressure by ... pressure. Some early research shows that drinking hibiscus tea might be as effective as the prescription drugs ...

  17. Developing a Teachers' Gender Stereotype Scale toward Mathematics

    ERIC Educational Resources Information Center

    Nurlu, Özge

    2017-01-01

    Gender has become a focus of mathematics education research. While some research show that there are no differences between boys and girls, numerous research studies have indicated that boys have outperformed girls. It is suggested that gender stereotypes, such as expecting girls to show less achievement in mathematics compared to boys, have an…

  18. Undergraduate Students' Resistance to Study Skills Course

    ERIC Educational Resources Information Center

    Yuksel, Sedat

    2006-01-01

    Research indicate that students generally fail to benefit from study skills courses and show resistance to this course in higher education level. The purpose of this research is to investigate reasons why students show resistance to the course of study skills and habits. In this research, a qualitative design utilizing retrospective interviews was…

  19. 2016 Research Final Presentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koskelo, EliseAnne Corinne

    These are slides which show an example of research at Los Alamos National Laboratory done by E.C. Koskelo to show college professors in the hopes of earning a research position or fellowship position. In summary, this researcher developed a new in-situ technique for the inspection of additively manufactured parts, created an algorithm which can correct "skewed" scans of angular parts/taken at oblique angles, and used AWS to detect hidden defects and thickness changes in aerospace composites.

  20. The Role of Attendance in Lecture Classes: You Can Lead a Horse to Water...

    ERIC Educational Resources Information Center

    Golding, Jonathan M.

    2011-01-01

    A review of prior research on the role of attendance policies in large lecture classes (including psychology) is presented. This research showed that although students often did not attend class, various policies were effective in getting students to the classroom. Moreover, some research showed that an attendance policy did not lower instructor…

  1. Research and ECEC for Children under Three in France: A Brief Review

    ERIC Educational Resources Information Center

    Rayna, Sylvie

    2010-01-01

    This article provides an overview of ECEC services in France for children under three. It reviews research carried out in this area, and shows its genesis, main issues, cultural context and modalities of its production. The French provision for the under threes' is characterized by its increasing diversification. French research shows a diversity…

  2. Theory and Practice in the Teaching of Composition: Processing, Distancing, and Modeling.

    ERIC Educational Resources Information Center

    Myers, Miles, Ed.; Gray, James, Ed.

    Intended to show teachers how their approaches to the teaching of writing reflect a particular area of research and to show researchers how the intuitions of teachers reflect research findings, the articles in this book are classified according to three approaches to writing: processing, distancing, and modeling. After an introductory essay that…

  3. Insights into teaching quantum mechanics in secondary and lower undergraduate education

    NASA Astrophysics Data System (ADS)

    Krijtenburg-Lewerissa, K.; Pol, H. J.; Brinkman, A.; van Joolingen, W. R.

    2017-06-01

    This study presents a review of the current state of research on teaching quantum mechanics in secondary and lower undergraduate education. A conceptual approach to quantum mechanics is being implemented in more and more introductory physics courses around the world. Because of the differences between the conceptual nature of quantum mechanics and classical physics, research on misconceptions, testing, and teaching strategies for introductory quantum mechanics is needed. For this review, 74 articles were selected and analyzed for the misconceptions, research tools, teaching strategies, and multimedia applications investigated. Outcomes were categorized according to their contribution to the various subtopics of quantum mechanics. Analysis shows that students have difficulty relating quantum physics to physical reality. It also shows that the teaching of complex quantum behavior, such as time dependence, superposition, and the measurement problem, has barely been investigated for the secondary and lower undergraduate level. At the secondary school level, this article shows a need to investigate student difficulties concerning wave functions and potential wells. Investigation of research tools shows the necessity for the development of assessment tools for secondary and lower undergraduate education, which cover all major topics and are suitable for statistical analysis. Furthermore, this article shows the existence of very diverse ideas concerning teaching strategies for quantum mechanics and a lack of research into which strategies promote understanding. This article underlines the need for more empirical research into student difficulties, teaching strategies, activities, and research tools intended for a conceptual approach for quantum mechanics.

  4. A Conceptual Model for Urgent Acquisition Programs

    DTIC Science & Technology

    2017-04-06

    Research ............................................................................................... 3 Overview of Research Methodology ... Research Methodology .............................................................................................. 13 Research Question...community. The last 60 years of history show three broad examples of priority reset. Welby (2016), Assistant Secretary of Defense for Research and

  5. Contributions of South American research centers to Carbohydrate Research.

    PubMed

    Stortz, Carlos A

    2014-03-24

    The present article shows the objective figures of the contributions of South American research centers to Carbohydrate Research during its 50years of history, measured in terms of members of the Editorial Board, number of articles and citations to them, together with a country-based comparison, and the progression of these contributions with time. In addition, it also shows the subjective feelings of the author toward the same journal. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Contributions of South American research centers to Carbohydrate Research.

    PubMed

    Stortz, Carlos A

    2015-02-11

    The present article shows the objective figures of the contributions of South American research centers to Carbohydrate Research during its 50 years of history, measured in terms of members of the Editorial Board, number of articles and citations to them, together with a country-based comparison, and the progression of these contributions with time. In addition, it also shows the subjective feelings of the author toward the same journal. Copyright © 2015. Published by Elsevier Ltd.

  7. Researchers Find Essential Brain Circuit in Visual Development

    MedlinePlus

    ... Release Monday, August 26, 2013 Researchers find essential brain circuit in visual development NIH-funded study could ... shows the connections from the eyes to the brain in a mouse. The right image shows the ...

  8. 8. INTERIOR VIEW, SHOWING CONVERTED STAND FOR COMPRESSOR RESEARCH. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. INTERIOR VIEW, SHOWING CONVERTED STAND FOR COMPRESSOR RESEARCH. - Wright-Patterson Air Force Base, Area B, Building No. 20A, Propeller Test Complex, Seventh Street, from E to G Streets, Dayton, Montgomery County, OH

  9. The Road to Reason.

    ERIC Educational Resources Information Center

    Wright, Benjamin D.

    2000-01-01

    Summarizes the distinctions between qualitative and quantitative research and shows their complementary aspects. Shows there is no contradiction or conflict between the qualitative and the quantitative and discusses Rasch measurement as the construction tool of quantitative research. (SLD)

  10. Performance evaluation of iterative reconstruction algorithms for achieving CT radiation dose reduction — a phantom study

    PubMed Central

    Dodge, Cristina T.; Tamm, Eric P.; Cody, Dianna D.; Liu, Xinming; Jensen, Corey T.; Wei, Wei; Kundra, Vikas

    2016-01-01

    The purpose of this study was to characterize image quality and dose performance with GE CT iterative reconstruction techniques, adaptive statistical iterative reconstruction (ASiR), and model‐based iterative reconstruction (MBIR), over a range of typical to low‐dose intervals using the Catphan 600 and the anthropomorphic Kyoto Kagaku abdomen phantoms. The scope of the project was to quantitatively describe the advantages and limitations of these approaches. The Catphan 600 phantom, supplemented with a fat‐equivalent oval ring, was scanned using a GE Discovery HD750 scanner at 120 kVp, 0.8 s rotation time, and pitch factors of 0.516, 0.984, and 1.375. The mA was selected for each pitch factor to achieve CTDIvol values of 24, 18, 12, 6, 3, 2, and 1 mGy. Images were reconstructed at 2.5 mm thickness with filtered back‐projection (FBP); 20%, 40%, and 70% ASiR; and MBIR. The potential for dose reduction and low‐contrast detectability were evaluated from noise and contrast‐to‐noise ratio (CNR) measurements in the CTP 404 module of the Catphan. Hounsfield units (HUs) of several materials were evaluated from the cylinder inserts in the CTP 404 module, and the modulation transfer function (MTF) was calculated from the air insert. The results were confirmed in the anthropomorphic Kyoto Kagaku abdomen phantom at 6, 3, 2, and 1 mGy. MBIR reduced noise levels five‐fold and increased CNR by a factor of five compared to FBP below 6 mGy CTDIvol, resulting in a substantial improvement in image quality. Compared to ASiR and FBP, HU in images reconstructed with MBIR were consistently lower, and this discrepancy was reversed by higher pitch factors in some materials. MBIR improved the conspicuity of the high‐contrast spatial resolution bar pattern, and MTF quantification confirmed the superior spatial resolution performance of MBIR versus FBP and ASiR at higher dose levels. While ASiR and FBP were relatively insensitive to changes in dose and pitch, the spatial resolution for MBIR improved with increasing dose and pitch. Unlike FBP, MBIR and ASiR may have the potential for patient imaging at around 1 mGy CTDIvol. The improved low‐contrast detectability observed with MBIR, especially at low‐dose levels, indicate the potential for considerable dose reduction. PACS number(s): 87.57.Q‐, 87.57,nf, 87.57.C‐, 87.57.cj, 87.57.cf, 87.57.cm, 87.57.uq PMID:27074454

  11. Total protein, albumin and low-molecular-weight protein excretion in HIV-positive patients.

    PubMed

    Campbell, Lucy J; Dew, Tracy; Salota, Rashim; Cheserem, Emily; Hamzah, Lisa; Ibrahim, Fowzia; Sarafidis, Pantelis A; Moniz, Caje F; Hendry, Bruce M; Poulton, Mary; Sherwood, Roy A; Post, Frank A

    2012-08-10

    Chronic kidney disease is common in HIV positive patients and renal tubular dysfunction has been reported in those receiving combination antiretroviral therapy (cART). Tenofovir (TFV) in particular has been linked to severe renal tubular disease as well as proximal tubular dysfunction. Markedly elevated urinary concentrations of retinal-binding protein (RBP) have been reported in patients with severe renal tubular disease, and low-molecular-weight proteins (LMWP) such as RBP may be useful in clinical practice to assess renal tubular function in patients receiving TFV. We analysed 3 LMWP as well as protein and albumin in the urine of a sample of HIV positive patients. In a cross-sectional fashion, total protein, albumin, RBP, cystatin C, and neutrophil gelatinase-associated lipocalin (NGAL) were quantified in random urine samples of 317 HIV positive outpatients and expressed as the ratio-to-creatinine (RBPCR, CCR and NGALCR). Exposure to cART was categorised as none, cART without TFV, and cART containing TFV and a non-nucleoside reverse-transcriptase-inhibitor (TFV/NNRTI) or TFV and a protease-inhibitor (TFV/PI). Proteinuria was present in 10.4 % and microalbuminuria in 16.7 % of patients. Albumin accounted for approximately 10 % of total urinary protein. RBPCR was within the reference range in 95 % of patients while NGALCR was elevated in 67 % of patients. No overall differences in urine protein, albumin, and LMWP levels were observed among patients stratified by cART exposure, although a greater proportion of patients exposed to TFV/PI had RBPCR >38.8 μg/mmol (343 μg/g) (p = 0.003). In multivariate analyses, black ethnicity (OR 0.43, 95 % CI 0.24, 0.77) and eGFR <75 mL/min/1.73 m2 (OR 3.54, 95 % CI 1.61, 7.80) were independently associated with upper quartile (UQ) RBPCR. RBPCR correlated well to CCR (r2 = 0.71), but not to NGALCR, PCR or ACR. In HIV positive patients, proteinuria was predominantly of tubular origin and microalbuminuria was common. RBPCR in patients without overt renal tubular disease was generally within the reference range, including those receiving TFV. RBP therefore appears a promising biomarker for monitoring renal tubular function in patients receiving TFV and for distinguishing patients with normal tubular function or mild tubular dysfunction from those with severe renal tubular disease or Fanconi syndrome.

  12. Tracking Middle Grades Climate Data to Inform School Change. REL West Research Digest

    ERIC Educational Resources Information Center

    Regional Educational Laboratory West, 2015

    2015-01-01

    A growing body of research shows that positive school climate is a key lever for students' academic and social development and success. This research digest shows how an alliance of California schools and districts, school climate experts, and state education agency personnel have teamed up to use school climate data to drive a continuous cycle of…

  13. Research notes : durability of composite repairs on bridges.

    DOT National Transportation Integrated Search

    2009-08-01

    The research showed that conditions that allow moisture to get under the carbon fiber reinforced polymer composites (CFRP) combined with freeze-thaw were detrimental to durability. In addition, the results showed that the American Concrete Institute ...

  14. Differential impact of science policy on subfields of human embryonic stem cell research.

    PubMed

    Moon, Seongwuk; Cho, Seong Beom

    2014-01-01

    In this research, we examine how restrictive policy influenced performance in human embryonic stem cell research (hESC) between 1998 and 2008. In previous research, researchers argued whether restrictive policy decreased the performance of stem cell research in some nations, especially in the US. Here, we hypothesize that this policy influenced specific subfields of the hESC research. To investigate the selective policy effects, we categorize hESC research publications into three subfields-derivation, differentiation, and medical application research. Our analysis shows that restrictive policy had different effects on different subfields. In general, the US outperformed in overall hESC research throughout these periods. In the derivation of hESC, however, the US almost lost its competence under restrictive policy. Interestingly, the US scientific community showed prominent resilience in hESC research through international collaboration. We concluded that the US resilience and performance stemmed from the wide breadth of research portfolio of US scientists across the hESC subfields, combined with their strategic efforts to collaborate internationally on derivation research.

  15. Association between Nurses' Education about Research and Their Research Use.

    ERIC Educational Resources Information Center

    McCleary, Lynn; Brown, G. Ted

    2003-01-01

    Responses from 178 of 528 pediatric nurses showed that higher education levels or courses in research design and use were associated with positive attitudes toward research. Higher education levels were associated with self-reported research use; completing research-related courses was not independently associated with higher research use.…

  16. Visualized analysis of developing trends and hot topics in natural disaster research.

    PubMed

    Shen, Shi; Cheng, Changxiu; Yang, Jing; Yang, Shanli

    2018-01-01

    This study visualized and analyzed the developing trends and hot topics in natural disaster research. 19694 natural disaster-related articles (January 1900 to June 2015) are indexed in the Web of Science database. The first step in this study is using complex networks to visualize and analyze these articles. CiteSpace and Gephi were employed to generate a countries collaboration network and a disciplines collaboration network, and then attached hot topics to countries and disciplines, respectively. The results show that USA, China, and Italy are the three major contributors to natural disaster research. "Prediction model", "social vulnerability", and "landslide inventory map" are three hot topics in recent years. They have attracted attention not only from large countries like China but also from small countries like Panama and Turkey. Comparing two hybrid networks provides details of natural disaster research. Scientists from USA and China use image data to research earthquakes. Indonesia and Germany collaboratively study tsunamis in the Indian Ocean. However, Indonesian studies focus on modeling and simulations, while German research focuses on early warning technology. This study also introduces an activity index (AI) and an attractive index (AAI) to generate time evolution trajectories of some major countries from 2000 to 2013 and evaluate their trends and performance. Four patterns of evolution are visible during this 14-year period. China and India show steadily rising contributions and impacts, USA and England show relatively decreasing research efforts and impacts, Japan and Australia show fluctuating activities and stable attraction, and Spain and Germany show fluctuating activities and increasing impacts.

  17. Visualized analysis of developing trends and hot topics in natural disaster research

    PubMed Central

    Shen, Shi; Cheng, Changxiu; Yang, Jing; Yang, Shanli

    2018-01-01

    This study visualized and analyzed the developing trends and hot topics in natural disaster research. 19694 natural disaster-related articles (January 1900 to June 2015) are indexed in the Web of Science database. The first step in this study is using complex networks to visualize and analyze these articles. CiteSpace and Gephi were employed to generate a countries collaboration network and a disciplines collaboration network, and then attached hot topics to countries and disciplines, respectively. The results show that USA, China, and Italy are the three major contributors to natural disaster research. “Prediction model”, “social vulnerability”, and “landslide inventory map” are three hot topics in recent years. They have attracted attention not only from large countries like China but also from small countries like Panama and Turkey. Comparing two hybrid networks provides details of natural disaster research. Scientists from USA and China use image data to research earthquakes. Indonesia and Germany collaboratively study tsunamis in the Indian Ocean. However, Indonesian studies focus on modeling and simulations, while German research focuses on early warning technology. This study also introduces an activity index (AI) and an attractive index (AAI) to generate time evolution trajectories of some major countries from 2000 to 2013 and evaluate their trends and performance. Four patterns of evolution are visible during this 14-year period. China and India show steadily rising contributions and impacts, USA and England show relatively decreasing research efforts and impacts, Japan and Australia show fluctuating activities and stable attraction, and Spain and Germany show fluctuating activities and increasing impacts. PMID:29351350

  18. Deforestation planning for cattle grazing in Amazon Basin using LANDSAT data

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Dossantos, A. P.; Demoraisnovo, E. M. L.

    1978-01-01

    The author has identified the following significant results. This research did not show the total potential of the LANDSAT system, but tried to open up new research aspects for the utilization of LANDSAT data in natural resource control. Results obtained through this research showed that LANDSAT data can be used to develop monitoring programs in the tropical forest areas of Brazil.

  19. Calibrating Bayesian Network Representations of Social-Behavioral Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, Paul D.; Walsh, Stephen J.

    2010-04-08

    While human behavior has long been studied, recent and ongoing advances in computational modeling present opportunities for recasting research outcomes in human behavior. In this paper we describe how Bayesian networks can represent outcomes of human behavior research. We demonstrate a Bayesian network that represents political radicalization research – and show a corresponding visual representation of aspects of this research outcome. Since Bayesian networks can be quantitatively compared with external observations, the representation can also be used for empirical assessments of the research which the network summarizes. For a political radicalization model based on published research, we show this empiricalmore » comparison with data taken from the Minorities at Risk Organizational Behaviors database.« less

  20. Community pharmacists' interest in and attitude to pharmacy practice research in Ethiopia: A cross-sectional study.

    PubMed

    Bhagavathula, Akshaya Srikanth; Gebreyohannes, Eyob Alemayehu; Gebresillassie, Begashaw Melaku; Erku, Daniel Asfaw; Negesse, Chernet Tafere; Belay, Yared Belete

    2017-01-01

    Pharmacy practice-research became an important component in the pharmacy practice. However, limited studies were conducted in sub-Saharan Africa to understand the pharmacists' interest and attitude towards pharmacy practice-research. We aimed to assess the community pharmacists' interest and attitude towards pharmacy practice-research in Ethiopia. A cross-sectional survey was conducted among community pharmacists in eight major cities in Ethiopia. A validated 25-item self-administered questionnaire covering interest and attitude related to pharmacy practice-research was distributed. Responses were analysed using descriptive and inferential statistics. A total of 389 community pharmacists responded to the survey (response rate- 88.4%). Most of community pharmacists showed a high level of interest and positive attitude in being involved in all aspects of pharmacy practice-research. The median summary score for interest and attitude were 38 (IQR 20-40) (range possible 10-50) and 30 (IQR 18-39), respectively. Sixty-seven percent of the respondents thought about being involved in research, felt research is important for their career (57.6%), confident to conduct the research (56.2%), and agreed that research is a part of pharmacy practice (48.5%). However, only forty-six percent agreed that they underwent research training. A multivariate analysis showed that females were more interested in pharmacy practice research than males [AOR: 1.50, 95% CI: 0.99-2.27; p<0.05]. Community pharmacists showed high interest towards several areas of research competencies and demonstrated positive attitude towards pharmacy practice-research. Our findings suggest that providing research training to community pharmacists may contribute in undertaking research activities and build the research capacity in Ethiopia.

  1. Community pharmacists’ interest in and attitude to pharmacy practice research in Ethiopia: A cross-sectional study

    PubMed Central

    Negesse, Chernet Tafere; Belay, Yared Belete

    2017-01-01

    Pharmacy practice-research became an important component in the pharmacy practice. However, limited studies were conducted in sub-Saharan Africa to understand the pharmacists’ interest and attitude towards pharmacy practice-research. We aimed to assess the community pharmacists’ interest and attitude towards pharmacy practice-research in Ethiopia. A cross-sectional survey was conducted among community pharmacists in eight major cities in Ethiopia. A validated 25-item self-administered questionnaire covering interest and attitude related to pharmacy practice-research was distributed. Responses were analysed using descriptive and inferential statistics. A total of 389 community pharmacists responded to the survey (response rate- 88.4%). Most of community pharmacists showed a high level of interest and positive attitude in being involved in all aspects of pharmacy practice-research. The median summary score for interest and attitude were 38 (IQR 20–40) (range possible 10–50) and 30 (IQR 18–39), respectively. Sixty-seven percent of the respondents thought about being involved in research, felt research is important for their career (57.6%), confident to conduct the research (56.2%), and agreed that research is a part of pharmacy practice (48.5%). However, only forty-six percent agreed that they underwent research training. A multivariate analysis showed that females were more interested in pharmacy practice research than males [AOR: 1.50, 95% CI: 0.99–2.27; p<0.05]. Community pharmacists showed high interest towards several areas of research competencies and demonstrated positive attitude towards pharmacy practice-research. Our findings suggest that providing research training to community pharmacists may contribute in undertaking research activities and build the research capacity in Ethiopia. PMID:28617834

  2. HELIOSPHERIC STRUCTURE: THE BOW WAVE AND THE HYDROGEN WALL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zank, G. P.; Heerikhuisen, J.; Pogorelov, N. V.

    2013-01-20

    Recent IBEX observations indicate that the local interstellar medium (LISM) flow speed is less than previously thought (23.2 km s{sup -1} rather than 26 km s{sup -1}). Reasonable LISM plasma parameters indicate that the LISM flow may be either marginally super-fast magnetosonic or sub-fast magnetosonic. This raises two challenging questions: (1) Can a LISM model that is barely super-fast or sub-fast magnetosonic account for Ly{alpha} observations that rely critically on the additional absorption provided by the hydrogen wall (H-wall)? and (2) If the LISM flow is weakly super-fast magnetosonic, does the transition assume the form of a traditional shock ormore » does neutral hydrogen (H) mediate shock dissipation and hence structure through charge exchange? Both questions are addressed using three three-dimensional self-consistently coupled magnetohydrodynamic plasma-kinetic H models with different LISM magnetic field strengths (2, 3, and 4 {mu}G) as well as plasma and neutral H number densities. The 2 and 3 {mu}G models are fast magnetosonic far upwind of the heliopause whereas the 4 {mu}G model is fully subsonic. The 2 {mu}G model admits a broad ({approx}50-75 AU) bow-shock-like structure. The 3 {mu}G model has a smooth super-fast-sub-fast magnetosonic transition that resembles a very broad, {approx}200 AU thick, bow wave. A theoretical analysis shows that the transition from a super-fast to a sub-fast magnetosonic downstream state is due to the charge exchange of fast neutral H and hot neutral H created in the supersonic solar wind and hot inner heliosheath, respectively. For both the 2 {mu}G and the 3 {mu}G models, the super-fast magnetosonic LISM flow passes through a critical point located where the fast magnetosonic Mach number M = 1 and Q{sub e} = {gamma}/({gamma} - 1)UQ{sub m} , where Q{sub e} and Q{sub m} are the plasma energy and momentum source terms due to charge exchange, U is the LISM flow speed, and {gamma} is the plasma adiabatic index. Because the Mach number is only barely super-fast magnetosonic in the 3 {mu}G case, the hot and fast neutral H can completely mediate the transition and impose a charge exchange length scale on the structure, making the solar-wind-LISM interaction effectively bow-shock-free. The charge exchange of fast and hot heliospheric neutral H therefore provides a primary dissipation mechanism at the weak heliospheric bow shock, in some cases effectively creating a one-shock heliosphere (i.e., a heliospheric termination shock only). Both super-fast magnetosonic models produce a sizeable H-wall. We find that (1) a sub-fast magnetosonic LISM flow cannot model the observed Ly{alpha} absorption profiles along the four sightlines considered ({alpha} Cen, 36 Oph, DK UMa, and {chi}{sup 1} Ori-upwind, sidewind, and downwind respectively); (2) both the super-fast magnetosonic models can account for the Ly{alpha} observations, with possibly the bow-shock-free 3 {mu}G model being slightly favored. Subject to further modeling and comparison against further lines of sight, we conclude with the tantalizing possibility that IBEX may have discovered a class of interstellar shocks mediated by neutral H.« less

  3. Design optimization and uncertainty quantification for aeromechanics forced response of a turbomachinery blade

    NASA Astrophysics Data System (ADS)

    Modgil, Girish A.

    Gas turbine engines for aerospace applications have evolved dramatically over the last 50 years through the constant pursuit for better specific fuel consumption, higher thrust-to-weight ratio, lower noise and emissions all while maintaining reliability and affordability. An important step in enabling these improvements is a forced response aeromechanics analysis involving structural dynamics and aerodynamics of the turbine. It is well documented that forced response vibration is a very critical problem in aircraft engine design, causing High Cycle Fatigue (HCF). Pushing the envelope on engine design has led to increased forced response problems and subsequently an increased risk of HCF failure. Forced response analysis is used to assess design feasibility of turbine blades for HCF using a material limit boundary set by the Goodman Diagram envelope that combines the effects of steady and vibratory stresses. Forced response analysis is computationally expensive, time consuming and requires multi-domain experts to finalize a result. As a consequence, high-fidelity aeromechanics analysis is performed deterministically and is usually done at the end of the blade design process when it is very costly to make significant changes to geometry or aerodynamic design. To address uncertainties in the system (engine operating point, temperature distribution, mistuning, etc.) and variability in material properties, designers apply conservative safety factors in the traditional deterministic approach, which leads to bulky designs. Moreover, using a deterministic approach does not provide a calculated risk of HCF failure. This thesis describes a process that begins with the optimal aerodynamic design of a turbomachinery blade developed using surrogate models of high-fidelity analyses. The resulting optimal blade undergoes probabilistic evaluation to generate aeromechanics results that provide a calculated likelihood of failure from HCF. An existing Rolls-Royce High Work Single Stage (HWSS) turbine blisk provides a baseline to demonstrate the process. The generalized polynomial chaos (gPC) toolbox which was developed includes sampling methods and constructs polynomial approximations. The toolbox provides not only the means for uncertainty quantification of the final blade design, but also facilitates construction of the surrogate models used for the blade optimization. This paper shows that gPC , with a small number of samples, achieves very fast rates of convergence and high accuracy in describing probability distributions without loss of detail in the tails . First, an optimization problem maximizes stage efficiency using turbine aerodynamic design rules as constraints; the function evaluations for this optimization are surrogate models from detailed 3D steady Computational Fluid Dynamics (CFD) analyses. The resulting optimal shape provides a starting point for the 3D high-fidelity aeromechanics (unsteady CFD and 3D Finite Element Analysis (FEA)) UQ study assuming three uncertain input parameters. This investigation seeks to find the steady and vibratory stresses associated with the first torsion mode for the HWSS turbine blisk near maximum operating speed of the engine. Using gPC to provide uncertainty estimates of the steady and vibratory stresses enables the creation of a Probabilistic Goodman Diagram, which - to the authors' best knowledge - is the first of its kind using high fidelity aeromechanics for turbomachinery blades. The Probabilistic Goodman Diagram enables turbine blade designers to make more informed design decisions and it allows the aeromechanics expert to assess quantitatively the risk associated with HCF for any mode crossing based on high fidelity simulations.

  4. Effects of a Multilingual Information Website Intervention on the Levels of Depression Literacy and Depression-Related Stigma in Greek-Born and Italian-Born Immigrants Living in Australia: A Randomized Controlled Trial

    PubMed Central

    Griffiths, Kathleen M; Blashki, Grant

    2011-01-01

    Background Little is known about the efficacy of Internet-based information interventions in increasing depression literacy or reducing depression stigma and depressive symptoms in people from non–English-speaking backgrounds. Objective Our objective was to investigate the effects of Multicultural Information on Depression Online (MIDonline), an Internet-based multilingual depression-specific information resource, on depression literacy, depression stigma, and depressive symptoms in Greek-born and Italian-born immigrants to Australia. Method In all, 202 Greek- and Italian-born immigrants aged 48 to 88 years were randomly allocated to an online depression information intervention (n =110) or a depression interview control group (n = 92). Participants allocated to the information intervention only had access to the website during the 1- to 1.5-hour intervention session. The primary outcome measures were depression literacy (depression knowledge), personal stigma (personal stigma toward people with a mental illness), perceived stigma (participants’ views about the probable attitude of the general community toward people with mental illness), and depressive symptoms. Depression literacy, personal and perceived stigma, and depressive symptoms were assessed at preassessment, postassessment, and at a 1-week follow-up assessment. The trial was undertaken at Monash University, Melbourne, Australia. Randomization and allocation to trial group were carried out using a computer-generated table. Results For depression literacy, there was a significant difference between the MIDonline and the control group with those in the MIDonline intervention displaying higher depression literacy scores postassessment (F1,178 = 144.99, P < .001) and at the follow-up assessment (F1,178 = 129.13, P < .001) than those in the control group. In addition, those in the MIDonline intervention showed a significantly greater decrease in mean personal stigma scores postassessment (F1,178 = 38.75, P < .001) and at the follow-up assessment (F1,176 = 11.08, P = .001) than those in the control group. For perceived stigma, there was no significant difference between the MIDonline intervention and the control group at postassessment (F1,178 = 0.60, P = .44) and at the follow-up assessment (F1,176 = 1.06, P = .30). For level of depression, there was no significant difference between the MIDonline intervention and the control group at preassessment (F1,201 = 0.56, P = .45), postassessment (F1,178 = 0.03, P = .86), or at the follow-up assessment, (F 1,175 = 1.71, P = .19). Within group effect sizes for depression literacy were −1.78 (MIDonline) and −0.07 (control); for personal stigma, they were 0.83 (MIDonline) and 0.06 (control); for perceived stigma, they were 0.14 (MIDonline) and 0.16 (control); and for depressive symptoms, they were 0.10 (MIDonline) and 0.10 (control). Conclusions Current results suggested that the Internet may be a feasible and effective means for increasing depression knowledge and decreasing personal stigma in non–English-speaking immigrant populations residing in English-speaking countries. The lack of change in perceived stigma in this trial is consistent with results in other trials examining online depression stigma interventions in English-speaking groups. Trial Registration ISRCTN76460837; http://www.controlled-trials.com/ISRCTN76460837 (Archived by WebCite at http://www.webcitation.org/5xjxva4Uq) PMID:21504872

  5. E-Beam Capture Aid Drawing Based Modelling on Cell Biology

    NASA Astrophysics Data System (ADS)

    Hidayat, T.; Rahmat, A.; Redjeki, S.; Rahman, T.

    2017-09-01

    The objectives of this research are to find out how far Drawing-based Modeling assisted with E-Beam Capture could support student’s scientific reasoning skill using Drawing - based Modeling approach assisted with E-Beam Capture. The research design that is used for this research is the Pre-test and Post-test Design. The data collection of scientific reasoning skills is collected by giving multiple choice questions before and after the lesson. The data analysis of scientific reasoning skills is using scientific reasoning assessment rubric. The results show an improvement of student’s scientific reasoning in every indicator; an improvement in generativity which shows 2 students achieving high scores, 3 students in elaboration reasoning, 4 students in justification, 3 students in explanation, 3 students in logic coherency, 2 students in synthesis. The research result in student’s explanation reasoning has the highest number of students with high scores, which shows 20 students with high scores in the pre-test and 23 students in post-test and synthesis reasoning shows the lowest number, which shows 1 student in the pretest and 3 students in posttest. The research result gives the conclusion that Drawing-based Modeling approach assisted with E-Beam Capture could not yet support student’s scientific reasoning skills comprehensively.

  6. Measuring interdisciplinary research and education outcomes in the Vienna Doctoral Programme on Water Resource Systems

    NASA Astrophysics Data System (ADS)

    Carr, Gemma; Loucks, Daniel Pete; Blaschke, Alfred Paul; Bucher, Christian; Farnleitner, Andreas; Fürnkranz-Prskawetz, Alexia; Parajka, Juraj; Pfeifer, Norbert; Rechberger, Helmut; Wagner, Wolfgang; Zessner, Matthias; Blöschl, Günter

    2015-04-01

    The interdisciplinary postgraduate research and education programme - the Vienna Doctoral Programme on Water Resource Systems - was initiated in 2009. To date, 35 research students, three post-docs and ten faculty members have been engaged in the Programme, from ten research fields (aquatic microbiology, hydrology, hydro-climatology, hydro-geology, mathematical economics, photogrammetry, remote sensing, resource management, structural mechanics, and water quality). The Programme aims to develop research students with the capacity to work across the disciplines, to conduct cutting edge research and foster an international perspective. To do this, a variety of mechanisms are adopted that include research cluster groups, joint study sites, joint supervision, a basic study programme and a research semester abroad. The Programme offers a unique case study to explore if and how these mechanisms lead to research and education outcomes. Outcomes are grouped according to whether they are tangible (publications with co-authors from more than one research field, analysis of graduate profiles and career destinations) or non-tangible (interaction between researchers, networks and trust). A mixed methods approach that includes bibliometric analysis combined with interviews with students is applied. Bibliometric analysis shows that as the Programme has evolved the amount of multi-disciplinary work has increased (32% of the 203 full papers produced by the programme's researchers have authors from more than one research field). Network analysis to explore which research fields collaborate most frequently show that hydrology plays a significant role and has collaborated with seven of the ten research fields. Hydrology researchers seem to interact the most strongly with other research fields as they contribute understanding on water system processes. Network analysis to explore which individuals collaborate shows that much joint work takes place through the five research cluster groups (water resource management, land-surface processes, Hydrological Open Air Laboratory, water and health, modelling and risk). Student interviews highlight that trust between colleagues and supervisors, and the role of spaces for interaction (joint study sites, cluster group meetings, shared offices etc.) are important for joint work. Graduate analysis shows that students develop skills and confidence to work across disciplines through collaborating on their doctoral research. Working collaboratively during the doctorate appears to be strongly correlated with continuing to work in this way after graduation.

  7. Metaphor, Model, and Theory in Education Research.

    ERIC Educational Resources Information Center

    Dickmeyer, Nathan

    1989-01-01

    The concepts of metaphor, model, and theory are defined and used to show how social science research in general, and education research in particular, has differed from Popper's description of natural science research. (IAH)

  8. Outlook on Research in Education for Sustainable Development

    ERIC Educational Resources Information Center

    Grasel, Cornelia; Bormann, Inka; Schutte, Kerstin; Trempler, Kati; Fischbach, Robert

    2013-01-01

    This article provides an overview of current research on Education for Sustainable Development (ESD). It shows a lack of correspondence between ESD research and recent debates in educational research. Research on ESD has established as a field of research with insufficient relations to other fields in educational research. Based on the overview…

  9. Communicating in context: a priority for gene therapy researchers.

    PubMed

    Robillard, Julie M

    2015-03-01

    History shows that public opinion of emerging biotechnologies has the potential to impact the research process through mechanisms such as funding and advocacy. It is critical, therefore, to consider public attitudes towards modern biotechnology such as gene therapy and more specifically towards the ethics of gene therapy, alongside advances in basic and clinical research. Research conducted through social media recently assessed how online users view the ethics of gene therapy and showed that while acceptability is high, significant ethical concerns remain. To address these concerns, the development of effective and evidence-based communication strategies that engage a wide range of stakeholders should be a priority for researchers.

  10. On the Ocean, Communicating Science Through Radio Broadcasts

    NASA Astrophysics Data System (ADS)

    Daugherty, M.; Campbell, L.

    2016-02-01

    The outcomes of oceanic research are of critical importance to the general public. Communicating these results in a relatable and efficient manner however, is no simple task. To further the cause of scientific outreach done for the benefit of society, a weekly radio show was created at Texas A&M University, taking cutting edge research and translating it into applicable, interesting radio segments. The show, named "On the Ocean", was created by the Department of Oceanography to inform and entertain listeners of the general public on marine issues affecting their lives. On the Ocean is an effort to present high-level research without sacrificing the complexity of the science conducted. On the Ocean is a uniquely designed module with a systematic approach in teaching a new oceanographic concept each month. On the Ocean has a format of monthly topics with a two minute show each week. The first monthly installment is general, introducing the topic and its relevancy. The second and third shows are cause or effect, or possibly something very interesting the public would not already know. The fourth installment highlights how researchers study the topic, with the contributing professor's specific research methods emphasized. All shows are co-created with, and inspected for validity, by Texas A&M University professors, and edited for radio adaption by graduate students. In addition to airing on public broadcast radio to the College Station/Bryan TX area, the show also includes a globally accessible interactive website with podcasts, additional figures, and links to better elaborate on the material presented, as well as credit the contributing professors. The website also allows these professors the opportunity to present their research visually and link to their current work. Overall, On the Ocean is a new tool to deliver applicable science.

  11. What Is This Thing Called "Design" in Design Research and Instructional Design

    ERIC Educational Resources Information Center

    Cronje, Johannes

    2013-01-01

    This paper will consider the phenomenon of design research. It will then consider four research positions in social science research. The paper will show how the design perspectives map onto the research paradigms and how by rotating through these paradigms, a design research cycle is formed. Finally, the paper will discuss four research questions…

  12. A comparative analysis of biomedical research ethics regulation systems in Europe and Latin America with regard to the protection of human subjects.

    PubMed

    Lamas, Eugenia; Ferrer, Marcela; Molina, Alberto; Salinas, Rodrigo; Hevia, Adriana; Bota, Alexandre; Feinholz, Dafna; Fuchs, Michael; Schramm, Roland; Tealdi, Juan-Carlos; Zorrilla, Sergio

    2010-12-01

    The European project European and Latin American Systems of Ethics Regulation of Biomedical Research Project (EULABOR) has carried out the first comparative analysis of ethics regulation systems for biomedical research in seven countries in Europe and Latin America, evaluating their roles in the protection of human subjects. We developed a conceptual and methodological framework defining 'ethics regulation system for biomedical research' as a set of actors, institutions, codes and laws involved in overseeing the ethics of biomedical research on humans. This framework allowed us to develop comprehensive national reports by conducting semi-structured interviews to key informants. These reports were summarised and analysed in a comparative analysis. The study showed that the regulatory framework for clinical research in these countries differ in scope. It showed that despite the different political contexts, actors involved and motivations for creating the regulation, in most of the studied countries it was the government who took the lead in setting up the system. The study also showed that Europe and Latin America are similar regarding national bodies and research ethics committees, but the Brazilian system has strong and noteworthy specificities.

  13. Evaluation of the pH of a Self-Adhesive Flowable Composite Over 3 Months

    DTIC Science & Technology

    2016-04-01

    Flowable Materials. Dental Research Journal 2012; 9(4): 460-465. 12. Goracci C, Margvelashvili M, Giovannetti A, Vichi A, Ferrari M: Shear Bond...responsible for the vast field of current products on the dental market today. These developments have led researchers to focus on combining the...reduces internal voids. Some studies even show better marginal adaptation when a flowable composite is used, though some research shows it doesn’t

  14. Educator Effectiveness Research Alliance: Using Research and Data to Understand and Improve Educator Preparation and Evaluation

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Southwest, 2018

    2018-01-01

    Research shows that teachers affect student learning more than any other factor. The Educator Effectiveness Research Alliance, a collaborative partnership of educators, policymakers, and researchers, seeks to improve educator quality through research and analytic technical support. Initially focused on Texas, the alliance has expanded to include…

  15. A survey of publication practices of single-case design researchers when treatments have small or large effects.

    PubMed

    Shadish, William R; Zelinsky, Nicole A M; Vevea, Jack L; Kratochwill, Thomas R

    2016-09-01

    The published literature often underrepresents studies that do not find evidence for a treatment effect; this is often called publication bias. Literature reviews that fail to include such studies may overestimate the size of an effect. Only a few studies have examined publication bias in single-case design (SCD) research, but those studies suggest that publication bias may occur. This study surveyed SCD researchers about publication preferences in response to simulated SCD results that show a range of small to large effects. Results suggest that SCD researchers are more likely to submit manuscripts that show large effects for publication and are more likely to recommend acceptance of manuscripts that show large effects when they act as a reviewer. A nontrivial minority of SCD researchers (4% to 15%) would drop 1 or 2 cases from the study if the effect size is small and then submit for publication. This article ends with a discussion of implications for publication practices in SCD research. © 2016 Society for the Experimental Analysis of Behavior.

  16. Bridging Research and Environmental Regulatory Processes: The Role of Knowledge Brokers

    PubMed Central

    Pennell, Kelly G.; Thompson, Marcella; Rice, James W.; Senier, Laura; Brown, Phil; Suuberg, Eric

    2013-01-01

    Federal funding agencies increasingly require research investigators to ensure that federally-sponsored research demonstrates broader societal impact. Specifically, the National Institutes of Environmental Health Sciences (NIEHS) Superfund Research Program (SRP) requires research centers to include research translation and community engagement cores to achieve broader impacts, with special emphasis on improving environmental health policies through better scientific understanding. This paper draws on theoretical insights from the social sciences to show how incorporating knowledge brokers in research centers can facilitate translation of scientific expertise to influence regulatory processes and thus promote public health. Knowledge brokers connect academic researchers with decision-makers, to facilitate the translation of research findings into policies and programs. In this article, we describe the stages of the regulatory process and highlight the role of the knowledge broker and scientific expert at each stage. We illustrate the cooperation of knowledge brokers, scientific experts and policymakers using a case from the Brown University (Brown) SRP. We show how the Brown SRP incorporated knowledge brokers to engage scientific experts with regulatory officials around the emerging public health problem of vapor intrusion. In the Brown SRP, the knowledge broker brought regulatory officials into the research process, to help scientific experts understand the critical nature of this emerging public health threat, and helped scientific experts develop a research agenda that would inform the development of timely measures to protect public health. Our experience shows that knowledge brokers can enhance the impact of environmental research on public health by connecting policy decision-makers with scientific experts at critical points throughout the regulatory process. PMID:24083557

  17. Physiology response of fourth generation saline resistant soybean (Glycine max (L.) Merrill) with application of several types of antioxidants

    NASA Astrophysics Data System (ADS)

    Manurung, I. R.; Rosmayati; Rahmawati, N.

    2018-02-01

    Antioxidant applications are expected to reduce the adverse effects of soil saline. This research was conducted in plastic house, Plant Tissue Laboratory Faculty of Agriculture and Plant Physiology Laboratory Faculty of Mathematic and Natural Science, Universitas Sumatera Utara, Medan also in Research Centers and Industry Standardization, Medan from July-December 2016. The objective of the research was to know the effect of various antioxidant treatments with different concentrations (control, ascorbic acid 250, 500 and 750 ppm; salicylic acid 250, 500 and 750 ppm; α-tocopherol 250, 500 and 750 ppm) on fourth generation soybean physiology in saline condition (Electric Conductivity 5-6 dS/m). The results of this research showed that the antioxidant type and concentration affected not significantly to physiology of fourth generation soybean. Descriptively the highest average of superoxide dismutase and peroxide dismutase was showed on ascorbic acid 250 ppm. The highest average of ascorbate peroxidase was showed on α-tocopherol 750 ppm. The highest average of carotenoid content was showed on ascorbic acid 500 ppm. The highest average of chlorophyll content was showed on α-tocopherol 250 ppm. The highest average of ratio of K/Na was showed on salicylic acid 250 ppm.

  18. 13. Historic view of Building 100 control room, showing personnel ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. Historic view of Building 100 control room, showing personnel with data recording instrumentation. 1957. On file at NASA Plumbrook Research Facility, Sandusky, Ohio. NASA photo number C-46211. - Rocket Engine Testing Facility, GRC Building No. 100, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  19. Review of a nursing research report. Young people with depression: review of a nursing research report.

    PubMed

    Collins, Janay

    2013-01-01

    McCann's et al. (2012) research study revealed several adverse effects that depression can have on young adults. The findings showed that depression in young adults can be life-threatening if not treated (McCann et al., 2012). One implication for evidenced-based nursing practice would be to educate family and friends on the signs of depression and how to respond to them. A suggestion for future research would be to conduct a study showing the effectiveness of different treatment methods (e.g., therapy, medications) on adolescent depression.

  20. Invitro Study on the Fluid From Banana Stem Bioprocess as Direct Fed Microbial

    NASA Astrophysics Data System (ADS)

    Mutaqin, B. K.; Tanuwiria, U. H.; Hernawan, E.

    2018-02-01

    The purpose of this research was to study the liquid produced by the bioprocess of banana stem as a Direct Fed Microbial (DFM) in order to enhance local sheep productivity invitro. Studying was the use of DFM in two invitro feeds. The object observed in this research was fermentability and digestibility value. The method was experimental with the experimental design, i.e. factorial experimental design with two factors. The first factor was DFM, the levels of which were 0, 0,2, 0,4 and 0,6%, while the second factor was two feed types (complete feed and Pennisetum purpureum only) with the treatment of threefold repetition. This research showed that fermentability and digestibility value were influenced by the DFM in the invitro complete feed. The research result analyzed using MANOVA with further testing using Duncan Test. The conclusion of the research result were shows the interaction DFM in the complete feed improve fermentability and digestibility values and DFM 0,6% shows the highest value.

  1. Community-Based Research: From Practice to Theory and Back Again.

    ERIC Educational Resources Information Center

    Stoecker, Randy

    2003-01-01

    Explores the theoretical strands being combined in community-based research--charity service learning, social justice service learning, action research, and participatory research. Shows how different models of community-based research, based in different theories of society and different approaches to community work, may combine or conflict. (EV)

  2. Using the Pyramid Approach to Teaching Marketing Research.

    ERIC Educational Resources Information Center

    Peltier, James W.; Westfall, John; Ainscough, Thomas L.

    2001-01-01

    Underscores the need for teaching marketing research skills at the secondary level and shows how marketing research fits into marketing education. Provides an example of how to use the pyramid approach to research, which involves review of secondary sources, key informant interviews, focus groups, and quantitative research. (Author/JOW)

  3. Keeping in Touch

    ERIC Educational Resources Information Center

    Tonn, Jessica L.

    2005-01-01

    Overwhelming evidence shows that family involvement--both in school and at home--has a positive impact on student achievement. Researchers have found that parental involvement tends to drop off when students enter middle school. Research also shows that does not have to be the case. In this article, the author discusses the school-family-community…

  4. Departmental Publication Productivity and Reputational Quality: Disciplinary Differences

    ERIC Educational Resources Information Center

    Baird, Leonard

    2009-01-01

    Using US national data, this study examines the levels and interactions between reputational rankings, average publications, citations, and external research support across 30 disciplines. The analyses show great variation among the disciplines in average and range of publications, citations, and external research support. They also show that the…

  5. Interleaved Practice in Multi-Dimensional Learning Tasks: Which Dimension Should We Interleave?

    ERIC Educational Resources Information Center

    Rau, Martina A.; Aleven, Vincent; Rummel, Nikol

    2013-01-01

    Research shows that multiple representations can enhance student learning. Many curricula use multiple representations across multiple task types. The temporal sequence of representations and task types is likely to impact student learning. Research on contextual interference shows that interleaving learning tasks leads to better learning results…

  6. An Assessment and Annotated Bibliography of Marine Bioluminescence Research: 1979-1987.

    DTIC Science & Technology

    1993-01-01

    to Leona Cole, Division Secretary for NOARL Code 330, who prepared the typed draft of this manuscript. The mention of commercial products or company...look at the end points of the curves suggest a The graph shows the number of publications each small drop in research productivity in three areas; 0 year...Table 2 shows the most productive agencies dominated the funding for U.S. research institutions. In the next decade, we may expect on bioluminescence

  7. Are vegans the same as vegetarians? The effect of diet on perceptions of masculinity.

    PubMed

    Thomas, Margaret A

    2016-02-01

    Food and food consumption matters in interpersonal interactions. Foods consumed can affect how a person is perceived by others in terms of morality, likeability, and gender. Food consumption can be used as a strategy for gendered presentation, either in terms of what foods are consumed or in the amount of food consumed. Finally, foods themselves are associated with gender. Previous research (Browarnik, 2012; Ruby & Heine, 2011) shows inconsistent patterns in the association between vegetarianism and masculinity. The current research conceptually replicates and extends this research by including the explicit label of vegetarian. The four studies in this article provide increased information about the effects of diet on gendered perceptions. Study 1 shows that vegetarian and omnivorous targets are rated equally in terms of masculinity. Study 2 shows that perceptions of vegetarians and vegans are similar, though comparing this research with past research indicates that perceptions of vegetarians are more variable. Study 3 shows that veganism leads perceptions of decreased masculinity relative to omnivores. Finally, Study 4 tests one possible mechanism for the results of Study 3, that it is the choice to be vegan that impacts perceptions of gender. Implications include increased knowledge about how meatless diets can affect the perceptions of gender in others. Multiple directions for future research are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. The development of thematic materials using project based learning for elementary school

    NASA Astrophysics Data System (ADS)

    Yuliana, M.; Wiryawan, S. A.; Riyadi

    2018-05-01

    Teaching materials is one of the important factors in supporting on learning process. This paper discussed about developing thematic materials using project based learning. Thematic materials are designed to make students to be active, creative, cooperative, easy in thinking to solve the problem. The purpose of the research was to develop thematic material using project based learning which used valid variables. The method of research which used in this research was four stages of research and development proposed by Thiagarajan consisting of 4 stages, namely: (1) definition stage, (2) design stage, (3) development stage, and (4) stage of dissemination. The first stage was research and information collection, it was in form of need analysis with questionnaire, observation, interview, and document analysis. Design stage was based on the competencies and indicator. The third was development stage, this stage was used to product validation from expert. The validity of research development involved media validator, material validator, and linguistic validator. The result from the validation of thematic material by expert showed that the overall result had a very good rating which ranged from 1 to 5 likert scale, media validation showed a mean score 4,83, the material validation showed mean score 4,68, and the mean of linguistic validation was e 4,74. It showed that the thematic material using project based learning was valid and feasible to be implemented in the context thematic learning.

  9. Institutional Controls and Educational Research.

    ERIC Educational Resources Information Center

    Homan, Roger

    1990-01-01

    Recognizing tendencies toward contract research and possible consequences, advocates creating a conduct code to regulate educational research and protect its integrity. Reports survey responses from 48 British institutions, showing no systematic code. States confidence in supervisory discretion currently guides research. Proposes a specific code…

  10. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research.

    PubMed

    Bandyopadhyay, Mridula

    2011-11-25

    The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people's social and cultural lives. I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health.

  11. Trends and topics in sports research in the Social Science Citation Index from 1993 to 2008.

    PubMed

    Gau, Li-Shiue

    2013-02-01

    This descriptive study evaluated behavioral and social science research on sport for 1993 through 2008, examined the characteristics of sport research, and identified mainstream issues appearing during these 16 years. Based on the Social Science Citation Index (SSCI) database from 1993 to 2008, 7,655 articles referring to sport or sports were available. The publication analyses showed that 13 core journals published the most articles in the behavioral sciences of sport. By analyzing all titles, author keywords, and KeyWords Plus, the results showed that physical education, athlete performance, and sports participation were the mainstream issues of sport research in the 16-year study period. The words adolescent, youth, and children frequently appeared, indicating that the emphasis of sport research focused on these participant groups. This bibliometric study reviewed global sports research in SSCI, and described certain patterns or trends in prior research on sport.

  12. An Analysis and Evaluation of Research in Cognition and Learning among Older Adults.

    ERIC Educational Resources Information Center

    Wass, Hannelore; Olejnik, Stephen F.

    1983-01-01

    Reviews research literature to determine implications for educational programs for elderly persons. Showed that, in general, researchers fall far short of providing useful information to practitioners in this field. Suggested that laboratory research on cognition and learning must be followed by research conducted in the actual educational…

  13. Cultural Relevance in Research Methodology/Paradigm/Terminology: Dilemma, Contradiction and Challenge.

    ERIC Educational Resources Information Center

    Maina, Faith

    This paper describes an incident between an academic researcher and a community member. The encounter, in which a researcher asked questions about farming practices, shows how cultural misunderstanding and failure to communicate the gains of research to the community has the potential to generate distorted information. The academic researcher has…

  14. Primary prevention research: a preliminary review of program outcome studies.

    PubMed

    Schaps, E; Churgin, S; Palley, C S; Takata, B; Cohen, A Y

    1980-07-01

    This article reviews 35 drug abuse prevention program evaluations employing drug-specific outcome measures. Many of these evaluations assessed the effects of "new generation" prevention strategies: affective, peer-oriented, and multidimensional approaches. Only 14 studies evaluated purely informational programs. Evaluations were analyzed to ascertain (1) characteristics of the programs under study, (2) characteristics of the research designs, and (3) patterns among findings. This review provides some evidence that the newer prevention strategies may produce more positive and fewer negative outcomes than did older drug information approaches. Over 70% of the programs using the newer strategies produced some positive effects; only 29% showed negative effects. In contrast, 46% of informational programs showed positive effects; 46% showed negative effects. These findings must be approached with great caution, since the research was frequently scientifically inadequate, and since rigor of research was negatively correlated with intensity and duration of program services.

  15. Valuing School Quality Using Boundary Discontinuities. CEE DP 132

    ERIC Educational Resources Information Center

    Gibbons, Stephen; Machin, Stephen; Silva, Olmo

    2012-01-01

    Existing research shows that house prices respond to local school quality as measured by average test scores. However, higher test scores could signal better quality teaching and academic value-added, or higher ability, sought-after intakes. In our research, we show decisively that value-added drives households' demand for good schooling. However,…

  16. "Cool" English: Stylized Native-Speaker English in Japanese Television Shows

    ERIC Educational Resources Information Center

    Furukawa, Gavin

    2015-01-01

    This article analyzes stylized pronunciations of English by Japanese speakers on televised variety shows in Japan. Research on style and mocking has done much to reveal how linguistic forms are utilized in interaction as resources of identity construction that can oftentimes subvert hegemonic discourse (Chun 2004). Within this research area,…

  17. The Dynamics of Blog Peer Feedback in ESL Classroom

    ERIC Educational Resources Information Center

    Gedera, Dilani S. P.

    2012-01-01

    Over the past decade aspects pertinent to the area of feedback have been extensively explored by researchers. While some of the studies show positive effects of peer review, others discuss its problematic areas. In spite of the controversies the new ways of integrating peer feedback in ESL classrooms are being explored. Researchers show an…

  18. 36. Historic photo of Building 202 interior, shows shop area ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    36. Historic photo of Building 202 interior, shows shop area with engineers assembling twenty-thousand-pound-thrust rocket engine, December 15, 1958. On file at NASA Plumbrook Research Center, Sandusky, Ohio. NASA photo number C-49343. - Rocket Engine Testing Facility, GRC Building No. 202, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  19. 12. Historic view of Building 100 control room, showing television ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. Historic view of Building 100 control room, showing television monitoring of tests and personnel operating rocket engine test controls. May 27, 1957. On file at NASA Plumbrook Research Facility, Sandusky, Ohio. NASA photo number C-45021. - Rocket Engine Testing Facility, GRC Building No. 100, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  20. 56. Historic photo of excavation work at Building 202, shows ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    56. Historic photo of excavation work at Building 202, shows facility with exhaust scrubber in foreground, February 24, 1969. On file at NASA Plumbrook Research Center, Sandusky, Ohio. NASA photo number C-69-712. - Rocket Engine Testing Facility, GRC Building No. 202, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  1. 55. Historic photo of excavation work at Building 202, shows ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    55. Historic photo of excavation work at Building 202, shows facility with detention tank in foreground, February 24, 1969. On file at NASA Plumbrook Research Center, Sandusky, Ohio. NASA photo number C-69-711. - Rocket Engine Testing Facility, GRC Building No. 202, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  2. Behavioral and Social Science Research: A National Resource. Part II.

    ERIC Educational Resources Information Center

    Adams, Robert McC., Ed.; And Others

    Areas of behavioral and social science research that have achieved significant breakthroughs in knowledge or application or that show future promise of achieving such breakthroughs are discussed in 12 papers. For example, the paper on formal demography shows how mathematical or statistical techniques can be used to explain and predict change in…

  3. Entering Research

    ERIC Educational Resources Information Center

    Lawless, Ann; Sedorkin, Barbara

    2007-01-01

    This article presents a short story of the authors, who show how they have "entered research", that is, entered the earliest conception of research and the early formation of research collaboration. As the authors worked together, they realised they had common concerns and life experiences. Each proudly identifies as working class…

  4. Installing a practical research project and interpreting research results

    Treesearch

    R. Kasten Dumroese; David L. Weny

    2002-01-01

    We review the basic concepts of science and research and the scientific process. Using an example from a bareroot nursery, we show how a practical research project can be done at any type of nursery, meshing sound statistical principles with limitations of busy nursery managers.

  5. Biometric Communication Research for Television.

    ERIC Educational Resources Information Center

    Malik, M. F.

    Biometric communication research is defined as research dealing with the information impact of a film or television show, photographic picture, painting, exhibition, display, or any literary or functional texts or verbal stimuli on human beings, both as individuals and in groups (mass audiences). Biometric communication research consists of a…

  6. Analysis of the research sample collections of Uppsala biobank.

    PubMed

    Engelmark, Malin T; Beskow, Anna H

    2014-10-01

    Uppsala Biobank is the joint and only biobank organization of the two principals, Uppsala University and Uppsala University Hospital. Biobanks are required to have updated registries on sample collection composition and management in order to fulfill legal regulations. We report here the results from the first comprehensive and overall analysis of the 131 research sample collections organized in the biobank. The results show that the median of the number of samples in the collections was 700 and that the number of samples varied from less than 500 to over one million. Blood samples, such as whole blood, serum, and plasma, were included in the vast majority, 84.0%, of the research sample collections. Also, as much as 95.5% of the newly collected samples within healthcare included blood samples, which further supports the concept that blood samples have fundamental importance for medical research. Tissue samples were also commonly used and occurred in 39.7% of the research sample collections, often combined with other types of samples. In total, 96.9% of the 131 sample collections included samples collected for healthcare, showing the importance of healthcare as a research infrastructure. Of the collections that had accessed existing samples from healthcare, as much as 96.3% included tissue samples from the Department of Pathology, which shows the importance of pathology samples as a resource for medical research. Analysis of different research areas shows that the most common of known public health diseases are covered. Collections that had generated the most publications, up to over 300, contained a large number of samples collected systematically and repeatedly over many years. More knowledge about existing biobank materials, together with public registries on sample collections, will support research collaborations, improve transparency, and bring us closer to the goals of biobanks, which is to save and prolong human lives and improve health and quality of life.

  7. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research

    PubMed Central

    2011-01-01

    Objective The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people’s social and cultural lives. Approach I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. Results I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. Conclusion When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Implication Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health. PMID:22168509

  8. Research Trends in Evidence-Based Medicine: A Joinpoint Regression Analysis of More than 50 Years of Publication Data

    PubMed Central

    Hung, Bui The; Long, Nguyen Phuoc; Hung, Le Phi; Luan, Nguyen Thien; Anh, Nguyen Hoang; Nghi, Tran Diem; Van Hieu, Mai; Trang, Nguyen Thi Huyen; Rafidinarivo, Herizo Fabien; Anh, Nguyen Ky; Hawkes, David; Huy, Nguyen Tien; Hirayama, Kenji

    2015-01-01

    Background Evidence-based medicine (EBM) has developed as the dominant paradigm of assessment of evidence that is used in clinical practice. Since its development, EBM has been applied to integrate the best available research into diagnosis and treatment with the purpose of improving patient care. In the EBM era, a hierarchy of evidence has been proposed, including various types of research methods, such as meta-analysis (MA), systematic review (SRV), randomized controlled trial (RCT), case report (CR), practice guideline (PGL), and so on. Although there are numerous studies examining the impact and importance of specific cases of EBM in clinical practice, there is a lack of research quantitatively measuring publication trends in the growth and development of EBM. Therefore, a bibliometric analysis was constructed to determine the scientific productivity of EBM research over decades. Methods NCBI PubMed database was used to search, retrieve and classify publications according to research method and year of publication. Joinpoint regression analysis was undertaken to analyze trends in research productivity and the prevalence of individual research methods. Findings Analysis indicates that MA and SRV, which are classified as the highest ranking of evidence in the EBM, accounted for a relatively small but auspicious number of publications. For most research methods, the annual percent change (APC) indicates a consistent increase in publication frequency. MA, SRV and RCT show the highest rate of publication growth in the past twenty years. Only controlled clinical trials (CCT) shows a non-significant reduction in publications over the past ten years. Conclusions Higher quality research methods, such as MA, SRV and RCT, are showing continuous publication growth, which suggests an acknowledgement of the value of these methods. This study provides the first quantitative assessment of research method publication trends in EBM. PMID:25849641

  9. Building Interdisciplinary Qualitative Research Networks: Reflections on Qualitative Research Group (QRG) at the University of Manitoba

    ERIC Educational Resources Information Center

    Roger, Kerstin Stieber; Halas, Gayle

    2012-01-01

    As qualitative research methodologies continue to evolve and develop, both students and experienced researchers are showing greater interest in learning about and developing new approaches. To meet this need, faculty at the University of Manitoba created the Qualitative Research Group (QRG), a community of practice that utilizes experiential…

  10. As if by Machinery: The Levelling of Educational Research

    ERIC Educational Resources Information Center

    Smith, Richard

    2006-01-01

    Much current educational research shows the influence of two powerful but potentially pernicious lines of thought. The first, which can be traced at least as far back as Francis Bacon, is the ambition to formulate precise techniques of research, or "research methods", which can be applied reliably irrespective of the talent of the researcher. The…

  11. Potential of antioxidant and toxicity of some medical plants used by sub-ethnic communities of Bahau in East Kalimantan

    NASA Astrophysics Data System (ADS)

    Rohim, P.; Arung, E. T.; Kusuma, I. W.

    2018-04-01

    The purpose of this research is to assay the potential antioxidant and toxicity of several plants from Bahau, a sub-ethnic in East Kalimantan in regard to their utilization as traditional medicines. This research includes phytochemical analysis, DPPH radical and superoxide radical scavenging activity as well as toxicity assay using Artemiasalina shrimp larvae. The results of the extraction showed the highest yield was 2,91% obtained from avung tanaq (Ficus uncinata), while the lowest is 1.14% obtained from tevoqsalah (Saccharum sp.) species. The result of phytochemicals showed that all plants contain alkaloid and carbohydrate. While carotenoids, saponins, triterpenoids and steroids were absence in all plant extracts. The DPPH radical scavenging activity test showed that the lowest IC50 value of kayog kue (Dictamnus albus) by 23.96 μg/mL. The superoxide radical scavenging activity assay showed IC50 values of all extract samples were >100 μg/mL. The toxicity assay showed that LC50 values of all samples of extract tested were >1000 μg/mL. The present research suggested good potential activity of some plants from Bahau ethnic and further research oriented to wide uses of the plants as herbal products is needed.

  12. Photos That Increase Feelings of Learning Promote Positive Evaluations

    ERIC Educational Resources Information Center

    Cardwell, Brittany A.; Newman, Eryn J.; Garry, Maryanne; Mantonakis, Antonia; Beckett, Randi

    2017-01-01

    Research shows that when semantic context makes it feel easier for people to bring related thoughts and images to mind, people can misinterpret that feeling of ease as evidence that information is positive. But research also shows that semantic context does more than help people bring known concepts to mind--it also teaches people new concepts. In…

  13. North wall, central part, showing partial partition wall at left. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    North wall, central part, showing partial partition wall at left. This area is labeled “Pioneering Research” on drawing copy NV-35-B-5 (submitted with HABS No. NV-35-B) (series 2 of 4) - Bureau of Mines Metallurgical Research Laboratory, Original Building, Date Street north of U.S. Highway 93, Boulder City, Clark County, NV

  14. 39. Historic photo of Building 202 test cell exterior, showing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    39. Historic photo of Building 202 test cell exterior, showing fiberglass cladding blown out by hydrogen fire during rocket engine testing, April 27, 1959. On file at NASA Plumbrook Research Center, Sandusky, Ohio. NASA photo number C-50472. - Rocket Engine Testing Facility, GRC Building No. 202, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  15. 11. Historic view of Building 100 control room, showing personnel ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. Historic view of Building 100 control room, showing personnel operating rocket engine test controls and observer watching activity from observation room. May 27, 1957. On file at NASA Plumbrook Research Center, Sandusky, Ohio. NASA photo number C-45020. - Rocket Engine Testing Facility, GRC Building No. 100, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  16. 38. Historic photo of Building 202 test cell interior, showing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    38. Historic photo of Building 202 test cell interior, showing damage to test stand A and rocket engine after failure and explosion of engine, December 12, 1958. On file at NASA Plumbrook Research Center, Sandusky, Ohio. NASA photo number C-49376. - Rocket Engine Testing Facility, GRC Building No. 202, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  17. Light Up Their Lives: A Review of Research on the Effects of Lighting on Children's Achievement and Behavior.

    ERIC Educational Resources Information Center

    Dunn, Rita; And Others

    1985-01-01

    Cites research showing individual reactions to bright and dim light in the classroom. Shows individual susceptibility to extreme negativism in inappropriate lighting conditions and suggests that students' predispositions for illumination be identified. Notes that restless, fidgety youngsters should be placed into softly lit sections, with the…

  18. Guidelines for the Development of Procedural Schematics for Research.

    ERIC Educational Resources Information Center

    Dolim, Michael P.

    To aid the student or researcher in the development of an effective procedural schematic (a graphic description of a research plan showing the steps needed to reach a stated objective), guidelines are presented under three topic headings: Compositional elements, taxonomy of research terms, and examples of procedural schematics. An introduction…

  19. Research Evaluation and the Assessment of Public Value

    ERIC Educational Resources Information Center

    Molas-Gallart, Jordi

    2015-01-01

    Funding organisations are increasingly asking academics to show evidence of the economic and social value generated by their research. These requests have often been associated with the emergence of a so-called "new social contract for research" and are related to the implementation of new research evaluation systems. Although the…

  20. Denial, Opposition, Rejection or Dissent: Why Do Teachers Contest Research Evidence?

    ERIC Educational Resources Information Center

    Cain, Tim

    2017-01-01

    Internationally, efforts are being made for educational practice to be research-informed on the grounds that schoolteachers will implement what research shows will "work". However, teachers do not necessarily accept research findings; sometimes they contest them. This article considers teachers' contestation in two empirical studies,…

  1. External Research: Helping Education Change for the Better, 1983-84.

    ERIC Educational Resources Information Center

    Austin Independent School District, TX. Office of Research and Evaluation.

    The Austin Independent School District (AISD) Office of Research and Development publishes abstracts of research projects conducted within the AISD by external agencies or individuals. This compilation begins with a roster of external research projects in tabular form showing the AISD project number, title, project director and sponsor, schools…

  2. Research Administrator Salary: Association with Education, Experience, Credentials and Gender

    ERIC Educational Resources Information Center

    Shambrook, Jennifer; Roberts, Thomas J.; Triscari, Robert

    2011-01-01

    The 2010 Research Administrators Stress Perception Survey (2010 RASPerS) collected data from 1,131 research administrators on salary, years experience, educational level, Certified Research Administrator (CRA) status, and gender. Using these data, comparisons were made to show how salary levels are associated with each of these variables. Using…

  3. A Sampling of Success Stories of Federal Investment in University Research.

    ERIC Educational Resources Information Center

    2003

    This volume illustrates how Canadian federal investment is advancing key measures of success in building research capacity at Ontario universities. Program descriptions illustrate federal investment in university-based research at 18 institutions of higher education in Ontario. A look at these programs shows that government-funded research is…

  4. The Joint Institute for Nuclear Research in Experimental Physics of Elementary Particles

    NASA Astrophysics Data System (ADS)

    Bednyakov, V. A.; Russakovich, N. A.

    2018-05-01

    The year 2016 marks the 60th anniversary of the Joint Institute for Nuclear Research (JINR) in Dubna, an international intergovernmental organization for basic research in the fields of elementary particles, atomic nuclei, and condensed matter. Highly productive advances over this long road clearly show that the international basis and diversity of research guarantees successful development (and maintenance) of fundamental science. This is especially important for experimental research. In this review, the most significant achievements are briefly described with an attempt to look into the future (seven to ten years ahead) and show the role of JINR in solution of highly important problems in elementary particle physics, which is a fundamental field of modern natural sciences. This glimpse of the future is full of justified optimism.

  5. Principals of Effective Schools Are Strong Instructional Leaders.

    ERIC Educational Resources Information Center

    Jordan, Ian

    1986-01-01

    Examines research on effective schools and on principals as instructional leaders. Research shows that principals of effective schools are strong instructional leaders. Mention is made of weaknesses found in the research on principals as instructional leaders. (MD)

  6. Basic Research and Progress against Cancer

    Cancer.gov

    An infographic about the importance of basic research for making progress against cancer. The graphic shows the research milestones that led to the development and approval of crizotinib (Xalkori®) to treat certain non-small cell lung cancers.

  7. The Influence of Accelerator Science on Physics Research

    NASA Astrophysics Data System (ADS)

    Haussecker, Enzo F.; Chao, Alexander W.

    2011-06-01

    We evaluate accelerator science in the context of its contributions to the physics community. We address the problem of quantifying these contributions and present a scheme for a numerical evaluation of them. We show by using a statistical sample of important developments in modern physics that accelerator science has influenced 28% of post-1938 physicists and also 28% of post-1938 physics research. We also examine how the influence of accelerator science has evolved over time, and show that on average it has contributed to a physics Nobel Prize-winning research every 2.9 years.

  8. Budget boosts overall research but cuts NOAA and USGS funds

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    Science in general, and physical sciences in particular, show growth far above projected inflation in President Ronald Reagan's fiscal 1984 budget proposal. Total funding requested for all federal research and development, including facilities, is $47 billion, up 17.2% over fiscal 1983, jumping hurdles over the 5% projected inflation rate. Defense R&D is slated to soar 29% to $30.3 billion, while non-defense R&D would rise 0.4% to $16.7 billion. Table 1 shows the proposed research and development budgets by major departments and agencies.

  9. Pavement Technology and Airport Infrastructure Expansion Impact

    NASA Astrophysics Data System (ADS)

    Sabib; Setiawan, M. I.; Kurniasih, N.; Ahmar, A. S.; Hasyim, C.

    2018-01-01

    This research aims for analyzing construction and infrastructure development activities potential contribution towards Airport Performance. This research is correlation study with variable research that includes Airport Performance as X variable and construction and infrastructure development activities as Y variable. The population in this research is 148 airports in Indonesia. The sampling technique uses total sampling, which means 148 airports that becomes the population unit then all of it become samples. The results of coefficient correlation (R) test showed that construction and infrastructure development activities variable have a relatively strong relationship with Airport Performance variable, but the value of Adjusted R Square shows that an increase in the construction and infrastructure development activities is influenced by factor other than Airport Performance.

  10. Is Afro-American Studies Research in Jeopardy? A Review of Recent Trends in Federal Research Support.

    ERIC Educational Resources Information Center

    Tucker, M. Belinda

    1984-01-01

    An investigation of Federal funding for Afro-American related projects for 1978 through 1983 shows that, during this period, research in the areas where Afro-American research is overwhelmingly concentrated--the social sciences and the humanities--has not kept pace with the substantial increases apparent in the funding of research in the…

  11. Measuring and Maximising Research Impact in Applied Social Science Research Settings. Good Practice Guide

    ERIC Educational Resources Information Center

    Stanwick, John; Hargreaves, Jo

    2012-01-01

    This guide describes the National Centre for Vocational Education Research (NCVER) approach to measuring impact using examples from its own case studies, as well as showing how to maximise the impact of applied social science research. Applied social science research needs to demonstrate that it is relevant and useful both to public policy and…

  12. The Research-Teaching Nexus: Using a Construction Teaching Event as a Research Tool

    ERIC Educational Resources Information Center

    Casanovas-Rubio, Maria del Mar; Ahearn, Alison; Ramos, Gonzalo; Popo-Ola, Sunday

    2016-01-01

    In principle, the research-teaching nexus should be seen as a two-way link, showing not only ways in which research supports teaching but also ways in which teaching supports research. In reality, the discussion has been limited almost entirely to the first of these practices. This paper presents a case study in which some student field-trip…

  13. Assessing the benefits of OHER (Office of Health and Environmental Research) research: Three case studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nesse, R.J.; Callaway, J.M.; Englin, J.E.

    1987-09-01

    This research was undertaken to estimate the societal benefits and costs of selected past research performed for the Office of Health and Environmental Research (OHER) of the US Department of Energy (DOE). Three case studies of representative OHER and DOE research were performed. One of these, the acid rain case study, includes research conducted elsewhere in DOE. The other two cases were the OHER marine research program and the development of high-purity germanium that is used in radiation detectors. The acid rain case study looked at the research benefits and costs of furnace sorbent injection and duct injection, technologies thatmore » might reduce acid deposition precursors. Both appear to show benefits in excess of costs. We examined in detail one of the OHER marine research program's accomplishments - the increase in environmental information used by the Outer Continental Shelf leasing program to manage bidding for off-shore oil drilling. The results of an econometric model show that environmental information of the type supported by OHER is unequivocally linked to government and industry leasing decisions. The germanium case study indicated that the benefits of germanium radiation detectors were significant.« less

  14. Higher Education Research: Its Relationship to Policy and Practice. Issues in Higher Education Series.

    ERIC Educational Resources Information Center

    Teichler, Ulrich, Ed.; Sadlack, Jan, Ed.

    Contributions to this volume show the current state of self-reflection among researchers in higher education, including the conditions they face and the ways in which they research and communicate their findings. The chapters are: (1) "The Relationships between Higher Education Research and Higher Education Policy and Practice: The…

  15. A Hands-On Experience of English Language Teachers as Researchers

    ERIC Educational Resources Information Center

    Yayli, Demet

    2012-01-01

    This study presents the results of a teacher research project. The analysis aimed to explore both the four teacher researchers' interpretations of conducting research in English language teaching and the nature of their collaboration with their supervisor in the procedure. The results showed that qualitative data analysis and interpreting the…

  16. Research by External Agencies or Individuals in AISD.

    ERIC Educational Resources Information Center

    Austin Independent School District, TX.

    Abstracts of 34 research projects conducted in the Austin (Texas) Independent School District (AISD) are presented. A roster summarizing the projects by external researchers is also included. The roster shows, for each project, the project number, title, director, sponsor, schools where research was conducted, and whether a full report is on file.…

  17. Headmaster Leadership and Teacher Competence in Increasing Student Achievement in School

    ERIC Educational Resources Information Center

    Wahyuddin, Wawan

    2017-01-01

    The purposes of this research are to identify and analyze the headmaster leadership and teacher competence in increasing student achievement in school. The research was at Private Islamic Junior High School in Serang, Banten, Indonesia. Researcher is using descriptive and inferential methods. The results of this research showed that there is…

  18. University Faculty Value the CRA Designation--They Just Don't Realize It Yet!

    ERIC Educational Resources Information Center

    Cole, Kimberley W.

    2013-01-01

    The Certified Research Administrator (CRA) certification has enjoyed success and recognition among research administration professionals. However, this recognition is parochial and does not extend much past the walls of research administration. Results of a recent research study showed that Principal Investigators value and expect certain aspects…

  19. A Rationale for Mixed Methods (Integrative) Research Programmes in Education

    ERIC Educational Resources Information Center

    Niaz, Mansoor

    2008-01-01

    Recent research shows that research programmes (quantitative, qualitative and mixed) in education are not displaced (as suggested by Kuhn) but rather lead to integration. The objective of this study is to present a rationale for mixed methods (integrative) research programs based on contemporary philosophy of science (Lakatos, Giere, Cartwright,…

  20. Rattling Cages--Comments on "Hearing Voices" Commentary.

    ERIC Educational Resources Information Center

    Hall, Ruth L.

    1999-01-01

    The exploration of the uses of research by G. Russell and J. Bohan illustrates applied feminist research in the use of research results to give something back to the community. Their projects also show that it is possible to be creative in the way in which researchers do give something back to the community. (SLD)

  1. Learning Qualitative Research

    ERIC Educational Resources Information Center

    Gerhart, Lael

    2009-01-01

    In this article I explore through a narrative how I came to do a research project in East New York. I show how first contact was established, how local contacts were made, and how trust between my research participants and me was created. I then explore how the research topic evolved through informal conversations, open-ended interviews, and…

  2. Mixed or Complementary Messages: Making the Most of Unexpected Assessment Results

    ERIC Educational Resources Information Center

    Jones, Phil; Bauder, Julia; Engel, Kevin

    2016-01-01

    Grinnell College participated in ACRL's [Association of College and Research Libraries] first cohort of Assessment in Action (AiA), undertaking a mixed-methods action research project to assess the effectiveness of librarian-led research literacy sessions in improving students' research skills. The quantitative data showed that the quality of…

  3. Researching Language-in-Education in Diverse, Twenty-First Century Settings

    ERIC Educational Resources Information Center

    Martin-Jones, Marilyn

    2016-01-01

    The two opening sections of this Afterword show how the studies in this collection reflect wider trends in research related to language-in-education policy and practice in contemporary contexts of linguistic and cultural diversity: namely, the turn towards interpretive research and the diversification of research sites. The third section focuses…

  4. A Survey of Neural Network Publications.

    ERIC Educational Resources Information Center

    Vijayaraman, Bindiganavale S.; Osyk, Barbara

    This paper is a survey of publications on artificial neural networks published in business journals for the period ending July 1996. Its purpose is to identify and analyze trends in neural network research during that period. This paper shows which topics have been heavily researched, when these topics were researched, and how that research has…

  5. APPLYING RESEARCH FINDINGS IN COMPREHENSION TO CLASSROOM PRACTICE.

    ERIC Educational Resources Information Center

    WILLIAMS, RICHARD P.

    RESEARCH SHOWS THAT, IN SPITE OF THE FAVORABLE ATTITUDE TOWARD SCIENTIFIC RESEARCH, A GAP EXISTS BETWEEN THE INITIATION OF AN INNOVATION AND ITS WIDE ACCEPTANCE. TO HELP CLOSE THE GAP, TEACHERS ARE ENCOURAGED TO APPLY RESEARCH FINDINGS TO CLASSROOM PRACTICE AND TO DETERMINE THEIR FEASIBILITY. SIXTEEN STUDIES ON COMPREHENSION CITED IN THIS ARTICLE…

  6. Transnational Lives in European Educational Research

    ERIC Educational Resources Information Center

    Lawn, Martin

    2014-01-01

    Transnational collaboration by educational researchers in Europe has grown fast since the mid-1990s and the means to support it have become more easily accessible. A study of the growth of the European Educational Research Association (EERA) since its foundation in the mid-1990s shows how transnational research in European education began, and how…

  7. Getting inside the Insider Researcher: Does Race-Symmetry Help or Hinder Research?

    ERIC Educational Resources Information Center

    Vass, Greg

    2017-01-01

    This article engages with methodological concerns connected to insider education research and the "race-symmetry" shared between the researcher and teacher participants. To do this, race critical reflexive strategies are utilized to show how and why this practice productively contributed to the knowledge about race making constructed in…

  8. Changing over a Project Changing over: A Project-Research Supervision as a Conversation

    ERIC Educational Resources Information Center

    Clarke, Helen; Ryan, Charly

    2006-01-01

    Extracts from the written conversation between research student and supervisor show the nature of educative research supervision. The authors argue that researcher-supervisor relationships are methodological in nature as they shape and influence the people, the project and the field. Such relationships, which construct meanings, are complex. A…

  9. A Note on the Estimator of the Alpha Coefficient for Standardized Variables Under Normality

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Kamata, Akihito

    2005-01-01

    The asymptotic standard deviation (SD) of the alpha coefficient with standardized variables is derived under normality. The research shows that the SD of the standardized alpha coefficient becomes smaller as the number of examinees and/or items increase. Furthermore, this research shows that the degree of the dependence of the SD on the number of…

  10. Scenario-Based Case Study Method and the Functionality of the Section Called "From Production to Consumption" from the Perspective of Primary School Students

    ERIC Educational Resources Information Center

    Taneri, Ahu

    2018-01-01

    In this research, the aim was showing the evaluation of students on scenario-based case study method and showing the functionality of the studied section called "from production to consumption". Qualitative research method and content analysis were used to reveal participants' experiences and reveal meaningful relations regarding…

  11. The Changing Value of Vigorous Activity and the Paradox of Utilising Exercise as Punishment in Physical Education

    ERIC Educational Resources Information Center

    Aasland, Erik; Walseth, Kristin; Engelsrud, Gunn

    2017-01-01

    Background: Previous research on physical education (PE) teaching practice indicates that an exercise physiology discourse has assumed a dominant position within the field. Research shows that PE teachers are likely to emphasise physical fitness training in their teaching, and PE teachers seem to appreciate pupils who show high levels of physical…

  12. Advice from Blind Teachers on How to Teach Statistics to Blind Students

    ERIC Educational Resources Information Center

    Godfrey, A. Jonathan R.; Loots, M. Theodor

    2015-01-01

    Blind students are bound to make up a very small part of the population most university lecturers will encounter during their careers. Research to date shows that good communication between staff and student improves the chances of a successful outcome for both parties. The research does show, however, that the exercise seems to be one of…

  13. The U.S. Counterterrorism Strategy: Addressing Radical Ideologies

    DTIC Science & Technology

    2015-06-12

    Prophet Muhammad (Readings, Brandon, and Phelps 2010, 267). Assumptions This study has two primary assumptions supporting the research. The first...negative circumstances and failed or oppressive states (Wiktorowicz 2004, 4). Research studies on the sociology and psychology of terrorists show that... psychological issues. However, studies show that it is not possible to extrapolate a definitive root cause for radicalization or violent extremism

  14. Broken Stringers Can Be Recovered By Splicing, Research By Pallet Lab Shows

    Treesearch

    Chaille Brindley

    1997-01-01

    With the increasing prices of lumber, pallet manufacturers and recyclers are looking to squeeze every dollar out of their operations. A recent study on stringer repair reveals another potential area of the business that may be squeezed. The study by Dr. Marshall White, director of the pallet and container research laboratory at Virginia Tech, shows broken stringers can...

  15. Children's and Young People's Reading Habits and Preferences: The Who, What, Why, Where and When

    ERIC Educational Resources Information Center

    Clark, Christina; Foster, Amelia

    2005-01-01

    This report, based on a recent survey of over 8,000 primary and secondary pupils in England, explores why some pupils choose to read and others do not. The research literature shows that reading for pleasure benefits children in numerous ways. Yet, research also shows that young people's reading enjoyment may be declining. Given current political…

  16. Developing measures of community-relevant outcomes for violence prevention programs: a community-based participatory research approach to measurement.

    PubMed

    Hausman, Alice J; Baker, Courtney N; Komaroff, Eugene; Thomas, Nicole; Guerra, Terry; Hohl, Bernadette C; Leff, Stephen S

    2013-12-01

    Community-Based Participatory Research is a research paradigm that encourages community participation in designing and implementing evaluation research, though the actual outcome measures usually reflect the "external" academic researchers' view of program effect and the policy-makers' needs for decision-making. This paper describes a replicable process by which existing standardized psychometric scales commonly used in youth-related intervention programs were modified to measure indicators of program success defined by community partners. This study utilizes a secondary analysis of data gathered in the context of a community-based youth violence prevention program. Data were retooled into new measures developed using items from the Alabama Parenting Questionnaire, the Hare Area Specific Self-Esteem Scale, and the Youth Asset Survey. These measures evaluated two community-defined outcome indicators, "More Parental Involvement" and "Showing Kids Love." Results showed that existing scale items can be re-organized to create measures of community-defined outcomes that are psychometrically reliable and valid. Results also show that the community definitions of parent or parenting caregivers exemplified by the two indicators are similar to how these constructs have been defined in previous research, but they are not synonymous. There are nuanced differences that are important and worthy of better understanding, in part through better measurement.

  17. Materials sciences research. [research facilities, research projects, and technical reports of materials tests

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Research projects involving materials research conducted by various international test facilities are reported. Much of the materials research is classified in the following areas: (1) acousto-optic, acousto-electric, and ultrasonic research, (2) research for elucidating transport phenomena in well characterized oxides, (3) research in semiconductor materials and semiconductor devices, (4) the study of interfaces and interfacial phenomena, and (5) materials research relevant to natural resources. Descriptions of the individual research programs are listed alphabetically by the name of the author and show all personnel involved, resulting publications, and associated meeting speeches.

  18. The Influence of Materialism on Purebred Dogs' Welfare Among Two Different Generations in Colombia (South America).

    PubMed

    Luna-Cortés, Gonzalo

    2018-03-27

    Some consumers in Colombia show a clear preference for purebred dogs. At the same time, there are many abandoned dogs on the streets and in shelters in this country. Previous research has revealed that appearances of the breeds influence the caregivers' (owners') choice. A choice based on appearances has been connected with materialism in the psychology and consumer behavior literature. Buying purebred dogs based on materialistic standards could affect the welfare of these nonhuman animals. With the use of quantitative research and the methodology of structural equation modeling, this research demonstrated that more materialistic consumers in Colombia have purebred dogs who, in the owners' opinions, show more behavioral problems. Furthermore, the results showed that materialism influenced the owners' intentions to abandon their companion animals when they perceived these problems. Finally, this research examined the moderating effect of generational segmentation regarding these relationships. It was observed that the intention to abandon the dogs was greater among members of Generation X than among members of Generation Y.

  19. New targets for immunotherapy-based treatment of HPV-related cancers | Center for Cancer Research

    Cancer.gov

    Scientists at the Center for Cancer Research and three other cancer research institutions show that immunotherapy treatments that resulted in complete regression of metastatic cervical cancer largely targeted two non-viral antigens. Read more…  

  20. Study shows women fail to land top grants

    NASA Astrophysics Data System (ADS)

    Jenner, Nicola

    2014-05-01

    Women in the UK are less successful than men at securing research council funding at almost every stage of their careers, according to an analysis published by Research Councils UK (RCUK) - the umbrella organization for the country's seven research councils.

  1. An overview of Quality Management System implementation in a research laboratory

    NASA Astrophysics Data System (ADS)

    Molinéro-Demilly, Valérie; Charki, Abdérafi; Jeoffrion, Christine; Lyonnet, Barbara; O'Brien, Steve; Martin, Luc

    2018-02-01

    The aim of this paper is to show the advantages of implementing a Quality Management System (QMS) in a research laboratory in order to improve the management of risks specific to research programmes and to increase the reliability of results. This paper also presents experience gained from feedback following the implementation of the Quality process in a research laboratory at INRA, the French National Institute for Agronomic Research and details the various challenges encountered and solutions proposed to help achieve smoother adoption of a QMS process. The 7Ms (Management, Measurement, Manpower, Methods, Materials, Machinery, Mother-nature) methodology based on the Ishikawa `Fishbone' diagram is used to show the effectiveness of the actions considered by a QMS, which involve both the organization and the activities of the laboratory. Practical examples illustrate the benefits and improvements observed in the laboratory.

  2. The inaction effect in the psychology of regret.

    PubMed

    Zeelenberg, Marcel; van de Bos, Kees; van Dijk, Eric; Pieters, Rik

    2002-03-01

    Previous research showed that decisions to act (i.e., actions) produce more regret than decisions not to act (i.e., inactions). This previous research focused on decisions made in isolation and ignored that decisions are often made in response to earlier outcomes. The authors show in 4 experiments that these prior outcomes may promote action and hence make inaction more abnormal. They manipulated information about a prior outcome. As hypothesized, when prior outcomes were positive or absent, people attributed more regret to action than to inaction. However, as predicted and counter to previous research, following negative prior outcomes, more regret was attributed to inaction, a finding that the authors label the inaction effect. Experiment 4, showing differential effects for regret and disappointment, demonstrates the need for emotion-specific predictions.

  3. Tools for Linking Research and Practice in the Helping Professions: Research Abstract Worksheets and Personal Reviews of the Literature.

    ERIC Educational Resources Information Center

    Burlingame, Martin

    This document is comprised of four chapters that show how to use research-abstract worksheets and personal reviews of the literature as tools for linking research and practice in the helping professions. The research tools help to condense lengthy reports, place them into a consistent format, and actively involve the information seeker. Chapter 1…

  4. Establishing an Intellectual and Theoretical Foundation for the After Action Review Process - A Literature Review

    DTIC Science & Technology

    2011-04-01

    Research Institute Technology-Based Training Research Unit Stephen L. Goldberg , Chief April 2011 United States Army...Research Unit Stephen L. Goldberg , Chief U.S. Army Research Institute for the Behavioral and Social Sciences 2511 Jefferson Davis Highway...statements of approval voiced by command elements. Rather, researchers must complete a program of transfer of training studies to show that variations in

  5. Gun Shows and Gun Violence: Fatally Flawed Study Yields Misleading Results

    PubMed Central

    Hemenway, David; Webster, Daniel; Pierce, Glenn; Braga, Anthony A.

    2010-01-01

    A widely publicized but unpublished study of the relationship between gun shows and gun violence is being cited in debates about the regulation of gun shows and gun commerce. We believe the study is fatally flawed. A working paper entitled “The Effect of Gun Shows on Gun-Related Deaths: Evidence from California and Texas” outlined this study, which found no association between gun shows and gun-related deaths. We believe the study reflects a limited understanding of gun shows and gun markets and is not statistically powered to detect even an implausibly large effect of gun shows on gun violence. In addition, the research contains serious ascertainment and classification errors, produces results that are sensitive to minor specification changes in key variables and in some cases have no face validity, and is contradicted by 1 of its own authors’ prior research. The study should not be used as evidence in formulating gun policy. PMID:20724672

  6. Gun shows and gun violence: fatally flawed study yields misleading results.

    PubMed

    Wintemute, Garen J; Hemenway, David; Webster, Daniel; Pierce, Glenn; Braga, Anthony A

    2010-10-01

    A widely publicized but unpublished study of the relationship between gun shows and gun violence is being cited in debates about the regulation of gun shows and gun commerce. We believe the study is fatally flawed. A working paper entitled "The Effect of Gun Shows on Gun-Related Deaths: Evidence from California and Texas" outlined this study, which found no association between gun shows and gun-related deaths. We believe the study reflects a limited understanding of gun shows and gun markets and is not statistically powered to detect even an implausibly large effect of gun shows on gun violence. In addition, the research contains serious ascertainment and classification errors, produces results that are sensitive to minor specification changes in key variables and in some cases have no face validity, and is contradicted by 1 of its own authors' prior research. The study should not be used as evidence in formulating gun policy.

  7. Academic Excellence: The Role of Research in the Physical Sciences at Undergraduate Institutions.

    ERIC Educational Resources Information Center

    Doyle, Michael P., Ed.

    Chapters of this collection show that students benefit from a research-based teaching environment, and that students who have the opportunity for research complete their science programs in greater numbers than those who do not. The chapters of section 1, "Achieving Excellence," are: (1) "The Role of Research at Undergraduate Institution: Why Is…

  8. Articles Published in Six School Psychology Journals from 2005-2009: Where's the Intervention Research?

    ERIC Educational Resources Information Center

    Villarreal, Victor; Gonzalez, Jorge E.; McCormick, Anita S.; Simek, Amber; Yoon, Hyunhee

    2013-01-01

    This article reports on a content analysis of six school psychology journals spanning the years 2005-2009, with a particular focus on published intervention research. The analysis showed that (a) research articles were the most frequently published, with the largest category being descriptive research; (b) the percentage of intervention studies…

  9. An Alternative Collaborative Supervision Practice between University-Based Teachers and School-Based Teachers

    ERIC Educational Resources Information Center

    Steele, Annfrid R.

    2017-01-01

    There is an increased focus in teacher education on research-based teaching as a means to develop a more research-based professional knowledge. However, research from several Western countries shows that neither school-based nor university-based teachers are familiar with how to integrate research-based knowledge in professional teacher practice.…

  10. X-38 Ship #2 Mated to B-52 Mothership in Flight

    NASA Image and Video Library

    1999-07-09

    This photo shows one of the X-38 lifting-body research vehicles mated to NASA's B-52 mothership in flight prior to launch. The B-52 has been a workhorse for the Dryden Flight Research Center for more than 40 years, carrying numerous research vehicles aloft and conducting a variety of other research flight experiments.

  11. Improving Your Reflective Practice through Stories of Practitioner Research. Pen Green Books for Early Years Educators

    ERIC Educational Resources Information Center

    Arnold, Cath, Ed.

    2012-01-01

    "Improving Your Reflective Practice through Stories of Practitioner Research" shows how research has informed and created effective and valuable reflective practice in early years education, and offers depth to the arguments for a research-orientated stance to this vital field of study. This thought-provoking text explores and documents a variety…

  12. What Does Good Education Research Look Like? Conducting Educational Research

    ERIC Educational Resources Information Center

    Yates, Lyn

    2004-01-01

    This book explains the debates that bedevil education research--for example that it is low quality, or not scientific enough, or not useful enough--and shows how research in education must meet different demands in different places, times and conditions. A major part of the book provides detailed analyses and guidance to different areas in which…

  13. The Research Interview as Discourses Crossing Swords: The Researcher and Apprentice on Crossing Roads

    ERIC Educational Resources Information Center

    Tanggaard, Lene

    2007-01-01

    This article presents a conception of the qualitative research interview as discourses crossing swords. The article draws on examples showing how the researchers' view on learning is challenged by the interviewed apprentices. The apprentices do not assume learning in itself to be an important aspect of their lives. They consider the process of…

  14. The Correlation between the Communication of the Health Risks of Ecstasy (MDMA) and the Drug's Use among College Students.

    ERIC Educational Resources Information Center

    Campe, Brian; Frye, Kristin; Hood, Caitlin; Kuznekoff, Jeffrey; Parsons, Michael

    The purpose of this study is to explore the relationship between college students and their awareness of the hazardous effects of the drug Ecstasy. Ecstasy use has risen among college students even though readily available research shows Ecstasy use having extremely hazardous effects on its users. Research also shows a lack of communication about…

  15. What Is the Perceived Value of Weekly Participation of Art Classes in the Elementary Grades?

    ERIC Educational Resources Information Center

    Gallagher, Adam

    2013-01-01

    This research study is dedicated to the importance of the arts, and the importance of being taught the subject of art from a licensed art educator. Bringing in a personal perspective of being an art teacher, the researcher shows his dedication to showing how the arts impacts lives on a daily basis. The overriding question of this study is,…

  16. Research mapping in North Sumatra based on Scopus

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Sitepu, R.; Rosmayati; Bakti, D.; Hardi, S. M.

    2018-02-01

    Research is needed to improve the capacity of human resources to manage natural resources for human well-being. Research is done by institutions such as universities or research institutions, but the research picture related to human welfare interests is not easy to obtain. If research can be proven through scientific publications, scientific research publication databases can be used to view research behaviour. Research mapping in North Sumatra needs to be done to see the suitability of research conducted with development needs in North Sumatra, and as a presentation is the Universitas Sumatera Utara which shows that research conducted has 60% strength, especially in the exact sciences.

  17. Qualitative Health Research with Children.

    ERIC Educational Resources Information Center

    Ireland, Lorraine; Holloway, Immy

    1996-01-01

    Uses a study about children's experience of asthma to show that qualitative research with children has inherent difficulties relating to access and ethical and developmental issues. Asserts that because of children's stage of development and the asymmetrical relationship between researcher and informants, adequate safeguards and awareness of these…

  18. Behavior, Experience and Expression: Some Research Considerations.

    ERIC Educational Resources Information Center

    Romanyshyn, Robert D.

    Utilizing research conducted on nostalgia, this paper shows how a phenomenological approach assists in understanding behavior, experience and expression. Moreover, a clearer understanding of them aids one's research with and comprehension of nostalgia. Human action can be studied from the experiential, behavioral and expressive perspectives. These…

  19. Sex Differences in Influenceability

    ERIC Educational Resources Information Center

    Eagly, Alice H.

    1978-01-01

    Examines the hypothesis that women are more easily influenced than men by reviewing the literature on persuasion and conformity research. Persuasion research and conformity studies not involving group pressure show scant empirical support for sex differences. For group pressure conformity research, a substantial minority of studies support the…

  20. Fragments of peer review: A quantitative analysis of the literature (1969-2015)

    PubMed Central

    Grimaldo, Francisco; Marušić, Ana

    2018-01-01

    This paper examines research on peer review between 1969 and 2015 by looking at records indexed from the Scopus database. Although it is often argued that peer review has been poorly investigated, we found that the number of publications in this field doubled from 2005. A half of this work was indexed as research articles, a third as editorial notes and literature reviews and the rest were book chapters or letters. We identified the most prolific and influential scholars, the most cited publications and the most important journals in the field. Co-authorship network analysis showed that research on peer review is fragmented, with the largest group of co-authors including only 2.1% of the whole community. Co-citation network analysis indicated a fragmented structure also in terms of knowledge. This shows that despite its central role in research, peer review has been examined only through small-scale research projects. Our findings would suggest that there is need to encourage collaboration and knowledge sharing across different research communities. PMID:29466467

  1. Evidence-based nursing practice: both state of the art in general and specific to pressure sores.

    PubMed

    Buss, I C; Halfens, R J; Abu-Saad, H H; Kok, G

    1999-01-01

    The importance of research-based practice in nursing has been frequently stressed, and a number of nursing studies have been conducted whose results enable nursing to improve knowledge and practice. This study reports a literature review in which the current status of knowledge and research utilization with regard to pressure sores is described. This review first gives an overview of studies on knowledge utilization in general and shows that the spontaneous diffusion of knowledge is inappropriate. Furthermore, an overview of planned research utilization activities focusing on pressure sore prevention and treatment in nursing is presented. The results of these studies show that planned research utilization activities performed in individual organizations lead to positive outcomes in almost all cases. Therefore, it could be concluded that implementing planned research utilization activities in individual health care institutions seems to be an effective strategy to decrease pressure sore incidence and prevalence rates.

  2. The correlation between concept mastery and stage of moral reasoning student using socio-scientific issues on reproductive system material

    NASA Astrophysics Data System (ADS)

    Lestari, T. A.; Saefudin; Priyandoko, D.

    2018-05-01

    This research aims to analyze the correlation between concept mastery and moral stages of students. The research method using a correlational study with stratified random sampling technique. The population in this research is all of eleventh grade students in Senior High School Bandung. Data were collected from 297 eleventh grade students of three Senior High School in Bandung with use the instrument in the form of examination and stage of moral reasoning questionnaire. The stage of moral reasoning in this research consists of two student’s moral reasoning categories based on 16 questionnaire as the indicators from Jones et al. (2007). The results of this research shows that the average of eleventh grade student’s moral reasoning stage is the advanced stage. The results of this research shows that the concept mastery and the stage of moral reasoning indicates that there are 0.370 0f a positive correlation. This research provides an overview of eleventh grade student about concept mastery and stage of moral reasoning using socio-scientific issues.

  3. Crossmaps: Visualization of overlapping relationships in collections of journal papers

    PubMed Central

    Morris, Steven A.; Yen, Gary G.

    2004-01-01

    A crossmapping technique is introduced for visualizing multiple and overlapping relations among entity types in collections of journal articles. Groups of entities from two entity types are crossplotted to show correspondence of relations. For example, author collaboration groups are plotted on the x axis against groups of papers (research fronts) on the y axis. At the intersection of each pair of author group/research front pairs a circular symbol is plotted whose size is proportional to the number of times that authors in the group appear as authors in papers in the research front. Entity groups are found by agglomerative hierarchical clustering using conventional similarity measures. Crossmaps comprise a simple technique that is particularly suited to showing overlap in relations among entity groups. Particularly useful crossmaps are: research fronts against base reference clusters, research fronts against author collaboration groups, and research fronts against term co-occurrence clusters. When exploring the knowledge domain of a collection of journal papers, it is useful to have several crossmaps of different entity pairs, complemented by research front timelines and base reference cluster timelines. PMID:14762168

  4. Training Students’ Science Process Skills through Didactic Design on Work and Energy

    NASA Astrophysics Data System (ADS)

    Ramayanti, S.; Utari, S.; Saepuzaman, D.

    2017-09-01

    Science Process Skills (SPS) has not been optimally trained to the students in the learning activity. The aim of this research is finding the ways to train SPS on the subject of Work and Energy. One shot case study design is utilized in this research that conducted on 32 students in one of the High Schools in Bandung. The students’ SPS responses were analyzed by the development SPS based assessment portfolios. The results of this research showed the didactic design that had been designed to training the identifying variables skills, formulating hypotheses, and the experiment activity shows the development. But the didactic design to improve the students’ predicting skills shows that the development is still not optimal. Therefore, in the future studies need to be developed the didactic design on the subject Work and Energy that exercising these skills.

  5. Calculable People? Standardising Assessment Guidelines for Alzheimer’s Disease in 1980s Britain

    PubMed Central

    Wilson, Duncan

    2017-01-01

    This article shows how funding research on Alzheimer’s disease became a priority for the British Medical Research Council (MRC) in the late 1970s and 1980s, thanks to work that isolated new pathological and biochemical markers and showed that the disease affected a significant proportion of the elderly population. In contrast to histories that focus on the emergence of new and competing theories of disease causation in this period, I argue that concerns over the use of different assessment methods ensured the MRC’s immediate priority was standardising the ways in which researchers identified and recorded symptoms of Alzheimer’s disease in potential research subjects. I detail how the rationale behind the development of standard assessment guidelines was less about arriving at a firm diagnosis and more about facilitating research by generating data that could be easily compared across the disciplines and sites that constitute modern biomedicine. Drawing on criticism of specific tests in the MRC’s guidelines, which some psychiatrists argued were ‘middle class biased’, I also show that debates over standardisation did not simply reflect concerns specific to the fields or areas of research that the MRC sought to govern. Questions about the validity of standard assessment guidelines for Alzheimer’s disease embodied broader concerns about education and social class, which ensured that distinguishing normal from pathological in old age remained a contested and historically contingent process. PMID:28901868

  6. Bifidobacteria

    MedlinePlus

    ... colitis. Airway infections. Most research shows that using probiotics containing bifidobacteria helps prevent airway infections such as ... helps prevent traveler's diarrhea when used with other probiotics such as lactobacillus or streptococcus. Ulcerative colitis. Research ...

  7. Small Colleges and New Faculty Pay

    ERIC Educational Resources Information Center

    Marthers, Paul; Parker, Jeff

    2008-01-01

    Do liberal arts colleges act like research universities when they seek to appoint new faculty members? Evidence shows that research universities bid aggressively for talent, using discretionary salary policies to achieve a diverse professoriate, appoint research stars, and fill vacancies in fields where market forces require differential salaries.…

  8. 32 CFR 701.42 - Categories of requesters-applicable fees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... an institution of vocational education, which operates a program or programs of scholarly research... scholarly research. Requesters must reasonably describe the records sought. (2) Requesters must show that... sought for commercial use, but in furtherance of scholarly research. (3) Fees shall be waived or reduced...

  9. Helping Teens Develop Healthy Social Skills and Relationships: What the Research Shows about Navigating Adolescence. American Teens. Child Trends Research Brief.

    ERIC Educational Resources Information Center

    Hair, Elizabeth C.; Jager, Justin; Garrett, Sarah B.

    As adolescents mature, their social skills are called upon to form and maintain relationships. Noting that the quality of those relationships has important consequences for adolescent development, this research brief presents information on social competency in adolescence. More than 360 research studies were reviewed to examine the factors that…

  10. Supervision and Early Career Work Experiences of Estonian Humanities Researchers under the Conditions of Project-Based Funding

    ERIC Educational Resources Information Center

    Eigi, Jaana; Põiklik, Pille; Lõhkivi, Endla; Velbaum, Katrin

    2014-01-01

    We analyze a series of interviews with Estonian humanities researchers to explore topics related to the beginning of academic careers and the relationships with supervisors and mentors. We show how researchers strive to have meaningful relationships and produce what they consider quality research in the conditions of a system that is very strongly…

  11. Marking Time, Making Methods: Temporality and Untimely Dilemmas in the Sociology of Youth and Educational Change

    ERIC Educational Resources Information Center

    McLeod, Julie

    2017-01-01

    This article explores how temporality and temporal regimes might be engaged in qualitative research in the sociology of education, proposing that such questions matter in relation to how research is done, not only to the topics and themes researched. The article shows how temporality enters into research designs, practices and imaginaries, arguing…

  12. No need to go! Workplace studies and the resources of the revised National Statement.

    PubMed

    Cordner, Christopher; Thomson, Colin

    2007-07-01

    In their article 'Unintended consequences of human research ethics committees: au revoir workplace studies?', Greg Bamber and Jennifer Sappey set out some real obstacles in the practices and attitudes of some Human Research Ethics Committees (HRECs), to research in the social sciences and particulalry in industrial sociology. They sheet home these attitudes and practices to the way in which various statements in the NHMRC's National Statement [1999] are implemented, which they say is often in 'conflict with an important stream of industrial sociological research' in Australia. They do not discuss the recently completed revision of the NS. We undertake to show that the revised National Statement meets their concerns about research in industrial sociology, and to draw attention to the resources of the revised National Statement that engage with those concerns. A more general aim is to display the greater scope, in the revised National Statement, for researchers to show to HRECs that their research is justified by virutue of its reflecting the established methodology and traditions of their discipline. The revised National Statement, we suggest, provides for a more flexible and responsive approach than its predecessor to the ethical review of many areas of research.

  13. Balanced performance measurement in research hospitals: the participative case study of a haematology department.

    PubMed

    Catuogno, Simona; Arena, Claudia; Saggese, Sara; Sarto, Fabrizia

    2017-08-03

    The paper aims to review, design and implement a multidimensional performance measurement system for a public research hospital in order to address the complexity of its multifaceted stakeholder requirements and its double institutional aim of care and research. The methodology relies on a participative case study performed by external researchers in close collaboration with the staff of an Italian research hospital. The paper develops and applies a customized version of balanced scorecard based on a new set of performance measures. Our findings suggest that it can be considered an effective framework for measuring the research hospital performance, thanks to a combination of generalizable and context-specific factors. By showing how the balanced scorecard framework can be customized to research hospitals, the paper is especially of interest for complex healthcare organizations that are implementing management accounting practices. The paper contributes to the body of literature on the application of the balanced scorecard in healthcare through an examination of the challenges in designing and implementing this multidimensional performance tool. This is one of the first papers that show how the balanced scorecard model can be adapted to fit the specific requirements of public research hospitals.

  14. Sequim Marine Research Laboratory routine environmental measurements during CY-1978. [Monitoring for laboratory-related radioactivity and pollutants in environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houston, J.R.; Blumer, P.J.

    1979-03-01

    Environmental data collected during 1978 in the vicinity of the Marine Research Laboratory show continued compliance with all applicable state and federal regulations and furthermore show no detectable change from conditions that existed in previous years. Samples collected for radiological analysis included soil, drinking water, bay water, clams, and seaweed. Radiation dose rates at 1 meter aboveground were also measured.

  15. Optical properties of different graphene concentration in P3HT

    NASA Astrophysics Data System (ADS)

    Shariff, N. S. M.; Sarah, M. S. P.; Rusop, M.

    2018-05-01

    The discovery of Graphene has led to many new findings in material research. P3HT is a polymer that is well used in photovoltaic studies but the main problem is its low photocurrent due to its low electron mobility. Therefore the objective of this research is to increase the mobility in order to achieve higher photocurrent. In this research, P3HT will be mixed with Graphene and used as an active layer. The fabrication method used in this research is spin coating technique. Optical properties such as absorbance, transmittance and photoluminescence is characterized. Each optical properties shows a positive results when compared to P3HT layer. A concentration of 2 wt % shows the optimum absorbance and transmittance while quenching effect can be seen when compared to P3HT layer.

  16. Status of Suspended Particulate Matters Pollution at Traditional Markets in Makassar City

    NASA Astrophysics Data System (ADS)

    Suryani, Sri; Fahrunnisa

    2018-03-01

    Research on the status of suspended particulate matters pollution in four traditional markets located in Makassar city has been done. The purpose of this research is to know the air quality in the traditional market areas, especially caused by suspended particulate matters. The background of this research is because traders who trade in traditional markets generally peddle their goods along dusty roads and suspended particulate matters in dust can be inhaled when the vehicle passes. These suspended particulate matters pollutant can cause lung diseases. The results showed that the level of suspended particulate matters pollution fluctuates every year depending on the local wind speed, humidity, and temperature. Research results also showed the values were over the standard value according to the governor of South Sulawesi regulation.

  17. A study on scientific collaboration and co-authorship patterns in library and information science studies in Iran between 2005 and 2009.

    PubMed

    Siamaki, Saba; Geraei, Ehsan; Zare-Farashbandi, Firoozeh

    2014-01-01

    Scientific collaboration is among the most important subjects in scientometrics, and many studies have investigated this concept to this day. The goal of the current study is investigation of scientific collaboration and co-authorship patterns of researchers in the field of library and information science in Iran between years 2005 and 2009. The current study uses scientometrics method. The statistical population consists of 942 documents published in Iranian library and information science journals between years 2005 and 2009. Collaboration coefficient, collaboration index (CI), and degree of collaboration (DC) were used for data analysis. The findings showed that among 942 investigated documents, 506 documents (53.70%) was created by one individual researcher and 436 documents (46.30%) were the result of collaboration between two or more researchers. Also, the highest rank of different authorship patterns belonged to National Journal of Librarianship and Information Organization (code H). The average collaboration coefficient for the library and information science researchers in the investigated time frame was 0.23. The closer this coefficient is to 1, the higher is the level of collaboration between authors, and a coefficient near zero shows a tendency to prefer individual articles. The highest collaboration index with an average of 1.92 authors per paper was seen in year 1388. The five year collaboration index in library and information science in Iran was 1.58, and the average degree of collaboration between researchers in the investigated papers was 0.46, which shows that library and information science researchers have a tendency for co-authorship. However, the co-authorship had increased in recent years reaching its highest number in year 1388. The researchers' collaboration coefficient also shows relative increase between years 1384 and 1388. National Journal of Librarianship and Information Organization has the highest rank among all the investigated journals based on collaboration coefficient, collaboration index (CI), and degree of collaboration (DC).

  18. NCVER Building Researcher Capacity Scholarship: A Rural Participant's Perspective

    ERIC Educational Resources Information Center

    Bowden, Anne

    2015-01-01

    This article uses an autoethnographic methodology to describe the experience of a novice practitioner-researcher engaging in the NCVER community of practice (CoP). The author's experience of the journey from vocational education and training (VET) practitioner to practitioner-researcher is recorded. The findings show that the numerous aspirations…

  19. Pointer Animation Implementation at Development of Multimedia Learning of Java Programming

    ERIC Educational Resources Information Center

    Rusli, Muhammad; Atmojo, Yohanes Priyo

    2015-01-01

    This research represents the development research using the references of previous research results related to the development of interactive multimedia learning (learner controlled), specially about the effectiveness and efficiency of multimedia learning of a content that developed by pointer animation implementation showing the content in…

  20. Research on the Treatment of Couple Distress

    ERIC Educational Resources Information Center

    Lebow, Jay L.; Chambers, Anthony L.; Christensen, Andrew; Johnson, Susan M.

    2012-01-01

    This article reviews the research on couple therapy over the last decade. The research shows that couple therapy positively impacts 70% of couples receiving treatment. The effectiveness rates of couple therapy are comparable to the effectiveness rates of individual therapies and vastly superior to control groups not receiving treatment. The…

  1. Navigating the Challenges Arising from University-School Collaborative Action Research

    ERIC Educational Resources Information Center

    Yuan, Rui; Mak, Pauline

    2016-01-01

    Despite increasing evidence showing the benefits language teachers can reap from university-school collaborative action research (CAR), scant attention has been given to how university researchers collaborate with language teachers, what challenges they might encounter, and how they navigate such challenges in CAR. To fill the gap, this study…

  2. 13 CFR 102.6 - Fees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... solely for the purpose of conducting scientific research the results of which are not intended to promote... records are not sought for a commercial use but are sought to further scientific research. SBA will charge... research. An educational institution requester must show that the request is authorized by and is made...

  3. 13 CFR 102.6 - Fees.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... solely for the purpose of conducting scientific research the results of which are not intended to promote... records are not sought for a commercial use but are sought to further scientific research. SBA will charge... research. An educational institution requester must show that the request is authorized by and is made...

  4. 13 CFR 102.6 - Fees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... solely for the purpose of conducting scientific research the results of which are not intended to promote... records are not sought for a commercial use but are sought to further scientific research. SBA will charge... research. An educational institution requester must show that the request is authorized by and is made...

  5. 13 CFR 102.6 - Fees.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... solely for the purpose of conducting scientific research the results of which are not intended to promote... records are not sought for a commercial use but are sought to further scientific research. SBA will charge... research. An educational institution requester must show that the request is authorized by and is made...

  6. Educational Research as a Practical Science

    ERIC Educational Resources Information Center

    Carr, Wilfred

    2007-01-01

    This article offers a philosophical contribution to recent debates about the assessment of quality in educational research. It shows how criticisms of the quality of educational research--pointing to its failure to meet epistemic criteria of rigour and practical criteria of relevance--are an inevitable manifestation of the flawed assumption that…

  7. Working Environment and the Research Productivity of Doctoral Students in Management

    ERIC Educational Resources Information Center

    Kim, Kiwan; Karau, Steven J.

    2010-01-01

    The authors examined the influence of creative personality and creative working environment on the research productivity of doctoral students in business. Students in management doctoral programs (N = 200) participated in an online survey. The results show that faculty support was positively associated with research productivity. Among demographic…

  8. 78 FR 29441 - Child Care and Development Fund (CCDF) Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... tax dollars, and leverage the latest knowledge and research in the field of early care and education... school success. A growing body of research demonstrates that the first five years of a child's cognitive... language development and problem-solving skills. Research shows that the quality and stability of adult...

  9. 19. Photocopy of photograph (original in the Langley Research Center ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L79758) INTERIOR VIEW SHOWING TURNING VANES AND PERSONNEL IN THE 8-FOOT HIGH SPEED WIND TUNNEL. - NASA Langley Research Center, 8-Foot High Speed Wind Tunnel, 641 Thornell Avenue, Hampton, Hampton, VA

  10. 18. Photocopy of photograph (original in the Langley Research Center ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L86-10235) INTERIOR VIEW SHOWING TURNING VANES IN 8-FOOT HIGH SPEED WIND TUNNEL. - NASA Langley Research Center, 8-Foot High Speed Wind Tunnel, 641 Thornell Avenue, Hampton, Hampton, VA

  11. 20. Photocopy of photograph (original in the Langley Research Center ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) INTERIOR VIEW SHOWING TURNING VANES AND PERSONNEL IN THE 8-FOOT HIGH SPEED WIND TUNNEL. - NASA Langley Research Center, 8-Foot High Speed Wind Tunnel, 641 Thornell Avenue, Hampton, Hampton, VA

  12. "Why Are We Here?" Taking "Place" into Account in UK Outdoor Environmental Education

    ERIC Educational Resources Information Center

    Harrison, Sam

    2010-01-01

    "Place" is an under-researched and poorly documented element of UK outdoor environmental education. In the international literature, North American and Australian researchers and practitioners show considerable attention to "place". Yet UK outdoor environmental educators and researchers seem to have neglected this area despite…

  13. Maximizing the Benefits of Student Diversity: Lessons from School Desegregation Research.

    ERIC Educational Resources Information Center

    Schofield, Janet Ward

    This chapter considers the implications for higher education of existing research on the effects of desegregation at the elementary and secondary school level. Research shows that school desegregation enhances the academic progress of African American students, increases suspension rates but cuts dropout rates among minority students, positively…

  14. Highlights of Programmatic, Interdisciplinary Research on Writing

    ERIC Educational Resources Information Center

    Berninger, Virginia W.

    2009-01-01

    An overview of research topics and findings from an interdisciplinary, programmatic line of research on writing over the past 25 years is presented. The cross-sectional assessment studies (grades 1 to 9) showed which measures uniquely explained variance in handwriting, spelling, and composing and thus validated their use in assessment. These and…

  15. Teacher Identity Development through Action Research: A Chinese Experience

    ERIC Educational Resources Information Center

    Yuan, Rui; Burns, Anne

    2017-01-01

    This study explores how two language teachers constructed and reconstructed their professional identities through their action research (AR) facilitated by university researchers in China. Informed by the theory of 'community of practice', the findings of the study show that AR exerted a transformative impact on the teachers' identity development.…

  16. How Well Do Researchers Report Their Measures? An Evaluation of Measurement in Published Educational Research.

    ERIC Educational Resources Information Center

    Whittington, Dale

    1998-01-01

    This study describes how much and in what ways authors of research studies fail to include adequate information about data collection. Results based on analysis of 220 articles from 22 journals show that the quality of measurement reporting continues to be a problem. (SLD)

  17. Internet research: improving traditional community analysis before launching a practice.

    PubMed

    Barresi, B; Scott, C

    2000-01-01

    Optometric practice management experts have always recommended that optometrists thoroughly research the communities in which they are considering practicing. Until the Internet came along, demographic research was possible but often daunting. Today, say these authors, it's becoming quite a bit easier ... and they show us how.

  18. Early Learning: Return on Investment. Annotated Bibliography

    ERIC Educational Resources Information Center

    Hite, Jenny

    2014-01-01

    Today's researchers seek to determine if contemporary pre-K programs provide the strong return on investment found by researchers in the 1960's High/Scope Perry Preschool Program and 1970's North Carolina Abecedarian Project. Research then showed that these two programs created positive academic effects that accompanied their students as they…

  19. From the Knowledge of Understanding to Military Deception

    DTIC Science & Technology

    2008-05-21

    and decision-making. Experimental research could lead to a validation of the theory. In the experiment, I tried to cause a decrease in ambiguity and...strong evidence that indicate a relationship. Further research with a larger sample size might show a significant statistical relationship. In order...2 Research question

  20. Towards Understanding EFL Teachers' Conceptions of Research: Findings from Argentina

    ERIC Educational Resources Information Center

    Banegas, Darío Luis

    2018-01-01

    This paper investigates the conceptions of research held by English as a foreign language teachers in Argentina. Quantitative data from 622 participants from an online questionnaire were followed by qualitative data from online interviews with 40 of those participants. Results show that the teachers conceptualised research through conventional…

Top