A combined computational-experimental analyses of selected metabolic enzymes in Pseudomonas species.
Perumal, Deepak; Lim, Chu Sing; Chow, Vincent T K; Sakharkar, Kishore R; Sakharkar, Meena K
2008-09-10
Comparative genomic analysis has revolutionized our ability to predict the metabolic subsystems that occur in newly sequenced genomes, and to explore the functional roles of the set of genes within each subsystem. These computational predictions can considerably reduce the volume of experimental studies required to assess basic metabolic properties of multiple bacterial species. However, experimental validations are still required to resolve the apparent inconsistencies in the predictions by multiple resources. Here, we present combined computational-experimental analyses on eight completely sequenced Pseudomonas species. Comparative pathway analyses reveal that several pathways within the Pseudomonas species show high plasticity and versatility. Potential bypasses in 11 metabolic pathways were identified. We further confirmed the presence of the enzyme O-acetyl homoserine (thiol) lyase (EC: 2.5.1.49) in P. syringae pv. tomato that revealed inconsistent annotations in KEGG and in the recently published SYSTOMONAS database. These analyses connect and integrate systematic data generation, computational data interpretation, and experimental validation and represent a synergistic and powerful means for conducting biological research.
Aerodynamic Database Development for Mars Smart Lander Vehicle Configurations
NASA Technical Reports Server (NTRS)
Bobskill, Glenn J.; Parikh, Paresh C.; Prabhu, Ramadas K.; Tyler, Erik D.
2002-01-01
An aerodynamic database has been generated for the Mars Smart Lander Shelf-All configuration using computational fluid dynamics (CFD) simulations. Three different CFD codes, USM3D and FELISA, based on unstructured grid technology and LAURA, an established and validated structured CFD code, were used. As part of this database development, the results for the Mars continuum were validated with experimental data and comparisons made where applicable. The validation of USM3D and LAURA with the Unitary experimental data, the use of intermediate LAURA check analyses, as well as the validation of FELISA with the Mach 6 CF(sub 4) experimental data provided a higher confidence in the ability for CFD to provide aerodynamic data in order to determine the static trim characteristics for longitudinal stability. The analyses of the noncontinuum regime showed the existence of multiple trim angles of attack that can be unstable or stable trim points. This information is needed to design guidance controller throughout the trajectory.
Experimental validation of structural optimization methods
NASA Technical Reports Server (NTRS)
Adelman, Howard M.
1992-01-01
The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.
NASA Astrophysics Data System (ADS)
Etxeberria, A.; Vechiu, I.; Baudoin, S.; Camblong, H.; Kreckelbergh, S.
2014-02-01
The increasing use of distributed generators, which are mainly based on renewable sources, can create several issues in the operation of the electric grid. The microgrid is being analysed as a solution to the integration in the grid of the renewable sources at a high penetration level in a controlled way. The storage systems play a vital role in order to keep the energy and power balance of the microgrid. Due to the technical limitations of the currently available storage systems, it is necessary to use more than one storage technology to satisfy the requirements of the microgrid application. This work validates in simulations and experimentally the use of a Three-Level Neutral Point Clamped converter to control the power flow of a hybrid storage system formed by a SuperCapacitor and a Vanadium Redox Battery. The operation of the system is validated in two case studies in the experimental platform installed in ESTIA. The experimental results prove the validity of the proposed system as well as the designed control algorithm. The good agreement among experimental and simulation results also validates the simulation model, that can therefore be used to analyse the operation of the system in different case studies.
Valid randomization-based p-values for partially post hoc subgroup analyses.
Lee, Joseph J; Rubin, Donald B
2015-10-30
By 'partially post-hoc' subgroup analyses, we mean analyses that compare existing data from a randomized experiment-from which a subgroup specification is derived-to new, subgroup-only experimental data. We describe a motivating example in which partially post hoc subgroup analyses instigated statistical debate about a medical device's efficacy. We clarify the source of such analyses' invalidity and then propose a randomization-based approach for generating valid posterior predictive p-values for such partially post hoc subgroups. Lastly, we investigate the approach's operating characteristics in a simple illustrative setting through a series of simulations, showing that it can have desirable properties under both null and alternative hypotheses. Copyright © 2015 John Wiley & Sons, Ltd.
Conditions for the Validity of Faraday's Law of Induction and Their Experimental Confirmation
ERIC Educational Resources Information Center
Lopez-Ramos, A.; Menendez, J. R.; Pique, C.
2008-01-01
This paper, as its main didactic objective, shows the conditions needed for the validity of Faraday's law of induction. Inadequate comprehension of these conditions has given rise to several paradoxes about the issue; some are analysed and solved in this paper in the light of the theoretical deduction of the induction law. Furthermore, an…
Masonry structures built with fictile tubules: Experimental and numerical analyses
NASA Astrophysics Data System (ADS)
Tiberti, Simone; Scuro, Carmelo; Codispoti, Rosamaria; Olivito, Renato S.; Milani, Gabriele
2017-11-01
Masonry structures with fictile tubules were a distinctive building technique of the Mediterranean area. This technique dates back to Roman and early Christian times, used to build vaulted constructions and domes with various geometrical forms by virtue of their modular structure. In the present work, experimental tests were carried out to identify the mechanical properties of hollow clay fictile tubules and a possible reinforcing technique for existing buildings employing such elements. The experimental results were then validated by devising and analyzing numerical models with the FE software Abaqus, also aimed at investigating the structural behavior of an arch via linear and nonlinear static analyses.
Moreno-Murcia, Juan A; Martínez-Galindo, Celestina; Moreno-Pérez, Víctor; Marcos, Pablo J.; Borges, Fernanda
2012-01-01
This study aimed to cross-validate the psychometric properties of the Basic Psychological Needs in Exercise Scale (BPNES) by Vlachopoulos and Michailidou, 2006 in a Spanish context. Two studies were conducted. Confirmatory factor analysis results confirmed the hypothesized three-factor solution In addition, we documented evidence of reliability, analysed as internal consistency and temporal stability. Future studies should analyse the scale's validity and reliability with different populations and check their experimental effect. Key pointsThe Basic Psychological Needs in Exercise Scale (BPNES) is valid and reliable for measuring basic psychological needs in healthy physical exercise in the Spanish context.The factor structure of three correlated factors has shown minimal invariance across gender. PMID:24149130
Panagiotopoulou, O.; Wilshin, S. D.; Rayfield, E. J.; Shefelbine, S. J.; Hutchinson, J. R.
2012-01-01
Finite element modelling is well entrenched in comparative vertebrate biomechanics as a tool to assess the mechanical design of skeletal structures and to better comprehend the complex interaction of their form–function relationships. But what makes a reliable subject-specific finite element model? To approach this question, we here present a set of convergence and sensitivity analyses and a validation study as an example, for finite element analysis (FEA) in general, of ways to ensure a reliable model. We detail how choices of element size, type and material properties in FEA influence the results of simulations. We also present an empirical model for estimating heterogeneous material properties throughout an elephant femur (but of broad applicability to FEA). We then use an ex vivo experimental validation test of a cadaveric femur to check our FEA results and find that the heterogeneous model matches the experimental results extremely well, and far better than the homogeneous model. We emphasize how considering heterogeneous material properties in FEA may be critical, so this should become standard practice in comparative FEA studies along with convergence analyses, consideration of element size, type and experimental validation. These steps may be required to obtain accurate models and derive reliable conclusions from them. PMID:21752810
Structurally compliant rocket engine combustion chamber: Experimental and analytical validation
NASA Technical Reports Server (NTRS)
Jankovsky, Robert S.; Arya, Vinod K.; Kazaroff, John M.; Halford, Gary R.
1994-01-01
A new, structurally compliant rocket engine combustion chamber design has been validated through analysis and experiment. Subscale, tubular channel chambers have been cyclically tested and analytically evaluated. Cyclic lives were determined to have a potential for 1000 percent increase over those of rectangular channel designs, the current state of the art. Greater structural compliance in the circumferential direction gave rise to lower thermal strains during hot firing, resulting in lower thermal strain ratcheting and longer predicted fatigue lives. Thermal, structural, and durability analyses of the combustion chamber design, involving cyclic temperatures, strains, and low-cycle fatigue lives, have corroborated the experimental observations.
Williams, G E; Cuvo, A J
1986-01-01
The research was designed to validate procedures to teach apartment upkeep skills to severely handicapped clients with various categorical disabilities. Methodological features of this research included performance comparisons between general and specific task analyses, effect of an impasse correction baseline procedure, social validation of training goals, natural environment assessments and contingencies, as well as long-term follow-up. Subjects were taught to perform upkeep responses on their air conditioner-heating unit, electric range, refrigerator, and electrical appliances within the context of a multiple-probe across subjects experimental design. The results showed acquisition, long-term maintenance, and generalization of the upkeep skills to a nontraining apartment. General task analyses were recommended for assessment and specific task analyses for training. The impasse correction procedure generally did not produce acquisition. PMID:3710947
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Brian; Gutowska, Izabela; Chiger, Howard
Computer simulations of nuclear reactor thermal-hydraulic phenomena are often used in the design and licensing of nuclear reactor systems. In order to assess the accuracy of these computer simulations, computer codes and methods are often validated against experimental data. This experimental data must be of sufficiently high quality in order to conduct a robust validation exercise. In addition, this experimental data is generally collected at experimental facilities that are of a smaller scale than the reactor systems that are being simulated due to cost considerations. Therefore, smaller scale test facilities must be designed and constructed in such a fashion tomore » ensure that the prototypical behavior of a particular nuclear reactor system is preserved. The work completed through this project has resulted in scaling analyses and conceptual design development for a test facility capable of collecting code validation data for the following high temperature gas reactor systems and events— 1. Passive natural circulation core cooling system, 2. pebble bed gas reactor concept, 3. General Atomics Energy Multiplier Module reactor, and 4. prismatic block design steam-water ingress event. In the event that code validation data for these systems or events is needed in the future, significant progress in the design of an appropriate integral-type test facility has already been completed as a result of this project. Where applicable, the next step would be to begin the detailed design development and material procurement. As part of this project applicable scaling analyses were completed and test facility design requirements developed. Conceptual designs were developed for the implementation of these design requirements at the Oregon State University (OSU) High Temperature Test Facility (HTTF). The original HTTF is based on a ¼-scale model of a high temperature gas reactor concept with the capability for both forced and natural circulation flow through a prismatic core with an electrical heat source. The peak core region temperature capability is 1400°C. As part of this project, an inventory of test facilities that could be used for these experimental programs was completed. Several of these facilities showed some promise, however, upon further investigation it became clear that only the OSU HTTF had the power and/or peak temperature limits that would allow for the experimental programs envisioned herein. Thus the conceptual design and feasibility study development focused on examining the feasibility of configuring the current HTTF to collect validation data for these experimental programs. In addition to the scaling analyses and conceptual design development, a test plan was developed for the envisioned modified test facility. This test plan included a discussion on an appropriate shakedown test program as well as the specific matrix tests. Finally, a feasibility study was completed to determine the cost and schedule considerations that would be important to any test program developed to investigate these designs and events.« less
Neman, R
1975-03-01
The Zigler and Seitz (1975) critique was carefully examined with respect to the conclusions of the Neman et al. (1975) study. Particular attention was given to the following questions: (a) did experimenter bias or commitment account for the results, (b) were unreliable and invalid psychometric instruments used, (c) were the statistical analyses insufficient or incorrect, (d) did the results reflect no more than the operation of chance, and (e) were the results biased by artifactually inflated profile scores. Experimenter bias and commitment were shown to be insufficient to account for the results; a further review of Buros (1972) showed that there was no need for apprehension about the testing instruments; the statistical analyses were shown to exceed prevailing standards for research reporting; the results were shown to reflect valid findings at the .05 probability level; and the Neman et al. (1975) results for the profile measure were equally significant using either "raw" neurological scores or "scales" neurological age scores. Zigler, Seitz, and I agreed on the needs for (a) using multivariate analyses, where applicable, in studies having more than one dependent variable; (b) defining the population for which sensorimotor training procedures may be appropriately prescribed; and (c) validating the profile measure as a tool to assess neurological disorganization.
Recent Work in Hybrid Radiation Transport Methods with Applications to Commercial Nuclear Power
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulesza, Joel A.
This talk will begin with an overview of hybrid radiation transport methods followed by a discussion of the author’s work to advance current capabilities. The talk will then describe applications for these methods in commercial nuclear power reactor analyses and techniques for experimental validation. When discussing these analytical and experimental activities, the importance of technical standards such as those created and maintained by ASTM International will be demonstrated.
Bad Behavior: Improving Reproducibility in Behavior Testing.
Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan
2018-01-24
Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.
Principles for valid histopathologic scoring in research
Gibson-Corley, Katherine N.; Olivier, Alicia K.; Meyerholz, David K.
2013-01-01
Histopathologic scoring is a tool by which semi-quantitative data can be obtained from tissues. Initially, a thorough understanding of the experimental design, study objectives and methods are required to allow the pathologist to appropriately examine tissues and develop lesion scoring approaches. Many principles go into the development of a scoring system such as tissue examination, lesion identification, scoring definitions and consistency in interpretation. Masking (a.k.a. “blinding”) of the pathologist to experimental groups is often necessary to constrain bias and multiple mechanisms are available. Development of a tissue scoring system requires appreciation of the attributes and limitations of the data (e.g. nominal, ordinal, interval and ratio data) to be evaluated. Incidence, ordinal and rank methods of tissue scoring are demonstrated along with key principles for statistical analyses and reporting. Validation of a scoring system occurs through two principal measures: 1) validation of repeatability and 2) validation of tissue pathobiology. Understanding key principles of tissue scoring can help in the development and/or optimization of scoring systems so as to consistently yield meaningful and valid scoring data. PMID:23558974
Considering RNAi experimental design in parasitic helminths.
Dalzell, Johnathan J; Warnock, Neil D; McVeigh, Paul; Marks, Nikki J; Mousley, Angela; Atkinson, Louise; Maule, Aaron G
2012-04-01
Almost a decade has passed since the first report of RNA interference (RNAi) in a parasitic helminth. Whilst much progress has been made with RNAi informing gene function studies in disparate nematode and flatworm parasites, substantial and seemingly prohibitive difficulties have been encountered in some species, hindering progress. An appraisal of current practices, trends and ideals of RNAi experimental design in parasitic helminths is both timely and necessary for a number of reasons: firstly, the increasing availability of parasitic helminth genome/transcriptome resources means there is a growing need for gene function tools such as RNAi; secondly, fundamental differences and unique challenges exist for parasite species which do not apply to model organisms; thirdly, the inherent variation in experimental design, and reported difficulties with reproducibility undermine confidence. Ideally, RNAi studies of gene function should adopt standardised experimental design to aid reproducibility, interpretation and comparative analyses. Although the huge variations in parasite biology and experimental endpoints make RNAi experimental design standardization difficult or impractical, we must strive to validate RNAi experimentation in helminth parasites. To aid this process we identify multiple approaches to RNAi experimental validation and highlight those which we deem to be critical for gene function studies in helminth parasites.
Vesterinen, Hanna M; Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich
2011-04-01
Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication.
Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich
2011-01-01
Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication. PMID:21157472
NASA Technical Reports Server (NTRS)
Gouldin, F. C.
1982-01-01
Fluid mechanical effects on combustion processes in steady flow combustors, especially gas turbine combustors were investigated. Flow features of most interest were vorticity, especially swirl, and turbulence. Theoretical analyses, numerical calculations, and experiments were performed. The theoretical and numerical work focused on noncombusting flows, while the experimental work consisted of both reacting and nonreacting flow studies. An experimental data set, e.g., velocity, temperature and composition, was developed for a swirl flow combustor for use by combustion modelers for development and validation work.
Ingram, Paul B; Ternes, Michael S
2016-05-01
This study synthesized research evaluation of the effectiveness of the over-reporting validity scales of the Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF) for detecting intentionally feigned over-endorsements of symptoms using a moderated meta-analysis. After identifying experimental and quasi-experimental studies for inclusion (k = 25) in which the validity scales of the MMPI-2-RF were compared between groups of respondents, moderated meta-analyses were conducted for each of its five over-reporting scales. These meta-analyses explored the general effectiveness of each scale across studies, as well as the impact that several moderators had on scale performance, including comparison group, study type (i.e. real versus simulation), age, education, sex, and diagnosis. The over-reporting scales of the MMPI-2-RF act as effective general measures for the detection of malingering and over endorsement of symptoms with individual scales ranging in effectiveness from an effect size of 1.08 (Symptom Validity; FBS-r) to 1.43 (Infrequent Pathology; Fp-r), each with different patterns of moderating influence. The MMPI-2-RF validity scales effectively discriminate between groups of respondents presenting in either an honest manner or with patterned exaggeration and over-endorsement of symptoms. The magnitude of difference observed between honest and malingering groups was substantially narrower than might be expected using traditional cut-scores for the validity scales, making interpretation within the evaluation context particularly important. While all over-reporting scales are effective, the FBS-r and RBS scales are those least influenced by common and context specific moderating influences, such as respondent or comparison grouping.
Effects of human running cadence and experimental validation of the bouncing ball model
NASA Astrophysics Data System (ADS)
Bencsik, László; Zelei, Ambrus
2017-05-01
The biomechanical analysis of human running is a complex problem, because of the large number of parameters and degrees of freedom. However, simplified models can be constructed, which are usually characterized by some fundamental parameters, like step length, foot strike pattern and cadence. The bouncing ball model of human running is analysed theoretically and experimentally in this work. It is a minimally complex dynamic model when the aim is to estimate the energy cost of running and the tendency of ground-foot impact intensity as a function of cadence. The model shows that cadence has a direct effect on energy efficiency of running and ground-foot impact intensity. Furthermore, it shows that higher cadence implies lower risk of injury and better energy efficiency. An experimental data collection of 121 amateur runners is presented. The experimental results validate the model and provides information about the walk-to-run transition speed and the typical development of cadence and grounded phase ratio in different running speed ranges.
Felipe-Sesé, Luis; López-Alba, Elías; Hannemann, Benedikt; Schmeer, Sebastian; Diaz, Francisco A
2017-06-28
A quasistatic indentation numerical analysis in a round section specimen made of soft material has been performed and validated with a full field experimental technique, i.e., Digital Image Correlation 3D. The contact experiment specifically consisted of loading a 25 mm diameter rubber cylinder of up to a 5 mm indentation and then unloading. Experimental strains fields measured at the surface of the specimen during the experiment were compared with those obtained by performing two numerical analyses employing two different hyperplastic material models. The comparison was performed using an Image Decomposition new methodology that makes a direct comparison of full-field data independently of their scale or orientation possible. Numerical results show a good level of agreement with those measured during the experiments. However, since image decomposition allows for the differences to be quantified, it was observed that one of the adopted material models reproduces lower differences compared to experimental results.
Felipe-Sesé, Luis; López-Alba, Elías; Hannemann, Benedikt; Schmeer, Sebastian; Diaz, Francisco A.
2017-01-01
A quasistatic indentation numerical analysis in a round section specimen made of soft material has been performed and validated with a full field experimental technique, i.e., Digital Image Correlation 3D. The contact experiment specifically consisted of loading a 25 mm diameter rubber cylinder of up to a 5 mm indentation and then unloading. Experimental strains fields measured at the surface of the specimen during the experiment were compared with those obtained by performing two numerical analyses employing two different hyperplastic material models. The comparison was performed using an Image Decomposition new methodology that makes a direct comparison of full-field data independently of their scale or orientation possible. Numerical results show a good level of agreement with those measured during the experiments. However, since image decomposition allows for the differences to be quantified, it was observed that one of the adopted material models reproduces lower differences compared to experimental results. PMID:28773081
Critical analysis of adsorption data statistically
NASA Astrophysics Data System (ADS)
Kaushal, Achla; Singh, S. K.
2017-10-01
Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are <1, indicating favourable isotherms. Karl Pearson's correlation coefficient values for Langmuir and Freundlich adsorption isotherms were obtained as 0.99 and 0.95 respectively, which show higher degree of correlation between the variables. This validates the data obtained for adsorption of zinc ions from the contaminated aqueous solution with the help of mango leaf powder.
Dehghanian, Fariba; Hojati, Zohreh; Esmaeili, Fariba; Masoudi-Nejad, Ali
2018-05-21
The Hippo signaling pathway is identified as a potential regulatory pathway which plays critical roles in differentiation and stem cell self-renewal. Yap1 is a primary transcriptional effector of this pathway. The importance of Yap1 in embryonic stem cells (ESCs) and differentiation procedure remains a challenging question, since two different observations have been reported. To answer this question we used co-expression network and differential co-expression analyses followed by experimental validations. Our results indicate that Yap1 is highly co-expressed with stem cell markers in ESCs but not in differentiated cells (DCs). The significant Yap1 down-regulation and also translocation of Yap1 into the cytoplasm during P19 differentiation was also detected. Moreover, our results suggest the E2f7, Lin28a and Dppa4 genes as possible regulatory nuclear factors of Hippo pathway in stem cells. The present findings are actively consistent with studies that suggested Yap1 as an essential factor for stem cell self-renewal. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Papadakis, M.; Breer, M.; Craig, N.; Liu, X.
1994-01-01
An experimental method has been developed to determine the water droplet impingement characteristics on two- and three-dimensional aircraft surfaces. The experimental water droplet impingement data are used to validate particle trajectory analysis codes that are used in aircraft icing analyses and engine inlet particle separator analyses. The aircraft surface is covered with thin strips of blotter paper in areas of interest. The surface is then exposed to an airstream that contains a dyed-water spray cloud. The water droplet impingement data are extracted from the dyed blotter paper strips by measuring the optical reflectance of each strip with an automated reflectometer. Experimental impingement efficiency data represented for a NLF (1)-0414 airfoil, a swept MS (1)-0317 airfoil, a Boeing 737-300 engine inlet model, two simulated ice shapes and a swept NACA 0012 wingtip. Analytical impingement efficiency data are also presented for the NLF (1)-0414 airfoil and the Boeing 737-300 engine inlet model.
Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A
2016-01-01
The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.
NASA Astrophysics Data System (ADS)
Zhou, Abel; White, Graeme L.; Davidson, Rob
2018-02-01
Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.
Holgado-Tello, Fco. P.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A.
2016-01-01
The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity. PMID:27378991
NASA Technical Reports Server (NTRS)
Papadakis, M.; Elangovan, E.; Freund, G. A., Jr.; Breer, M. D.
1987-01-01
An experimental method has been developed to determine the droplet impingement characteristics on two- and three-dimensional bodies. The experimental results provide the essential droplet impingement data required to validate particle trajectory codes, used in aircraft icing analyses and engine inlet particle separator analyses. A body whose water droplet impingement characteristics are required is covered at strategic locations by thin strips of moisture absorbing (blotter) paper, and then exposed to an air stream containing a dyed-water spray cloud. Water droplet impingement data are extracted from the dyed blotter strips, by measuring the optical reflectance of the dye deposit on the strips, using an automated reflectometer. Impingement efficiency data obtained for a NACA 65(2)015 airfoil section, a supercritical airfoil section, and Being 737-300 and axisymmetric inlet models are presented in this paper.
NASA Astrophysics Data System (ADS)
Takeda, M.; Nakajima, H.; Zhang, M.; Hiratsuka, T.
2008-04-01
To obtain reliable diffusion parameters for diffusion testing, multiple experiments should not only be cross-checked but the internal consistency of each experiment should also be verified. In the through- and in-diffusion tests with solution reservoirs, test interpretation of different phases often makes use of simplified analytical solutions. This study explores the feasibility of steady, quasi-steady, equilibrium and transient-state analyses using simplified analytical solutions with respect to (i) valid conditions for each analytical solution, (ii) potential error, and (iii) experimental time. For increased generality, a series of numerical analyses are performed using unified dimensionless parameters and the results are all related to dimensionless reservoir volume (DRV) which includes only the sorptive parameter as an unknown. This means the above factors can be investigated on the basis of the sorption properties of the testing material and/or tracer. The main findings are that steady, quasi-steady and equilibrium-state analyses are applicable when the tracer is not highly sorptive. However, quasi-steady and equilibrium-state analyses become inefficient or impractical compared to steady state analysis when the tracer is non-sorbing and material porosity is significantly low. Systematic and comprehensive reformulation of analytical models enables the comparison of experimental times between different test methods. The applicability and potential error of each test interpretation can also be studied. These can be applied in designing, performing, and interpreting diffusion experiments by deducing DRV from the available information for the target material and tracer, combined with the results of this study.
On vital aid: the why, what and how of validation
Kleywegt, Gerard J.
2009-01-01
Limitations to the data and subjectivity in the structure-determination process may cause errors in macromolecular crystal structures. Appropriate validation techniques may be used to reveal problems in structures, ideally before they are analysed, published or deposited. Additionally, such techniques may be used a posteriori to assess the (relative) merits of a model by potential users. Weak validation methods and statistics assess how well a model reproduces the information that was used in its construction (i.e. experimental data and prior knowledge). Strong methods and statistics, on the other hand, test how well a model predicts data or information that were not used in the structure-determination process. These may be data that were excluded from the process on purpose, general knowledge about macromolecular structure, information about the biological role and biochemical activity of the molecule under study or its mutants or complexes and predictions that are based on the model and that can be tested experimentally. PMID:19171968
Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc
2013-06-01
An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.
NASA Astrophysics Data System (ADS)
Kawamura, Yoshifumi; Hikage, Takashi; Nojima, Toshio
The aim of this study is to develop a new whole-body averaged specific absorption rate (SAR) estimation method based on the external-cylindrical field scanning technique. This technique is adopted with the goal of simplifying the dosimetry estimation of human phantoms that have different postures or sizes. An experimental scaled model system is constructed. In order to examine the validity of the proposed method for realistic human models, we discuss the pros and cons of measurements and numerical analyses based on the finite-difference time-domain (FDTD) method. We consider the anatomical European human phantoms and plane-wave in the 2GHz mobile phone frequency band. The measured whole-body averaged SAR results obtained by the proposed method are compared with the results of the FDTD analyses.
Beer, Lucian; Mlitz, Veronika; Gschwandtner, Maria; Berger, Tanja; Narzt, Marie-Sophie; Gruber, Florian; Brunner, Patrick M; Tschachler, Erwin; Mildner, Michael
2015-10-01
Reverse transcription polymerase chain reaction (qRT-PCR) has become a mainstay in many areas of skin research. To enable quantitative analysis, it is necessary to analyse expression of reference genes (RGs) for normalization of target gene expression. The selection of reliable RGs therefore has an important impact on the experimental outcome. In this study, we aimed to identify and validate the best suited RGs for qRT-PCR in human primary keratinocytes (KCs) over a broad range of experimental conditions using the novel bioinformatics tool 'RefGenes', which is based on a manually curated database of published microarray data. Expression of 6 RGs identified by RefGenes software and 12 commonly used RGs were validated by qRT-PCR. We assessed whether these 18 markers fulfilled the requirements for a valid RG by the comprehensive ranking of four bioinformatics tools and the coefficient of variation (CV). In an overall ranking, we found GUSB to be the most stably expressed RG, whereas the expression values of the commonly used RGs, GAPDH and B2M were significantly affected by varying experimental conditions. Our results identify RefGenes as a powerful tool for the identification of valid RGs and suggest GUSB as the most reliable RG for KCs. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Validation of Laser-Induced Fluorescent Photogrammetric Targets on Membrane Structures
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Dorrington, Adrian A.; Shortis, Mark R.; Hendricks, Aron R.
2004-01-01
The need for static and dynamic characterization of a new generation of inflatable space structures requires the advancement of classical metrology techniques. A new photogrammetric-based method for non-contact ranging and surface profiling has been developed at NASA Langley Research Center (LaRC) to support modal analyses and structural validation of this class of space structures. This full field measurement method, known as Laser-Induced Fluorescence (LIF) photogrammetry, has previously yielded promising experimental results. However, data indicating the achievable measurement precision had not been published. This paper provides experimental results that indicate the LIF-photogrammetry measurement precision for three different target types used on a reflective membrane structure. The target types were: (1) non-contact targets generated using LIF, (2) surface attached retro-reflective targets, and (3) surface attached diffuse targets. Results from both static and dynamic investigations are included.
NASA Technical Reports Server (NTRS)
Seybert, A. F.; Wu, T. W.; Wu, X. F.
1994-01-01
This research report is presented in three parts. In the first part, acoustical analyses were performed on modes of vibration of the housing of a transmission of a gear test rig developed by NASA. The modes of vibration of the transmission housing were measured using experimental modal analysis. The boundary element method (BEM) was used to calculate the sound pressure and sound intensity on the surface of the housing and the radiation efficiency of each mode. The radiation efficiency of each of the transmission housing modes was then compared to theoretical results for a finite baffled plate. In the second part, analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise level radiated from the box. The FEM was used to predict the vibration, while the BEM was used to predict the sound intensity and total radiated sound power using surface vibration as the input data. Vibration predicted by the FEM model was validated by experimental modal analysis; noise predicted by the BEM was validated by measurements of sound intensity. Three types of results are presented for the total radiated sound power: sound power predicted by the BEM model using vibration data measured on the surface of the box; sound power predicted by the FEM/BEM model; and sound power measured by an acoustic intensity scan. In the third part, the structure used in part two was modified. A rib was attached to the top plate of the structure. The FEM and BEM were then used to predict structural vibration and radiated noise respectively. The predicted vibration and radiated noise were then validated through experimentation.
Experimental validation of solid rocket motor damping models
NASA Astrophysics Data System (ADS)
Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio
2017-12-01
In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe damping properties of slender launch vehicles in payload/launcher coupled load analysis.
Experimental validation of solid rocket motor damping models
NASA Astrophysics Data System (ADS)
Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio
2018-06-01
In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe damping properties of slender launch vehicles in payload/launcher coupled load analysis.
Network news: prime time for systems biology of the plant circadian clock.
McClung, C Robertson; Gutiérrez, Rodrigo A
2010-12-01
Whole-transcriptome analyses have established that the plant circadian clock regulates virtually every plant biological process and most prominently hormonal and stress response pathways. Systems biology efforts have successfully modeled the plant central clock machinery and an iterative process of model refinement and experimental validation has contributed significantly to the current view of the central clock machinery. The challenge now is to connect this central clock to the output pathways for understanding how the plant circadian clock contributes to plant growth and fitness in a changing environment. Undoubtedly, systems approaches will be needed to integrate and model the vastly increased volume of experimental data in order to extract meaningful biological information. Thus, we have entered an era of systems modeling, experimental testing, and refinement. This approach, coupled with advances from the genetic and biochemical analyses of clock function, is accelerating our progress towards a comprehensive understanding of the plant circadian clock network. Copyright © 2010 Elsevier Ltd. All rights reserved.
Cervical Spine Injuries: A Whole-Body Musculoskeletal Model for the Analysis of Spinal Loading.
Cazzola, Dario; Holsgrove, Timothy P; Preatoni, Ezio; Gill, Harinderjit S; Trewartha, Grant
2017-01-01
Cervical spine trauma from sport or traffic collisions can have devastating consequences for individuals and a high societal cost. The precise mechanisms of such injuries are still unknown as investigation is hampered by the difficulty in experimentally replicating the conditions under which these injuries occur. We harness the benefits of computer simulation to report on the creation and validation of i) a generic musculoskeletal model (MASI) for the analyses of cervical spine loading in healthy subjects, and ii) a population-specific version of the model (Rugby Model), for investigating cervical spine injury mechanisms during rugby activities. The musculoskeletal models were created in OpenSim, and validated against in vivo data of a healthy subject and a rugby player performing neck and upper limb movements. The novel aspects of the Rugby Model comprise i) population-specific inertial properties and muscle parameters representing rugby forward players, and ii) a custom scapula-clavicular joint that allows the application of multiple external loads. We confirm the utility of the developed generic and population-specific models via verification steps and validation of kinematics, joint moments and neuromuscular activations during rugby scrummaging and neck functional movements, which achieve results comparable with in vivo and in vitro data. The Rugby Model was validated and used for the first time to provide insight into anatomical loading and cervical spine injury mechanisms related to rugby, whilst the MASI introduces a new computational tool to allow investigation of spinal injuries arising from other sporting activities, transport, and ergonomic applications. The models used in this study are freely available at simtk.org and allow to integrate in silico analyses with experimental approaches in injury prevention.
Cervical Spine Injuries: A Whole-Body Musculoskeletal Model for the Analysis of Spinal Loading
Holsgrove, Timothy P.; Preatoni, Ezio; Gill, Harinderjit S.; Trewartha, Grant
2017-01-01
Cervical spine trauma from sport or traffic collisions can have devastating consequences for individuals and a high societal cost. The precise mechanisms of such injuries are still unknown as investigation is hampered by the difficulty in experimentally replicating the conditions under which these injuries occur. We harness the benefits of computer simulation to report on the creation and validation of i) a generic musculoskeletal model (MASI) for the analyses of cervical spine loading in healthy subjects, and ii) a population-specific version of the model (Rugby Model), for investigating cervical spine injury mechanisms during rugby activities. The musculoskeletal models were created in OpenSim, and validated against in vivo data of a healthy subject and a rugby player performing neck and upper limb movements. The novel aspects of the Rugby Model comprise i) population-specific inertial properties and muscle parameters representing rugby forward players, and ii) a custom scapula-clavicular joint that allows the application of multiple external loads. We confirm the utility of the developed generic and population-specific models via verification steps and validation of kinematics, joint moments and neuromuscular activations during rugby scrummaging and neck functional movements, which achieve results comparable with in vivo and in vitro data. The Rugby Model was validated and used for the first time to provide insight into anatomical loading and cervical spine injury mechanisms related to rugby, whilst the MASI introduces a new computational tool to allow investigation of spinal injuries arising from other sporting activities, transport, and ergonomic applications. The models used in this study are freely available at simtk.org and allow to integrate in silico analyses with experimental approaches in injury prevention. PMID:28052130
NASA Astrophysics Data System (ADS)
Nair, B. G.; Winter, N.; Daniel, B.; Ward, R. M.
2016-07-01
Direct measurement of the flow of electric current during VAR is extremely difficult due to the aggressive environment as the arc process itself controls the distribution of current. In previous studies the technique of “magnetic source tomography” was presented; this was shown to be effective but it used a computationally intensive iterative method to analyse the distribution of arc centre position. In this paper we present faster computational methods requiring less numerical optimisation to determine the centre position of a single distributed arc both numerically and experimentally. Numerical validation of the algorithms were done on models and experimental validation on measurements based on titanium and nickel alloys (Ti6Al4V and INCONEL 718). The results are used to comment on the effects of process parameters on arc behaviour during VAR.
NASA Astrophysics Data System (ADS)
Lee, Chang-Chun; Shih, Yan-Shin; Wu, Chih-Sheng; Tsai, Chia-Hao; Yeh, Shu-Tang; Peng, Yi-Hao; Chen, Kuang-Jung
2012-07-01
This work analyses the overall stress/strain characteristic of flexible encapsulations with organic light-emitting diode (OLED) devices. A robust methodology composed of a mechanical model of multi-thin film under bending loads and related stress simulations based on nonlinear finite element analysis (FEA) is proposed, and validated to be more reliable compared with related experimental data. With various geometrical combinations of cover plate, stacked thin films and plastic substrate, the position of the neutral axis (NA) plate, which is regarded as a key design parameter to minimize stress impact for the concerned OLED devices, is acquired using the present methodology. The results point out that both the thickness and mechanical properties of the cover plate help in determining the NA location. In addition, several concave and convex radii are applied to examine the reliable mechanical tolerance and to provide an insight into the estimated reliability of foldable OLED encapsulations.
Zuthi, M F R; Ngo, H H; Guo, W S; Nghiem, L D; Hai, F I; Xia, S Q; Zhang, Z Q; Li, J X
2015-08-01
This study investigates the influence of key biomass parameters on specific oxygen uptake rate (SOUR) in a sponge submerged membrane bioreactor (SSMBR) to develop mathematical models of biomass viability. Extra-cellular polymeric substances (EPS) were considered as a lumped parameter of bound EPS (bEPS) and soluble microbial products (SMP). Statistical analyses of experimental results indicate that the bEPS, SMP, mixed liquor suspended solids and volatile suspended solids (MLSS and MLVSS) have functional relationships with SOUR and their relative influence on SOUR was in the order of EPS>bEPS>SMP>MLVSS/MLSS. Based on correlations among biomass parameters and SOUR, two independent empirical models of biomass viability were developed. The models were validated using results of the SSMBR. However, further validation of the models for different operating conditions is suggested. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Serna Moreno, M. C.; Romero Gutierrez, A.; Martínez Vicente, J. L.
2016-07-01
An analytical model has been derived for describing the results of three-point-bending tests in materials with different behaviour under tension and compression. The shift of the neutral plane and the damage initiation mode and its location have been defined. The validity of the equations has been reviewed by testing carbon fibre-reinforced polymers (CFRP), typically employed in different weight-critical applications. Both unidirectional and cross-ply laminates have been studied. The initial failure mode produced depends directly on the beam span- thickness relation. Therefore, specimens with different thicknesses have been analysed for examining the damage initiation due to either the bending moment or the out-of-plane shear load. The experimental description of the damage initiation and evolution has been shown by means of optical microscopy. The good agreement between the analytical estimations and the experimental results shows the validity of the analytical model exposed.
Development and validation of the multidimensional state boredom scale.
Fahlman, Shelley A; Mercer-Lynn, Kimberley B; Flora, David B; Eastwood, John D
2013-02-01
This article describes the development and validation of the Multidimensional State Boredom Scale (MSBS)-the first and only full-scale measure of state boredom. It was developed based on a theoretically and empirically grounded definition of boredom. A five-factor structure of the scale (Disengagement, High Arousal, Low Arousal, Inattention, and Time Perception) was supported by exploratory factor analyses and confirmatory factor analyses of two independent samples. Furthermore, all subscales were significantly related to a single, second-order factor. The MSBS factor structure was shown to be invariant across gender. MSBS scores were significantly correlated with measures of trait boredom, depression, anxiety, anger, inattention, impulsivity, neuroticism, life satisfaction, and purpose in life. Finally, MSBS scores distinguished between participants who were experimentally manipulated into a state of boredom and those who were not, above and beyond measures of trait boredom, negative affect, and depression.
Khanfar, Mohammad A; Banat, Fahmy; Alabed, Shada; Alqtaishat, Saja
2017-02-01
High expression of Nek2 has been detected in several types of cancer and it represents a novel target for human cancer. In the current study, structure-based pharmacophore modeling combined with multiple linear regression (MLR)-based QSAR analyses was applied to disclose the structural requirements for NEK2 inhibition. Generated pharmacophoric models were initially validated with receiver operating characteristic (ROC) curve, and optimum models were subsequently implemented in QSAR modeling with other physiochemical descriptors. QSAR-selected models were implied as 3D search filters to mine the National Cancer Institute (NCI) database for novel NEK2 inhibitors, whereas the associated QSAR model prioritized the bioactivities of captured hits for in vitro evaluation. Experimental validation identified several potent NEK2 inhibitors of novel structural scaffolds. The most potent captured hit exhibited an [Formula: see text] value of 237 nM.
Phenomenological study of decoherence in solid-state spin qubits due to nuclear spin diffusion
NASA Astrophysics Data System (ADS)
Biercuk, Michael J.; Bluhm, Hendrik
2011-06-01
We present a study of the prospects for coherence preservation in solid-state spin qubits using dynamical decoupling protocols. Recent experiments have provided the first demonstrations of multipulse dynamical decoupling sequences in this qubit system, but quantitative analyses of potential coherence improvements have been hampered by a lack of concrete knowledge of the relevant noise processes. We present calculations of qubit coherence under the application of arbitrary dynamical decoupling pulse sequences based on an experimentally validated semiclassical model. This phenomenological approach bundles the details of underlying noise processes into a single experimentally relevant noise power spectral density. Our results show that the dominant features of experimental measurements in a two-electron singlet-triplet spin qubit can be replicated using a 1/ω2 noise power spectrum associated with nuclear spin flips in the host material. Beginning with this validation, we address the effects of nuclear programming, high-frequency nuclear spin dynamics, and other high-frequency classical noise sources, with conjectures supported by physical arguments and microscopic calculations where relevant. Our results provide expected performance bounds and identify diagnostic metrics that can be measured experimentally in order to better elucidate the underlying nuclear spin dynamics.
Thermodynamic analyses and the experimental validation of the Pulse Tube Expander system
NASA Astrophysics Data System (ADS)
Jia, Qiming; Gong, Linghui; Feng, Guochao; Zou, Longhui
2018-04-01
A Pulse Tube Expander (PTE) for small and medium capacity cryogenic refrigeration systems is described in this paper. An analysis of the Pulse Tube Expander is developed based on the thermodynamic analyses of the system. It is shown that the gas expansion is isentropic in the cold end of the pulse tube. The temperature variation at the outlet of Pulse Tube Expander is measured and the isentropic efficiency is calculated to be 0.455 at 2 Hz. The pressure oscillations in the pulse tube are obtained at different frequencies. The limitations and advantages of this system are also discussed.
Rubble masonry response under cyclic actions: The experience of L’Aquila city (Italy)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fonti, Roberta, E-mail: roberta.fonti@tum.de; Barthel, Rainer, E-mail: r.barthel@lrz.tu-muenchen.de; Formisano, Antonio, E-mail: antoform@unina.it
2015-12-31
Several methods of analysis are available in engineering practice to study old masonry constructions. Two commonly used approaches in the field of seismic engineering are global and local analyses. Despite several years of research in this field, the various methodologies suffer from a lack of comprehensive experimental validation. This is mainly due to the difficulty in simulating the many different kinds of masonry and, accordingly, the non-linear response under horizontal actions. This issue can be addressed by examining the local response of isolated panels under monotonic and/or alternate actions. Different testing methodologies are commonly used to identify the local responsemore » of old masonry. These range from simplified pull-out tests to sophisticated in-plane monotonic tests. However, there is a lack of both knowledge and critical comparison between experimental validations and numerical simulations. This is mainly due to the difficulties in implementing irregular settings within both simplified and advanced numerical analyses. Similarly, the simulation of degradation effects within laboratory tests is difficult with respect to old masonry in-situ boundary conditions. Numerical models, particularly on rubble masonry, are commonly simplified. They are mainly based on a kinematic chain of rigid blocks able to perform different “modes of damage” of structures subjected to horizontal actions. This paper presents an innovative methodology for testing; its aim is to identify a simplified model for out-of-plane response of rubbleworks with respect to the experimental evidence. The case study of L’Aquila district is discussed.« less
Assessing the significance of pedobarographic signals using random field theory.
Pataky, Todd C
2008-08-07
Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.
NASA Astrophysics Data System (ADS)
Amanowicz, Łukasz; Wojtkowiak, Janusz
2017-11-01
In this paper the experimentally obtained flow characteristics of multi-pipe earth-to-air heat exchangers (EAHEs) were used to validate the EAHE flow performance numerical model prepared by means of CFD software Ansys Fluent. The cut-cell meshing and the k-ɛ realizable turbulence model with default coefficients values and enhanced wall treatment was used. The total pressure losses and airflow in each pipe of multi-pipe exchangers was investigated both experimentally and numerically. The results show that airflow in each pipe of multi-pipe EAHE structures is not equal. The validated numerical model can be used for a proper designing of multi-pipe EAHEs from the flow characteristics point of view. The influence of EAHEs geometrical parameters on the total pressure losses and airflow division between the exchanger pipes can be also analysed. Usage of CFD for designing the EAHEs can be helpful for HVAC engineers (Heating Ventilation and Air Conditioning) for optimizing the geometrical structure of multi-pipe EAHEs in order to save the energy and decrease operational costs of low-energy buildings.
Caldeira, Letícia Gomes Magnago; Santos, Flávio Alves; de Oliveira, Andréa Melo Garcia; Lima, Josefa Abucater; de Souza, Leonardo Francisco; da Silva, Guilherme Resende; de Assis, Débora Cristina Sampaio
2017-01-01
A multiresidue method by UHPLC/MS-MS was optimized and validated for the screening and semiquantitative detection of antimicrobials residues from tetracyclines, aminoglycosides, quinolones, lincosamides, β-lactams, sulfonamides, and macrolides families in eggs. A qualitative approach was used to ensure adequate sensitivity to detect residues at the level of interest, defined as maximum residue limit (MRL), or less. The applicability of the methods was assessed by analyzing egg samples from hens that had been subjected to pharmacological treatment with neomycin, enrofloxacin, lincomycin, oxytetracycline, and doxycycline during five days and after discontinuation of medication (10 days). The method was adequate for screening all studied analytes in eggs, since the performance parameters ensured a false-compliant rate below or equal to 5%, except for flumequine. In the analyses of eggs from laying hens subjected to pharmacological treatment, all antimicrobial residues were detected throughout the experimental period, even after discontinuation of medication, except for neomycin, demonstrating the applicability of the method for analyses of antimicrobial residues in eggs. PMID:29181222
Toward a CFD nose-to-tail capability - Hypersonic unsteady Navier-Stokes code validation
NASA Technical Reports Server (NTRS)
Edwards, Thomas A.; Flores, Jolen
1989-01-01
Computational fluid dynamics (CFD) research for hypersonic flows presents new problems in code validation because of the added complexity of the physical models. This paper surveys code validation procedures applicable to hypersonic flow models that include real gas effects. The current status of hypersonic CFD flow analysis is assessed with the Compressible Navier-Stokes (CNS) code as a case study. The methods of code validation discussed to beyond comparison with experimental data to include comparisons with other codes and formulations, component analyses, and estimation of numerical errors. Current results indicate that predicting hypersonic flows of perfect gases and equilibrium air are well in hand. Pressure, shock location, and integrated quantities are relatively easy to predict accurately, while surface quantities such as heat transfer are more sensitive to the solution procedure. Modeling transition to turbulence needs refinement, though preliminary results are promising.
Methodological quality of meta-analyses of single-case experimental studies.
Jamshidi, Laleh; Heyvaert, Mieke; Declercq, Lies; Fernández-Castilla, Belén; Ferron, John M; Moeyaert, Mariola; Beretvas, S Natasha; Onghena, Patrick; Van den Noortgate, Wim
2017-12-28
Methodological rigor is a fundamental factor in the validity and credibility of the results of a meta-analysis. Following an increasing interest in single-case experimental design (SCED) meta-analyses, the current study investigates the methodological quality of SCED meta-analyses. We assessed the methodological quality of 178 SCED meta-analyses published between 1985 and 2015 through the modified Revised-Assessment of Multiple Systematic Reviews (R-AMSTAR) checklist. The main finding of the current review is that the methodological quality of the SCED meta-analyses has increased over time, but is still low according to the R-AMSTAR checklist. A remarkable percentage of the studies (93.80% of the included SCED meta-analyses) did not even reach the midpoint score (22, on a scale of 0-44). The mean and median methodological quality scores were 15.57 and 16, respectively. Relatively high scores were observed for "providing the characteristics of the included studies" and "doing comprehensive literature search". The key areas of deficiency were "reporting an assessment of the likelihood of publication bias" and "using the methods appropriately to combine the findings of studies". Although the results of the current review reveal that the methodological quality of the SCED meta-analyses has increased over time, still more efforts are needed to improve their methodological quality. Copyright © 2017 Elsevier Ltd. All rights reserved.
Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA)
NASA Astrophysics Data System (ADS)
Bates, E. M.; Birmingham, W. J.; Romero-Talamás, C. A.
2018-05-01
The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.
Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA).
Bates, E M; Birmingham, W J; Romero-Talamás, C A
2018-05-01
The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.
Collapse of a Liquid Column: Numerical Simulation and Experimental Validation
NASA Astrophysics Data System (ADS)
Cruchaga, Marcela A.; Celentano, Diego J.; Tezduyar, Tayfun E.
2007-03-01
This paper is focused on the numerical and experimental analyses of the collapse of a liquid column. The measurements of the interface position in a set of experiments carried out with shampoo and water for two different initial column aspect ratios are presented together with the corresponding numerical predictions. The experimental procedure was found to provide acceptable recurrence in the observation of the interface evolution. Basic models describing some of the relevant physical aspects, e.g. wall friction and turbulence, are included in the simulations. Numerical experiments are conducted to evaluate the influence of the parameters involved in the modeling by comparing the results with the data from the measurements. The numerical predictions reasonably describe the physical trends.
BATMAN-TCM: a Bioinformatics Analysis Tool for Molecular mechANism of Traditional Chinese Medicine.
Liu, Zhongyang; Guo, Feifei; Wang, Yong; Li, Chun; Zhang, Xinlei; Li, Honglei; Diao, Lihong; Gu, Jiangyong; Wang, Wei; Li, Dong; He, Fuchu
2016-02-16
Traditional Chinese Medicine (TCM), with a history of thousands of years of clinical practice, is gaining more and more attention and application worldwide. And TCM-based new drug development, especially for the treatment of complex diseases is promising. However, owing to the TCM's diverse ingredients and their complex interaction with human body, it is still quite difficult to uncover its molecular mechanism, which greatly hinders the TCM modernization and internationalization. Here we developed the first online Bioinformatics Analysis Tool for Molecular mechANism of TCM (BATMAN-TCM). Its main functions include 1) TCM ingredients' target prediction; 2) functional analyses of targets including biological pathway, Gene Ontology functional term and disease enrichment analyses; 3) the visualization of ingredient-target-pathway/disease association network and KEGG biological pathway with highlighted targets; 4) comparison analysis of multiple TCMs. Finally, we applied BATMAN-TCM to Qishen Yiqi dripping Pill (QSYQ) and combined with subsequent experimental validation to reveal the functions of renin-angiotensin system responsible for QSYQ's cardioprotective effects for the first time. BATMAN-TCM will contribute to the understanding of the "multi-component, multi-target and multi-pathway" combinational therapeutic mechanism of TCM, and provide valuable clues for subsequent experimental validation, accelerating the elucidation of TCM's molecular mechanism. BATMAN-TCM is available at http://bionet.ncpsb.org/batman-tcm.
Han, Sang-Uk; Ahn, Dae-Gyun; Lee, Myeong-Gon; Lee, Kwon-Hee; Han, Seung-Ho
2014-01-01
The structural integrity of valves that are used to control cooling waters in the primary coolant loop that prevents boiling within the reactor in a nuclear power plant must be capable of withstanding earthquakes or other dangerous situations. In this study, numerical analyses using a finite element method, that is, static and dynamic analyses according to the rigid or flexible characteristics of the dynamic properties of a 200A butterfly valve, were performed according to the KEPIC MFA. An experimental vibration test was also carried out in order to verify the results from the modal analysis, in which a validated finite element model was obtained via a model-updating method that considers changes in the in situ experimental data. By using a validated finite element model, the equivalent static load under SSE conditions stipulated by the KEPIC MFA gave a stress of 135 MPa that occurred at the connections of the stem and body. A larger stress of 183 MPa was induced when we used a CQC method with a design response spectrum that uses 2% damping ratio. These values were lower than the allowable strength of the materials used for manufacturing the butterfly valve, and, therefore, its structural safety met the KEPIC MFA requirements.
Supersonic unstalled flutter. [aerodynamic loading of thin airfoils induced by cascade motion
NASA Technical Reports Server (NTRS)
Adamczyk, J. J.; Goldstein, M. E.; Hartmann, M. J.
1978-01-01
Flutter analyses were developed to predict the onset of supersonic unstalled flutter of a cascade of two-dimensional airfoils. The first of these analyzes the onset of supersonic flutter at low levels of aerodynamic loading (i.e., backpressure), while the second examines the occurrence of supersonic flutter at moderate levels of aerodynamic loading. Both of these analyses are based on the linearized unsteady inviscid equations of gas dynamics to model the flow field surrounding the cascade. These analyses are utilized in a parametric study to show the effects of cascade geometry, inlet Mach number, and backpressure on the onset of single and multi degree of freedom unstalled supersonic flutter. Several of the results are correlated against experimental qualitative observation to validate the models.
Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.
2014-01-01
Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051
NASA Astrophysics Data System (ADS)
Rajak, D. K.; Deshpande, P. G.; Kumaraswamidhas, L. A.
2017-08-01
This Paper aimed at experimental investigation of compressive behaviour of square tube filled with pumice lightweight concrete (PLC). Square section of 20×20×30 mm is investigated, which is the backbone structure. The compression deformation result shows the better folding mechanism, displacement value, and energy absorption. PLC concrete filled with aluminium thin-wall tubes has been revealed superior energy absorption capacity (EAC) under low strain rate at room temperature. Superior EAC resulted as a result of mutual deformation benefit between aluminium section and PLC is also analysed. PLC was characterised by Fourier Transform Infrared (FTIR) and Field Emission Scanning Electron Microscopy (FESEM), and Energy Dispersive X-ray Spectrometry (EDX) analysis for better understanding of material behaviour. Individual and comparative load bearing graphs is logged for better prospective of analysing. Novel approach aimed at validation of porous lightweight concrete for better lightweight EA filler material.
PCA as a practical indicator of OPLS-DA model reliability.
Worley, Bradley; Powers, Robert
Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.
Validation of the mean radiant temperature simulated by the RayMan software in urban environments.
Lee, Hyunjung; Mayer, Helmut
2016-11-01
The RayMan software is worldwide applied in investigations on different issues in human-biometeorology. However, only the simulated mean radiant temperature (T mrt ) has been validated so far in a few case studies. They are based on T mrt values, which were experimentally determined in urban environments by use of a globe thermometer or applying the six-directional method. This study analyses previous T mrt validations in a comparative manner. Their results are extended by a recent validation of T mrt in an urban micro-environment in Freiburg (southwest Germany), which can be regarded as relatively heterogeneous due to different shading intensities by tree crowns. In addition, a validation of the physiologically equivalent temperature (PET) simulated by RayMan is conducted for the first time. The validations are based on experimentally determined T mrt and PET values, which were calculated from measured meteorological variables in the daytime of a clear-sky summer day. In total, the validation results show that RayMan is capable of simulating T mrt satisfactorily under relatively homogeneous site conditions. However, the inaccuracy of simulated T mrt is increasing with lower sun elevation and growing heterogeneity of the simulation site. As T mrt represents the meteorological variable that mostly governs PET in the daytime of clear-sky summer days, the accuracy of simulated T mrt is mainly responsible for the accuracy of simulated PET. The T mrt validations result in some recommendations, which concern an update of physical principles applied in the RayMan software to simulate the short- and long-wave radiant flux densities, especially from vertical building walls and tree crowns.
NASA Astrophysics Data System (ADS)
Goit, Chandra Shekhar; Saitoh, Masato
2013-03-01
Horizontal impedance functions of inclined single piles are measured experimentally for model soil-pile systems with both the effects of local soil nonlinearity and resonant characteristics. Two practical pile inclinations of 5° and 10° in addition to a vertical pile embedded in cohesionless soil and subjected to lateral harmonic pile head loadings for a wide range of frequencies are considered. Results obtained with low-to-high amplitude of lateral loadings on model soil-pile systems encased in a laminar shear box show that the local nonlinearities have a profound impact on the horizontal impedance functions of piles. Horizontal impedance functions of inclined piles are found to be smaller than the vertical pile and the values decrease as the angle of pile inclination increases. Distinct values of horizontal impedance functions are obtained for the `positive' and `negative' cycles of harmonic loadings, leading to asymmetric force-displacement relationships for the inclined piles. Validation of these experimental results is carried out through three-dimensional nonlinear finite element analyses, and the results from the numerical models are in good agreement with the experimental data. Sensitivity analyses conducted on the numerical models suggest that the consideration of local nonlinearity at the vicinity of the soil-pile interface influence the response of the soil-pile systems.
Validation of hydrogen gas stratification and mixing models
Wu, Hsingtzu; Zhao, Haihua
2015-05-26
Two validation benchmarks confirm that the BMIX++ code is capable of simulating unintended hydrogen release scenarios efficiently. The BMIX++ (UC Berkeley mechanistic MIXing code in C++) code has been developed to accurately and efficiently predict the fluid mixture distribution and heat transfer in large stratified enclosures for accident analyses and design optimizations. The BMIX++ code uses a scaling based one-dimensional method to achieve large reduction in computational effort compared to a 3-D computational fluid dynamics (CFD) simulation. Two BMIX++ benchmark models have been developed. One is for a single buoyant jet in an open space and another is for amore » large sealed enclosure with both a jet source and a vent near the floor. Both of them have been validated by comparisons with experimental data. Excellent agreements are observed. The entrainment coefficients of 0.09 and 0.08 are found to fit the experimental data for hydrogen leaks with the Froude number of 99 and 268 best, respectively. In addition, the BIX++ simulation results of the average helium concentration for an enclosure with a vent and a single jet agree with the experimental data within a margin of about 10% for jet flow rates ranging from 1.21 × 10⁻⁴ to 3.29 × 10⁻⁴ m³/s. In conclusion, computing time for each BMIX++ model with a normal desktop computer is less than 5 min.« less
Physical validation of a patient-specific contact finite element model of the ankle.
Anderson, Donald D; Goldsworthy, Jane K; Li, Wendy; James Rudert, M; Tochigi, Yuki; Brown, Thomas D
2007-01-01
A validation study was conducted to determine the extent to which computational ankle contact finite element (FE) results agreed with experimentally measured tibio-talar contact stress. Two cadaver ankles were loaded in separate test sessions, during which ankle contact stresses were measured with a high-resolution (Tekscan) pressure sensor. Corresponding contact FE analyses were subsequently performed for comparison. The agreement was good between FE-computed and experimentally measured mean (3.2% discrepancy for one ankle, 19.3% for the other) and maximum (1.5% and 6.2%) contact stress, as well as for contact area (1.7% and 14.9%). There was also excellent agreement between histograms of fractional areas of cartilage experiencing specific ranges of contact stress. Finally, point-by-point comparisons between the computed and measured contact stress distributions over the articular surface showed substantial agreement, with correlation coefficients of 90% for one ankle and 86% for the other. In the past, general qualitative, but little direct quantitative agreement has been demonstrated with articular joint contact FE models. The methods used for this validation enable formal comparison of computational and experimental results, and open the way for objective statistical measures of regional correlation between FE-computed contact stress distributions from comparison articular joint surfaces (e.g., those from an intact versus those with residual intra-articular fracture incongruity).
Validation of Cross Sections for Monte Carlo Simulation of the Photoelectric Effect
NASA Astrophysics Data System (ADS)
Han, Min Cheol; Kim, Han Sung; Pia, Maria Grazia; Basaglia, Tullio; Batič, Matej; Hoff, Gabriela; Kim, Chan Hyeong; Saracco, Paolo
2016-04-01
Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofield's 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library (EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofield's non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebel's parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthill's parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.
Optimization of Nd: YAG Laser Marking of Alumina Ceramic Using RSM And ANN
NASA Astrophysics Data System (ADS)
Peter, Josephine; Doloi, B.; Bhattacharyya, B.
2011-01-01
The present research papers deals with the artificial neural network (ANN) and the response surface methodology (RSM) based mathematical modeling and also an optimization analysis on marking characteristics on alumina ceramic. The experiments have been planned and carried out based on Design of Experiment (DOE). It also analyses the influence of the major laser marking process parameters and the optimal combination of laser marking process parametric setting has been obtained. The output of the RSM optimal data is validated through experimentation and ANN predictive model. A good agreement is observed between the results based on ANN predictive model and actual experimental observations.
Analysis of Tile-Reinforced Composite Armor. Part 1; Advanced Modeling and Strength Analyses
NASA Technical Reports Server (NTRS)
Davila, C. G.; Chen, Tzi-Kang; Baker, D. J.
1998-01-01
The results of an analytical and experimental study of the structural response and strength of tile-reinforced components of the Composite Armored Vehicle are presented. The analyses are based on specialized finite element techniques that properly account for the effects of the interaction between the armor tiles, the surrounding elastomers, and the glass-epoxy sublaminates. To validate the analytical predictions, tests were conducted with panels subjected to three-point bending loads. The sequence of progressive failure events for the laminates is described. This paper describes the results of Part 1 of a study of the response and strength of tile-reinforced composite armor.
Computer aided manual validation of mass spectrometry-based proteomic data.
Curran, Timothy G; Bryson, Bryan D; Reigelhaupt, Michael; Johnson, Hannah; White, Forest M
2013-06-15
Advances in mass spectrometry-based proteomic technologies have increased the speed of analysis and the depth provided by a single analysis. Computational tools to evaluate the accuracy of peptide identifications from these high-throughput analyses have not kept pace with technological advances; currently the most common quality evaluation methods are based on statistical analysis of the likelihood of false positive identifications in large-scale data sets. While helpful, these calculations do not consider the accuracy of each identification, thus creating a precarious situation for biologists relying on the data to inform experimental design. Manual validation is the gold standard approach to confirm accuracy of database identifications, but is extremely time-intensive. To palliate the increasing time required to manually validate large proteomic datasets, we provide computer aided manual validation software (CAMV) to expedite the process. Relevant spectra are collected, catalogued, and pre-labeled, allowing users to efficiently judge the quality of each identification and summarize applicable quantitative information. CAMV significantly reduces the burden associated with manual validation and will hopefully encourage broader adoption of manual validation in mass spectrometry-based proteomics. Copyright © 2013 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Rearden, Bradley T
The validation of neutron transport methods used in nuclear criticality safety analyses is required by consensus American National Standards Institute/American Nuclear Society (ANSI/ANS) standards. In the last decade, there has been an increased interest in correlations among critical experiments used in validation that have shared physical attributes and which impact the independence of each measurement. The statistical methods included in many of the frequently cited guidance documents on performing validation calculations incorporate the assumption that all individual measurements are independent, so little guidance is available to practitioners on the topic. Typical guidance includes recommendations to select experiments from multiple facilitiesmore » and experiment series in an attempt to minimize the impact of correlations or common-cause errors in experiments. Recent efforts have been made both to determine the magnitude of such correlations between experiments and to develop and apply methods for adjusting the bias and bias uncertainty to account for the correlations. This paper describes recent work performed at Oak Ridge National Laboratory using the Sampler sequence from the SCALE code system to develop experimental correlations using a Monte Carlo sampling technique. Sampler will be available for the first time with the release of SCALE 6.2, and a brief introduction to the methods used to calculate experiment correlations within this new sequence is presented in this paper. Techniques to utilize these correlations in the establishment of upper subcritical limits are the subject of a companion paper and will not be discussed here. Example experimental uncertainties and correlation coefficients are presented for a variety of low-enriched uranium water-moderated lattice experiments selected for use in a benchmark exercise by the Working Party on Nuclear Criticality Safety Subgroup on Uncertainty Analysis in Criticality Safety Analyses. The results include studies on the effect of fuel rod pitch on the correlations, and some observations are also made regarding difficulties in determining experimental correlations using the Monte Carlo sampling technique.« less
Gao, Lingyun; Zhao, Shuang; Jiang, Wei; Huang, Yuan; Bie, Zhilong
2014-01-01
Watermelon is one of the major Cucurbitaceae crops and the recent availability of genome sequence greatly facilitates the fundamental researches on it. Quantitative real-time reverse transcriptase PCR (qRT–PCR) is the preferred method for gene expression analyses, and using validated reference genes for normalization is crucial to ensure the accuracy of this method. However, a systematic validation of reference genes has not been conducted on watermelon. In this study, transcripts of 15 candidate reference genes were quantified in watermelon using qRT–PCR, and the stability of these genes was compared using geNorm and NormFinder. geNorm identified ClTUA and ClACT, ClEF1α and ClACT, and ClCAC and ClTUA as the best pairs of reference genes in watermelon organs and tissues under normal growth conditions, abiotic stress, and biotic stress, respectively. NormFinder identified ClYLS8, ClUBCP, and ClCAC as the best single reference genes under the above experimental conditions, respectively. ClYLS8 and ClPP2A were identified as the best reference genes across all samples. Two to nine reference genes were required for more reliable normalization depending on the experimental conditions. The widely used watermelon reference gene 18SrRNA was less stable than the other reference genes under the experimental conditions. Catalase family genes were identified in watermelon genome, and used to validate the reliability of the identified reference genes. ClCAT1and ClCAT2 were induced and upregulated in the first 24 h, whereas ClCAT3 was downregulated in the leaves under low temperature stress. However, the expression levels of these genes were significantly overestimated and misinterpreted when 18SrRNA was used as a reference gene. These results provide a good starting point for reference gene selection in qRT–PCR analyses involving watermelon. PMID:24587403
Kong, Qiusheng; Yuan, Jingxian; Gao, Lingyun; Zhao, Shuang; Jiang, Wei; Huang, Yuan; Bie, Zhilong
2014-01-01
Watermelon is one of the major Cucurbitaceae crops and the recent availability of genome sequence greatly facilitates the fundamental researches on it. Quantitative real-time reverse transcriptase PCR (qRT-PCR) is the preferred method for gene expression analyses, and using validated reference genes for normalization is crucial to ensure the accuracy of this method. However, a systematic validation of reference genes has not been conducted on watermelon. In this study, transcripts of 15 candidate reference genes were quantified in watermelon using qRT-PCR, and the stability of these genes was compared using geNorm and NormFinder. geNorm identified ClTUA and ClACT, ClEF1α and ClACT, and ClCAC and ClTUA as the best pairs of reference genes in watermelon organs and tissues under normal growth conditions, abiotic stress, and biotic stress, respectively. NormFinder identified ClYLS8, ClUBCP, and ClCAC as the best single reference genes under the above experimental conditions, respectively. ClYLS8 and ClPP2A were identified as the best reference genes across all samples. Two to nine reference genes were required for more reliable normalization depending on the experimental conditions. The widely used watermelon reference gene 18SrRNA was less stable than the other reference genes under the experimental conditions. Catalase family genes were identified in watermelon genome, and used to validate the reliability of the identified reference genes. ClCAT1and ClCAT2 were induced and upregulated in the first 24 h, whereas ClCAT3 was downregulated in the leaves under low temperature stress. However, the expression levels of these genes were significantly overestimated and misinterpreted when 18SrRNA was used as a reference gene. These results provide a good starting point for reference gene selection in qRT-PCR analyses involving watermelon.
NASA Astrophysics Data System (ADS)
Shan, Hangying; Xiao, Jun; Chu, Qiyi
2018-05-01
The Z-Pin interfacial bond properties play an important role in the structural performance of X-Cor® sandwich structures. This paper presents an experimental investigation on bond-slip behavior of Z-Pin interfaces using Z-Pin pull-out test. Based on the experimental data the whole Z-Pin pull-out process consists of three stages: initial bonding, debonding and frictional sliding. Comparative experimental study on the influence of design parameters on bond-slip behavior of Z-Pin interfaces has also been performed. Numerical analyses were conducted with the ABAQUS finite element (FE) program to simulate the Z-Pins bond-slip response of the pull-out test. The Z-Pins interfacial bond-slip behavior was implemented using nonlinear spring elements characterized with the constitutive relation from experimental results. Numerical results were validated by comparison with experimental data, and reasonably good agreement was achieved between experimental and analytical pull-out force-slip curves.
Xu, H; Li, C; Zeng, Q; Agrawal, I; Zhu, X; Gong, Z
2016-06-01
In this study, to systematically identify the most stably expressed genes for internal reference in zebrafish Danio rerio investigations, 37 D. rerio transcriptomic datasets (both RNA sequencing and microarray data) were collected from gene expression omnibus (GEO) database and unpublished data, and gene expression variations were analysed under three experimental conditions: tissue types, developmental stages and chemical treatments. Forty-four putative candidate genes were identified with the c.v. <0·2 from all datasets. Following clustering into different functional groups, 21 genes, in addition to four conventional housekeeping genes (eef1a1l1, b2m, hrpt1l and actb1), were selected from different functional groups for further quantitative real-time (qrt-)PCR validation using 25 RNA samples from different adult tissues, developmental stages and chemical treatments. The qrt-PCR data were then analysed using the statistical algorithm refFinder for gene expression stability. Several new candidate genes showed better expression stability than the conventional housekeeping genes in all three categories. It was found that sep15 and metap1 were the top two stable genes for tissue types, ube2a and tmem50a the top two for different developmental stages, and rpl13a and rp1p0 the top two for chemical treatments. Thus, based on the extensive transcriptomic analyses and qrt-PCR validation, these new reference genes are recommended for normalization of D. rerio qrt-PCR data respectively for the three different experimental conditions. © 2016 The Fisheries Society of the British Isles.
NASA Astrophysics Data System (ADS)
Aziz, A. M. Y.; Harun, M. N.; Syahrom, Ardiyansyah; Omar, A. H.
2017-04-01
This paper presents a study of the hydrodynamics of several rowing blade designs. The study was done using Computational Fluid Dynamics (CFD) which enabled the investigation to be done similar to the experimental study, but with additional hydrodynamic visualization for further analysis and understanding. The CFD method was validated using quasi-static experimental data from Caplan (2007). Besides that, the proposed CFD analyses have improved the precious CFD results with the percentage of error of 6.58 percent of lift and 0.69 percent of drag force compared to 33.65 and 18.75 percent obtained by Coppel (2010). Consequent to the successful validation, the study then proceeded with the real size of Macon, Big balde and Fat blade. It was found that the hydrodynamic performance of the Fat blade was the highest due to the area, aspect ratio and the shape of the blade. Besides that, distribution of pressure for all models were also investigated which deepened the understanding of the blade fluid mechanics of rowing.
Recent developments in deployment analysis simulation using a multi-body computer code
NASA Technical Reports Server (NTRS)
Housner, Jerrold M.
1989-01-01
Deployment is a candidate mode for construction of structural space systems components. By its very nature, deployment is a dynamic event, often involving large angle unfolding of flexible beam members. Validation of proposed designs and conceptual deployment mechanisms is enhanced through analysis. Analysis may be used to determine member loads thus helping to establish deployment rates and deployment control requirements for a given concept. Futhermore, member flexibility, joint free-play, manufacturing tolerances, and imperfections can affect the reliability of deployment. Analyses which include these effects can aid in reducing risks associated with a particular concept. Ground tests which can play a similar role to that of analyses are difficult and expensive to perform. Suspension systems just for vibration ground tests of large space structures in a 1 g environment present many challenges. Suspension of a structure which spatially expands is even more challenging. Analysis validation through experimental confirmation on relatively small simple models would permit analytical extrapolation to larger more complex space structures.
Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Yungster, S.
1996-01-01
A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.
BATMAN-TCM: a Bioinformatics Analysis Tool for Molecular mechANism of Traditional Chinese Medicine
NASA Astrophysics Data System (ADS)
Liu, Zhongyang; Guo, Feifei; Wang, Yong; Li, Chun; Zhang, Xinlei; Li, Honglei; Diao, Lihong; Gu, Jiangyong; Wang, Wei; Li, Dong; He, Fuchu
2016-02-01
Traditional Chinese Medicine (TCM), with a history of thousands of years of clinical practice, is gaining more and more attention and application worldwide. And TCM-based new drug development, especially for the treatment of complex diseases is promising. However, owing to the TCM’s diverse ingredients and their complex interaction with human body, it is still quite difficult to uncover its molecular mechanism, which greatly hinders the TCM modernization and internationalization. Here we developed the first online Bioinformatics Analysis Tool for Molecular mechANism of TCM (BATMAN-TCM). Its main functions include 1) TCM ingredients’ target prediction; 2) functional analyses of targets including biological pathway, Gene Ontology functional term and disease enrichment analyses; 3) the visualization of ingredient-target-pathway/disease association network and KEGG biological pathway with highlighted targets; 4) comparison analysis of multiple TCMs. Finally, we applied BATMAN-TCM to Qishen Yiqi dripping Pill (QSYQ) and combined with subsequent experimental validation to reveal the functions of renin-angiotensin system responsible for QSYQ’s cardioprotective effects for the first time. BATMAN-TCM will contribute to the understanding of the “multi-component, multi-target and multi-pathway” combinational therapeutic mechanism of TCM, and provide valuable clues for subsequent experimental validation, accelerating the elucidation of TCM’s molecular mechanism. BATMAN-TCM is available at http://bionet.ncpsb.org/batman-tcm.
BATMAN-TCM: a Bioinformatics Analysis Tool for Molecular mechANism of Traditional Chinese Medicine
Liu, Zhongyang; Guo, Feifei; Wang, Yong; Li, Chun; Zhang, Xinlei; Li, Honglei; Diao, Lihong; Gu, Jiangyong; Wang, Wei; Li, Dong; He, Fuchu
2016-01-01
Traditional Chinese Medicine (TCM), with a history of thousands of years of clinical practice, is gaining more and more attention and application worldwide. And TCM-based new drug development, especially for the treatment of complex diseases is promising. However, owing to the TCM’s diverse ingredients and their complex interaction with human body, it is still quite difficult to uncover its molecular mechanism, which greatly hinders the TCM modernization and internationalization. Here we developed the first online Bioinformatics Analysis Tool for Molecular mechANism of TCM (BATMAN-TCM). Its main functions include 1) TCM ingredients’ target prediction; 2) functional analyses of targets including biological pathway, Gene Ontology functional term and disease enrichment analyses; 3) the visualization of ingredient-target-pathway/disease association network and KEGG biological pathway with highlighted targets; 4) comparison analysis of multiple TCMs. Finally, we applied BATMAN-TCM to Qishen Yiqi dripping Pill (QSYQ) and combined with subsequent experimental validation to reveal the functions of renin-angiotensin system responsible for QSYQ’s cardioprotective effects for the first time. BATMAN-TCM will contribute to the understanding of the “multi-component, multi-target and multi-pathway” combinational therapeutic mechanism of TCM, and provide valuable clues for subsequent experimental validation, accelerating the elucidation of TCM’s molecular mechanism. BATMAN-TCM is available at http://bionet.ncpsb.org/batman-tcm. PMID:26879404
Rennenberg, Heinz; Herschbach, Cornelia
2014-11-01
Understanding the dynamics of physiological process in the systems biology era requires approaches at the genome, transcriptome, proteome, and metabolome levels. In this context, metabolite flux experiments have been used in mapping metabolite pathways and analysing metabolic control. In the present review, sulphur metabolism was taken to illustrate current challenges of metabolic flux analyses. At the cellular level, restrictions in metabolite flux analyses originate from incomplete knowledge of the compartmentation network of metabolic pathways. Transport of metabolites through membranes is usually not considered in flux experiments but may be involved in controlling the whole pathway. Hence, steady-state and snapshot readings need to be expanded to time-course studies in combination with compartment-specific metabolite analyses. Because of species-specific differences, differences between tissues, and stress-related responses, the quantitative significance of different sulphur sinks has to be elucidated; this requires the development of methods for whole-sulphur metabolome approaches. Different cell types can contribute to metabolite fluxes to different extents at the tissue and organ level. Cell type-specific analyses are needed to characterize these contributions. Based on such approaches, metabolite flux analyses can be expanded to the whole-plant level by considering long-distance transport and, thus, the interaction of roots and the shoot in metabolite fluxes. However, whole-plant studies need detailed empirical and mathematical modelling that have to be validated by experimental analyses. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Optimization of Nd: YAG Laser Marking of Alumina Ceramic Using RSM And ANN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peter, Josephine; Doloi, B.; Bhattacharyya, B.
The present research papers deals with the artificial neural network (ANN) and the response surface methodology (RSM) based mathematical modeling and also an optimization analysis on marking characteristics on alumina ceramic. The experiments have been planned and carried out based on Design of Experiment (DOE). It also analyses the influence of the major laser marking process parameters and the optimal combination of laser marking process parametric setting has been obtained. The output of the RSM optimal data is validated through experimentation and ANN predictive model. A good agreement is observed between the results based on ANN predictive model and actualmore » experimental observations.« less
Glenn, Beth A.; Bastani, Roshan; Maxwell, Annette E.
2013-01-01
Objective Threats to external validity including pretest sensitization and the interaction of selection and an intervention are frequently overlooked by researchers despite their potential to significantly influence study outcomes. The purpose of this investigation was to conduct secondary data analyses to assess the presence of external validity threats in the setting of a randomized trial designed to promote mammography use in a high risk sample of women. Design During the trial, recruitment and intervention implementation took place in three cohorts (with different ethnic composition), utilizing two different designs (pretest-posttest control group design; posttest only control group design). Results Results reveal that the intervention produced different outcomes across cohorts, dependent upon the research design used and the characteristics of the sample. Conclusion These results illustrate the importance of weighing the pros and cons of potential research designs before making a selection and attending more closely to issues of external validity. PMID:23289517
Numerical modeling and preliminary validation of drag-based vertical axis wind turbine
NASA Astrophysics Data System (ADS)
Krysiński, Tomasz; Buliński, Zbigniew; Nowak, Andrzej J.
2015-03-01
The main purpose of this article is to verify and validate the mathematical description of the airflow around a wind turbine with vertical axis of rotation, which could be considered as representative for this type of devices. Mathematical modeling of the airflow around wind turbines in particular those with the vertical axis is a problematic matter due to the complex nature of this highly swirled flow. Moreover, it is turbulent flow accompanied by a rotation of the rotor and the dynamic boundary layer separation. In such conditions, the key aspects of the mathematical model are accurate turbulence description, definition of circular motion as well as accompanying effects like centrifugal force or the Coriolis force and parameters of spatial and temporal discretization. The paper presents the impact of the different simulation parameters on the obtained results of the wind turbine simulation. Analysed models have been validated against experimental data published in the literature.
Glenn, Beth A; Bastani, Roshan; Maxwell, Annette E
2013-01-01
Threats to external validity, including pretest sensitisation and the interaction of selection and an intervention, are frequently overlooked by researchers despite their potential to significantly influence study outcomes. The purpose of this investigation was to conduct secondary data analyses to assess the presence of external validity threats in the setting of a randomised trial designed to promote mammography use in a high-risk sample of women. During the trial, recruitment and intervention, implementation took place in three cohorts (with different ethnic composition), utilising two different designs (pretest-posttest control group design and posttest only control group design). Results reveal that the intervention produced different outcomes across cohorts, dependent upon the research design used and the characteristics of the sample. These results illustrate the importance of weighing the pros and cons of potential research designs before making a selection and attending more closely to issues of external validity.
Experimental validation of docking and capture using space robotics testbeds
NASA Technical Reports Server (NTRS)
Spofford, John; Schmitz, Eric; Hoff, William
1991-01-01
This presentation describes the application of robotic and computer vision systems to validate docking and capture operations for space cargo transfer vehicles. Three applications are discussed: (1) air bearing systems in two dimensions that yield high quality free-flying, flexible, and contact dynamics; (2) validation of docking mechanisms with misalignment and target dynamics; and (3) computer vision technology for target location and real-time tracking. All the testbeds are supported by a network of engineering workstations for dynamic and controls analyses. Dynamic simulation of multibody rigid and elastic systems are performed with the TREETOPS code. MATRIXx/System-Build and PRO-MATLAB/Simulab are the tools for control design and analysis using classical and modern techniques such as H-infinity and LQG/LTR. SANDY is a general design tool to optimize numerically a multivariable robust compensator with a user-defined structure. Mathematica and Macsyma are used to derive symbolically dynamic and kinematic equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew; Hu, Rui; Lisowski, Darius
2016-04-17
The Reactor Cavity Cooling System (RCCS) is an important passive safety system being incorporated into the overall safety strategy for high temperature advanced reactor concepts such as the High Temperature Gas- Cooled Reactors (HTGR). The Natural Convection Shutdown Heat Removal Test Facility (NSTF) at Argonne National Laboratory (Argonne) reflects a 1/2-scale model of the primary features of one conceptual air-cooled RCCS design. The project conducts ex-vessel, passive heat removal experiments in support of Department of Energy Office of Nuclear Energy’s Advanced Reactor Technology (ART) program, while also generating data for code validation purposes. While experiments are being conducted at themore » NSTF to evaluate the feasibility of the passive RCCS, parallel modeling and simulation efforts are ongoing to support the design, fabrication, and operation of these natural convection systems. Both system-level and high fidelity computational fluid dynamics (CFD) analyses were performed to gain a complete understanding of the complex flow and heat transfer phenomena in natural convection systems. This paper provides a summary of the RELAP5-3D NSTF model development efforts and provides comparisons between simulation results and experimental data from the NSTF. Overall, the simulation results compared favorably to the experimental data, however, further analyses need to be conducted to investigate any identified differences.« less
Code of Federal Regulations, 2014 CFR
2014-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
Code of Federal Regulations, 2012 CFR
2012-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
Code of Federal Regulations, 2013 CFR
2013-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
Evaluation of MARC for the analysis of rotating composite blades
NASA Technical Reports Server (NTRS)
Bartos, Karen F.; Ernst, Michael A.
1993-01-01
The suitability of the MARC code for the analysis of rotating composite blades was evaluated using a four-task process. A nonlinear displacement analysis and subsequent eigenvalue analysis were performed on a rotating spring mass system to ensure that displacement-dependent centrifugal forces were accounted for in the eigenvalue analysis. Normal modes analyses were conducted on isotropic plates with various degrees of twist to evaluate MARC's ability to handle blade twist. Normal modes analyses were conducted on flat composite plates to validate the newly developed coupled COBSTRAN-MARC methodology. Finally, normal modes analyses were conducted on four composite propfan blades that were designed, analyzed, and fabricated at NASA Lewis Research Center. Results were compared with experimental data. The research documented herein presents MARC as a viable tool for the analysis of rotating composite blades.
Evaluation of MARC for the analysis of rotating composite blades
NASA Astrophysics Data System (ADS)
Bartos, Karen F.; Ernst, Michael A.
1993-03-01
The suitability of the MARC code for the analysis of rotating composite blades was evaluated using a four-task process. A nonlinear displacement analysis and subsequent eigenvalue analysis were performed on a rotating spring mass system to ensure that displacement-dependent centrifugal forces were accounted for in the eigenvalue analysis. Normal modes analyses were conducted on isotropic plates with various degrees of twist to evaluate MARC's ability to handle blade twist. Normal modes analyses were conducted on flat composite plates to validate the newly developed coupled COBSTRAN-MARC methodology. Finally, normal modes analyses were conducted on four composite propfan blades that were designed, analyzed, and fabricated at NASA Lewis Research Center. Results were compared with experimental data. The research documented herein presents MARC as a viable tool for the analysis of rotating composite blades.
NASA Technical Reports Server (NTRS)
Allgood, Daniel C.; Graham, Jason S.; McVay, Greg P.; Langford, Lester L.
2008-01-01
A unique assessment of acoustic similarity scaling laws and acoustic analogy methodologies in predicting the far-field acoustic signature from a sub-scale altitude rocket test facility at the NASA Stennis Space Center was performed. A directional, point-source similarity analysis was implemented for predicting the acoustic far-field. In this approach, experimental acoustic data obtained from "similar" rocket engine tests were appropriately scaled using key geometric and dynamic parameters. The accuracy of this engineering-level method is discussed by comparing the predictions with acoustic far-field measurements obtained. In addition, a CFD solver was coupled with a Lilley's acoustic analogy formulation to determine the improvement of using a physics-based methodology over an experimental correlation approach. In the current work, steady-state Reynolds-averaged Navier-Stokes calculations were used to model the internal flow of the rocket engine and altitude diffuser. These internal flow simulations provided the necessary realistic input conditions for external plume simulations. The CFD plume simulations were then used to provide the spatial turbulent noise source distributions in the acoustic analogy calculations. Preliminary findings of these studies will be discussed.
Gadkar, Vijay J; Filion, Martin
2013-06-01
In various experimental systems, limiting available amounts of RNA may prevent a researcher from performing large-scale analyses of gene transcripts. One way to circumvent this is to 'pre-amplify' the starting RNA/cDNA, so that sufficient amounts are available for any downstream analysis. In the present study, we report the development of a novel protocol for constructing amplified cDNA libraries using the Phi29 DNA polymerase based multiple displacement amplification (MDA) system. Using as little as 200 ng of total RNA, we developed a linear concatenation strategy to make the single-stranded cDNA template amenable for MDA. The concatenation, made possible by the template switching property of the reverse transcriptase enzyme, resulted in the amplified cDNA library with intact 5' ends. MDA generated micrograms of template, allowing large-scale polymerase chain reaction analyses or other large-scale downstream applications. As the amplified cDNA library contains intact 5' ends, it is also compatible with 5' RACE analyses of specific gene transcripts. Empirical validation of this protocol is demonstrated on a highly characterized (tomato) and an uncharacterized (corn gromwell) experimental system.
Validation of endogenous internal real-time PCR controls in renal tissues.
Cui, Xiangqin; Zhou, Juling; Qiu, Jing; Johnson, Martin R; Mrug, Michal
2009-01-01
Endogenous internal controls ('reference' or 'housekeeping' genes) are widely used in real-time PCR (RT-PCR) analyses. Their use relies on the premise of consistently stable expression across studied experimental conditions. Unfortunately, none of these controls fulfills this premise across a wide range of experimental conditions; consequently, none of them can be recommended for universal use. To determine which endogenous RT-PCR controls are suitable for analyses of renal tissues altered by kidney disease, we studied the expression of 16 commonly used 'reference genes' in 7 mildly and 7 severely affected whole kidney tissues from a well-characterized cystic kidney disease model. Expression levels of these 16 genes, determined by TaqMan RT-PCR analyses and Affymetrix GeneChip arrays, were normalized and tested for overall variance and equivalence of the means. Both statistical approaches and both TaqMan- and GeneChip-based methods converged on 3 out of the 4 top-ranked genes (Ppia, Gapdh and Pgk1) that had the most constant expression levels across the studied phenotypes. A combination of the top-ranked genes will provide a suitable endogenous internal control for similar studies of kidney tissues across a wide range of disease severity. Copyright 2009 S. Karger AG, Basel.
Design data needs modular high-temperature gas-cooled reactor. Revision 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1987-03-01
The Design Data Needs (DDNs) provide summary statements for program management, of the designer`s need for experimental data to confirm or validate assumptions made in the design. These assumptions were developed using the Integrated Approach and are tabulated in the Functional Analysis Report. These assumptions were also necessary in the analyses or trade studies (A/TS) to develop selections of hardware design or design requirements. Each DDN includes statements providing traceability to the function and the associated assumption that requires the need.
Implementation of a Blowing Boundary Condition in the LAURA Code
NASA Technical Reports Server (NTRS)
Thompson, Richard a.; Gnoffo, Peter A.
2008-01-01
Preliminary steps toward modeling a coupled ablation problem using a finite-volume Navier-Stokes code (LAURA) are presented in this paper. Implementation of a surface boundary condition with mass transfer (blowing) is described followed by verification and validation through comparisons with analytic results and experimental data. Application of the code to a carbon-nosetip ablation problem is demonstrated and the results are compared with previously published data. It is concluded that the code and coupled procedure are suitable to support further ablation analyses and studies.
NASA Technical Reports Server (NTRS)
Stauter, R. C.; Fleeter, S.
1982-01-01
Three dimensional aerodynamic data, required to validate and/or indicate necessary refinements to inviscid and viscous analyses of the flow through turbomachine blade rows, are discussed. Instrumentation and capabilities for pressure measurement, probe insertion and traversing, and flow visualization are reviewed. Advanced measurement techniques including Laser Doppler Anemometers, are considered. Data processing is reviewed. Predictions were correlated with the experimental data. A flow visualization technique using helium filled soap bubbles was demonstrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard R. Schultz; Paul D. Bayless; Richard W. Johnson
2010-09-01
The Oregon State University (OSU) High Temperature Test Facility (HTTF) is an integral experimental facility that will be constructed on the OSU campus in Corvallis, Oregon. The HTTF project was initiated, by the U.S. Nuclear Regulatory Commission (NRC), on September 5, 2008 as Task 4 of the 5 year High Temperature Gas Reactor Cooperative Agreement via NRC Contract 04-08-138. Until August, 2010, when a DOE contract was initiated to fund additional capabilities for the HTTF project, all of the funding support for the HTTF was provided by the NRC via their cooperative agreement. The U.S. Department of Energy (DOE) beganmore » their involvement with the HTTF project in late 2009 via the Next Generation Nuclear Plant project. Because the NRC interests in HTTF experiments were only centered on the depressurized conduction cooldown (DCC) scenario, NGNP involvement focused on expanding the experimental envelope of the HTTF to include steady-state operations and also the pressurized conduction cooldown (PCC). Since DOE has incorporated the HTTF as an ingredient in the NGNP thermal-fluids validation program, several important outcomes should be noted: 1. The reference prismatic reactor design, that serves as the basis for scaling the HTTF, became the modular high temperature gas-cooled reactor (MHTGR). The MHTGR has also been chosen as the reference design for all of the other NGNP thermal-fluid experiments. 2. The NGNP validation matrix is being planned using the same scaling strategy that has been implemented to design the HTTF, i.e., the hierarchical two-tiered scaling methodology developed by Zuber in 1991. Using this approach a preliminary validation matrix has been designed that integrates the HTTF experiments with the other experiments planned for the NGNP thermal-fluids verification and validation project. 3. Initial analyses showed that the inherent power capability of the OSU infrastructure, which only allowed a total operational facility power capability of 0.6 MW, is inadequate to permit steady-state operation at reasonable conditions. 4. To enable the HTTF to operate at a more representative steady-state conditions, DOE recently allocated funding via a DOE subcontract to HTTF to permit an OSU infrastructure upgrade such that 2.2 MW will become available for HTTF experiments. 5. Analyses have been performed to study the relationship between HTTF and MHTGR via the hierarchical two-tiered scaling methodology which has been used successfully in the past, e.g., APEX facility scaling to the Westinghouse AP600 plant. These analyses have focused on the relationship between key variables that will be measured in the HTTF to the counterpart variables in the MHTGR with a focus on natural circulation, using nitrogen as a working fluid, and core heat transfer. 6. Both RELAP5-3D and computational fluid dynamics (CD-Adapco’s STAR-CCM+) numerical models of the MHTGR and the HTTF have been constructed and analyses are underway to study the relationship between the reference reactor and the HTTF. The HTTF is presently being designed. It has ¼-scaling relationship to the MHTGR in both the height and the diameter. Decisions have been made to design the reactor cavity cooling system (RCCS) simulation as a boundary condition for the HTTF to ensure that (a) the boundary condition is well defined and (b) the boundary condition can be modified easily to achieve the desired heat transfer sink for HTTF experimental operations.« less
Development and Initial Validation of the Pain Resilience Scale.
Slepian, P Maxwell; Ankawi, Brett; Himawan, Lina K; France, Christopher R
2016-04-01
Over the past decade, the role of positive psychology in pain experience has gained increasing attention. One such positive factor, identified as resilience, has been defined as the ability to maintain positive emotional and physical functioning despite physical or psychological adversity. Although cross-situational measures of resilience have been shown to be related to pain, it was hypothesized that a pain-specific resilience measure would serve as a stronger predictor of acute pain experience. To test this hypothesis, we conducted a series of studies to develop and validate the Pain Resilience Scale. Study 1 described exploratory and confirmatory factor analyses that support a scale with 2 distinct factors, Cognitive/Affective Positivity and Behavioral Perseverance. Study 2 showed test-retest reliability and construct validity of this new scale, including moderate positive relationships with measures of positive psychological functioning and small to moderate negative relationships with vulnerability measures such as pain catastrophizing. Finally, consistent with our initial hypothesis, study 3 showed that the Pain Resilience Scale is more strongly related to ischemic pain responses than existing measures of general resilience. Together, these studies support the predictive utility of this new pain-specific measure of resilience in the context of acute experimental pain. The Pain Resilience Scale represents a novel measure of Cognitive/Affective Positivity and Behavioral Perseverance during exposure to noxious stimuli. Construct validity is supported by expected relationships with existing pain-coping measures, and predictive validity is shown by individual differences in response to acute experimental pain. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.
Towards interoperable and reproducible QSAR analyses: Exchange of datasets.
Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es
2010-06-30
QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.
Towards interoperable and reproducible QSAR analyses: Exchange of datasets
2010-01-01
Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161
Calder, Stefan; O'Grady, Greg; Cheng, Leo K; Du, Peng
2018-04-27
Electrogastrography (EGG) is a non-invasive method for measuring gastric electrical activity. Recent simulation studies have attempted to extend the current clinical utility of the EGG, in particular by providing a theoretical framework for distinguishing specific gastric slow wave dysrhythmias. In this paper we implement an experimental setup called a 'torso-tank' with the aim of expanding and experimentally validating these previous simulations. The torso-tank was developed using an adult male torso phantom with 190 electrodes embedded throughout the torso. The gastric slow waves were reproduced using an artificial current source capable of producing 3D electrical fields. Multiple gastric dysrhythmias were reproduced based on high-resolution mapping data from cases of human gastric dysfunction (gastric re-entry, conduction blocks and ectopic pacemakers) in addition to normal test data. Each case was recorded and compared to the previously-presented simulated results. Qualitative and quantitative analyses were performed to define the accuracy showing [Formula: see text] 1.8% difference, [Formula: see text] 0.99 correlation, and [Formula: see text] 0.04 normalised RMS error between experimental and simulated findings. These results reaffirm previous findings and these methods in unison therefore present a promising morphological-based methodology for advancing the understanding and clinical applications of EGG.
NASA Astrophysics Data System (ADS)
Toledo Fuentes, A.; Kipfmueller, M.; José Prieto, M. A.
2017-10-01
Mobile manipulators are becoming a key instrument to increase the flexibility in industrial processes. Some of their requirements include handling of objects with different weights and sizes and their “fast” transportation, without jeopardizing production workers and machines. The compensation of forces affecting the system dynamic is therefore needed to avoid unwanted oscillations and tilting by sudden accelerations and decelerations. One general solution may be the implementation of external positioning elements to active stabilize the system. To accomplish the approach, the dynamic behavior of a robotic arm and a mobile platform was investigated to develop the stabilization mechanism using multibody simulations. The methodology used was divided into two phases for each subsystem: their natural frequencies and modal shapes were obtained using experimental modal analyses. Then, based on these experimental results, multibody simulation models (MBS) were set up and its dynamical parameters adjusted. Their modal shapes together with their obtained natural frequencies allowed a quantitative and qualitative analysis. In summary, the MBS models were successfully validated with the real subsystems, with a maximal percentage error of 15%. These models will serve as the basis for future steps in the design of the external actuators and its control strategy using a co-simulation tool.
Supersonic Retro-Propulsion Experimental Design for Computational Fluid Dynamics Model Validation
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Laws, Christopher T.; Kleb, W. L.; Rhode, Matthew N.; Spells, Courtney; McCrea, Andrew C.; Truble, Kerry A.; Schauerhamer, Daniel G.; Oberkampf, William L.
2011-01-01
The development of supersonic retro-propulsion, an enabling technology for heavy payload exploration missions to Mars, is the primary focus for the present paper. A new experimental model, intended to provide computational fluid dynamics model validation data, was recently designed for the Langley Research Center Unitary Plan Wind Tunnel Test Section 2. Pre-test computations were instrumental for sizing and refining the model, over the Mach number range of 2.4 to 4.6, such that tunnel blockage and internal flow separation issues would be minimized. A 5-in diameter 70-deg sphere-cone forebody, which accommodates up to four 4:1 area ratio nozzles, followed by a 10-in long cylindrical aftbody was developed for this study based on the computational results. The model was designed to allow for a large number of surface pressure measurements on the forebody and aftbody. Supplemental data included high-speed Schlieren video and internal pressures and temperatures. The run matrix was developed to allow for the quantification of various sources of experimental uncertainty, such as random errors due to run-to-run variations and bias errors due to flow field or model misalignments. Some preliminary results and observations from the test are presented, although detailed analyses of the data and uncertainties are still on going.
Alonso-López, Diego; Gutiérrez, Miguel A.; Lopes, Katia P.; Prieto, Carlos; Santamaría, Rodrigo; De Las Rivas, Javier
2016-01-01
APID (Agile Protein Interactomes DataServer) is an interactive web server that provides unified generation and delivery of protein interactomes mapped to their respective proteomes. This resource is a new, fully redesigned server that includes a comprehensive collection of protein interactomes for more than 400 organisms (25 of which include more than 500 interactions) produced by the integration of only experimentally validated protein–protein physical interactions. For each protein–protein interaction (PPI) the server includes currently reported information about its experimental validation to allow selection and filtering at different quality levels. As a whole, it provides easy access to the interactomes from specific species and includes a global uniform compendium of 90,379 distinct proteins and 678,441 singular interactions. APID integrates and unifies PPIs from major primary databases of molecular interactions, from other specific repositories and also from experimentally resolved 3D structures of protein complexes where more than two proteins were identified. For this purpose, a collection of 8,388 structures were analyzed to identify specific PPIs. APID also includes a new graph tool (based on Cytoscape.js) for visualization and interactive analyses of PPI networks. The server does not require registration and it is freely available for use at http://apid.dep.usal.es. PMID:27131791
Experimental Design in Clinical 'Omics Biomarker Discovery.
Forshed, Jenny
2017-11-03
This tutorial highlights some issues in the experimental design of clinical 'omics biomarker discovery, how to avoid bias and get as true quantities as possible from biochemical analyses, and how to select samples to improve the chance of answering the clinical question at issue. This includes the importance of defining clinical aim and end point, knowing the variability in the results, randomization of samples, sample size, statistical power, and how to avoid confounding factors by including clinical data in the sample selection, that is, how to avoid unpleasant surprises at the point of statistical analysis. The aim of this Tutorial is to help translational clinical and preclinical biomarker candidate research and to improve the validity and potential of future biomarker candidate findings.
NASA Technical Reports Server (NTRS)
Walberg, G.
1974-01-01
The present work describes a facility designed to validate the various aspects of radiative flow field theory, including the absorption of shock layer radiation by ablation products. The facility is capable of producing radiation with a spectrum similar to that of an entry vehicle shock layer and is designed to allow measurements at vacuum ultraviolet wavelengths where the most significant absorption by ablation products is predicted to occur. The design concept of the facility is presented along with results of theoretical analyses carried out to assess its research potential. Experimental data obtained during tests that simulated earth and Venusian entry and in which simulated ablation products were injected into the stagnation region flow field are discussed.
NASA Astrophysics Data System (ADS)
Rudrapati, R.; Sahoo, P.; Bandyopadhyay, A.
2016-09-01
The main aim of the present work is to analyse the significance of turning parameters on surface roughness in computer numerically controlled (CNC) turning operation while machining of aluminium alloy material. Spindle speed, feed rate and depth of cut have been considered as machining parameters. Experimental runs have been conducted as per Box-Behnken design method. After experimentation, surface roughness is measured by using stylus profile meter. Factor effects have been studied through analysis of variance. Mathematical modelling has been done by response surface methodology, to made relationships between the input parameters and output response. Finally, process optimization has been made by teaching learning based optimization (TLBO) algorithm. Predicted turning condition has been validated through confirmatory experiment.
Numerical framework for the modeling of electrokinetic flows
NASA Astrophysics Data System (ADS)
Deshpande, Manish; Ghaddar, Chahid; Gilbert, John R.; St. John, Pamela M.; Woudenberg, Timothy M.; Connell, Charles R.; Molho, Joshua; Herr, Amy; Mungal, Godfrey; Kenny, Thomas W.
1998-09-01
This paper presents a numerical framework for design-based analyses of electrokinetic flow in interconnects. Electrokinetic effects, which can be broadly divided into electrophoresis and electroosmosis, are of importance in providing a transport mechanism in microfluidic devices for both pumping and separation. Models for the electrokinetic effects can be derived and coupled to the fluid dynamic equations through appropriate source terms. In the design of practical microdevices, however, accurate coupling of the electrokinetic effects requires the knowledge of several material and physical parameters, such as the diffusivity and the mobility of the solute in the solvent. Additionally wall-based effects such as chemical binding sites might exist that affect the flow patterns. In this paper, we address some of these issues by describing a synergistic numerical/experimental process to extract the parameters required. Experiments were conducted to provide the numerical simulations with a mechanism to extract these parameters based on quantitative comparisons with each other. These parameters were then applied in predicting further experiments to validate the process. As part of this research, we have created NetFlow, a tool for micro-fluid analyses. The tool can be validated and applied in existing technologies by first creating test structures to extract representations of the physical phenomena in the device, and then applying them in the design analyses to predict correct behavior.
Confidence crisis of results in biomechanics research.
Knudson, Duane
2017-11-01
Many biomechanics studies have small sample sizes and incorrect statistical analyses, so reporting of inaccurate inferences and inflated magnitude of effects are common in the field. This review examines these issues in biomechanics research and summarises potential solutions from research in other fields to increase the confidence in the experimental effects reported in biomechanics. Authors, reviewers and editors of biomechanics research reports are encouraged to improve sample sizes and the resulting statistical power, improve reporting transparency, improve the rigour of statistical analyses used, and increase the acceptance of replication studies to improve the validity of inferences from data in biomechanics research. The application of sports biomechanics research results would also improve if a larger percentage of unbiased effects and their uncertainty were reported in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Christopher S.; Bernstein, Hans C.; Weisenhorn, Pamela
Metabolic network modeling of microbial communities provides an in-depth understanding of community-wide metabolic and regulatory processes. Compared to single organism analyses, community metabolic network modeling is more complex because it needs to account for interspecies interactions. To date, most approaches focus on reconstruction of high-quality individual networks so that, when combined, they can predict community behaviors as a result of interspecies interactions. However, this conventional method becomes ineffective for communities whose members are not well characterized and cannot be experimentally interrogated in isolation. Here, we tested a new approach that uses community-level data as a critical input for the networkmore » reconstruction process. This method focuses on directly predicting interspecies metabolic interactions in a community, when axenic information is insufficient. We validated our method through the case study of a bacterial photoautotroph-heterotroph consortium that was used to provide data needed for a community-level metabolic network reconstruction. Resulting simulations provided experimentally validated predictions of how a photoautotrophic cyanobacterium supports the growth of an obligate heterotrophic species by providing organic carbon and nitrogen sources.« less
Navier-Stokes Computations of Longitudinal Forces and Moments for a Blended Wing Body
NASA Technical Reports Server (NTRS)
Pao, S. Paul; Biedron, Robert T.; Park, Michael A.; Fremaux, C. Michael; Vicroy, Dan D.
2005-01-01
The object of this paper is to investigate the feasibility of applying CFD methods to aerodynamic analyses for aircraft stability and control. The integrated aerodynamic parameters used in stability and control, however, are not necessarily those extensively validated in the state of the art CFD technology. Hence, an exploratory study of such applications and the comparison of the solutions to available experimental data will help to assess the validity of the current computation methods. In addition, this study will also examine issues related to wind tunnel measurements such as measurement uncertainty and support interference effects. Several sets of experimental data from the NASA Langley 14x22-Foot Subsonic Tunnel and the National Transonic Facility are presented. Two Navier-Stokes flow solvers, one using structured meshes and the other unstructured meshes, were used to compute longitudinal static stability derivatives for an advanced Blended Wing Body configuration over a wide range of angles of attack. The computations were performed for two different Reynolds numbers and the resulting forces and moments are compared with the above mentioned wind tunnel data.
Navier-Stokes Computations of Longitudinal Forces and Moments for a Blended Wing Body
NASA Technical Reports Server (NTRS)
Pao, S. Paul; Biedron, Robert T.; Park, Michael A.; Fremaux, C. Michael; Vicroy, Dan D.
2004-01-01
The object of this paper is to investigate the feasibility of applying CFD methods to aerodynamic analyses for aircraft stability and control. The integrated aerodynamic parameters used in stability and control, however, are not necessarily those extensively validated in the state of the art CFD technology. Hence, an exploratory study of such applications and the comparison of the solutions to available experimental data will help to assess the validity of the current computation methods. In addition, this study will also examine issues related to wind tunnel measurements such as measurement uncertainty and support interference effects. Several sets of experimental data from the NASA Langley 14x22-Foot Subsonic Tunnel and the National Transonic Facility are presented. Two Navier-Stokes flow solvers, one using structured meshes and the other unstructured meshes, were used to compute longitudinal static stability derivatives for an advanced Blended Wing Body configuration over a wide range of angles of attack. The computations were performed for two different Reynolds numbers and the resulting forces and moments are compared with the above mentioned wind tunnel data.
OVERVIEW OF NEUTRON MEASUREMENTS IN JET FUSION DEVICE.
Batistoni, P; Villari, R; Obryk, B; Packer, L W; Stamatelatos, I E; Popovichev, S; Colangeli, A; Colling, B; Fonnesu, N; Loreti, S; Klix, A; Klosowski, M; Malik, K; Naish, J; Pillon, M; Vasilopoulou, T; De Felice, P; Pimpinella, M; Quintieri, L
2017-10-05
The design and operation of ITER experimental fusion reactor requires the development of neutron measurement techniques and numerical tools to derive the fusion power and the radiation field in the device and in the surrounding areas. Nuclear analyses provide essential input to the conceptual design, optimisation, engineering and safety case in ITER and power plant studies. The required radiation transport calculations are extremely challenging because of the large physical extent of the reactor plant, the complexity of the geometry, and the combination of deep penetration and streaming paths. This article reports the experimental activities which are carried-out at JET to validate the neutronics measurements methods and numerical tools used in ITER and power plant design. A new deuterium-tritium campaign is proposed in 2019 at JET: the unique 14 MeV neutron yields produced will be exploited as much as possible to validate measurement techniques, codes, procedures and data currently used in ITER design thus reducing the related uncertainties and the associated risks in the machine operation. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
BETA (Bitter Electromagnet Testing Apparatus)
NASA Astrophysics Data System (ADS)
Bates, Evan M.; Birmingham, William J.; Rivera, William F.; Romero-Talamas, Carlos A.
2017-10-01
The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) prototype of the 10-T Adjustable Long Pulse High-Field Apparatus (ALPHA). These water-cooled resistive magnets use high DC currents to produce strong uniform magnetic fields. Presented here is the successful completion of the BETA project and experimental results validating analytical magnet designing methods developed at the Dusty Plasma Laboratory (DPL). BETA's final design specifications will be highlighted which include electromagnetic, thermal and stress analyses. The magnet core design will be explained which include: Bitter Arcs, helix starters, and clamping annuli. The final version of the magnet's vessel and cooling system are also presented, as well as the electrical system of BETA, which is composed of a unique solid-state breaker circuit. Experimental results presented will show the operation of BETA at 1 T. The results are compared to both analytical design methods and finite element analysis calculations. We also explore the steady state maximums and theoretical limits of BETA's design. The completion of BETA validates the design and manufacturing techniques that will be used in the succeeding magnet, ALPHA.
Validating Experimental and Theoretical Langmuir Probe Analyses
NASA Astrophysics Data System (ADS)
Pilling, Lawrence Stuart; Carnegie, Dale
2004-11-01
Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a DC discharge plasma over a wide variety of conditions. This discharge contains a dual temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital motion limited (OML) is approximately the same as the radial motion gradients. An analysis of the gradients from the radial motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature. Only the position of the space charge potential is necessary to determine the applicable theory.
Automated identification of reference genes based on RNA-seq data.
Carmona, Rosario; Arroyo, Macarena; Jiménez-Quesada, María José; Seoane, Pedro; Zafra, Adoración; Larrosa, Rafael; Alché, Juan de Dios; Claros, M Gonzalo
2017-08-18
Gene expression analyses demand appropriate reference genes (RGs) for normalization, in order to obtain reliable assessments. Ideally, RG expression levels should remain constant in all cells, tissues or experimental conditions under study. Housekeeping genes traditionally fulfilled this requirement, but they have been reported to be less invariant than expected; therefore, RGs should be tested and validated for every particular situation. Microarray data have been used to propose new RGs, but only a limited set of model species and conditions are available; on the contrary, RNA-seq experiments are more and more frequent and constitute a new source of candidate RGs. An automated workflow based on mapped NGS reads has been constructed to obtain highly and invariantly expressed RGs based on a normalized expression in reads per mapped million and the coefficient of variation. This workflow has been tested with Roche/454 reads from reproductive tissues of olive tree (Olea europaea L.), as well as with Illumina paired-end reads from two different accessions of Arabidopsis thaliana and three different human cancers (prostate, small-cell cancer lung and lung adenocarcinoma). Candidate RGs have been proposed for each species and many of them have been previously reported as RGs in literature. Experimental validation of significant RGs in olive tree is provided to support the algorithm. Regardless sequencing technology, number of replicates, and library sizes, when RNA-seq experiments are designed and performed, the same datasets can be analyzed with our workflow to extract suitable RGs for subsequent PCR validation. Moreover, different subset of experimental conditions can provide different suitable RGs.
Lee, Stella Juhyun; Brennan, Emily; Gibson, Laura Anne; Tan, Andy S. L.; Kybert-Momjian, Ani; Liu, Jiaying; Hornik, Robert
2016-01-01
Several message topic selection approaches propose that messages based on beliefs pretested and found to be more strongly associated with intentions will be more effective in changing population intentions and behaviors when used in a campaign. This study aimed to validate the underlying causal assumption of these approaches which rely on cross-sectional belief–intention associations. We experimentally tested whether messages addressing promising themes as identified by the above criterion were more persuasive than messages addressing less promising themes. Contrary to expectations, all messages increased intentions. Interestingly, mediation analyses showed that while messages deemed promising affected intentions through changes in targeted promising beliefs, messages deemed less promising also achieved persuasion by influencing nontargeted promising beliefs. Implications for message topic selection are discussed. PMID:27867218
Modal Test/Analysis Correlation of Space Station Structures Using Nonlinear Sensitivity
NASA Technical Reports Server (NTRS)
Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan
1992-01-01
The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlation. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.
Modal test/analysis correlation of Space Station structures using nonlinear sensitivity
NASA Technical Reports Server (NTRS)
Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan
1992-01-01
The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlations. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.
QSPR for predicting chloroform formation in drinking water disinfection.
Luilo, G B; Cabaniss, S E
2011-01-01
Chlorination is the most widely used technique for water disinfection, but may lead to the formation of chloroform (trichloromethane; TCM) and other by-products. This article reports the first quantitative structure-property relationship (QSPR) for predicting the formation of TCM in chlorinated drinking water. Model compounds (n = 117) drawn from 10 literature sources were divided into training data (n = 90, analysed by five-way leave-many-out internal cross-validation) and external validation data (n = 27). QSPR internal cross-validation had Q² = 0.94 and root mean square error (RMSE) of 0.09 moles TCM per mole compound, consistent with external validation Q2 of 0.94 and RMSE of 0.08 moles TCM per mole compound, and met criteria for high predictive power and robustness. In contrast, log TCM QSPR performed poorly and did not meet the criteria for predictive power. The QSPR predictions were consistent with experimental values for TCM formation from tannic acid and for model fulvic acid structures. The descriptors used are consistent with a relatively small number of important TCM precursor structures based upon 1,3-dicarbonyls or 1,3-diphenols.
Monte Carol-based validation of neutronic methodology for EBR-II analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liaw, J.R.; Finck, P.J.
1993-01-01
The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less
Integration of design and inspection
NASA Astrophysics Data System (ADS)
Simmonds, William H.
1990-08-01
Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.
Hülsemann, Frank; Koehler, Karsten; Wittsiepe, Jürgen; Wilhelm, Michael; Hilbig, Annett; Kersting, Mathilde; Braun, Hans; Flenker, Ulrich; Schänzer, Wilhelm
2017-08-01
Natural stable isotope ratios (δ 15 N) of humans can be used for nutritional analyses and dietary reconstruction of modern and historic individuals and populations. Information about an individual's metabolic state can be obtained by comparison of tissue and dietary δ 15 N. Different methods have been used to estimate dietary δ 15 N in the past; however, the validity of such predictions has not been compared to experimental values. For a total of 56 meals and 21 samples of 24-h diets, predicted and experimental δ 15 N values were compared. The δ 15 N values were predicted from self-recorded food intake and compared with experimental δ 15 N values. Predicted and experimental δ 15 N values were in good agreement for meals and preparations (r = 0.89, p < .001) as well as for the 24-h diets (r = 0.76, p < .001). Dietary δ 15 N was mainly determined by the amount of fish, whereas the contribution of meat to dietary δ 15 N values was less pronounced. Prediction of human dietary δ 15 N values using standardised food records and representative δ 15 N data sets yields reliable data for dietary δ 15 N intake. A differentiated analysis of the primary protein sources is necessary when relating the proportion of animal-derived protein in the diet by δ 15 N analysis.
Mead, Emma J; Chiverton, Lesley M; Spurgeon, Sarah K; Martin, Elaine B; Montague, Gary A; Smales, C Mark; von der Haar, Tobias
2012-01-01
Monoclonal antibodies are commercially important, high value biotherapeutic drugs used in the treatment of a variety of diseases. These complex molecules consist of two heavy chain and two light chain polypeptides covalently linked by disulphide bonds. They are usually expressed as recombinant proteins from cultured mammalian cells, which are capable of correctly modifying, folding and assembling the polypeptide chains into the native quaternary structure. Such recombinant cell lines often vary in the amounts of product produced and in the heterogeneity of the secreted products. The biological mechanisms of this variation are not fully defined. Here we have utilised experimental and modelling strategies to characterise and define the biology underpinning product heterogeneity in cell lines exhibiting varying antibody expression levels, and then experimentally validated these models. In undertaking these studies we applied and validated biochemical (rate-constant based) and engineering (nonlinear) models of antibody expression to experimental data from four NS0 cell lines with different IgG4 secretion rates. The models predict that export of the full antibody and its fragments are intrinsically linked, and cannot therefore be manipulated individually at the level of the secretory machinery. Instead, the models highlight strategies for the manipulation at the precursor species level to increase recombinant protein yields in both high and low producing cell lines. The models also highlight cell line specific limitations in the antibody expression pathway.
Validation of tungsten cross sections in the neutron energy region up to 100 keV
NASA Astrophysics Data System (ADS)
Pigni, Marco T.; Žerovnik, Gašper; Leal, Luiz. C.; Trkov, Andrej
2017-09-01
Following a series of recent cross section evaluations on tungsten isotopes performed at Oak Ridge National Laboratory (ORNL), this paper presents the validation work carried out to test the performance of the evaluated cross sections based on lead-slowing-down (LSD) benchmarks conducted in Grenoble. ORNL completed the resonance parameter evaluation of four tungsten isotopes - 182,183,184,186W - in August 2014 and submitted it as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The evaluations were performed with support from the US Nuclear Criticality Safety Program in an effort to provide improved tungsten cross section and covariance data for criticality safety sensitivity analyses. The validation analysis based on the LSD benchmarks showed an improved agreement with the experimental response when the ORNL tungsten evaluations were included in the ENDF/B-VII.1 library. Comparison with the results obtained with the JEFF-3.2 nuclear data library are also discussed.
Reactance to Health Warnings Scale: Development and Validation
Hall, Marissa G.; Sheeran, Paschal; Noar, Seth M.; Ribisl, Kurt M.; Bach, Laura E.; Brewer, Noel T.
2016-01-01
Background Health warnings may be less effective if they elicit reactance, a motivation to resist a threat to freedom, yet we lack a standard measure of reactance. Purpose We sought to validate a new health warning reactance scale in the context of pictorial cigarette pack warnings. Methods A national sample of adults (n=1,413) responded to reactance survey questions while viewing randomly assigned pictorial or text warnings on images of cigarette packs. A separate longitudinal sample of adult smokers received the warnings on their own cigarette packs (n=46). Results Factor analyses identified a reliable and valid 27-item Reactance to Health Warnings Scale. In our experimental study, smokers rated pictorial warnings as being able to motivate quitting more than text warnings. However, five reactance scale factors weakened the warnings’ impact (anger, exaggeration, government, manipulation, and personal attack; all p<.05). Conclusions The Reactance to Health Warnings Scale had good psychometric properties. Reactance weakened the impact of pictorial warnings on smokers’ evaluation of the warning’s ability to motivate quitting. PMID:27333895
Substantiation Data for Advanced Beaded and Tubular Structural Panels. Volume 3: Testing
NASA Technical Reports Server (NTRS)
Hedges, P. C.; Greene, B. E.
1974-01-01
The test program is described, which was conducted to provide the necessary experimental data to verify the design and analysis methods developed for beaded and tubular panels. Test results are summarized and presented for all local buckling and full size panel tests. Selected representative test data from each of these tests is presented in detail. The results of this program established a valid analysis and design procedure for circular tube panels. Test results from three other configurations show deformational modes which are not adequately accounted for in the present analyses.
NASA Astrophysics Data System (ADS)
Souley, Mountaka; Lopez, Philippe; Boulon, Marc; Thoraval, Alain
2015-05-01
The experimental device previously used to study the hydromechanical behaviour of individual fractures on a laboratory scale, was adapted to make it possible to measure flow through porous rock mass samples in addition to fracture flows. A first series of tests was performed to characterize the hydromechanical behaviour of the fracture individually as well as the porous matrix (sandstone) comprising the fracture walls. A third test in this series was used to validate the experimental approach. These tests showed non-linear evolution of the contact area on the fracture walls with respect to effective normal stress. Consequently, a non-linear relationship was noted between the hydraulic aperture on the one hand, and the effective normal stress and mechanical opening on the other hand. The results of the three tests were then analysed by numerical modelling. The VIPLEF/HYDREF numerical codes used take into account the dual-porosity of the sample (fracture + rock matrix) and can be used to reproduce hydromechanical loading accurately. The analyses show that the relationship between the hydraulic aperture of the fracture and the mechanical closure has a significant effect on fracture flow rate predictions. By taking simultaneous measurements of flow in both fracture and rock matrix, we were able to carry out a global evaluation of the conceptual approach used.
Integrative analyses shed new light on human ribosomal protein gene regulation
Li, Xin; Zheng, Yiyu; Hu, Haiyan; Li, Xiaoman
2016-01-01
Ribosomal protein genes (RPGs) are important house-keeping genes that are well-known for their coordinated expression. Previous studies on RPGs are largely limited to their promoter regions. Recent high-throughput studies provide an unprecedented opportunity to study how human RPGs are transcriptionally modulated and how such transcriptional regulation may contribute to the coordinate gene expression in various tissues and cell types. By analyzing the DNase I hypersensitive sites under 349 experimental conditions, we predicted 217 RPG regulatory regions in the human genome. More than 86.6% of these computationally predicted regulatory regions were partially corroborated by independent experimental measurements. Motif analyses on these predicted regulatory regions identified 31 DNA motifs, including 57.1% of experimentally validated motifs in literature that regulate RPGs. Interestingly, we observed that the majority of the predicted motifs were shared by the predicted distal and proximal regulatory regions of the same RPGs, a likely general mechanism for enhancer-promoter interactions. We also found that RPGs may be differently regulated in different cells, indicating that condition-specific RPG regulatory regions still need to be discovered and investigated. Our study advances the understanding of how RPGs are coordinately modulated, which sheds light to the general principles of gene transcriptional regulation in mammals. PMID:27346035
Integrative analyses shed new light on human ribosomal protein gene regulation.
Li, Xin; Zheng, Yiyu; Hu, Haiyan; Li, Xiaoman
2016-06-27
Ribosomal protein genes (RPGs) are important house-keeping genes that are well-known for their coordinated expression. Previous studies on RPGs are largely limited to their promoter regions. Recent high-throughput studies provide an unprecedented opportunity to study how human RPGs are transcriptionally modulated and how such transcriptional regulation may contribute to the coordinate gene expression in various tissues and cell types. By analyzing the DNase I hypersensitive sites under 349 experimental conditions, we predicted 217 RPG regulatory regions in the human genome. More than 86.6% of these computationally predicted regulatory regions were partially corroborated by independent experimental measurements. Motif analyses on these predicted regulatory regions identified 31 DNA motifs, including 57.1% of experimentally validated motifs in literature that regulate RPGs. Interestingly, we observed that the majority of the predicted motifs were shared by the predicted distal and proximal regulatory regions of the same RPGs, a likely general mechanism for enhancer-promoter interactions. We also found that RPGs may be differently regulated in different cells, indicating that condition-specific RPG regulatory regions still need to be discovered and investigated. Our study advances the understanding of how RPGs are coordinately modulated, which sheds light to the general principles of gene transcriptional regulation in mammals.
NASA Astrophysics Data System (ADS)
Sheikh, Muhammad; Elmarakbi, Ahmed; Elkady, Mustafa
2017-12-01
This paper focuses on state of charge (SOC) dependent mechanical failure analysis of 18650 lithium-ion battery to detect signs of thermal runaway. Quasi-static loading conditions are used with four test protocols (Rod, Circular punch, three-point bend and flat plate) to analyse the propagation of mechanical failures and failure induced temperature changes. Finite element analysis (FEA) is used to model single battery cell with the concentric layered formation which represents a complete cell. The numerical simulation model is designed with solid element formation where stell casing and all layers followed the same formation, and fine mesh is used for all layers. Experimental work is also performed to analyse deformation of 18650 lithium-ion cell. The numerical simulation model is validated with experimental results. Deformation of cell mimics thermal runaway and various thermal runaway detection strategies are employed in this work including, force-displacement, voltage-temperature, stress-strain, SOC dependency and separator failure. Results show that cell can undergo severe conditions even with no fracture or rupture, these conditions may slow to develop but they can lead to catastrophic failures. The numerical simulation technique is proved to be useful in predicting initial battery failures, and results are in good correlation with the experimental results.
Validation of Endogenous Internal Real-Time PCR Controls in Renal Tissues
Cui, Xiangqin; Zhou, Juling; Qiu, Jing; Johnson, Martin R.; Mrug, Michal
2009-01-01
Background Endogenous internal controls (‘reference’ or ‘housekeeping’ genes) are widely used in real-time PCR (RT-PCR) analyses. Their use relies on the premise of consistently stable expression across studied experimental conditions. Unfortunately, none of these controls fulfills this premise across a wide range of experimental conditions; consequently, none of them can be recommended for universal use. Methods To determine which endogenous RT-PCR controls are suitable for analyses of renal tissues altered by kidney disease, we studied the expression of 16 commonly used ‘reference genes’ in 7 mildly and 7 severely affected whole kidney tissues from a well-characterized cystic kidney disease model. Expression levels of these 16 genes, determined by TaqMan® RT-PCR analyses and Affymetrix GeneChip® arrays, were normalized and tested for overall variance and equivalence of the means. Results Both statistical approaches and both TaqMan- and GeneChip-based methods converged on 3 out of the 4 top-ranked genes (Ppia, Gapdh and Pgk1) that had the most constant expression levels across the studied phenotypes. Conclusion A combination of the top-ranked genes will provide a suitable endogenous internal control for similar studies of kidney tissues across a wide range of disease severity. PMID:19729889
DAMAGE MODELING OF INJECTION-MOLDED SHORT- AND LONG-FIBER THERMOPLASTICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Kunc, Vlastimil; Bapanapalli, Satish K.
2009-10-30
This article applies the recent anisotropic rotary diffusion – reduced strain closure (ARD-RSC) model for predicting fiber orientation and a new damage model for injection-molded long-fiber thermoplastics (LFTs) to analyze progressive damage leading to total failure of injection-molded long-glass-fiber/polypropylene (PP) specimens. The ARD-RSC model was implemented in a research version of the Autodesk Moldflow Plastics Insight (MPI) processing code, and it has been used to simulate injection-molding of a long-glass-fiber/PP plaque. The damage model combines micromechanical modeling with a continuum damage mechanics description to predict the nonlinear behavior due to plasticity coupled with damage in LFTs. This model has beenmore » implemented in the ABAQUS finite element code via user-subroutines and has been used in the damage analyses of tensile specimens removed from the injection-molded long-glass-fiber/PP plaques. Experimental characterization and mechanical testing were performed to provide input data to support and validate both process modeling and damage analyses. The predictions are in agreement with the experimental results.« less
Data Analysis for the LISA Pathfinder Mission
NASA Technical Reports Server (NTRS)
Thorpe, James Ira
2009-01-01
The LTP (LISA Technology Package) is the core part of the Laser Interferometer Space Antenna (LISA) Pathfinder mission. The main goal of the mission is to study the sources of any disturbances that perturb the motion of the freely-falling test masses from their geodesic trajectories as well as 10 test various technologies needed for LISA. The LTP experiment is designed as a sequence of experimental runs in which the performance of the instrument is studied and characterized under different operating conditions. In order to best optimize subsequent experimental runs, each run must be promptly analysed to ensure that the following ones make best use of the available knowledge of the instrument ' In order to do this, all analyses must be designed and tested in advance of the mission and have sufficient built-in flexibility to account for unexpected results or behaviour. To support this activity, a robust and flexible data analysis software package is also required. This poster presents two of the main components that make up the data analysis effort: the data analysis software and the mock-data challenges used to validate analysis procedures and experiment designs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu
2014-01-15
According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less
Intralaminar and Interlaminar Progressive Failure Analysis of Composite Panels with Circular Cutouts
NASA Technical Reports Server (NTRS)
Goyal, Vinay K.; Jaunky, Navin; Johnson, Eric R.; Ambur, Damodar
2002-01-01
A progressive failure methodology is developed and demonstrated to simulate the initiation and material degradation of a laminated panel due to intralaminar and interlaminar failures. Initiation of intralaminar failure can be by a matrix-cracking mode, a fiber-matrix shear mode, and a fiber failure mode. Subsequent material degradation is modeled using damage parameters for each mode to selectively reduce lamina material properties. The interlaminar failure mechanism such as delamination is simulated by positioning interface elements between adjacent sublaminates. A nonlinear constitutive law is postulated for the interface element that accounts for a multi-axial stress criteria to detect the initiation of delamination, a mixed-mode fracture criteria for delamination progression, and a damage parameter to prevent restoration of a previous cohesive state. The methodology is validated using experimental data available in the literature on the response and failure of quasi-isotropic panels with centrally located circular cutouts loaded into the postbuckling regime. Very good agreement between the progressive failure analyses and the experimental results is achieved if the failure analyses includes the interaction of intralaminar and interlaminar failures.
Snorradóttir, Bergthóra S; Jónsdóttir, Fjóla; Sigurdsson, Sven Th; Másson, Már
2014-08-01
A model is presented for transdermal drug delivery from single-layered silicone matrix systems. The work is based on our previous results that, in particular, extend the well-known Higuchi model. Recently, we have introduced a numerical transient model describing matrix systems where the drug dissolution can be non-instantaneous. Furthermore, our model can describe complex interactions within a multi-layered matrix and the matrix to skin boundary. The power of the modelling approach presented here is further illustrated by allowing the possibility of a donor solution. The model is validated by a comparison with experimental data, as well as validating the parameter values against each other, using various configurations with donor solution, silicone matrix and skin. Our results show that the model is a good approximation to real multi-layered delivery systems. The model offers the ability of comparing drug release for ibuprofen and diclofenac, which cannot be analysed by the Higuchi model because the dissolution in the latter case turns out to be limited. The experiments and numerical model outlined in this study could also be adjusted to more general formulations, which enhances the utility of the numerical model as a design tool for the development of drug-loaded matrices for trans-membrane and transdermal delivery. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Applying Propensity Score Methods in Medical Research: Pitfalls and Prospects
Luo, Zhehui; Gardiner, Joseph C.; Bradley, Cathy J.
2012-01-01
The authors review experimental and nonexperimental causal inference methods, focusing on assumptions for the validity of instrumental variables and propensity score (PS) methods. They provide guidance in four areas for the analysis and reporting of PS methods in medical research and selectively evaluate mainstream medical journal articles from 2000 to 2005 in the four areas, namely, examination of balance, overlapping support description, use of estimated PS for evaluation of treatment effect, and sensitivity analyses. In spite of the many pitfalls, when appropriately evaluated and applied, PS methods can be powerful tools in assessing average treatment effects in observational studies. Appropriate PS applications can create experimental conditions using observational data when randomized controlled trials are not feasible and, thus, lead researchers to an efficient estimator of the average treatment effect. PMID:20442340
The Validity of Conscientiousness Is Overestimated in the Prediction of Job Performance.
Kepes, Sven; McDaniel, Michael A
2015-01-01
Sensitivity analyses refer to investigations of the degree to which the results of a meta-analysis remain stable when conditions of the data or the analysis change. To the extent that results remain stable, one can refer to them as robust. Sensitivity analyses are rarely conducted in the organizational science literature. Despite conscientiousness being a valued predictor in employment selection, sensitivity analyses have not been conducted with respect to meta-analytic estimates of the correlation (i.e., validity) between conscientiousness and job performance. To address this deficiency, we reanalyzed the largest collection of conscientiousness validity data in the personnel selection literature and conducted a variety of sensitivity analyses. Publication bias analyses demonstrated that the validity of conscientiousness is moderately overestimated (by around 30%; a correlation difference of about .06). The misestimation of the validity appears to be due primarily to suppression of small effects sizes in the journal literature. These inflated validity estimates result in an overestimate of the dollar utility of personnel selection by millions of dollars and should be of considerable concern for organizations. The fields of management and applied psychology seldom conduct sensitivity analyses. Through the use of sensitivity analyses, this paper documents that the existing literature overestimates the validity of conscientiousness in the prediction of job performance. Our data show that effect sizes from journal articles are largely responsible for this overestimation.
The Validity of Conscientiousness Is Overestimated in the Prediction of Job Performance
2015-01-01
Introduction Sensitivity analyses refer to investigations of the degree to which the results of a meta-analysis remain stable when conditions of the data or the analysis change. To the extent that results remain stable, one can refer to them as robust. Sensitivity analyses are rarely conducted in the organizational science literature. Despite conscientiousness being a valued predictor in employment selection, sensitivity analyses have not been conducted with respect to meta-analytic estimates of the correlation (i.e., validity) between conscientiousness and job performance. Methods To address this deficiency, we reanalyzed the largest collection of conscientiousness validity data in the personnel selection literature and conducted a variety of sensitivity analyses. Results Publication bias analyses demonstrated that the validity of conscientiousness is moderately overestimated (by around 30%; a correlation difference of about .06). The misestimation of the validity appears to be due primarily to suppression of small effects sizes in the journal literature. These inflated validity estimates result in an overestimate of the dollar utility of personnel selection by millions of dollars and should be of considerable concern for organizations. Conclusion The fields of management and applied psychology seldom conduct sensitivity analyses. Through the use of sensitivity analyses, this paper documents that the existing literature overestimates the validity of conscientiousness in the prediction of job performance. Our data show that effect sizes from journal articles are largely responsible for this overestimation. PMID:26517553
Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua
2018-01-04
Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu
2018-01-01
Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416
Verification and Validation of Residual Stresses in Bi-Material Composite Rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Stacy Michelle; Hanson, Alexander Anthony; Briggs, Timothy
Process-induced residual stresses commonly occur in composite structures composed of dissimilar materials. These residual stresses form due to differences in the composite materials’ coefficients of thermal expansion and the shrinkage upon cure exhibited by polymer matrix materials. Depending upon the specific geometric details of the composite structure and the materials’ curing parameters, it is possible that these residual stresses could result in interlaminar delamination or fracture within the composite. Therefore, the consideration of potential residual stresses is important when designing composite parts and their manufacturing processes. However, the experimental determination of residual stresses in prototype parts can be time andmore » cost prohibitive. As an alternative to physical measurement, it is possible for computational tools to be used to quantify potential residual stresses in composite prototype parts. Therefore, the objectives of the presented work are to demonstrate a simplistic method for simulating residual stresses in composite parts, as well as the potential value of sensitivity and uncertainty quantification techniques during analyses for which material property parameters are unknown. Specifically, a simplified residual stress modeling approach, which accounts for coefficient of thermal expansion mismatch and polymer shrinkage, is implemented within the Sandia National Laboratories’ developed SIERRA/SolidMechanics code. Concurrent with the model development, two simple, bi-material structures composed of a carbon fiber/epoxy composite and aluminum, a flat plate and a cylinder, are fabricated and the residual stresses are quantified through the measurement of deformation. Then, in the process of validating the developed modeling approach with the experimental residual stress data, manufacturing process simulations of the two simple structures are developed and undergo a formal verification and validation process, including a mesh convergence study, sensitivity analysis, and uncertainty quantification. The simulations’ final results show adequate agreement with the experimental measurements, indicating the validity of a simple modeling approach, as well as a necessity for the inclusion of material parameter uncertainty in the final residual stress predictions.« less
Bianchi, F; Careri, M; Maffini, M; Mangia, A; Mucchino, C
2003-01-01
A sensitive method for the simultaneous determination of (7)Li, (27)Al and (56)Fe by cold plasma ICP-MS was developed and validated. Experimental design was used to investigate the effects of torch position, torch power, lens 2 voltage, and coolant flow. Regression models and desirability functions were applied to find the experimental conditions providing the highest global sensitivity in a multi-elemental analysis. Validation was performed in terms of limits of detection (LOD), limits of quantitation (LOQ), linearity and precision. LODs were 1.4 and 159 ng L(-1) for (7)Li and (56)Fe, respectively; the highest LOD found being that for (27)Al (425 ng L(-1)). Linear ranges of 5 orders of magnitude for Li and 3 orders for Fe were statistically verified for each compound. Precision was evaluated by testing two concentration levels, and good results in terms of both intra-day repeatability and intermediate precision were obtained. RSD values lower than 4.8% at the lowest concentration level were calculated for intra-day repeatability. Commercially available soft drinks and alcoholic beverages contained in different packaging materials (TetraPack, polyethylene terephthalate (PET), commercial cans and glass) were analysed, and all the analytes were detected and quantitated. Copyright 2002 John Wiley & Sons, Ltd.
GIMDA: Graphlet interaction-based MiRNA-disease association prediction.
Chen, Xing; Guan, Na-Na; Li, Jian-Qiang; Yan, Gui-Ying
2018-03-01
MicroRNAs (miRNAs) have been confirmed to be closely related to various human complex diseases by many experimental studies. It is necessary and valuable to develop powerful and effective computational models to predict potential associations between miRNAs and diseases. In this work, we presented a prediction model of Graphlet Interaction for MiRNA-Disease Association prediction (GIMDA) by integrating the disease semantic similarity, miRNA functional similarity, Gaussian interaction profile kernel similarity and the experimentally confirmed miRNA-disease associations. The related score of a miRNA to a disease was calculated by measuring the graphlet interactions between two miRNAs or two diseases. The novelty of GIMDA lies in that we used graphlet interaction to analyse the complex relationships between two nodes in a graph. The AUCs of GIMDA in global and local leave-one-out cross-validation (LOOCV) turned out to be 0.9006 and 0.8455, respectively. The average result of five-fold cross-validation reached to 0.8927 ± 0.0012. In case study for colon neoplasms, kidney neoplasms and prostate neoplasms based on the database of HMDD V2.0, 45, 45, 41 of the top 50 potential miRNAs predicted by GIMDA were validated by dbDEMC and miR2Disease. Additionally, in the case study of new diseases without any known associated miRNAs and the case study of predicting potential miRNA-disease associations using HMDD V1.0, there were also high percentages of top 50 miRNAs verified by the experimental literatures. © 2017 The Authors. Journal of Cellular and Molecular Medicine published by John Wiley & Sons Ltd and Foundation for Cellular and Molecular Medicine.
Edmonds, Lisa A; Donovan, Neila J
2014-06-01
Virtually no valid materials are available to evaluate confrontation naming in Spanish-English bilingual adults in the U.S. In a recent study, a large group of young Spanish-English bilingual adults were evaluated on An Object and Action Naming Battery (Edmonds & Donovan in Journal of Speech, Language, and Hearing Research 55:359-381, 2012). Rasch analyses of the responses resulted in evidence for the content and construct validity of the retained items. However, the scope of that study did not allow for extensive examination of individual item characteristics, group analyses of participants, or the provision of testing and scoring materials or raw data, thereby limiting the ability of researchers to administer the test to Spanish-English bilinguals and to score the items with confidence. In this study, we present the in-depth information described above on the basis of further analyses, including (1) online searchable spreadsheets with extensive empirical (e.g., accuracy and name agreeability) and psycholinguistic item statistics; (2) answer sheets and instructions for scoring and interpreting the responses to the Rasch items; (3) tables of alternative correct responses for English and Spanish; (4) ability strata determined for all naming conditions (English and Spanish nouns and verbs); and (5) comparisons of accuracy across proficiency groups (i.e., Spanish dominant, English dominant, and balanced). These data indicate that the Rasch items from An Object and Action Naming Battery are valid and sensitive for the evaluation of naming in young Spanish-English bilingual adults. Additional information based on participant responses for all of the items on the battery can provide researchers with valuable information to aid in stimulus development and response interpretation for experimental studies in this population.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radulescu, Georgeta; Gauld, Ian C; Ilas, Germina
2011-01-01
The expanded use of burnup credit in the United States (U.S.) for storage and transport casks, particularly in the acceptance of credit for fission products, has been constrained by the availability of experimental fission product data to support code validation. The U.S. Nuclear Regulatory Commission (NRC) staff has noted that the rationale for restricting the Interim Staff Guidance on burnup credit for storage and transportation casks (ISG-8) to actinide-only is based largely on the lack of clear, definitive experiments that can be used to estimate the bias and uncertainty for computational analyses associated with using burnup credit. To address themore » issues of burnup credit criticality validation, the NRC initiated a project with the Oak Ridge National Laboratory to (1) develop and establish a technically sound validation approach for commercial spent nuclear fuel (SNF) criticality safety evaluations based on best-available data and methods and (2) apply the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The purpose of this paper is to describe the isotopic composition (depletion) validation approach and resulting observations and recommendations. Validation of the criticality calculations is addressed in a companion paper at this conference. For isotopic composition validation, the approach is to determine burnup-dependent bias and uncertainty in the effective neutron multiplication factor (keff) due to bias and uncertainty in isotopic predictions, via comparisons of isotopic composition predictions (calculated) and measured isotopic compositions from destructive radiochemical assay utilizing as much assay data as is available, and a best-estimate Monte Carlo based method. This paper (1) provides a detailed description of the burnup credit isotopic validation approach and its technical bases, (2) describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models to demonstrate its usage and applicability, (3) provides reference bias and uncertainty results based on a quality-assurance-controlled prerelease version of the Scale 6.1 code package and the ENDF/B-VII nuclear cross section data.« less
COLORcation: A new application to phenotype exploratory behavior models of anxiety in mice.
Dagan, Shachar Y; Tsoory, Michael M; Fainzilber, Mike; Panayotis, Nicolas
2016-09-01
Behavioral analyses in rodents have successfully delineated the function of many genes and signaling pathways in the brain. Behavioral testing uses highly defined experimental conditions to identify abnormalities in a given mouse strain or genotype. The open field (OF) is widely used to assess both locomotion and anxiety in rodents. In this test, the more a mouse explores and spend time in the center of the arena, the less anxious it is considered to be. However, the simplistic distinction between center and border substantially reduces the information content of the analysis and may fail to detect biologically meaningful differences. Here we describe COLORcation, a new application for improved analyses of mouse behavior in the OF. The application analyses animal exploration patterns in detailed spatial resolution (e.g. 10×10 bins) to provide a color-encoded heat map of mouse activity. In addition, COLORcation provides new parameters to track activity and locomotion of the test animals. We demonstrate the use of COLORcation in different experimental paradigms, including pharmacological and restraint-based induction of stress and anxiety. COLORcation is compatible with multiple acquisition systems, giving users the option to make the most of their raw data organized text files containing time and coordinates of animal locations as input. These analyses validate the utility of the software and establish its reliability and potential as a new tool to analyze OF data. Copyright © 2016 Elsevier B.V. All rights reserved.
Sabeh, Michael; Duceppe, Marc-Olivier; St-Arnaud, Marc; Mimee, Benjamin
2018-01-01
Relative gene expression analyses by qRT-PCR (quantitative reverse transcription PCR) require an internal control to normalize the expression data of genes of interest and eliminate the unwanted variation introduced by sample preparation. A perfect reference gene should have a constant expression level under all the experimental conditions. However, the same few housekeeping genes selected from the literature or successfully used in previous unrelated experiments are often routinely used in new conditions without proper validation of their stability across treatments. The advent of RNA-Seq and the availability of public datasets for numerous organisms are opening the way to finding better reference genes for expression studies. Globodera rostochiensis is a plant-parasitic nematode that is particularly yield-limiting for potato. The aim of our study was to identify a reliable set of reference genes to study G. rostochiensis gene expression. Gene expression levels from an RNA-Seq database were used to identify putative reference genes and were validated with qRT-PCR analysis. Three genes, GR, PMP-3, and aaRS, were found to be very stable within the experimental conditions of this study and are proposed as reference genes for future work.
Shock tube and chemical kinetic modeling study of the oxidation of 2,5-dimethylfuran.
Sirjean, Baptiste; Fournet, René; Glaude, Pierre-Alexandre; Battin-Leclerc, Frédérique; Wang, Weijing; Oehlschlaeger, Matthew A
2013-02-21
A detailed kinetic model describing the oxidation of 2,5-dimethylfuran (DMF), a potential second-generation biofuel, is proposed. The kinetic model is based upon quantum chemical calculations for the initial DMF consumption reactions and important reactions of intermediates. The model is validated by comparison to new DMF shock tube ignition delay time measurements (over the temperature range 1300-1831 K and at nominal pressures of 1 and 4 bar) and the DMF pyrolysis speciation measurements of Lifshitz et al. [ J. Phys. Chem. A 1998 , 102 ( 52 ), 10655 - 10670 ]. Globally, modeling predictions are in good agreement with the considered experimental targets. In particular, ignition delay times are predicted well by the new model, with model-experiment deviations of at most a factor of 2, and DMF pyrolysis conversion is predicted well, to within experimental scatter of the Lifshitz et al. data. Additionally, comparisons of measured and model predicted pyrolysis speciation provides validation of theoretically calculated channels for the oxidation of DMF. Sensitivity and reaction flux analyses highlight important reactions as well as the primary reaction pathways responsible for the decomposition of DMF and formation and destruction of key intermediate and product species.
Validating experimental and theoretical Langmuir probe analyses
NASA Astrophysics Data System (ADS)
Pilling, L. S.; Carnegie, D. A.
2007-08-01
Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.
NASA Astrophysics Data System (ADS)
Rubinato, Matteo; Martins, Ricardo; Kesserwani, Georges; Leandro, Jorge; Djordjević, Slobodan; Shucksmith, James
2017-09-01
The linkage between sewer pipe flow and floodplain flow is recognised to induce an important source of uncertainty within two-dimensional (2D) urban flood models. This uncertainty is often attributed to the use of empirical hydraulic formulae (the one-dimensional (1D) weir and orifice steady flow equations) to achieve data-connectivity at the linking interface, which require the determination of discharge coefficients. Because of the paucity of high resolution localised data for this type of flows, the current understanding and quantification of a suitable range for those discharge coefficients is somewhat lacking. To fulfil this gap, this work presents the results acquired from an instrumented physical model designed to study the interaction between a pipe network flow and a floodplain flow. The full range of sewer-to-surface and surface-to-sewer flow conditions at the exchange zone are experimentally analysed in both steady and unsteady flow regimes. Steady state measured discharges are first analysed considering the relationship between the energy heads from the sewer flow and the floodplain flow; these results show that existing weir and orifice formulae are valid for describing the flow exchange for the present physical model, and yield new calibrated discharge coefficients for each of the flow conditions. The measured exchange discharges are also integrated (as a source term) within a 2D numerical flood model (a finite volume solver to the 2D Shallow Water Equations (SWE)), which is shown to reproduce the observed coefficients. This calibrated numerical model is then used to simulate a series of unsteady flow tests reproduced within the experimental facility. Results show that the numerical model overestimated the values of mean surcharge flow rate. This suggests the occurrence of additional head losses in unsteady conditions which are not currently accounted for within flood models calibrated in steady flow conditions.
Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.
Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B
2018-01-01
The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.
Liebl, Hans; Garcia, Eduardo Grande; Holzner, Fabian; Noel, Peter B.; Burgkart, Rainer; Rummeny, Ernst J.; Baum, Thomas; Bauer, Jan S.
2015-01-01
Purpose To experimentally validate a non-linear finite element analysis (FEA) modeling approach assessing in-vitro fracture risk at the proximal femur and to transfer the method to standard in-vivo multi-detector computed tomography (MDCT) data of the hip aiming to predict additional hip fracture risk in subjects with and without osteoporosis associated vertebral fractures using bone mineral density (BMD) measurements as gold standard. Methods One fresh-frozen human femur specimen was mechanically tested and fractured simulating stance and clinically relevant fall loading configurations to the hip. After experimental in-vitro validation, the FEA simulation protocol was transferred to standard contrast-enhanced in-vivo MDCT images to calculate individual hip fracture risk each for 4 subjects with and without a history of osteoporotic vertebral fractures matched by age and gender. In addition, FEA based risk factor calculations were compared to manual femoral BMD measurements of all subjects. Results In-vitro simulations showed good correlation with the experimentally measured strains both in stance (R2 = 0.963) and fall configuration (R2 = 0.976). The simulated maximum stress overestimated the experimental failure load (4743 N) by 14.7% (5440 N) while the simulated maximum strain overestimated by 4.7% (4968 N). The simulated failed elements coincided precisely with the experimentally determined fracture locations. BMD measurements in subjects with a history of osteoporotic vertebral fractures did not differ significantly from subjects without fragility fractures (femoral head: p = 0.989; femoral neck: p = 0.366), but showed higher FEA based risk factors for additional incident hip fractures (p = 0.028). Conclusion FEA simulations were successfully validated by elastic and destructive in-vitro experiments. In the subsequent in-vivo analyses, MDCT based FEA based risk factor differences for additional hip fractures were not mirrored by according BMD measurements. Our data suggests, that MDCT derived FEA models may assess bone strength more accurately than BMD measurements alone, providing a valuable in-vivo fracture risk assessment tool. PMID:25723187
Benchmarking comparison and validation of MCNP photon interaction data
NASA Astrophysics Data System (ADS)
Colling, Bethany; Kodeli, I.; Lilley, S.; Packer, L. W.
2017-09-01
The objective of the research was to test available photoatomic data libraries for fusion relevant applications, comparing against experimental and computational neutronics benchmarks. Photon flux and heating was compared using the photon interaction data libraries (mcplib 04p, 05t, 84p and 12p). Suitable benchmark experiments (iron and water) were selected from the SINBAD database and analysed to compare experimental values with MCNP calculations using mcplib 04p, 84p and 12p. In both the computational and experimental comparisons, the majority of results with the 04p, 84p and 12p photon data libraries were within 1σ of the mean MCNP statistical uncertainty. Larger differences were observed when comparing computational results with the 05t test photon library. The Doppler broadening sampling bug in MCNP-5 is shown to be corrected for fusion relevant problems through use of the 84p photon data library. The recommended libraries for fusion neutronics are 84p (or 04p) with MCNP6 and 84p if using MCNP-5.
A Comprehensive Validation Methodology for Sparse Experimental Data
NASA Technical Reports Server (NTRS)
Norman, Ryan B.; Blattnig, Steve R.
2010-01-01
A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.
2012-08-01
U0=15m/s, Lv =350m Cloud Wind and Clear Sky Gust Simulation Using Dryden PSD* Harvested Energy from Normal Vibration (Red) to...energy control law based on limited energy constraints 4) Experimentally validated simultaneous energy harvesting and vibration control Summary...Experimental Characterization and Validation of Simultaneous Gust Alleviation and Energy Harvesting for Multifunctional Wing Spars AFOSR
Study on bamboo gluing performance numerical simulation
NASA Astrophysics Data System (ADS)
Zhao, Z. R.; Sun, W. H.; Sui, X. M.; Zhang, X. F.
2018-01-01
Bamboo gluing timber is a green building materials, can be widely used as modern building beams and columns. The existing bamboo gluing timber is usually produced by bamboo columns or bamboo bundle rolled into by bamboo columns. The performance of new bamboo gluing timber is decided by bamboo adhesion character. Based on this, the cohesive damage model of bamboo gluing is created, experiment results are used to validate the model. The model proposed in the work is agreed on the experimental results. Different bamboo bonding length and bamboo gluing performance is analysed. The model is helpful to bamboo integrated timber application.
Residualization is not the answer: Rethinking how to address multicollinearity.
York, Richard
2012-11-01
Here I show that a commonly used procedure to address problems stemming from collinearity and multicollinearity among independent variables in regression analysis, "residualization", leads to biased coefficient and standard error estimates and does not address the fundamental problem of collinearity, which is a lack of information. I demonstrate this using visual representations of collinearity, hypothetical experimental designs, and analyses of both artificial and real world data. I conclude by noting the importance of examining methodological practices to ensure that their validity can be established based on rational criteria. Copyright © 2012 Elsevier Inc. All rights reserved.
Experimental study of hybrid interface cooling system using air ventilation and nanofluid
NASA Astrophysics Data System (ADS)
Rani, M. F. H.; Razlan, Z. M.; Bakar, S. A.; Desa, H.; Wan, W. K.; Ibrahim, I.; Kamarrudin, N. S.; Bin-Abdun, Nazih A.
2017-09-01
The hybrid interface cooling system needs to be established to chill the battery compartment of electric car and maintained its ambient temperature inside the compartment between 25°C to 35°C. The air cooling experiment has been conducted to verify the cooling capacity, compressor displacement volume, dehumidifying value and mass flow rate of refrigerant (R-410A). At the same time, liquid cooling system is analysed theoretically by comparing the performance of two types of nanofluid, i.e., CuO + Water and Al2O3 + Water, based on the heat load generated inside the compartment. In order for the result obtained to be valid and reliable, several assumptions are considered during the experimental and theoretical analysis. Results show that the efficiency of the hybrid interface cooling system is improved as compared to the individual cooling system.
Simultaneous enumeration of cancer and immune cell types from bulk tumor gene expression data.
Racle, Julien; de Jonge, Kaat; Baumgaertner, Petra; Speiser, Daniel E; Gfeller, David
2017-11-13
Immune cells infiltrating tumors can have important impact on tumor progression and response to therapy. We present an efficient algorithm to simultaneously estimate the fraction of cancer and immune cell types from bulk tumor gene expression data. Our method integrates novel gene expression profiles from each major non-malignant cell type found in tumors, renormalization based on cell-type-specific mRNA content, and the ability to consider uncharacterized and possibly highly variable cell types. Feasibility is demonstrated by validation with flow cytometry, immunohistochemistry and single-cell RNA-Seq analyses of human melanoma and colorectal tumor specimens. Altogether, our work not only improves accuracy but also broadens the scope of absolute cell fraction predictions from tumor gene expression data, and provides a unique novel experimental benchmark for immunogenomics analyses in cancer research (http://epic.gfellerlab.org).
ODE constrained mixture modelling: a method for unraveling subpopulation structures and dynamics.
Hasenauer, Jan; Hasenauer, Christine; Hucho, Tim; Theis, Fabian J
2014-07-01
Functional cell-to-cell variability is ubiquitous in multicellular organisms as well as bacterial populations. Even genetically identical cells of the same cell type can respond differently to identical stimuli. Methods have been developed to analyse heterogeneous populations, e.g., mixture models and stochastic population models. The available methods are, however, either incapable of simultaneously analysing different experimental conditions or are computationally demanding and difficult to apply. Furthermore, they do not account for biological information available in the literature. To overcome disadvantages of existing methods, we combine mixture models and ordinary differential equation (ODE) models. The ODE models provide a mechanistic description of the underlying processes while mixture models provide an easy way to capture variability. In a simulation study, we show that the class of ODE constrained mixture models can unravel the subpopulation structure and determine the sources of cell-to-cell variability. In addition, the method provides reliable estimates for kinetic rates and subpopulation characteristics. We use ODE constrained mixture modelling to study NGF-induced Erk1/2 phosphorylation in primary sensory neurones, a process relevant in inflammatory and neuropathic pain. We propose a mechanistic pathway model for this process and reconstructed static and dynamical subpopulation characteristics across experimental conditions. We validate the model predictions experimentally, which verifies the capabilities of ODE constrained mixture models. These results illustrate that ODE constrained mixture models can reveal novel mechanistic insights and possess a high sensitivity.
Biswas, Kaushik; Abhari, Ramin
2014-10-03
A promising approach to increasing the energy efficiency of buildings is the implementation of a phase change material (PCM) in the building envelope. Numerous studies over the last two decades have reported the energy saving potential of PCMs in building envelopes, but their wide application has been inhibited, in part, by their high cost. This article describes a novel PCM made of naturally occurring fatty acids/glycerides trapped into high density polyethylene (HDPE) pellets and its performance in a building envelope application. The PCM-HDPE pellets were mixed with cellulose insulation and then added to an exterior wall of a test buildingmore » in a hot and humid climate, and tested over a period of several months, To demonstrate the efficacy of the PCM-enhanced cellulose insulation in reducing the building envelope heat gains and losses, side-by-side comparison was performed with another wall section filled with cellulose-only insulation. Further, numerical modeling of the test wall was performed to determine the actual impact of the PCM-HDPE pellets on wall-generated heating and cooling loads and the associated electricity consumption. The model was first validated using experimental data and then used for annual simulations using typical meteorological year (TMY3) weather data. Furthermore, this article presents the experimental data and numerical analyses showing the energy-saving potential of the new PCM.« less
2017-09-01
VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY
Broadband Transmission Loss Due to Reverberant Excitation
NASA Technical Reports Server (NTRS)
Barisciano, Lawrence P. Jr.
1999-01-01
The noise transmission characteristics of candidate curved aircraft sidewall panel constructions is examined analytically using finite element models of the selected panel geometries. The models are validated by experimental modal analyses and transmission loss testing. The structural and acoustic response of the models are then examined when subjected to random or reverberant excitation, the simulation of which is also discussed. For a candidate curved honeycomb panel, the effect of add-on trim panel treatments is examined. Specifically, two different mounting configurations are discussed and their effect on the transmission loss of the panel is presented. This study finds that the add-on acoustical treatments do improve on the primary structures transmission loss characteristics, however, much more research is necessary to draw any valid conclusions about the optimal configuration for the maximum noise transmission loss. This paper describes several directions for the extension of this work.
Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand
2016-03-15
A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.
Complex Water Impact Visitor Information Validation and Qualification Sciences Experimental Complex Our the problem space. The Validation and Qualification Sciences Experimental Complex (VQSEC) at Sandia
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
Experimental Design and Some Threats to Experimental Validity: A Primer
ERIC Educational Resources Information Center
Skidmore, Susan
2008-01-01
Experimental designs are distinguished as the best method to respond to questions involving causality. The purpose of the present paper is to explicate the logic of experimental design and why it is so vital to questions that demand causal conclusions. In addition, types of internal and external validity threats are discussed. To emphasize the…
Hybrid, experimental and computational, investigation of mechanical components
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1996-07-01
Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.
Gleadall, Andrew; Pan, Jingzhe; Kruft, Marc-Anton; Kellomäki, Minna
2014-05-01
This paper presents an understanding of how initial molecular weight and initial monomer fraction affect the degradation of bioresorbable polymers in terms of the underlying hydrolysis mechanisms. A mathematical model was used to analyse the effects of initial molecular weight for various hydrolysis mechanisms including noncatalytic random scission, autocatalytic random scission, noncatalytic end scission or autocatalytic end scission. Different behaviours were identified to relate initial molecular weight to the molecular weight half-life and to the time until the onset of mass loss. The behaviours were validated by fitting the model to experimental data for molecular weight reduction and mass loss of samples with different initial molecular weights. Several publications that consider initial molecular weight were reviewed. The effect of residual monomer on degradation was also analysed, and shown to accelerate the reduction of molecular weight and mass loss. An inverse square root law relationship was found between molecular weight half-life and initial monomer fraction for autocatalytic hydrolysis. The relationship was tested by fitting the model to experimental data with various residual monomer contents. Copyright © 2014 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...
Code of Federal Regulations, 2010 CFR
2010-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...
Methodological convergence of program evaluation designs.
Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2014-01-01
Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model
2010-03-01
EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End
Experimental characterization of an adaptive aileron: lab tests and FE correlation
NASA Astrophysics Data System (ADS)
Amendola, Gianluca; Dimino, Ignazio; Amoroso, Francesco; Pecora, Rosario
2016-04-01
Like any other technology, morphing has to demonstrate system level performance benefits prior to implementation onto a real aircraft. The current status of morphing structures research efforts (as the ones, sponsored by the European Union) involves the design of several subsystems which have to be individually tested in order to consolidate their general performance in view of the final integration into a flyable device. This requires a fundamental understanding of the interaction between aerodynamic, structure and control systems. Important worldwide research collaborations were born in order to exchange acquired experience and better investigate innovative technologies devoted to morphing structures. The "Adaptive Aileron" project represents a joint cooperation between Canadian and Italian research centers and leading industries. In this framework, an overview of the design, manufacturing and testing of a variable camber aileron for a regional aircraft is presented. The key enabling technology for the presented morphing aileron is the actuation structural system, integrating a suitable motor and a load-bearing architecture. The paper describes the lab test campaign of the developed device. The implementation of a distributed actuation system fulfills the actual tendency of the aeronautical research to move toward the use of electrical power to supply non-propulsive systems. The aileron design features are validated by targeted experimental tests, demonstrating both its adaptive capability and robustness under operative loads and its dynamic behavior for further aeroelastic analyses. The experimental results show a satisfactory correlation with the numerical expectations thus validating the followed design approach.
Monitoring Building Deformation with InSAR: Experiments and Validation.
Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng
2016-12-20
Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated.
Integrating cell biology and proteomic approaches in plants.
Takáč, Tomáš; Šamajová, Olga; Šamaj, Jozef
2017-10-03
Significant improvements of protein extraction, separation, mass spectrometry and bioinformatics nurtured advancements of proteomics during the past years. The usefulness of proteomics in the investigation of biological problems can be enhanced by integration with other experimental methods from cell biology, genetics, biochemistry, pharmacology, molecular biology and other omics approaches including transcriptomics and metabolomics. This review aims to summarize current trends integrating cell biology and proteomics in plant science. Cell biology approaches are most frequently used in proteomic studies investigating subcellular and developmental proteomes, however, they were also employed in proteomic studies exploring abiotic and biotic stress responses, vesicular transport, cytoskeleton and protein posttranslational modifications. They are used either for detailed cellular or ultrastructural characterization of the object subjected to proteomic study, validation of proteomic results or to expand proteomic data. In this respect, a broad spectrum of methods is employed to support proteomic studies including ultrastructural electron microscopy studies, histochemical staining, immunochemical localization, in vivo imaging of fluorescently tagged proteins and visualization of protein-protein interactions. Thus, cell biological observations on fixed or living cell compartments, cells, tissues and organs are feasible, and in some cases fundamental for the validation and complementation of proteomic data. Validation of proteomic data by independent experimental methods requires development of new complementary approaches. Benefits of cell biology methods and techniques are not sufficiently highlighted in current proteomic studies. This encouraged us to review most popular cell biology methods used in proteomic studies and to evaluate their relevance and potential for proteomic data validation and enrichment of purely proteomic analyses. We also provide examples of representative studies combining proteomic and cell biology methods for various purposes. Integrating cell biology approaches with proteomic ones allow validation and better interpretation of proteomic data. Moreover, cell biology methods remarkably extend the knowledge provided by proteomic studies and might be fundamental for the functional complementation of proteomic data. This review article summarizes current literature linking proteomics with cell biology. Copyright © 2017 Elsevier B.V. All rights reserved.
Schütte, Judith; Wang, Huange; Antoniou, Stella; Jarratt, Andrew; Wilson, Nicola K; Riepsaame, Joey; Calero-Nieto, Fernando J; Moignard, Victoria; Basilico, Silvia; Kinston, Sarah J; Hannah, Rebecca L; Chan, Mun Chiang; Nürnberg, Sylvia T; Ouwehand, Willem H; Bonzanni, Nicola; de Bruijn, Marella FTR; Göttgens, Berthold
2016-01-01
Transcription factor (TF) networks determine cell-type identity by establishing and maintaining lineage-specific expression profiles, yet reconstruction of mammalian regulatory network models has been hampered by a lack of comprehensive functional validation of regulatory interactions. Here, we report comprehensive ChIP-Seq, transgenic and reporter gene experimental data that have allowed us to construct an experimentally validated regulatory network model for haematopoietic stem/progenitor cells (HSPCs). Model simulation coupled with subsequent experimental validation using single cell expression profiling revealed potential mechanisms for cell state stabilisation, and also how a leukaemogenic TF fusion protein perturbs key HSPC regulators. The approach presented here should help to improve our understanding of both normal physiological and disease processes. DOI: http://dx.doi.org/10.7554/eLife.11469.001 PMID:26901438
Taxonomic and systematic revisions to the North American Nimravidae (Mammalia, Carnivora)
2016-01-01
The Nimravidae is a family of extinct carnivores commonly referred to as “false saber-tooth cats.” Since their initial discovery, they have prompted difficulty in taxonomic assignments and number of valid species. Past revisions have only examined a handful of genera, while recent advances in cladistic and morphometric analyses have granted us additional avenues to answering questions regarding our understanding of valid nimravid taxa and their phylogenetic relationships. To resolve issues of specific validity, the phylogenetic species concept (PSC) was utilized to maintain consistency in diagnosing valid species, while simultaneously employing character and linear morphometric analyses for confirming the validity of taxa. Determined valid species and taxonomically informative characters were then employed in two differential cladistic analyses to create competing hypotheses of interspecific relationships. The results suggest the validity of twelve species and six monophyletic genera. The first in depth reviews of Pogonodon and Dinictis returned two valid species (P. platycopis, P. davisi) for the former, while only one for the latter (D. felina). The taxonomic validity of Nanosmilus is upheld. Two main clades with substantial support were returned for all cladistic analyses, the Hoplophoneini and Nimravini, with ambiguous positions relative to these main clades for the European taxa: Eofelis, Dinailurictis bonali, and Quercylurus major; and the North American taxa Dinictis and Pogonodon. Eusmilus is determined to represent a non-valid genus for North American taxa, suggesting non-validity for Old World nimravid species as well. Finally, Hoplophoneus mentalis is found to be a junior synonym of Hoplophoneus primaevus, while the validity of Hoplophoneus oharrai is reinstated. PMID:26893959
van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G
2010-01-01
Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.
Experimental Replication of an Aeroengine Combustion Instability
NASA Technical Reports Server (NTRS)
Cohen, J. M.; Hibshman, J. R.; Proscia, W.; Rosfjord, T. J.; Wake, B. E.; McVey, J. B.; Lovett, J.; Ondas, M.; DeLaat, J.; Breisacher, K.
2000-01-01
Combustion instabilities in gas turbine engines are most frequently encountered during the late phases of engine development, at which point they are difficult and expensive to fix. The ability to replicate an engine-traceable combustion instability in a laboratory-scale experiment offers the opportunity to economically diagnose the problem (to determine the root cause), and to investigate solutions to the problem, such as active control. The development and validation of active combustion instability control requires that the causal dynamic processes be reproduced in experimental test facilities which can be used as a test bed for control system evaluation. This paper discusses the process through which a laboratory-scale experiment was designed to replicate an instability observed in a developmental engine. The scaling process used physically-based analyses to preserve the relevant geometric, acoustic and thermo-fluid features. The process increases the probability that results achieved in the single-nozzle experiment will be scalable to the engine.
Design and analysis issues in quantitative proteomics studies.
Karp, Natasha A; Lilley, Kathryn S
2007-09-01
Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.
Simple model of a photoacoustic system as a CR circuit
NASA Astrophysics Data System (ADS)
Fukuhara, Akiko; Kaneko, Fumitoshi; Ogawa, Naohisa
2012-05-01
We introduce the photoacoustic educational system (PAES), by which we can identify which gas causes the greenhouse effect in a classroom (Kaneko et al 2010 J. Chem. Educ. 87 202-4). PAES is an experimental system in which a pulse of infrared (IR) is absorbed into gas as internal energy, an oscillation of pressure (sound) appears, and then we can measure the absorptance of IR by the strength of sound. In this paper, we construct a simple mathematical model for PAES which is equivalent to the CR circuit. The energy absorption of an IR pulse into gas corresponds to the charge of a condenser and the heat diffusion to the outside corresponds to the energy dissipation by electric resistance. We analyse the experimental results by using this simple model, and check its validity. Although the model is simple, it explains phenomena occurring in PAES and can be a good educational resource.
Design and Testing of CO 2 Compression Using Supersonic Shock Wave Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koopman, Aaron
This report summarizes work performed by Ramgen and subcontractors in pursuit of the design and construction of a 10 MW supersonic CO2 compressor and supporting facility. The compressor will demonstrate application of Ramgen’s supersonic compression technology at an industrial scale using CO2 in a closed-loop. The report includes details of early feasibility studies, CFD validation and comparison to experimental data, static test experimental results, compressor and facility design and analyses, and development of aerodynamic tools. A summary of Ramgen's ISC Engine program activity is also included. This program will demonstrate the adaptation of Ramgen's supersonic compression and advanced vortex combustionmore » technology to result in a highly efficient and cost effective alternative to traditional gas turbine engines. The build out of a 1.5 MW test facility to support the engine and associated subcomponent test program is summarized.« less
Simultaneous enumeration of cancer and immune cell types from bulk tumor gene expression data
Racle, Julien; de Jonge, Kaat; Baumgaertner, Petra; Speiser, Daniel E
2017-01-01
Immune cells infiltrating tumors can have important impact on tumor progression and response to therapy. We present an efficient algorithm to simultaneously estimate the fraction of cancer and immune cell types from bulk tumor gene expression data. Our method integrates novel gene expression profiles from each major non-malignant cell type found in tumors, renormalization based on cell-type-specific mRNA content, and the ability to consider uncharacterized and possibly highly variable cell types. Feasibility is demonstrated by validation with flow cytometry, immunohistochemistry and single-cell RNA-Seq analyses of human melanoma and colorectal tumor specimens. Altogether, our work not only improves accuracy but also broadens the scope of absolute cell fraction predictions from tumor gene expression data, and provides a unique novel experimental benchmark for immunogenomics analyses in cancer research (http://epic.gfellerlab.org). PMID:29130882
Recent advances in ChIP-seq analysis: from quality management to whole-genome annotation.
Nakato, Ryuichiro; Shirahige, Katsuhiko
2017-03-01
Chromatin immunoprecipitation followed by sequencing (ChIP-seq) analysis can detect protein/DNA-binding and histone-modification sites across an entire genome. Recent advances in sequencing technologies and analyses enable us to compare hundreds of samples simultaneously; such large-scale analysis has potential to reveal the high-dimensional interrelationship level for regulatory elements and annotate novel functional genomic regions de novo. Because many experimental considerations are relevant to the choice of a method in a ChIP-seq analysis, the overall design and quality management of the experiment are of critical importance. This review offers guiding principles of computation and sample preparation for ChIP-seq analyses, highlighting the validity and limitations of the state-of-the-art procedures at each step. We also discuss the latest challenges of single-cell analysis that will encourage a new era in this field. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.
2012-06-01
The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.
Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott
2011-01-01
Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-Å Cα root mean square deviation [RMSD]) the high-resolution (1.8-Å) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur. PMID:21965559
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kemege, Kyle E.; Hickey, John M.; Lovell, Scott
2012-02-13
Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF)more » CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-{angstrom} C{alpha} root mean square deviation [RMSD]) the high-resolution (1.8-{angstrom}) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur.« less
Hooijmans, Carlijn R; Tillema, Alice; Leenaars, Marlies; Ritskes-Hoitinga, Merel
2010-01-01
Collecting and analysing all available literature before starting an animal experiment is important and it is indispensable when writing a systematic review (SR) of animal research. Writing such review prevents unnecessary duplication of animal studies and thus unnecessary animal use (Reduction). One of the factors currently impeding the production of ‘high-quality’ SRs in laboratory animal science is the fact that searching for all available literature concerning animal experimentation is rather difficult. In order to diminish these difficulties, we developed a search filter for PubMed to detect all publications concerning animal studies. This filter was compared with the method most frequently used, the PubMed Limit: Animals, and validated further by performing two PubMed topic searches. Our filter performs much better than the PubMed limit: it retrieves, on average, 7% more records. Other important advantages of our filter are that it also finds the most recent records and that it is easy to use. All in all, by using our search filter in PubMed, all available literature concerning animal studies on a specific topic can easily be found and assessed, which will help in increasing the scientific quality and thereby the ethical validity of animal experiments. PMID:20551243
Fuzzy-PI-based centralised control of semi-isolated FP-SEPIC/ZETA BDC in a PV/battery hybrid system
NASA Astrophysics Data System (ADS)
Mahendran, Venmathi; Ramabadran, Ramaprabha
2016-11-01
Multiport converters with centralised controller have been most commonly used in stand-alone photovoltaic (PV)/battery hybrid system to supply the load smoothly without any disturbances. This study presents the performance analysis of four-port SEPIC/ZETA bidirectional converter (FP-SEPIC/ZETA BDC) using various types of centralised control schemes like Fuzzy tuned proportional integral controller (Fuzzy-PI), fuzzy logic controller (FLC) and conventional proportional integral (PI) controller. The proposed FP-SEPIC/ZETA BDC with various control strategy is derived for simultaneous power management of a PV source using distributed maximum power point tracking (DMPPT) algorithm, a rechargeable battery, and a load by means of centralised controller. The steady state and the dynamic response of the FP-SEPIC/ZETA BDC are analysed using three different types of controllers under line and load regulation. The Fuzzy-PI-based control scheme improves the dynamic response of the system when compared with the FLC and the conventional PI controller. The power balance between the ports is achieved by pseudorandom carrier modulation scheme. The response of the FP-SEPIC/ZETA BDC is also validated experimentally using hardware prototype model of 500 W system. The effectiveness of the control strategy is validated using simulation and experimental results.
ATtRACT-a database of RNA-binding proteins and associated motifs.
Giudice, Girolamo; Sánchez-Cabo, Fátima; Torroja, Carlos; Lara-Pezzi, Enrique
2016-01-01
RNA-binding proteins (RBPs) play a crucial role in key cellular processes, including RNA transport, splicing, polyadenylation and stability. Understanding the interaction between RBPs and RNA is key to improve our knowledge of RNA processing, localization and regulation in a global manner. Despite advances in recent years, a unified non-redundant resource that includes information on experimentally validated motifs, RBPs and integrated tools to exploit this information is lacking. Here, we developed a database named ATtRACT (available athttp://attract.cnic.es) that compiles information on 370 RBPs and 1583 RBP consensus binding motifs, 192 of which are not present in any other database. To populate ATtRACT we (i) extracted and hand-curated experimentally validated data from CISBP-RNA, SpliceAid-F, RBPDB databases, (ii) integrated and updated the unavailable ASD database and (iii) extracted information from Protein-RNA complexes present in Protein Data Bank database through computational analyses. ATtRACT provides also efficient algorithms to search a specific motif and scan one or more RNA sequences at a time. It also allows discoveringde novomotifs enriched in a set of related sequences and compare them with the motifs included in the database.Database URL:http:// attract. cnic. es. © The Author(s) 2016. Published by Oxford University Press.
Quantification of DNA cleavage specificity in Hi-C experiments.
Meluzzi, Dario; Arya, Gaurav
2016-01-08
Hi-C experiments produce large numbers of DNA sequence read pairs that are typically analyzed to deduce genomewide interactions between arbitrary loci. A key step in these experiments is the cleavage of cross-linked chromatin with a restriction endonuclease. Although this cleavage should happen specifically at the enzyme's recognition sequence, an unknown proportion of cleavage events may involve other sequences, owing to the enzyme's star activity or to random DNA breakage. A quantitative estimation of these non-specific cleavages may enable simulating realistic Hi-C read pairs for validation of downstream analyses, monitoring the reproducibility of experimental conditions and investigating biophysical properties that correlate with DNA cleavage patterns. Here we describe a computational method for analyzing Hi-C read pairs to estimate the fractions of cleavages at different possible targets. The method relies on expressing an observed local target distribution downstream of aligned reads as a linear combination of known conditional local target distributions. We validated this method using Hi-C read pairs obtained by computer simulation. Application of the method to experimental Hi-C datasets from murine cells revealed interesting similarities and differences in patterns of cleavage across the various experiments considered. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Hooijmans, Carlijn R; Tillema, Alice; Leenaars, Marlies; Ritskes-Hoitinga, Merel
2010-07-01
Collecting and analysing all available literature before starting an animal experiment is important and it is indispensable when writing a systematic review (SR) of animal research. Writing such review prevents unnecessary duplication of animal studies and thus unnecessary animal use (Reduction). One of the factors currently impeding the production of 'high-quality' SRs in laboratory animal science is the fact that searching for all available literature concerning animal experimentation is rather difficult. In order to diminish these difficulties, we developed a search filter for PubMed to detect all publications concerning animal studies. This filter was compared with the method most frequently used, the PubMed Limit: Animals, and validated further by performing two PubMed topic searches. Our filter performs much better than the PubMed limit: it retrieves, on average, 7% more records. Other important advantages of our filter are that it also finds the most recent records and that it is easy to use. All in all, by using our search filter in PubMed, all available literature concerning animal studies on a specific topic can easily be found and assessed, which will help in increasing the scientific quality and thereby the ethical validity of animal experiments.
Ramo, Nicole L.; Puttlitz, Christian M.
2018-01-01
Compelling evidence that many biological soft tissues display both strain- and time-dependent behavior has led to the development of fully non-linear viscoelastic modeling techniques to represent the tissue’s mechanical response under dynamic conditions. Since the current stress state of a viscoelastic material is dependent on all previous loading events, numerical analyses are complicated by the requirement of computing and storing the stress at each step throughout the load history. This requirement quickly becomes computationally expensive, and in some cases intractable, for finite element models. Therefore, we have developed a strain-dependent numerical integration approach for capturing non-linear viscoelasticity that enables calculation of the current stress from a strain-dependent history state variable stored from the preceding time step only, which improves both fitting efficiency and computational tractability. This methodology was validated based on its ability to recover non-linear viscoelastic coefficients from simulated stress-relaxation (six strain levels) and dynamic cyclic (three frequencies) experimental stress-strain data. The model successfully fit each data set with average errors in recovered coefficients of 0.3% for stress-relaxation fits and 0.1% for cyclic. The results support the use of the presented methodology to develop linear or non-linear viscoelastic models from stress-relaxation or cyclic experimental data of biological soft tissues. PMID:29293558
Assessment of brain reference genes for RT-qPCR studies in neurodegenerative diseases
Rydbirk, Rasmus; Folke, Jonas; Winge, Kristian; Aznar, Susana; Pakkenberg, Bente; Brudek, Tomasz
2016-01-01
Evaluation of gene expression levels by reverse transcription quantitative real-time PCR (RT-qPCR) has for many years been the favourite approach for discovering disease-associated alterations. Normalization of results to stably expressed reference genes (RGs) is pivotal to obtain reliable results. This is especially important in relation to neurodegenerative diseases where disease-related structural changes may affect the most commonly used RGs. We analysed 15 candidate RGs in 98 brain samples from two brain regions from Alzheimer’s disease (AD), Parkinson’s disease (PD), Multiple System Atrophy, and Progressive Supranuclear Palsy patients. Using RefFinder, a web-based tool for evaluating RG stability, we identified the most stable RGs to be UBE2D2, CYC1, and RPL13 which we recommend for future RT-qPCR studies on human brain tissue from these patients. None of the investigated genes were affected by experimental variables such as RIN, PMI, or age. Findings were further validated by expression analyses of a target gene GSK3B, known to be affected by AD and PD. We obtained high variations in GSK3B levels when contrasting the results using different sets of common RG underlining the importance of a priori validation of RGs for RT-qPCR studies. PMID:27853238
Assessment of brain reference genes for RT-qPCR studies in neurodegenerative diseases.
Rydbirk, Rasmus; Folke, Jonas; Winge, Kristian; Aznar, Susana; Pakkenberg, Bente; Brudek, Tomasz
2016-11-17
Evaluation of gene expression levels by reverse transcription quantitative real-time PCR (RT-qPCR) has for many years been the favourite approach for discovering disease-associated alterations. Normalization of results to stably expressed reference genes (RGs) is pivotal to obtain reliable results. This is especially important in relation to neurodegenerative diseases where disease-related structural changes may affect the most commonly used RGs. We analysed 15 candidate RGs in 98 brain samples from two brain regions from Alzheimer's disease (AD), Parkinson's disease (PD), Multiple System Atrophy, and Progressive Supranuclear Palsy patients. Using RefFinder, a web-based tool for evaluating RG stability, we identified the most stable RGs to be UBE2D2, CYC1, and RPL13 which we recommend for future RT-qPCR studies on human brain tissue from these patients. None of the investigated genes were affected by experimental variables such as RIN, PMI, or age. Findings were further validated by expression analyses of a target gene GSK3B, known to be affected by AD and PD. We obtained high variations in GSK3B levels when contrasting the results using different sets of common RG underlining the importance of a priori validation of RGs for RT-qPCR studies.
Identification of Reference Genes for RT-qPCR Data Normalization in Cannabis sativa Stem Tissues.
Mangeot-Peter, Lauralie; Legay, Sylvain; Hausman, Jean-Francois; Esposito, Sergio; Guerriero, Gea
2016-09-15
Gene expression profiling via quantitative real-time PCR is a robust technique widely used in the life sciences to compare gene expression patterns in, e.g., different tissues, growth conditions, or after specific treatments. In the field of plant science, real-time PCR is the gold standard to study the dynamics of gene expression and is used to validate the results generated with high throughput techniques, e.g., RNA-Seq. An accurate relative quantification of gene expression relies on the identification of appropriate reference genes, that need to be determined for each experimental set-up used and plant tissue studied. Here, we identify suitable reference genes for expression profiling in stems of textile hemp (Cannabis sativa L.), whose tissues (isolated bast fibres and core) are characterized by remarkable differences in cell wall composition. We additionally validate the reference genes by analysing the expression of putative candidates involved in the non-oxidative phase of the pentose phosphate pathway and in the first step of the shikimate pathway. The goal is to describe the possible regulation pattern of some genes involved in the provision of the precursors needed for lignin biosynthesis in the different hemp stem tissues. The results here shown are useful to design future studies focused on gene expression analyses in hemp.
CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction
NASA Technical Reports Server (NTRS)
Davis, David O.
2015-01-01
Experimental investigations of specific flow phenomena, e.g., Shock Wave Boundary-Layer Interactions (SWBLI), provide great insight to the flow behavior but often lack the necessary details to be useful as CFD validation experiments. Reasons include: 1.Undefined boundary conditions Inconsistent results 2.Undocumented 3D effects (CL only measurements) 3.Lack of uncertainty analysis While there are a number of good subsonic experimental investigations that are sufficiently documented to be considered test cases for CFD and turbulence model validation, the number of supersonic and hypersonic cases is much less. This was highlighted by Settles and Dodsons [1] comprehensive review of available supersonic and hypersonic experimental studies. In all, several hundred studies were considered for their database.Of these, over a hundred were subjected to rigorous acceptance criteria. Based on their criteria, only 19 (12 supersonic, 7 hypersonic) were considered of sufficient quality to be used for validation purposes. Aeschliman and Oberkampf [2] recognized the need to develop a specific methodology for experimental studies intended specifically for validation purposes.
Validation of the Work-Life Balance Culture Scale (WLBCS).
Nitzsche, Anika; Jung, Julia; Kowalski, Christoph; Pfaff, Holger
2014-01-01
The purpose of this paper is to describe the theoretical development and initial validation of the newly developed Work-Life Balance Culture Scale (WLBCS), an instrument for measuring an organizational culture that promotes the work-life balance of employees. In Study 1 (N=498), the scale was developed and its factorial validity tested through exploratory factor analyses. In Study 2 (N=513), confirmatory factor analysis (CFA) was performed to examine model fit and retest the dimensional structure of the instrument. To assess construct validity, a priori hypotheses were formulated and subsequently tested using correlation analyses. Exploratory and confirmatory factor analyses revealed a one-factor model. Results of the bivariate correlation analyses may be interpreted as preliminary evidence of the scale's construct validity. The five-item WLBCS is a new and efficient instrument with good overall quality. Its conciseness makes it particularly suitable for use in employee surveys to gain initial insight into a company's perceived work-life balance culture.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-31
... factors as the approved models, are validated by experimental test data, and receive the Administrator's... stage of the MEP involves applying the model against a database of experimental test cases including..., particularly the requirement for validation by experimental test data. That guidance is based on the MEP's...
Schlenker, Philippe; Chemla, Emmanuel; Zuberbühler, Klaus
2016-12-01
A field of primate linguistics is gradually emerging. It combines general questions and tools from theoretical linguistics with rich data gathered in experimental primatology. Analyses of several monkey systems have uncovered very simple morphological and syntactic rules and have led to the development of a primate semantics that asks new questions about the division of semantic labor between the literal meaning of monkey calls, additional mechanisms of pragmatic enrichment, and the environmental context. We show that comparative studies across species may validate this program and may in some cases help in reconstructing the evolution of monkey communication over millions of years. Copyright © 2016. Published by Elsevier Ltd.
LHC collider phenomenology of minimal universal extra dimensions
NASA Astrophysics Data System (ADS)
Beuria, Jyotiranjan; Datta, AseshKrishna; Debnath, Dipsikha; Matchev, Konstantin T.
2018-05-01
We discuss the collider phenomenology of the model of Minimal Universal Extra Dimensions (MUED) at the Large hadron Collider (LHC). We derive analytical results for all relevant strong pair-production processes of two level 1 Kaluza-Klein partners and use them to validate and correct the existing MUED implementation in the fortran version of the PYTHIA event generator. We also develop a new implementation of the model in the C++ version of PYTHIA. We use our implementations in conjunction with the CHECKMATE package to derive the LHC bounds on MUED from a large number of published experimental analyses from Run 1 at the LHC.
Cell size control and homeostasis in bacteria
NASA Astrophysics Data System (ADS)
Bradde, Serena; Taheri, Sattar; Sauls, John; Hill, Nobert; Levine, Petra; Paulsson, Johan; Vergassola, Massimo; Jun, Suckjoon
2015-03-01
How cells control their size is a fundamental question in biology. The mechanisms for sensing size, time, or a combination of the two are not supported by experimental evidence. By analysing distributions of size at division at birth and generation time of hundreds of thousands of Gram-negative E. coli and Gram-positive B. subtilis cells under a wide range of tightly controlled steady-state growth conditions, we are now in the position to validate different theoretical models. In this talk I will present all possible models in details and present a general mechanism that quantitatively explains all measurable aspects of growth and cell division at both population and single-cell levels.
Receding horizon online optimization for torque control of gasoline engines.
Kang, Mingxin; Shen, Tielong
2016-11-01
This paper proposes a model-based nonlinear receding horizon optimal control scheme for the engine torque tracking problem. The controller design directly employs the nonlinear model exploited based on mean-value modeling principle of engine systems without any linearizing reformation, and the online optimization is achieved by applying the Continuation/GMRES (generalized minimum residual) approach. Several receding horizon control schemes are designed to investigate the effects of the integral action and integral gain selection. Simulation analyses and experimental validations are implemented to demonstrate the real-time optimization performance and control effects of the proposed torque tracking controllers. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Results of a real-time irradiation of lithium P/N and conventional N/P silicon solar cells.
NASA Technical Reports Server (NTRS)
Reynard, D. L.; Peterson, D. G.
1972-01-01
Eight types of lithium-diffused P/N and three types of conventional 10 ohm-cm N/P silicon solar cells were irradiated at four different temperatures with a strontium-90 radioisotope at a rate typical of that expected in earth orbit. The six-month irradiation confirmed earlier accelerator results, showed that certain cell types outperform others at the various temperatures, and, in general, verified the recent improvements and potential usefulness of lithium solar cells. The experimental approach and statistical methods and analyses employed yielded increased confidence in the validity of the results. Injection level effects were observed to be significant.
Feed-forward control of a solid oxide fuel cell system with anode offgas recycle
NASA Astrophysics Data System (ADS)
Carré, Maxime; Brandenburger, Ralf; Friede, Wolfgang; Lapicque, François; Limbeck, Uwe; da Silva, Pedro
2015-05-01
In this work a combined heat and power unit (CHP unit) based on the solid oxide fuel cell (SOFC) technology is analysed. This unit has a special feature: the anode offgas is partially recycled to the anode inlet. Thus it is possible to increase the electrical efficiency and the system can be operated without external water feeding. A feed-forward control concept which allows secure operating conditions of the CHP unit as well as a maximization of its electrical efficiency is introduced and validated experimentally. The control algorithm requires a limited number of measurement values and few deterministic relations for its description.
NASA Technical Reports Server (NTRS)
Bert, C. W.; Clary, R. R.
1974-01-01
Various methods potentially usable for determining dynamic stiffness and damping of composite materials are reviewed. Of these, the following most widely used techniques are singled out for more detailed discussion: free vibration, pulse propagation, and forced vibration response. To illustrate the usefulness and validity of dynamic property data, their application in dynamic analyses and comparison with measured structural response are described for the following composite-material structures: free-free sandwich beam with glass-epoxy facings, clamped-edge sandwich plate with similar facings, free-end sandwich conical shell with similar facings, and boron-epoxy free plate with layers arranged at various orientations.
Petkova, Rumena; Chelenkova, Pavlina; Georgieva, Elena; Chakarov, Stoian
2014-01-01
ABSTRACT The individual variance in the efficiency of repair of damage induced by genotoxic therapies may be an important factor in the assessment of eligibility for different anticancer treatments, the outcomes of various treatments and the therapy-associated complications, including acute and delayed toxicity and acquired drug resistance. The second part of this paper analyses the currently available information about the possibilities of using experimentally obtained knowledge about individual repair capacity for the purposes of personalised medicine and healthcare. PMID:26019482
Petkova, Rumena; Chelenkova, Pavlina; Georgieva, Elena; Chakarov, Stoian
2014-01-02
The individual variance in the efficiency of repair of damage induced by genotoxic therapies may be an important factor in the assessment of eligibility for different anticancer treatments, the outcomes of various treatments and the therapy-associated complications, including acute and delayed toxicity and acquired drug resistance. The second part of this paper analyses the currently available information about the possibilities of using experimentally obtained knowledge about individual repair capacity for the purposes of personalised medicine and healthcare.
Expanding the horizons of microRNA bioinformatics.
Huntley, Rachael P; Kramarz, Barbara; Sawford, Tony; Umrao, Zara; Kalea, Anastasia Z; Acquaah, Vanessa; Martin, Maria-Jesus; Mayr, Manuel; Lovering, Ruth C
2018-06-05
MicroRNA regulation of key biological and developmental pathways is a rapidly expanding area of research, accompanied by vast amounts of experimental data. This data, however, is not widely available in bioinformatic resources, making it difficult for researchers to find and analyse microRNA-related experimental data and define further research projects. We are addressing this problem by providing two new bioinformatics datasets that contain experimentally verified functional information for mammalian microRNAs involved in cardiovascular-relevant, and other, processes. To date, our resource provides over 3,900 Gene Ontology annotations associated with almost 500 miRNAs from human, mouse and rat and over 2,200 experimentally validated miRNA:target interactions. We illustrate how this resource can be used to create miRNA-focused interaction networks with a biological context using the known biological role of miRNAs and the mRNAs they regulate, enabling discovery of associations between gene products, biological pathways and, ultimately, diseases. This data will be crucial in advancing the field of microRNA bioinformatics and will establish consistent datasets for reproducible functional analysis of microRNAs across all biological research areas. Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Supersonic Retropropulsion Experimental Results from the NASA Langley Unitary Plan Wind Tunnel
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Rhode, Matthew N.; Edquist, Karl T.; Player, Charles J.
2011-01-01
A new supersonic retropropulsion experimental effort, intended to provide code validation data, was recently completed in the Langley Research Center Unitary Plan Wind Tunnel Test Section 2 over the Mach number range from 2.4 to 4.6. The experimental model was designed using insights gained from pre-test computations, which were instrumental for sizing and refining the model to minimize tunnel wall interference and internal flow separation concerns. A 5-in diameter 70-deg sphere-cone forebody with a roughly 10-in long cylindrical aftbody was the baseline configuration selected for this study. The forebody was designed to accommodate up to four 4:1 area ratio supersonic nozzles. Primary measurements for this model were a large number of surface pressures on the forebody and aftbody. Supplemental data included high-speed Schlieren video and internal pressures and temperatures. The run matrix was developed to allow for the quantification of various sources of experimental uncertainty, such as random errors due to run-to-run variations and bias errors due to flow field or model misalignments. Preliminary results and observations from the test are presented, while detailed data and uncertainty analyses are ongoing.
Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan
2017-12-27
Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Suwono, H.; Susanti, S.; Lestari, U.
2017-04-01
The learning activities that involve the students to learn actively is one of the characteristics of a qualified education. The learning strategy that involves students’ active learning is guided inquiry. Learning problems today are growing metacognitive skills and cognitive learning outcomes. It is the research and development of learning module by using 4D models of Thiagarajan. The first phase is Define, which analyses the problems and needs required by the prior preparation of the module. The second phase is Design, which formulates learning design and devices to obtain the initial draft of learning modules. The third stage is Develop, which is developing and writing module, module validation, product testing, revision, and the resulting an end-product results module development. The fourth stage is Disseminate, which is disseminating of the valid products. Modules were validated by education experts, practitioners, subject matter experts, and expert of online media. The results of the validation module indicated that the module was valid and could be used in teaching and learning. In the validation phase of testing methods, we used experiments to know the difference of metacognitive skills and learning outcomes between the control group and experimental group. The experimental design was a one group pretest-posttest design. The results of the data analysis showed that the modules could enhance metacognitive skills and learning outcomes. The advantages of this module is as follows, 1) module is accompanied by a video link on a website that contains practical activities that are appropriate to Curriculum 2013, 2) module is accompanied by a video link on a website that contains about manual laboratory activities that will be used in the classroom face-to-face, so that students are ready when doing laboratory activities, 3) this module can be online through chat to increase students’ understanding. The disadvantages of this module are the material presented in the modules is limited. It is suggested that for the better utilisation of the online activities, students should be present at every meeting of the activities, so as to make all the students participate actively. It is also suggested that school set up facilities to support blended learning.
Experimental investigation of an RNA sequence space
NASA Technical Reports Server (NTRS)
Lee, Youn-Hyung; Dsouza, Lisa; Fox, George E.
1993-01-01
Modern rRNAs are the historic consequence of an ongoing evolutionary exploration of a sequence space. These extant sequences belong to a special subset of the sequence space that is comprised only of those primary sequences that can validly perform the biological function(s) required of the particular RNA. If it were possible to readily identify all such valid sequences, stochastic predictions could be made about the relative likelihood of various evolutionary pathways available to an RNA. Herein an experimental system which can assess whether a particular sequence is likely to have validity as a eubacterial 5S rRNA is described. A total of ten naturally occurring, and hence known to be valid, sequences and two point mutants of unknown validity were used to test the usefulness of the approach. Nine of the ten valid sequences tested positive whereas both mutants tested as clearly defective. The tenth valid sequence gave results that would be interpreted as reflecting a borderline status were the answer not known. These results demonstrate that it is possible to experimentally determine which sequences in local regions of the sequence space are potentially valid 5S rRNAs.
2014-04-15
SINGLE CYLINDER DIESEL ENGINE Amit Shrestha, Umashankar Joshi, Ziliang Zheng, Tamer Badawy, Naeim A. Henein, Wayne State University, Detroit, MI, USA...13-03-2014 4. TITLE AND SUBTITLE EXPERIMENTAL VALIDATION AND COMBUSTION MODELING OF A JP-8 SURROGATE IN A SINGLE CYLINDER DIESEL ENGINE 5a...INTERNATIONAL UNCLASSIFIED • Validate a two-component JP-8 surrogate in a single cylinder diesel engine. Validation parameters include – Ignition delay
ODE Constrained Mixture Modelling: A Method for Unraveling Subpopulation Structures and Dynamics
Hasenauer, Jan; Hasenauer, Christine; Hucho, Tim; Theis, Fabian J.
2014-01-01
Functional cell-to-cell variability is ubiquitous in multicellular organisms as well as bacterial populations. Even genetically identical cells of the same cell type can respond differently to identical stimuli. Methods have been developed to analyse heterogeneous populations, e.g., mixture models and stochastic population models. The available methods are, however, either incapable of simultaneously analysing different experimental conditions or are computationally demanding and difficult to apply. Furthermore, they do not account for biological information available in the literature. To overcome disadvantages of existing methods, we combine mixture models and ordinary differential equation (ODE) models. The ODE models provide a mechanistic description of the underlying processes while mixture models provide an easy way to capture variability. In a simulation study, we show that the class of ODE constrained mixture models can unravel the subpopulation structure and determine the sources of cell-to-cell variability. In addition, the method provides reliable estimates for kinetic rates and subpopulation characteristics. We use ODE constrained mixture modelling to study NGF-induced Erk1/2 phosphorylation in primary sensory neurones, a process relevant in inflammatory and neuropathic pain. We propose a mechanistic pathway model for this process and reconstructed static and dynamical subpopulation characteristics across experimental conditions. We validate the model predictions experimentally, which verifies the capabilities of ODE constrained mixture models. These results illustrate that ODE constrained mixture models can reveal novel mechanistic insights and possess a high sensitivity. PMID:24992156
Letaief, Rabia; Rebours, Emmanuelle; Grohs, Cécile; Meersseman, Cédric; Fritz, Sébastien; Trouilh, Lidwine; Esquerré, Diane; Barbieri, Johanna; Klopp, Christophe; Philippe, Romain; Blanquet, Véronique; Boichard, Didier; Rocha, Dominique; Boussaha, Mekki
2017-10-24
Copy number variations (CNV) are known to play a major role in genetic variability and disease pathogenesis in several species including cattle. In this study, we report the identification and characterization of CNV in eight French beef and dairy breeds using whole-genome sequence data from 200 animals. Bioinformatics analyses to search for CNV were carried out using four different but complementary tools and we validated a subset of the CNV by both in silico and experimental approaches. We report the identification and localization of 4178 putative deletion-only, duplication-only and CNV regions, which cover 6% of the bovine autosomal genome; they were validated by two in silico approaches and/or experimentally validated using array-based comparative genomic hybridization and single nucleotide polymorphism genotyping arrays. The size of these variants ranged from 334 bp to 7.7 Mb, with an average size of ~ 54 kb. Of these 4178 variants, 3940 were deletions, 67 were duplications and 171 corresponded to both deletions and duplications, which were defined as potential CNV regions. Gene content analysis revealed that, among these variants, 1100 deletions and duplications encompassed 1803 known genes, which affect a wide spectrum of molecular functions, and 1095 overlapped with known QTL regions. Our study is a large-scale survey of CNV in eight French dairy and beef breeds. These CNV will be useful to study the link between genetic variability and economically important traits, and to improve our knowledge on the genomic architecture of cattle.
Numerical simulation of turbulent gas flames in tubes.
Salzano, E; Marra, F S; Russo, G; Lee, J H S
2002-12-02
Computational fluid dynamics (CFD) is an emerging technique to predict possible consequences of gas explosion and it is often considered a powerful and accurate tool to obtain detailed results. However, systematic analyses of the reliability of this approach to real-scale industrial configurations are still needed. Furthermore, few experimental data are available for comparison and validation. In this work, a set of well documented experimental data related to the flame acceleration obtained within obstacle-filled tubes filled with flammable gas-air mixtures, has been simulated. In these experiments, terminal steady flame speeds corresponding to different propagation regimes were observed, thus, allowing a clear and prompt characterisation of the numerical results with respect to numerical parameters, as grid definition, geometrical parameters, as blockage ratio and to mixture parameters, as mixture reactivity. The CFD code AutoReagas was used for the simulations. Numerical predictions were compared with available experimental data and some insights into the code accuracy were determined. Computational results are satisfactory for the relatively slower turbulent deflagration regimes and became fair when choking regime is observed, whereas transition to quasi-detonation or Chapman-Jogouet (CJ) were never predicted.
Numerical and experimental investigations of human swimming motions
Takagi, Hideki; Nakashima, Motomu; Sato, Yohei; Matsuuchi, Kazuo; Sanders, Ross H.
2016-01-01
ABSTRACT This paper reviews unsteady flow conditions in human swimming and identifies the limitations and future potential of the current methods of analysing unsteady flow. The capability of computational fluid dynamics (CFD) has been extended from approaches assuming steady-state conditions to consideration of unsteady/transient conditions associated with the body motion of a swimmer. However, to predict hydrodynamic forces and the swimmer’s potential speeds accurately, more robust and efficient numerical methods are necessary, coupled with validation procedures, requiring detailed experimental data reflecting local flow. Experimental data obtained by particle image velocimetry (PIV) in this area are limited, because at present observations are restricted to a two-dimensional 1.0 m2 area, though this could be improved if the output range of the associated laser sheet increased. Simulations of human swimming are expected to improve competitive swimming, and our review has identified two important advances relating to understanding the flow conditions affecting performance in front crawl swimming: one is a mechanism for generating unsteady fluid forces, and the other is a theory relating to increased speed and efficiency. PMID:26699925
Numerical and experimental investigations of human swimming motions.
Takagi, Hideki; Nakashima, Motomu; Sato, Yohei; Matsuuchi, Kazuo; Sanders, Ross H
2016-08-01
This paper reviews unsteady flow conditions in human swimming and identifies the limitations and future potential of the current methods of analysing unsteady flow. The capability of computational fluid dynamics (CFD) has been extended from approaches assuming steady-state conditions to consideration of unsteady/transient conditions associated with the body motion of a swimmer. However, to predict hydrodynamic forces and the swimmer's potential speeds accurately, more robust and efficient numerical methods are necessary, coupled with validation procedures, requiring detailed experimental data reflecting local flow. Experimental data obtained by particle image velocimetry (PIV) in this area are limited, because at present observations are restricted to a two-dimensional 1.0 m(2) area, though this could be improved if the output range of the associated laser sheet increased. Simulations of human swimming are expected to improve competitive swimming, and our review has identified two important advances relating to understanding the flow conditions affecting performance in front crawl swimming: one is a mechanism for generating unsteady fluid forces, and the other is a theory relating to increased speed and efficiency.
Jones, Cameron C; McDonough, James M; Capasso, Patrizio; Wang, Dongfang; Rosenstein, Kyle S; Zwischenberger, Joseph B
2013-10-01
Computational fluid dynamics (CFD) is a useful tool in characterizing artificial lung designs by providing predictions of device performance through analyses of pressure distribution, perfusion dynamics, and gas transport properties. Validation of numerical results in membrane oxygenators has been predominantly based on experimental pressure measurements with little emphasis placed on confirmation of the velocity fields due to opacity of the fiber membrane and limitations of optical velocimetric methods. Biplane X-ray digital subtraction angiography was used to visualize flow of a blood analogue through a commercial membrane oxygenator at 1-4.5 L/min. Permeability and inertial coefficients of the Ergun equation were experimentally determined to be 180 and 2.4, respectively. Numerical simulations treating the fiber bundle as a single momentum sink according to the Ergun equation accurately predicted pressure losses across the fiber membrane, but significantly underestimated velocity magnitudes in the fiber bundle. A scaling constant was incorporated into the numerical porosity and reduced the average difference between experimental and numerical values in the porous media regions from 44 ± 4% to 6 ± 5%.
Zare, Yasser; Rhim, Sungsoo; Garmabi, Hamid; Rhee, Kyong Yop
2018-04-01
The networks of nanoparticles in nanocomposites cause solid-like behavior demonstrating a constant storage modulus at low frequencies. This study examines the storage modulus of poly (lactic acid)/poly (ethylene oxide)/carbon nanotubes (CNT) nanocomposites. The experimental data of the storage modulus in the plateau regions are obtained by a frequency sweep test. In addition, a simple model is developed to predict the constant storage modulus assuming the properties of the interphase regions and the CNT networks. The model calculations are compared with the experimental results, and the parametric analyses are applied to validate the predictability of the developed model. The calculations properly agree with the experimental data at all polymer and CNT concentrations. Moreover, all parameters acceptably modulate the constant storage modulus. The percentage of the networked CNT, the modulus of networks, and the thickness and modulus of the interphase regions directly govern the storage modulus of nanocomposites. The outputs reveal the important roles of the interphase properties in the storage modulus. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Rossi, Robert Joseph
Methods drawn from four logical theories associated with studies of inductive processes are applied to the assessment and evaluation of experimental episode construct validity. It is shown that this application provides for estimates of episode informativeness with respect to the person examined in terms of the construct and to the construct…
ERIC Educational Resources Information Center
Mihura, Joni L.; Meyer, Gregory J.; Dumitrascu, Nicolae; Bombel, George
2013-01-01
We systematically evaluated the peer-reviewed Rorschach validity literature for the 65 main variables in the popular Comprehensive System (CS). Across 53 meta-analyses examining variables against externally assessed criteria (e.g., observer ratings, psychiatric diagnosis), the mean validity was r = 0.27 (k = 770) as compared to r = 0.08 (k = 386)…
Ancient Glass: A Literature Search and its Role in Waste Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strachan, Denis M.; Pierce, Eric M.
2010-07-01
When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less
Li, B; Matter, E K; Hoppert, H T; Grayson, B E; Seeley, R J; Sandoval, D A
2014-02-01
Obesity has a complicated metabolic pathology, and defining the underlying mechanisms of obesity requires integrative studies with molecular end points. Real-time quantitative PCR (RT-qPCR) is a powerful tool that has been widely utilized. However, the importance of using carefully validated reference genes in RT-qPCR seems to have been overlooked in obesity-related research. The objective of this study was to select a set of reference genes with stable expressions to be used for RT-qPCR normalization in rats under fasted vs re-fed and chow vs high-fat diet (HFD) conditions. Male long-Evans rats were treated under four conditions: chow/fasted, chow/re-fed, HFD/fasted and HFD/re-fed. Expression stabilities of 13 candidate reference genes were evaluated in the rat hypothalamus, duodenum, jejunum and ileum using the ReFinder software program. The optimal number of reference genes needed for RT-qPCR analyses was determined using geNorm. Using geNorm analysis, we found that it was sufficient to use the two most stably expressed genes as references in RT-qPCR analyses for each tissue under specific experimental conditions. B2M and RPLP0 in the hypothalamus, RPS18 and HMBS in the duodenum, RPLP2 and RPLP0 in the jejunum and RPS18 and YWHAZ in the ileum were the most suitable pairs for a normalization study when the four aforementioned experimental conditions were considered. Our study demonstrates that gene expression levels of reference genes commonly used in obesity-related studies, such as ACTB or RPS18, are altered by changes in acute or chronic energy status. These findings underline the importance of using reference genes that are stable in expression across experimental conditions when studying the rat hypothalamus and intestine, because these tissues have an integral role in the regulation of energy homeostasis. It is our hope that this study will raise awareness among obesity researchers on the essential need for reference gene validation in gene expression studies.
Enhancement of CFD validation exercise along the roof profile of a low-rise building
NASA Astrophysics Data System (ADS)
Deraman, S. N. C.; Majid, T. A.; Zaini, S. S.; Yahya, W. N. W.; Abdullah, J.; Ismail, M. A.
2018-04-01
The aim of this study is to enhance the validation of CFD exercise along the roof profile of a low-rise building. An isolated gabled-roof house having 26.6° roof pitch was simulated to obtain the pressure coefficient around the house. Validation of CFD analysis with experimental data requires many input parameters. This study performed CFD simulation based on the data from a previous study. Where the input parameters were not clearly stated, new input parameters were established from the open literatures. The numerical simulations were performed in FLUENT 14.0 by applying the Computational Fluid Dynamics (CFD) approach based on steady RANS equation together with RNG k-ɛ model. Hence, the result from CFD was analysed by using quantitative test (statistical analysis) and compared with CFD results from the previous study. The statistical analysis results from ANOVA test and error measure showed that the CFD results from the current study produced good agreement and exhibited the closest error compared to the previous study. All the input data used in this study can be extended to other types of CFD simulation involving wind flow over an isolated single storey house.
Application of the Virtual Fields Method to a relaxation behaviour of rubbers
NASA Astrophysics Data System (ADS)
Yoon, Sung-ho; Siviour, Clive R.
2018-07-01
This paper presents the application of the Virtual Fields Method (VFM) for the characterization of viscoelastic behaviour of rubbers. The relaxation behaviour of the rubbers following a dynamic loading event is characterized using the dynamic VFM in which full-field (two dimensional) strain and acceleration data, obtained from high-speed imaging, are analysed by the principle of virtual work without traction force data, instead using the acceleration fields in the specimen to provide stress information. Two (silicone and nitrile) rubbers were tested in tension using a drop-weight apparatus. It is assumed that the dynamic behaviour is described by the combination of hyperelastic and Prony series models. A VFM based procedure is designed and used to produce the identification of the modulus term of a hyperelastic model and the Prony series parameters within a time scale determined by two experimental factors: imaging speed and loading duration. Then, the time range of the data is extended using experiments at different temperatures combined with the time-temperature superposition principle. Prior to these experimental analyses, finite element simulations were performed to validate the application of the proposed VFM analysis. Therefore, for the first time, it has been possible to identify relaxation behaviour of a material following dynamic loading, using a technique that can be applied to both small and large deformations.
NASA Astrophysics Data System (ADS)
Ozdemir, Ozan C.; Widener, Christian A.; Carter, Michael J.; Johnson, Kyle W.
2017-10-01
As the industrial application of the cold spray technology grows, the need to optimize both the cost and the quality of the process grows with it. Parameter selection techniques available today require the use of a coupled system of equations to be solved to involve the losses due to particle loading in the gas stream. Such analyses cause a significant increase in the computational time in comparison with calculations with isentropic flow assumptions. In cold spray operations, engineers and operators may, therefore, neglect the effects of particle loading to simplify the multiparameter optimization process. In this study, two-way coupled (particle-fluid) quasi-one-dimensional fluid dynamics simulations are used to test the particle loading effects under many potential cold spray scenarios. Output of the simulations is statistically analyzed to build regression models that estimate the changes in particle impact velocity and temperature due to particle loading. This approach eases particle loading optimization for more complete analysis on deposition cost and time. The model was validated both numerically and experimentally. Further numerical analyses were completed to test the particle loading capacity and limitations of a nozzle with a commonly used throat size. Additional experimentation helped document the physical limitations to high-rate deposition.
McAuliff, Bradley D; Kovera, Margaret Bull; Nunez, Gabriel
2009-06-01
This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed.
Monitoring Building Deformation with InSAR: Experiments and Validation
Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng
2016-01-01
Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated. PMID:27999403
Validation of WIND for a Series of Inlet Flows
NASA Technical Reports Server (NTRS)
Slater, John W.; Abbott, John M.; Cavicchi, Richard H.
2002-01-01
Validation assessments compare WIND CFD simulations to experimental data for a series of inlet flows ranging in Mach number from low subsonic to hypersonic. The validation procedures follow the guidelines of the AIAA. The WIND code performs well in matching the available experimental data. The assessments demonstrate the use of WIND and provide confidence in its use for the analysis of aircraft inlets.
Orsi, Rebecca
2017-02-01
Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.
Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites
NASA Technical Reports Server (NTRS)
Turner, Travis L.
2001-01-01
This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.
Caffeine expectancy: instrument development in the Rasch measurement framework.
Heinz, Adrienne J; Kassel, Jon D; Smith, Everett V
2009-09-01
Although caffeine is the most widely consumed psychoactive drug in the world, the mechanisms associated with consumption are not well understood. Nonetheless, outcome expectancies for caffeine use are thought to underlie caffeine's reinforcing properties. To date, however, there is no available, sufficient measure by which to assess caffeine expectancy. Therefore, the current study sought to develop such a measure employing Rasch measurement models. Unlike traditional measurement development techniques, Rasch analyses afford dynamic and interactive control of the analysis process and generate helpful information to guide instrument construction. A 5-stage developmental process is described, ultimately yielding a 37-item Caffeine Expectancy Questionnaire (CEQ) comprised of 4 factors representing "withdrawal symptoms," "positive effects," "acute negative effects," and "mood effects." Initial evaluation of the CEQ yielded sufficient evidence for various aspects of validity. Although additional research with more heterogeneous samples is required to further assess the measure's reliability and validity, the CEQ demonstrates potential with regard to its utility in experimental laboratory research and clinical application. 2009 APA, all rights reserved.
A pulse tube cryocooler with a cold reservoir
NASA Astrophysics Data System (ADS)
Zhang, X. B.; Zhang, K. H.; Qiu, L. M.; Gan, Z. H.; Shen, X.; Xiang, S. J.
2013-02-01
Phase difference between pressure wave and mass flow is decisive to the cooling capacity of regenerative cryocoolers. Unlike the direct phase shifting using a piston or displacer in conventional Stirling or GM cryocoolers, the pulse tube cyocooler (PTC) indirectly adjusts the cold phase due to the absence of moving parts at the cold end. The present paper proposed and validated theoretically and experimentally a novel configuration of PTC, termed cold reservoir PTC, in which a reservoir together with an adjustable orifice is connected to the cold end of the pulse tube. The impedance from the additional orifice to the cold end helps to increase the mass flow in phase with the pressure wave at the cold end. Theoretical analyses with the linear model for the orifice and double-inlet PTCs indicate that the cooling performance can be improved by introducing the cold reservoir. The preliminary experiments with a home-made single-stage GM PTC further validated the results on the premise of minor opening of the cold-end orifice.
Effect of a Diffusion Zone on Fatigue Crack Propagation in Layered FGMs
NASA Astrophysics Data System (ADS)
Hauber, Brett; Brockman, Robert; Paulino, Glaucio
2008-02-01
Research into functionally graded materials (FGMs) has led to advances in our ability to analyze cracks. However, two prominent aspects remain relatively unexplored: 1) development and validation of modeling methods for fatigue crack propagation in FGMs, and 2) experimental validation of stress intensity models in engineered materials such as two phase monolithic and graded materials. This work addresses some of these problems for a limited set of conditions, material systems (e.g., Ti/TiB), and material gradients. Numerical analyses are conducted for single edge notch bend (SENB) specimens. Stress intensity factors are computed using the specialized finite element code I-Franc (Illinois Fracture Analysis Code), which is tailored for both homogeneous and graded materials, as well as Franc2DL and ABAQUS. Crack extension is considered by means of specified crack increments, together with fatigue evaluations to predict crack propagation life. Results will be used to determine linear material gradient parameters that are significant for prediction of fatigue crack growth behavior.
Multivariate Analyses of Quality Metrics for Crystal Structures in the PDB Archive.
Shao, Chenghua; Yang, Huanwang; Westbrook, John D; Young, Jasmine Y; Zardecki, Christine; Burley, Stephen K
2017-03-07
Following deployment of an augmented validation system by the Worldwide Protein Data Bank (wwPDB) partnership, the quality of crystal structures entering the PDB has improved. Of significance are improvements in quality measures now prominently displayed in the wwPDB validation report. Comparisons of PDB depositions made before and after introduction of the new reporting system show improvements in quality measures relating to pairwise atom-atom clashes, side-chain torsion angle rotamers, and local agreement between the atomic coordinate structure model and experimental electron density data. These improvements are largely independent of resolution limit and sample molecular weight. No significant improvement in the quality of associated ligands was observed. Principal component analysis revealed that structure quality could be summarized with three measures (Rfree, real-space R factor Z score, and a combined molecular geometry quality metric), which can in turn be reduced to a single overall quality metric readily interpretable by all PDB archive users. Copyright © 2017 Elsevier Ltd. All rights reserved.
Li, Xiuying; Yang, Qiwei; Bai, Jinping; Xuan, Yali; Wang, Yimin
2015-01-01
Normalization to a reference gene is the method of choice for quantitative reverse transcription-PCR (RT-qPCR) analysis. The stability of reference genes is critical for accurate experimental results and conclusions. We have evaluated the expression stability of eight commonly used reference genes found in four different human mesenchymal stem cells (MSC). Using geNorm, NormFinder and BestKeeper algorithms, we show that beta-2-microglobulin and peptidyl-prolylisomerase A were the optimal reference genes for normalizing RT-qPCR data obtained from MSC, whereas the TATA box binding protein was not suitable due to its extensive variability in expression. Our findings emphasize the significance of validating reference genes for qPCR analyses. We offer a short list of reference genes to use for normalization and recommend some commercially-available software programs as a rapid approach to validate reference genes. We also demonstrate that the two reference genes, β-actin and glyceraldehyde-3-phosphate dehydrogenase, are frequently used are not always successful in many cases.
Weller, Kathryn E; Greene, Geoffrey W; Redding, Colleen A; Paiva, Andrea L; Lofgren, Ingrid; Nash, Jessica T; Kobayashi, Hisanori
2014-01-01
To develop and validate an instrument to assess environmentally conscious eating (Green Eating [GE]) behavior (BEH) and GE Transtheoretical Model constructs including Stage of Change (SOC), Decisional Balance (DB), and Self-efficacy (SE). Cross-sectional instrument development survey. Convenience sample (n = 954) of 18- to 24-year-old college students from a northeastern university. The sample was randomly split: (N1) and (N2). N1 was used for exploratory factor analyses using principal components analyses; N2 was used for confirmatory analyses (structural modeling) and reliability analyses (coefficient α). The full sample was used for measurement invariance (multi-group confirmatory analyses) and convergent validity (BEH) and known group validation (DB and SE) by SOC using analysis of variance. Reliable (α > .7), psychometrically sound, and stable measures included 2 correlated 5-item DB subscales (Pros and Cons), 2 correlated SE subscales (school [5 items] and home [3 items]), and a single 6-item BEH scale. Most students (66%) were in Precontemplation and Contemplation SOC. Behavior, DB, and SE scales differed significantly by SOC (P < .001) with moderate to large effect sizes, as predicted by the Transtheoretical Model, which supported the validity of these measures. Successful development and preliminary validation of this 25-item GE instrument provides a basis for assessment as well as development of tailored interventions for college students. Copyright © 2014 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment
2013-06-01
architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation
ERIC Educational Resources Information Center
Gordon, Rachel A.; Fujimoto, Ken; Kaestner, Robert; Korenman, Sanders; Abner, Kristin
2013-01-01
The Early Childhood Environment Rating Scale-Revised (ECERS-R) is widely used to associate child care quality with child development, but its validity for this purpose is not well established. We examined the validity of the ECERS-R using the multidimensional Rasch partial credit model (PCM), factor analyses, and regression analyses with data from…
Good quantification practices of flavours and fragrances by mass spectrometry.
Begnaud, Frédéric; Chaintreau, Alain
2016-10-28
Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.
Li, Hongdong; Zhang, Yang; Guan, Yuanfang; Menon, Rajasree; Omenn, Gilbert S
2017-01-01
Tens of thousands of splice isoforms of proteins have been catalogued as predicted sequences from transcripts in humans and other species. Relatively few have been characterized biochemically or structurally. With the extensive development of protein bioinformatics, the characterization and modeling of isoform features, isoform functions, and isoform-level networks have advanced notably. Here we present applications of the I-TASSER family of algorithms for folding and functional predictions and the IsoFunc, MIsoMine, and Hisonet data resources for isoform-level analyses of network and pathway-based functional predictions and protein-protein interactions. Hopefully, predictions and insights from protein bioinformatics will stimulate many experimental validation studies.
Comparison of Analytical Predictions and Experimental Results for a Dual Brayton Power System
NASA Technical Reports Server (NTRS)
Johnson, Paul
2007-01-01
NASA Glenn Research Center (GRC) contracted Barber- Nichols, Arvada, CO to construct a dual Brayton power conversion system for use as a hardware proof of concept and to validate results from a computational code known as the Closed Cycle System Simulation (CCSS). Initial checkout tests were performed at Barber- Nichols to ready the system for delivery to GRC. This presentation describes the system hardware components and lists the types of checkout tests performed along with a couple issues encountered while conducting the tests. A description of the CCSS model is also presented. The checkout tests did not focus on generating data, therefore, no test data or model analyses are presented.
Some important considerations in the development of stress corrosion cracking test methods.
NASA Technical Reports Server (NTRS)
Wei, R. P.; Novak, S. R.; Williams, D. P.
1972-01-01
Discussion of some of the precaution needs the development of fracture-mechanics based test methods for studying stress corrosion cracking involves. Following a review of pertinent analytical fracture mechanics considerations and of basic test methods, the implications for test corrosion cracking studies of the time-to-failure determining kinetics of crack growth and life are examined. It is shown that the basic assumption of the linear-elastic fracture mechanics analyses must be clearly recognized and satisfied in experimentation and that the effects of incubation and nonsteady-state crack growth must also be properly taken into account in determining the crack growth kinetics, if valid data are to be obtained from fracture-mechanics based test methods.
Image encryption using a synchronous permutation-diffusion technique
NASA Astrophysics Data System (ADS)
Enayatifar, Rasul; Abdullah, Abdul Hanan; Isnin, Ismail Fauzi; Altameem, Ayman; Lee, Malrey
2017-03-01
In the past decade, the interest on digital images security has been increased among scientists. A synchronous permutation and diffusion technique is designed in order to protect gray-level image content while sending it through internet. To implement the proposed method, two-dimensional plain-image is converted to one dimension. Afterward, in order to reduce the sending process time, permutation and diffusion steps for any pixel are performed in the same time. The permutation step uses chaotic map and deoxyribonucleic acid (DNA) to permute a pixel, while diffusion employs DNA sequence and DNA operator to encrypt the pixel. Experimental results and extensive security analyses have been conducted to demonstrate the feasibility and validity of this proposed image encryption method.
Analysis on ultrashort-pulse laser ablation for nanoscale film of ceramics
NASA Astrophysics Data System (ADS)
Ho, C. Y.; Tsai, Y. H.; Chiou, Y. J.
2017-06-01
This paper uses the dual-phase-lag model to study the ablation characteristics of femtosecond laser processing for nanometer-sized ceramic films. In ultrafast process and ultrasmall size where the two lags occur, a dual-phase-lag can be applied to analyse the ablation characteristics of femtosecond laser processing for materials. In this work, the ablation rates of nanometer-sized lead zirconate titanate (PZT) ceramics are investigated using a dual-phase-lag and the model is solved by Laplace transform method. The results obtained from this work are validated by the available experimental data. The effects of material thermal properties on the ablation characteristics of femtosecond laser processing for ceramics are also discussed.
Buekenhout, Imke; Leitão, José; Gomes, Ana A
2018-05-24
Month ordering tasks have been used in experimental settings to obtain measures of working memory (WM) capacity in older/clinical groups based solely on their face validity. We sought to assess the appropriateness of using a month ordering task in other contexts, including clinical settings, as a psychometrically sound WM assessment. To this end, we constructed a month ordering task (ucMOT), studied its reliability (internal consistency and temporal stability), and gathered construct-related and criterion-related validity evidence for its use as a WM assessment. The ucMOT proved to be internally consistent and temporally stable, and analyses of the criterion-related validity evidence revealed that its scores predicted the efficiency of language comprehension processes known to depend crucially on WM resources, namely, processes involved in pronoun interpretation. Furthermore, all ucMOT items discriminated between younger and older age groups; the global scores were significantly correlated with scores on well-established WM tasks and presented lower correlations with instruments that evaluate different (although related) processes, namely, inhibition and processing speed. We conclude that the ucMOT possesses solid psychometric properties. Accordingly, we acquired normative data for the Portuguese population, which we present as a regression-based algorithm that yields z scores adjusted for age, gender, and years of formal education. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Rai, Hari Mohan; Saxena, Shailendra K.; Mishra, Vikash; Kumar, Rajesh; Sagdeo, P. R.
2017-08-01
Magnetodielectric (MD) materials have attracted considerable attention due to their intriguing physics and potential future applications. However, the intrinsicality of the MD effect is always a major concern in such materials as the MD effect may arise also due to the MR (magnetoresistance) effect. In the present case study, we report an experimental approach to analyse and separate the intrinsic and MR dominated contributions of the MD phenomenon. For this purpose, polycrystalline samples of LaGa1-xAxO3 (A = Mn/Fe) have been prepared by solid state reaction method. The purity of their structural phase (orthorhombic) has been validated by refining the X-ray diffraction data. The RTMD (room temperature MD) response has been recorded over a frequency range of 20 Hz to 10 MHz. In order to analyse the intrinsicality of the MD effect, FDMR (frequency dependent MR) by means of IS (impedance spectroscopy) and dc MR measurements in four probe geometry have been carried out at RT. A significant RTMD effect has been observed in selected Mn/Fe doped LaGaO3 (LGO) compositions. The mechanism of MR free/intrinsic MD effect, observed in Mn/Fe doped LGO, has been understood speculatively in terms of modified cell volume associated with the reorientation/retransformation of spin-coupled Mn/Fe orbitals due to the application of magnetic field. The present analysis suggests that in order to justify the intrinsic/resistive origin of the MD phenomenon, FDMR measurements are more useful than measuring only dc MR or analysing the trends of magnetic field dependent change in the dielectric constant and tanδ. On the basis of the present case study, we propose that IS (FDMR) alone can be used as an effective experimental tool to detect and analyse the resistive and intrinsic parts contributing to the MD phenomenon.
Validation of the thermal challenge problem using Bayesian Belief Networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFarland, John; Swiler, Laura Painton
The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less
Michel-Sendis, F.; Gauld, I.; Martinez, J. S.; ...
2017-08-02
SFCOMPO-2.0 is the new release of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) database of experimental assay measurements. These measurements are isotopic concentrations from destructive radiochemical analyses of spent nuclear fuel (SNF) samples. We supplement the measurements with design information for the fuel assembly and fuel rod from which each sample was taken, as well as with relevant information on operating conditions and characteristics of the host reactors. These data are necessary for modeling and simulation of the isotopic evolution of the fuel during irradiation. SFCOMPO-2.0 has been developed and is maintained by the OECDmore » NEA under the guidance of the Expert Group on Assay Data of Spent Nuclear Fuel (EGADSNF), which is part of the NEA Working Party on Nuclear Criticality Safety (WPNCS). Significant efforts aimed at establishing a thorough, reliable, publicly available resource for code validation and safety applications have led to the capture and standardization of experimental data from 750 SNF samples from more than 40 reactors. These efforts have resulted in the creation of the SFCOMPO-2.0 database, which is publicly available from the NEA Data Bank. Our paper describes the new database, and applications of SFCOMPO-2.0 for computer code validation, integral nuclear data benchmarking, and uncertainty analysis in nuclear waste package analysis are briefly illustrated.« less
Huard, Jérémy; Mueller, Stephanie; Gilles, Ernst D; Klingmüller, Ursula; Klamt, Steffen
2012-01-01
During liver regeneration, quiescent hepatocytes re-enter the cell cycle to proliferate and compensate for lost tissue. Multiple signals including hepatocyte growth factor, epidermal growth factor, tumor necrosis factor α, interleukin-6, insulin and transforming growth factor β orchestrate these responses and are integrated during the G1 phase of the cell cycle. To investigate how these inputs influence DNA synthesis as a measure for proliferation, we established a large-scale integrated logical model connecting multiple signaling pathways and the cell cycle. We constructed our model based upon established literature knowledge, and successively improved and validated its structure using hepatocyte-specific literature as well as experimental DNA synthesis data. Model analyses showed that activation of the mitogen-activated protein kinase and phosphatidylinositol 3-kinase pathways was sufficient and necessary for triggering DNA synthesis. In addition, we identified key species in these pathways that mediate DNA replication. Our model predicted oncogenic mutations that were compared with the COSMIC database, and proposed intervention targets to block hepatocyte growth factor-induced DNA synthesis, which we validated experimentally. Our integrative approach demonstrates that, despite the complexity and size of the underlying interlaced network, logical modeling enables an integrative understanding of signaling-controlled proliferation at the cellular level, and thus can provide intervention strategies for distinct perturbation scenarios at various regulatory levels. PMID:22443451
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michel-Sendis, F.; Gauld, I.; Martinez, J. S.
SFCOMPO-2.0 is the new release of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) database of experimental assay measurements. These measurements are isotopic concentrations from destructive radiochemical analyses of spent nuclear fuel (SNF) samples. We supplement the measurements with design information for the fuel assembly and fuel rod from which each sample was taken, as well as with relevant information on operating conditions and characteristics of the host reactors. These data are necessary for modeling and simulation of the isotopic evolution of the fuel during irradiation. SFCOMPO-2.0 has been developed and is maintained by the OECDmore » NEA under the guidance of the Expert Group on Assay Data of Spent Nuclear Fuel (EGADSNF), which is part of the NEA Working Party on Nuclear Criticality Safety (WPNCS). Significant efforts aimed at establishing a thorough, reliable, publicly available resource for code validation and safety applications have led to the capture and standardization of experimental data from 750 SNF samples from more than 40 reactors. These efforts have resulted in the creation of the SFCOMPO-2.0 database, which is publicly available from the NEA Data Bank. Our paper describes the new database, and applications of SFCOMPO-2.0 for computer code validation, integral nuclear data benchmarking, and uncertainty analysis in nuclear waste package analysis are briefly illustrated.« less
Dubois, F; Depresseux, J C; Bontemps, L; Demaison, L; Keriel, C; Mathieu, J P; Pernin, C; Marti-Batlle, D; Vidal, M; Cuchet, P
1986-01-01
The aim of the present study was to demonstrate that it is possible to estimate the intracellular metabolism of a fatty acid labelled with iodine using external radioactivity measurements. 123I-16-iodo-9-hexadecenoic acid (IHA) was injected close to the coronary arteries of isolated rat hearts perfused according to the Langendorff technique. The time course of the cardiac radioactivity was measured using an INa crystal coupled to an analyser. The obtained curves were analysed using a four-compartment mathematical model, with the compartments corresponding to the vascular-IHA (O), intramyocardial free-IHA (1), esterified-IHA (2) and iodide (3) pools. Curve analysis using this model demonstrated that, as compared to substrate-free perfusion, the presence of glucose (11 mM) increased IHA storage and decreased its oxidation. These changes were enhanced by the presence of insulin. A comparison of these results with measurements of the radioactivity levels within the various cellular fractions validated our proposed mathematical model. Thus, using only a mathematical analysis of a cardiac time-activity curve, it is possible to obtain quantitative information about IHA distribution in the different intracellular metabolic pathways. This technique is potentially useful for the study of metabolic effects of ischaemia or anoxia, as well as for the study of the influence of various substrates or drugs on IHA metabolism in isolated rat hearts.
NASA Technical Reports Server (NTRS)
Wayner, P. C., Jr.; Plawsky, J. L.; Wong, Harris
2004-01-01
The major accomplishments of the experimental portion of the research were documented in Ling Zheng's doctoral dissertation. Using Pentane, he obtained a considerable amount of data on the stability and heat transfer characteristics of an evaporating meniscus. The important points are that experimental equipment to obtain data on the stability and heat transfer characteristics of an evaporating meniscus were built and successfully operated. The data and subsequent analyses were accepted by the Journal of Heat Transfer for publication in 2004 [PU4]. The work was continued by a new graduate student using HFE-7000 [PU3] and then Pentane at lower heat fluxes. The Pentane results are being analyzed for publication. The experimental techniques are currently being used in our other NASA Grant. The oscillation of the contact line observed in the experiments involves evaporation (retraction part) and spreading. Since both processes occur with finite contact angles, it is important to derive a precise equation of the intermolecular forces (disjoining pressure) valid for non-zero contact angles. This theoretical derivation was accepted for publication by Journal of Fluid Mechanics [PU5]. The evaporation process near the contact line is complicated, and an idealized micro heat pipe has been proposed to help in elucidating the detailed evaporation process [manuscripts in preparation].
Kredel, Ralf; Vater, Christian; Klostermann, André; Hossner, Ernst-Joachim
2017-01-01
Reviewing 60 studies on natural gaze behavior in sports, it becomes clear that, over the last 40 years, the use of eye-tracking devices has considerably increased. Specifically, this review reveals the large variance of methods applied, analyses performed, and measures derived within the field. The results of sub-sample analyses suggest that sports-related eye-tracking research strives, on the one hand, for ecologically valid test settings (i.e., viewing conditions and response modes), while on the other, for experimental control along with high measurement accuracy (i.e., controlled test conditions with high-frequency eye-trackers linked to algorithmic analyses). To meet both demands, some promising compromises of methodological solutions have been proposed-in particular, the integration of robust mobile eye-trackers in motion-capture systems. However, as the fundamental trade-off between laboratory and field research cannot be solved by technological means, researchers need to carefully weigh the arguments for one or the other approach by accounting for the respective consequences. Nevertheless, for future research on dynamic gaze behavior in sports, further development of the current mobile eye-tracking methodology seems highly advisable to allow for the acquisition and algorithmic analyses of larger amounts of gaze-data and further, to increase the explanatory power of the derived results.
NASA Astrophysics Data System (ADS)
Shojaeefard, Mohammad Hasan; Khalkhali, Abolfazl; Yarmohammadisatri, Sadegh
2017-06-01
The main purpose of this paper is to propose a new method for designing Macpherson suspension, based on the Sobol indices in terms of Pearson correlation which determines the importance of each member on the behaviour of vehicle suspension. The formulation of dynamic analysis of Macpherson suspension system is developed using the suspension members as the modified links in order to achieve the desired kinematic behaviour. The mechanical system is replaced with an equivalent constrained links and then kinematic laws are utilised to obtain a new modified geometry of Macpherson suspension. The equivalent mechanism of Macpherson suspension increased the speed of analysis and reduced its complexity. The ADAMS/CAR software is utilised to simulate a full vehicle, Renault Logan car, in order to analyse the accuracy of modified geometry model. An experimental 4-poster test rig is considered for validating both ADAMS/CAR simulation and analytical geometry model. Pearson correlation coefficient is applied to analyse the sensitivity of each suspension member according to vehicle objective functions such as sprung mass acceleration, etc. Besides this matter, the estimation of Pearson correlation coefficient between variables is analysed in this method. It is understood that the Pearson correlation coefficient is an efficient method for analysing the vehicle suspension which leads to a better design of Macpherson suspension system.
Liu, Mingying; Jiang, Jing; Han, Xiaojiao; Qiao, Guirong; Zhuo, Renying
2014-01-01
Dendrocalamus latiflorus Munro distributes widely in subtropical areas and plays vital roles as valuable natural resources. The transcriptome sequencing for D. latiflorus Munro has been performed and numerous genes especially those predicted to be unique to D. latiflorus Munro were revealed. qRT-PCR has become a feasible approach to uncover gene expression profiling, and the accuracy and reliability of the results obtained depends upon the proper selection of stable reference genes for accurate normalization. Therefore, a set of suitable internal controls should be validated for D. latiflorus Munro. In this report, twelve candidate reference genes were selected and the assessment of gene expression stability was performed in ten tissue samples and four leaf samples from seedlings and anther-regenerated plants of different ploidy. The PCR amplification efficiency was estimated, and the candidate genes were ranked according to their expression stability using three software packages: geNorm, NormFinder and Bestkeeper. GAPDH and EF1α were characterized to be the most stable genes among different tissues or in all the sample pools, while CYP showed low expression stability. RPL3 had the optimal performance among four leaf samples. The application of verified reference genes was illustrated by analyzing ferritin and laccase expression profiles among different experimental sets. The analysis revealed the biological variation in ferritin and laccase transcript expression among the tissues studied and the individual plants. geNorm, NormFinder, and BestKeeper analyses recommended different suitable reference gene(s) for normalization according to the experimental sets. GAPDH and EF1α had the highest expression stability across different tissues and RPL3 for the other sample set. This study emphasizes the importance of validating superior reference genes for qRT-PCR analysis to accurately normalize gene expression of D. latiflorus Munro.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Fifield, Leonard S.; Wang, Jin
2016-06-01
This project aimed to integrate, optimize, and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk® Simulation Moldflow® Insight (ASMI) software package for injection-molded long-carbon-fiber (LCF) thermoplastic composite structures. The project was organized into two phases. Phase 1 demonstrated the ability of the advanced ASMI package to predict fiber orientation and length distributions in LCF/polypropylene (PP) and LCF/polyamide-6, 6 (PA66) plaques within 15% of experimental results. Phase 2 validated the advanced ASMI package by predicting fiber orientation and length distributions within 15% of experimental results for a complex three-dimensional (3D) Toyota automotive part injection-moldedmore » from LCF/PP and LCF/PA66 materials. Work under Phase 2 also included estimate of weight savings and cost impacts for a vehicle system using ASMI and structural analyses of the complex part. The present report summarizes the completion of Phases 1 and 2 work activities and accomplishments achieved by the team comprising Pacific Northwest National Laboratory (PNNL); Purdue University (Purdue); Virginia Polytechnic Institute and State University (Virginia Tech); Autodesk, Inc. (Autodesk); PlastiComp, Inc. (PlastiComp); Toyota Research Institute North America (Toyota); Magna Exteriors and Interiors Corp. (Magna); and University of Illinois. Figure 1 illustrates the technical approach adopted in this project that progressed from compounding LCF/PP and LCF/PA66 materials, to process model improvement and implementation, to molding and modeling LCF/PP and LCF/PA66 plaques. The lessons learned from the plaque study and the successful validation of improved process models for fiber orientation and length distributions for these plaques enabled the project to go to Phase 2 to mold, model, and optimize the 3D complex part.« less
Rossi, Marcel M; Alderson, Jacqueline; El-Sallam, Amar; Dowling, James; Reinbolt, Jeffrey; Donnelly, Cyril J
2016-12-08
The aims of this study were to: (i) establish a new criterion method to validate inertia tensor estimates by setting the experimental angular velocity data of an airborne objects as ground truth against simulations run with the estimated tensors, and (ii) test the sensitivity of the simulations to changes in the inertia tensor components. A rigid steel cylinder was covered with reflective kinematic markers and projected through a calibrated motion capture volume. Simulations of the airborne motion were run with two models, using inertia tensor estimated with geometric formula or the compound pendulum technique. The deviation angles between experimental (ground truth) and simulated angular velocity vectors and the root mean squared deviation angle were computed for every simulation. Monte Carlo analyses were performed to assess the sensitivity of simulations to changes in magnitude of principal moments of inertia within ±10% and to changes in orientation of principal axes of inertia within ±10° (of the geometric-based inertia tensor). Root mean squared deviation angles ranged between 2.9° and 4.3° for the inertia tensor estimated geometrically, and between 11.7° and 15.2° for the compound pendulum values. Errors up to 10% in magnitude of principal moments of inertia yielded root mean squared deviation angles ranging between 3.2° and 6.6°, and between 5.5° and 7.9° when lumped with errors of 10° in principal axes of inertia orientation. The proposed technique can effectively validate inertia tensors from novel estimation methods of body segment inertial parameter. Principal axes of inertia orientation should not be neglected when modelling human/animal mechanics. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John
2013-01-01
On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.
Gan, Han Ming; Dailey, Lucas K.; Halliday, Nigel; Williams, Paul; Hudson, André O.
2016-01-01
Background Members of the genus Novosphingobium have been isolated from a variety of environmental niches. Although genomics analyses have suggested the presence of genes associated with quorum sensing signal production e.g., the N-acyl-homoserine lactone (AHL) synthase (luxI) homologs in various Novosphingobium species, to date, no luxI homologs have been experimentally validated. Methods In this study, we report the draft genome of the N-(AHL)-producing bacterium Novosphingobium subterraneum DSM 12447 and validate the functions of predicted luxI homologs from the bacterium through inducible heterologous expression in Agrobacterium tumefaciens strain NTL4. We developed a two-dimensional thin layer chromatography bioassay and used LC-ESI MS/MS analyses to separate, detect and identify the AHL signals produced by the N. subterraneum DSM 12447 strain. Results Three predicted luxI homologs were annotated to the locus tags NJ75_2841 (NovINsub1), NJ75_2498 (NovINsub2), and NJ75_4146 (NovINsub3). Inducible heterologous expression of each luxI homologs followed by LC-ESI MS/MS and two-dimensional reverse phase thin layer chromatography bioassays followed by bioluminescent ccd camera imaging indicate that the three LuxI homologs are able to produce a variety of medium-length AHL compounds. New insights into the LuxI phylogeny was also gleemed as inferred by Bayesian inference. Discussion This study significantly adds to our current understanding of quorum sensing in the genus Novosphingobium and provide the framework for future characterization of the phylogenetically interesting LuxI homologs from members of the genus Novosphingobium and more generally the family Sphingomonadaceae. PMID:27635318
Validation of the Soil Moisture Active Passive mission using USDA-ARS experimental watersheds
USDA-ARS?s Scientific Manuscript database
The calibration and validation program of the Soil Moisture Active Passive mission (SMAP) relies upon an international cooperative of in situ networks to provide ground truth references across a variety of landscapes. The USDA Agricultural Research Service operates several experimental watersheds wh...
Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet
2011-10-01
The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations. Only the model for one specimen met the validation criteria for average and peak pressure of both articulations; however the experimental measures for peak pressure also exhibited high variability. MRI-based modeling can reliably be used for evaluating the contact area and contact force with similar confidence as in currently available experimental techniques. Average contact pressure, and peak contact pressure were more variable from all measurement techniques, and these measures from MRI-based modeling should be used with some caution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, J.; Yuan, B.; Jin, M.
2012-07-01
Three-dimensional neutronics optimization calculations were performed to analyse the parameters of Tritium Breeding Ratio (TBR) and maximum average Power Density (PDmax) in a helium-cooled multi-functional experimental fusion-fission hybrid reactor named FDS (Fusion-Driven hybrid System)-MFX (Multi-Functional experimental) blanket. Three-stage tests will be carried out successively, in which the tritium breeding blanket, uranium-fueled blanket and spent-fuel-fueled blanket will be utilized respectively. In this contribution, the most significant and main goal of the FDS-MFX blanket is to achieve the PDmax of about 100 MW/m3 with self-sustaining tritium (TBR {>=} 1.05) based on the second-stage test with uranium-fueled blanket to check and validate themore » demonstrator reactor blanket relevant technologies based on the viable fusion and fission technologies. Four different enriched uranium materials were taken into account to evaluate PDmax in subcritical blanket: (i) natural uranium, (ii) 3.2% enriched uranium, (iii) 19.75% enriched uranium, and (iv) 64.4% enriched uranium carbide. These calculations and analyses were performed using a home-developed code VisualBUS and Hybrid Evaluated Nuclear Data Library (HENDL). The results showed that the performance of the blanket loaded with 64.4% enriched uranium was the most attractive and it could be promising to effectively obtain tritium self-sufficiency (TBR-1.05) and a high maximum average power density ({approx}100 MW/m{sup 3}) when the blanket was loaded with the mass of {sup 235}U about 1 ton. (authors)« less
Effects of seawater acidification on gene expression: resolving broader-scale trends in sea urchins.
Evans, Tyler G; Watson-Wynn, Priscilla
2014-06-01
Sea urchins are ecologically and economically important calcifying organisms threatened by acidification of the global ocean caused by anthropogenic CO2 emissions. Propelled by the sequencing of the purple sea urchin (Strongylocentrotus purpuratus) genome, profiling changes in gene expression during exposure to high pCO2 seawater has emerged as a powerful and increasingly common method to infer the response of urchins to ocean change. However, analyses of gene expression are sensitive to experimental methodology, and comparisons between studies of genes regulated by ocean acidification are most often made in the context of major caveats. Here we perform meta-analyses as a means of minimizing experimental discrepancies and resolving broader-scale trends regarding the effects of ocean acidification on gene expression in urchins. Analyses across eight studies and four urchin species largely support prevailing hypotheses about the impact of ocean acidification on marine calcifiers. The predominant expression pattern involved the down-regulation of genes within energy-producing pathways, a clear indication of metabolic depression. Genes with functions in ion transport were significantly over-represented and are most plausibly contributing to intracellular pH regulation. Expression profiles provided extensive evidence for an impact on biomineralization, epitomized by the down-regulation of seven spicule matrix proteins. In contrast, expression profiles provided limited evidence for CO2-mediated developmental delay or induction of a cellular stress response. Congruence between studies of gene expression and the ocean acidification literature in general validates the accuracy of gene expression in predicting the consequences of ocean change and justifies its continued use in future studies. © 2014 Marine Biological Laboratory.
Geist, Rebecca E; DuBois, Chase H; Nichols, Timothy C; Caughey, Melissa C; Merricks, Elizabeth P; Raymer, Robin; Gallippi, Caterina M
2016-09-01
Acoustic radiation force impulse (ARFI) Surveillance of Subcutaneous Hemorrhage (ASSH) has been previously demonstrated to differentiate bleeding phenotype and responses to therapy in dogs and humans, but to date, the method has lacked experimental validation. This work explores experimental validation of ASSH in a poroelastic tissue-mimic and in vivo in dogs. The experimental design exploits calibrated flow rates and infusion durations of evaporated milk in tofu or heparinized autologous blood in dogs. The validation approach enables controlled comparisons of ASSH-derived bleeding rate (BR) and time to hemostasis (TTH) metrics. In tissue-mimicking experiments, halving the calibrated flow rate yielded ASSH-derived BRs that decreased by 44% to 48%. Furthermore, for calibrated flow durations of 5.0 minutes and 7.0 minutes, average ASSH-derived TTH was 5.2 minutes and 7.0 minutes, respectively, with ASSH predicting the correct TTH in 78% of trials. In dogs undergoing calibrated autologous blood infusion, ASSH measured a 3-minute increase in TTH, corresponding to the same increase in the calibrated flow duration. For a measured 5% decrease in autologous infusion flow rate, ASSH detected a 7% decrease in BR. These tissue-mimicking and in vivo preclinical experimental validation studies suggest the ASSH BR and TTH measures reflect bleeding dynamics. © The Author(s) 2015.
Technical Data to Justify Full Burnup Credit in Criticality Safety Licensing Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enercon Services, Inc.
2011-03-14
Enercon Services, Inc. (ENERCON) was requested under Task Order No.2 to identify scientific and technical data needed to benchmark and justify Full Burnup Credit, which adds 16 fission products and 4 minor actinides1 to Actinide-Only burnup credit. The historical perspective for Full Burnup Credit is discussed, and interviews of organizations participating in burnup credit activities are summarized as a basis for identifying additional data needs and making recommendation. Input from burnup credit participants representing two segments of the commercial nuclear industry is provided. First, the Electric Power Research Institute (EPRI) has been very active in the development of Full Burnupmore » Credit, representing the interests of nuclear utilities in achieving capacity gains for storage and transport casks. EPRI and its utility customers are interested in a swift resolution of the validation issues that are delaying the implementation of Full Burnup Credit [EPRI 2010b]. Second, used nuclear fuel storage and transportation Cask Vendors favor improving burnup credit beyond Actinide-Only burnup credit, although their discussion of specific burnup credit achievements and data needs was limited citing business sensitive and technical proprietary concerns. While Cask Vendor proprietary items are not specifically identified in this report, the needs of all nuclear industry participants are reflected in the conclusions and recommendations of this report. In addition, Oak Ridge National Laboratory (ORNL) and Sandia National Laboratory (SNL) were interviewed for their input into additional data needs to achieve Full Burnup Credit. ORNL was very open to discussions of Full Burnup Credit, with several telecoms and a visit by ENERCON to ORNL. For many years, ORNL has provided extensive support to the NRC regarding burnup credit in all of its forms. Discussions with ORNL focused on potential resolutions to the validation issues for the use of fission products. SNL was helpful in ENERCON's understanding of the difficult issues related to obtaining and analyzing additional cross section test data to support Full Burnup Credit. A PIRT (Phenomena Identification and Ranking Table) analysis was performed by ENERCON to evaluate the costs and benefits of acquiring different types of nuclear data in support of Full Burnup Credit. A PIRT exercise is a formal expert elicitation process with the final output being the ranking tables. The PIRT analysis (Table 7-4: Results of PIRT Evaluation) showed that the acquisition of additional Actinide-Only experimental data, although beneficial, was associated with high cost and is not necessarily needed. The conclusion was that the existing Radiochemical Assay (RCA) data plus the French Haut Taux de Combustion (HTC)2 and handbook Laboratory Critical Experiment (LCE) data provide adequate benchmark validation for Actinide-Only Burnup Credit. The PIRT analysis indicated that the costs and schedule to obtain sufficient additional experimental data to support the addition of 16 fission products to Actinide-Only Burnup Credit to produce Full Burnup Credit are quite substantial. ENERCON estimates the cost to be $50M to $100M with a schedule of five or more years. The PIRT analysis highlights another option for fission product burnup credit, which is the application of computer-based uncertainty analyses (S/U - Sensitivity/Uncertainty methodologies), confirmed by the limited experimental data that is already available. S/U analyses essentially transform cross section uncertainty information contained in the cross section libraries into a reactivity bias and uncertainty. Recent work by ORNL and EPRI has shown that a methodology to support Full Burnup Credit is possible using a combination of traditional RCA and LCE validation plus S/U validation for fission product isotopics and cross sections. Further, the most recent cross section data (ENDF/B-VII) can be incorporated into the burnup credit codes at a reasonable cost compared to the acquisition of equivalent experimental data. ENERCON concludes that even with the costs of code data library updating, the use of S/U analysis methodologies could be accomplished on a shorter schedule and a lower cost than the gathering of sufficient experimental data. ENERCON estimates of the costs of an updated S/U computer code and data suite are $5M to $10M with a schedule of two to three years. Recent ORNL analyses using the S/U analysis method show that the bias and uncertainty values for fission product cross sections are smaller than previously expected. This result is confirmed by a similar EPRI approach using different data and computer codes. ENERCON also found that some issues regarding the implementation of burnup credit appear to have been successfully resolved especially the axial burnup profile issue and the depletion parameter issue. These issues were resolved through data gathering activities at the Yucca Mountain Project and ORNL.« less
An experimental phylogeny to benchmark ancestral sequence reconstruction
Randall, Ryan N.; Radford, Caelan E.; Roof, Kelsey A.; Natarajan, Divya K.; Gaucher, Eric A.
2016-01-01
Ancestral sequence reconstruction (ASR) is a still-burgeoning method that has revealed many key mechanisms of molecular evolution. One criticism of the approach is an inability to validate its algorithms within a biological context as opposed to a computer simulation. Here we build an experimental phylogeny using the gene of a single red fluorescent protein to address this criticism. The evolved phylogeny consists of 19 operational taxonomic units (leaves) and 17 ancestral bifurcations (nodes) that display a wide variety of fluorescent phenotypes. The 19 leaves then serve as ‘modern' sequences that we subject to ASR analyses using various algorithms and to benchmark against the known ancestral genotypes and ancestral phenotypes. We confirm computer simulations that show all algorithms infer ancient sequences with high accuracy, yet we also reveal wide variation in the phenotypes encoded by incorrectly inferred sequences. Specifically, Bayesian methods incorporating rate variation significantly outperform the maximum parsimony criterion in phenotypic accuracy. Subsampling of extant sequences had minor effect on the inference of ancestral sequences. PMID:27628687
Overview of Glenn Mechanical Components Branch Research
NASA Astrophysics Data System (ADS)
Zakrajsek, James
2002-09-01
Mr. James Zakrajsek, chief of the Mechanical Components Branch, gave an overview of research conducted by the branch. Branch members perform basic research on mechanical components and systems, including gears and bearings, turbine seals, structural and thermal barrier seals, and space mechanisms. The research is focused on propulsion systems for present and advanced aerospace vehicles. For rotorcraft and conventional aircraft, we conduct research to develop technology needed to enable the design of low noise, ultra safe geared drive systems. We develop and validate analytical models for gear crack propagation, gear dynamics and noise, gear diagnostics, bearing dynamics, and thermal analyses of gear systems using experimental data from various component test rigs. In seal research we develop and test advanced turbine seal concepts to increase efficiency and durability of turbine engines. We perform experimental and analytical research to develop advanced thermal barrier seals and structural seals for current and next generation space vehicles. Our space mechanisms research involves fundamental investigation of lubricants, materials, components and mechanisms for deep space and planetary environments.
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
A zero torsional stiffness twist morphing blade as a wind turbine load alleviation device
NASA Astrophysics Data System (ADS)
Lachenal, X.; Daynes, S.; Weaver, P. M.
2013-06-01
This paper presents the design, analysis and realization of a zero stiffness twist morphing wind turbine blade. The morphing blade is designed to actively twist as a means of alleviating the gust loads which reduce the fatigue life of wind turbine blades. The morphing structure exploits an elastic strain energy balance within the blade to enable large twisting deformations with modest actuation requirements. While twist is introduced using the warping of the blade skin, internal pre-stressed members ensure that a constant strain energy balance is achieved throughout the deformation, resulting in a zero torsional stiffness structure. The torsional stability of the morphing blade is characterized by analysing the elastic strain energy in the device. Analytical models of the skin, the pre-stressed components and the complete blade are compared to their respective finite element models as well as experimental results. The load alleviation potential of the adaptive structure is quantified using a two-dimensional steady flow aerodynamic model which is experimentally validated with wind tunnel measurements.
NASA Astrophysics Data System (ADS)
Bruen, Thomas; Marco, James
2016-04-01
Variations in cell properties are unavoidable and can be caused by manufacturing tolerances and usage conditions. As a result of this, cells connected in series may have different voltages and states of charge that limit the energy and power capability of the complete battery pack. Methods of removing this energy imbalance have been extensively reported within literature. However, there has been little discussion around the effect that such variation has when cells are connected electrically in parallel. This work aims to explore the impact of connecting cells, with varied properties, in parallel and the issues regarding energy imbalance and battery management that may arise. This has been achieved through analysing experimental data and a validated model. The main results from this study highlight that significant differences in current flow can occur between cells within a parallel stack that will affect how the cells age and the temperature distribution within the battery assembly.
Automated Reconstruction of Three-Dimensional Fish Motion, Forces, and Torques
Voesenek, Cees J.; Pieters, Remco P. M.; van Leeuwen, Johan L.
2016-01-01
Fish can move freely through the water column and make complex three-dimensional motions to explore their environment, escape or feed. Nevertheless, the majority of swimming studies is currently limited to two-dimensional analyses. Accurate experimental quantification of changes in body shape, position and orientation (swimming kinematics) in three dimensions is therefore essential to advance biomechanical research of fish swimming. Here, we present a validated method that automatically tracks a swimming fish in three dimensions from multi-camera high-speed video. We use an optimisation procedure to fit a parameterised, morphology-based fish model to each set of video images. This results in a time sequence of position, orientation and body curvature. We post-process this data to derive additional kinematic parameters (e.g. velocities, accelerations) and propose an inverse-dynamics method to compute the resultant forces and torques during swimming. The presented method for quantifying 3D fish motion paves the way for future analyses of swimming biomechanics. PMID:26752597
Fatigue Damage of Collagenous Tissues: Experiment, Modeling and Simulation Studies
Martin, Caitlin; Sun, Wei
2017-01-01
Mechanical fatigue damage is a critical issue for soft tissues and tissue-derived materials, particularly for musculoskeletal and cardiovascular applications; yet, our understanding of the fatigue damage process is incomplete. Soft tissue fatigue experiments are often difficult and time-consuming to perform, which has hindered progress in this area. However, the recent development of soft-tissue fatigue-damage constitutive models has enabled simulation-based fatigue analyses of tissues under various conditions. Computational simulations facilitate highly controlled and quantitative analyses to study the distinct effects of various loading conditions and design features on tissue durability; thus, they are advantageous over complex fatigue experiments. Although significant work to calibrate the constitutive models from fatigue experiments and to validate predictability remains, further development in these areas will add to our knowledge of soft-tissue fatigue damage and will facilitate the design of durable treatments and devices. In this review, the experimental, modeling, and simulation efforts to study collagenous tissue fatigue damage are summarized and critically assessed. PMID:25955007
Rolling Bearing Fault Diagnosis Based on an Improved HTT Transform
Tang, Guiji; Tian, Tian; Zhou, Chong
2018-01-01
When rolling bearing failure occurs, vibration signals generally contain different signal components, such as impulsive fault feature signals, background noise and harmonic interference signals. One of the most challenging aspects of rolling bearing fault diagnosis is how to inhibit noise and harmonic interference signals, while enhancing impulsive fault feature signals. This paper presents a novel bearing fault diagnosis method, namely an improved Hilbert time–time (IHTT) transform, by combining a Hilbert time–time (HTT) transform with principal component analysis (PCA). Firstly, the HTT transform was performed on vibration signals to derive a HTT transform matrix. Then, PCA was employed to de-noise the HTT transform matrix in order to improve the robustness of the HTT transform. Finally, the diagonal time series of the de-noised HTT transform matrix was extracted as the enhanced impulsive fault feature signal and the contained fault characteristic information was identified through further analyses of amplitude and envelope spectrums. Both simulated and experimental analyses validated the superiority of the presented method for detecting bearing failures. PMID:29662013
Further Validation of a CFD Code for Calculating the Performance of Two-Stage Light Gas Guns
NASA Technical Reports Server (NTRS)
Bogdanoff, David W.
2017-01-01
Earlier validations of a higher-order Godunov code for modeling the performance of two-stage light gas guns are reviewed. These validation comparisons were made between code predictions and experimental data from the NASA Ames 1.5" and 0.28" guns and covered muzzle velocities of 6.5 to 7.2 km/s. In the present report, five more series of code validation comparisons involving experimental data from the Ames 0.22" (1.28" pump tube diameter), 0.28", 0.50", 1.00" and 1.50" guns are presented. The total muzzle velocity range of the validation data presented herein is 3 to 11.3 km/s. The agreement between the experimental data and CFD results is judged to be very good. Muzzle velocities were predicted within 0.35 km/s for 74% of the cases studied with maximum differences being 0.5 km/s and for 4 out of 50 cases, 0.5 - 0.7 km/s.
Ramayo-Caldas, Yuliaxis; Ballester, Maria; Fortes, Marina R S; Esteve-Codina, Anna; Castelló, Anna; Noguera, Jose L; Fernández, Ana I; Pérez-Enciso, Miguel; Reverter, Antonio; Folch, Josep M
2014-03-26
Fatty acids (FA) play a critical role in energy homeostasis and metabolic diseases; in the context of livestock species, their profile also impacts on meat quality for healthy human consumption. Molecular pathways controlling lipid metabolism are highly interconnected and are not fully understood. Elucidating these molecular processes will aid technological development towards improvement of pork meat quality and increased knowledge of FA metabolism, underpinning metabolic diseases in humans. The results from genome-wide association studies (GWAS) across 15 phenotypes were subjected to an Association Weight Matrix (AWM) approach to predict a network of 1,096 genes related to intramuscular FA composition in pigs. To identify the key regulators of FA metabolism, we focused on the minimal set of transcription factors (TF) that the explored the majority of the network topology. Pathway and network analyses pointed towards a trio of TF as key regulators of FA metabolism: NCOA2, FHL2 and EP300. Promoter sequence analyses confirmed that these TF have binding sites for some well-know regulators of lipid and carbohydrate metabolism. For the first time in a non-model species, some of the co-associations observed at the genetic level were validated through co-expression at the transcriptomic level based on real-time PCR of 40 genes in adipose tissue, and a further 55 genes in liver. In particular, liver expression of NCOA2 and EP300 differed between pig breeds (Iberian and Landrace) extreme in terms of fat deposition. Highly clustered co-expression networks in both liver and adipose tissues were observed. EP300 and NCOA2 showed centrality parameters above average in the both networks. Over all genes, co-expression analyses confirmed 28.9% of the AWM predicted gene-gene interactions in liver and 33.0% in adipose tissue. The magnitude of this validation varied across genes, with up to 60.8% of the connections of NCOA2 in adipose tissue being validated via co-expression. Our results recapitulate the known transcriptional regulation of FA metabolism, predict gene interactions that can be experimentally validated, and suggest that genetic variants mapped to EP300, FHL2, and NCOA2 modulate lipid metabolism and control energy homeostasis in pigs.
Multitrait-Multimethod Analyses of Two Self-Concept Instruments.
ERIC Educational Resources Information Center
Marsh, Herbert W.; Smith, Ian D.
1982-01-01
The multidimensionality of self-concept and the use of factor analysis in the development of self-concept instruments are supported in multitrait-multimethod analyses of the Sears and Coopersmith instruments. Convergent validity and discriminate validity of subscales in factor analysis and multitrait-multimethod analysis of longitudinal data are…
2003-03-01
Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig
Hofstadter-Duke, Kristi L; Daly, Edward J
2015-03-01
This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.
The teamwork in assertive community treatment (TACT) scale: development and validation.
Wholey, Douglas R; Zhu, Xi; Knoke, David; Shah, Pri; Zellmer-Bruhn, Mary; Witheridge, Thomas F
2012-11-01
Team design is meticulously specified for assertive community treatment (ACT) teams, yet performance can vary across ACT teams, even those with high fidelity. By developing and validating the Teamwork in Assertive Community Treatment (TACT) scale, investigators examined the role of team processes in ACT performance. The TACT scale measuring ACT teamwork was developed from a conceptual model grounded in organizational research and adapted for the ACT and mental health context. TACT subscales were constructed after exploratory and confirmatory factor analyses. The reliability, discriminant validity, predictive validity, temporal stability, internal consistency, and within-team agreement were established with surveys from approximately 300 members of 26 Minnesota ACT teams who completed the questionnaire three times, at six-month intervals. Nine TACT subscales emerged from the analyses: exploration, exploitation of new and existing knowledge, psychological safety, goal agreement, conflict, constructive controversy, information accessibility, encounter preparedness, and consumer-centered care. These nine subscales demonstrated fit and temporal stability (confirmatory factor analysis), high internal consistency (Cronbach's alpha), and within-team agreement and between-team differences (rwg and intraclass correlations). Correlational analyses of the subscales revealed that they measure related yet distinctive aspects of ACT team processes, and regression analyses demonstrated predictive validity (encounter preparedness is related to staff outcomes). The TACT scale demonstrated high reliability and validity and can be included in research and evaluation of teamwork in ACT and mental health teams.
Vivaldi: visualization and validation of biomacromolecular NMR structures from the PDB.
Hendrickx, Pieter M S; Gutmanas, Aleksandras; Kleywegt, Gerard J
2013-04-01
We describe Vivaldi (VIsualization and VALidation DIsplay; http://pdbe.org/vivaldi), a web-based service for the analysis, visualization, and validation of NMR structures in the Protein Data Bank (PDB). Vivaldi provides access to model coordinates and several types of experimental NMR data using interactive visualization tools, augmented with structural annotations and model-validation information. The service presents information about the modeled NMR ensemble, validation of experimental chemical shifts, residual dipolar couplings, distance and dihedral angle constraints, as well as validation scores based on empirical knowledge and databases. Vivaldi was designed for both expert NMR spectroscopists and casual non-expert users who wish to obtain a better grasp of the information content and quality of NMR structures in the public archive. Copyright © 2013 Wiley Periodicals, Inc.
Experimental validation of predicted cancer genes using FRET
NASA Astrophysics Data System (ADS)
Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.
2018-07-01
Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.
Solar-Diesel Hybrid Power System Optimization and Experimental Validation
NASA Astrophysics Data System (ADS)
Jacobus, Headley Stewart
As of 2008 1.46 billion people, or 22 percent of the World's population, were without electricity. Many of these people live in remote areas where decentralized generation is the only method of electrification. Most mini-grids are powered by diesel generators, but new hybrid power systems are becoming a reliable method to incorporate renewable energy while also reducing total system cost. This thesis quantifies the measurable Operational Costs for an experimental hybrid power system in Sierra Leone. Two software programs, Hybrid2 and HOMER, are used during the system design and subsequent analysis. Experimental data from the installed system is used to validate the two programs and to quantify the savings created by each component within the hybrid system. This thesis bridges the gap between design optimization studies that frequently lack subsequent validation and experimental hybrid system performance studies.
Macias, Cathaleene; Barreira, Paul; Hargreaves, William; Bickman, Leonard; Fisher, William; Aronson, Elliot
2005-04-01
The inability to blind research participants to their experimental conditions is the Achilles' heel of mental health services research. When one experimental condition receives more disappointed participants, or more satisfied participants, research findings can be biased in spite of random assignment. The authors explored the potential for research participants' preference for one experimental program over another to compromise the generalizability and validity of randomized controlled service evaluations as well as cross-study comparisons. Three Cox regression analyses measured the impact of applicants' service assignment preference on research project enrollment, engagement in assigned services, and a service-related outcome, competitive employment. A stated service preference, referral by an agency with a low level of continuity in outpatient care, and willingness to switch from current services were significant positive predictors of research enrollment. Match to service assignment preference was a significant positive predictor of service engagement, and mismatch to assignment preference was a significant negative predictor of both service engagement and employment outcome. Referral source type and service assignment preference should be routinely measured and statistically controlled for in all studies of mental health service effectiveness to provide a sound empirical base for evidence-based practice.
Dynamic Investigation of Static Divergence: Analysis and Testing
NASA Technical Reports Server (NTRS)
Heeg, Jennifer
2000-01-01
The phenomenon known as aeroelastic divergence is the focus of this work. The analyses and experiment presented here show that divergence can occur without a structural dynamic mode losing its oscillatory nature. Aeroelastic divergence occurs when the structural restorative capability or stiffness of a structure is overwhelmed by the static aerodynamic moment. This static aeroelastic coupling does not require the structural dynamic system behavior to cease, however. Aeroelastic changes in the dynamic mode behavior are governed not only by the stiffness, but by damping and inertial properties. The work presented here supports these fundamental assertions by examining a simple system: a typical section airfoil with only a rotational structural degree of freedom. Analytical results identified configurations that exhibit different types of dynamic mode behavior as the system encounters divergence. A wind tunnel model was designed and tested to examine divergence experimentally. The experimental results validate the analytical calculations and explicitly examine the divergence phenomenon where the dynamic mode persists. Three configurations of the wind tunnel model were tested. The experimental results agree very well with the analytical predictions of subcritical characteristics, divergence velocity, and behavior of the noncritical dynamic mode at divergence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
FLANAGAN,A; SCHACHTER,J.M; SCHISSEL,D.P
2003-02-01
A Data Analysis Monitoring (DAM) system has been developed to monitor between pulse physics analysis at the DIII-D National Fusion Facility (http://nssrv1.gat.com:8000/dam). The system allows for rapid detection of discrepancies in diagnostic measurements or the results from physics analysis codes. This enables problems to be detected and possibly fixed between pulses as opposed to after the experimental run has concluded thus increasing the efficiency of experimental time. An example of a consistency check is comparing the experimentally measured neutron rate and the expected neutron emission, RDD0D. A significant difference between these two values could indicate a problem with one ormore » more diagnostics, or the presence of unanticipated phenomena in the plasma. This new system also tracks the progress of MDSplus dispatched data analysis software and the loading of analyzed data into MDSplus. DAM uses a Java Servlet to receive messages, CLIPS to implement expert system logic, and displays its results to multiple web clients via HTML. If an error is detected by DAM, users can view more detailed information so that steps can be taken to eliminate the error for the next pulse.« less
System to monitor data analyses and results of physics data validation between pulses at DIII-D
NASA Astrophysics Data System (ADS)
Flanagan, S.; Schachter, J. M.; Schissel, D. P.
2004-06-01
A data analysis monitoring (DAM) system has been developed to monitor between pulse physics analysis at the DIII-D National Fusion Facility (http://nssrv1.gat.com:8000/dam). The system allows for rapid detection of discrepancies in diagnostic measurements or the results from physics analysis codes. This enables problems to be detected and possibly fixed between pulses as opposed to after the experimental run has concluded, thus increasing the efficiency of experimental time. An example of a consistency check is comparing the experimentally measured neutron rate and the expected neutron emission, RDD0D. A significant difference between these two values could indicate a problem with one or more diagnostics, or the presence of unanticipated phenomena in the plasma. This system also tracks the progress of MDSplus dispatched data analysis software and the loading of analyzed data into MDSplus. DAM uses a Java Servlet to receive messages, C Language Integrated Production System to implement expert system logic, and displays its results to multiple web clients via Hypertext Markup Language. If an error is detected by DAM, users can view more detailed information so that steps can be taken to eliminate the error for the next pulse.
A self-sensing magnetorheological damper with power generation
NASA Astrophysics Data System (ADS)
Chen, Chao; Liao, Wei-Hsin
2012-02-01
Magnetorheological (MR) dampers are promising for semi-active vibration control of various dynamic systems. In the current MR damper systems, a separate power supply and dynamic sensor are required. To enable the MR damper to be self-powered and self-sensing in the future, in this paper we propose and investigate a self-sensing MR damper with power generation, which integrates energy harvesting, dynamic sensing and MR damping technologies into one device. This MR damper has self-contained power generation and velocity sensing capabilities, and is applicable to various dynamic systems. It combines the advantages of energy harvesting—reusing wasted energy, MR damping—controllable damping force, and sensing—providing dynamic information for controlling system dynamics. This multifunctional integration would bring great benefits such as energy saving, size and weight reduction, lower cost, high reliability, and less maintenance for the MR damper systems. In this paper, a prototype of the self-sensing MR damper with power generation was designed, fabricated, and tested. Theoretical analyses and experimental studies on power generation were performed. A velocity-sensing method was proposed and experimentally validated. The magnetic-field interference among three functions was prevented by a combined magnetic-field isolation method. Modeling, analysis, and experimental results on damping forces are also presented.
Revisit the faster-is-slower effect for an exit at a corner
NASA Astrophysics Data System (ADS)
Chen, Jun Min; Lin, Peng; Wu, Fan Yu; Li Gao, Dong; Wang, Guo Yuan
2018-02-01
The faster-is-slower effect (FIS), which means that crowd at a high enough velocity could significantly increase the evacuation time to escape through an exit, is an interesting phenomenon in pedestrian dynamics. Such phenomenon had been studied widely and has been experimentally verified in different systems of discrete particles flowing through a centre exit. To experimentally validate this phenomenon by using people under high pressure is difficult due to ethical issues. A mouse, similar to a human, is a kind of self-driven and soft body creature with competitive behaviour under stressed conditions. Therefore, mice are used to escape through an exit at a corner. A number of repeated tests are conducted and the average escape time per mouse at different levels of stimulus are analysed. The escape times do not increase obviously with the level of stimulus for the corner exit, which is contrary to the experiment with the center exit. The experimental results show that the FIS effect is not necessary a universal law for any discrete system. The observation could help the design of buildings by relocating their exits to the corner in rooms to avoid the formation of FIS effect.
NASA Astrophysics Data System (ADS)
Ferreira, A.
1996-04-01
This paper describes an automated test system for piezoelectric motors allowing the experimental characterization of its electromechanical behaviour. In the first part, an experimental method is given for evaluation of losses generated in the different mechanisms of conversion: electric energy into ultrasonic vibrating energy and ultrasonic vibrating energy into mechanical energy of revolving motion. In the second part, the present method is experimentally validated on a travelling-wave-type rotary motor (Shinsei USR-60). The free stator vibration is analysed by a laser vibrometer which gives a picture both of amplitude and of phase vibration. This result allows one to obtain an identification of vibrations modes and an evaluation of ultrasonic vibrating energy and electromechanical efficiency. To characterize the working of the complete motor, the no-load working mode is first considered. The measurement of its maximal mechanical characteristics (maximal no-load rotating speed, maximal driving torque) with respect to axial load allows the choice of optimum axial load. For this optimum value, the load working mode is, finally, investigated for the evaluation of load characteristics and conversion losses.
Mass transfer in thin films under counter-current gas: experiments and numerical study
NASA Astrophysics Data System (ADS)
Lucquiaud, Mathieu; Lavalle, Gianluca; Schmidt, Patrick; Ausner, Ilja; Wehrli, Marc; O Naraigh, Lennon; Valluri, Prashant
2016-11-01
Mass transfer in liquid-gas stratified flows is strongly affected by the waviness of the interface. For reactive flows, the chemical reactions occurring at the liquid-gas interface also influence the mass transfer rate. This is encountered in several technological applications, such as absorption units for carbon capture. We investigate the absorption rate of carbon dioxide in a liquid solution. The experimental set-up consists of a vertical channel where a falling film is sheared by a counter-current gas flow. We measure the absorption occurring at different flow conditions, by changing the liquid solution, the liquid flow rate and the gas composition. With the aim to support the experimental results with numerical simulations, we implement in our level-set flow solver a novel module for mass transfer taking into account a variant of the ghost-fluid formalism. We firstly validate the pure mass transfer case with and without hydrodynamics by comparing the species concentration in the bulk flow to the analytical solution. In a final stage, we analyse the absorption rate in reactive flows, and try to reproduce the experimental results by means of numerical simulations to explore the active role of the waves at the interface.
NASA Astrophysics Data System (ADS)
Vintila, Iuliana; Gavrus, Adinel
2017-10-01
The present research paper proposes the validation of a rigorous computation model used as a numerical tool to identify rheological behavior of complex emulsions W/O. Considering a three-dimensional description of a general viscoplastic flow it is detailed the thermo-mechanical equations used to identify fluid or soft material's rheological laws starting from global experimental measurements. Analyses are conducted for complex emulsions W/O having generally a Bingham behavior using the shear stress - strain rate dependency based on a power law and using an improved analytical model. Experimental results are investigated in case of rheological behavior for crude and refined rapeseed/soybean oils and four types of corresponding W/O emulsions using different physical-chemical composition. The rheological behavior model was correlated with the thermo-mechanical analysis of a plane-plane rheometer, oil content, chemical composition, particle size and emulsifier's concentration. The parameters of rheological laws describing the industrial oils and the W/O concentrated emulsions behavior were computed from estimated shear stresses using a non-linear regression technique and from experimental torques using the inverse analysis tool designed by A. Gavrus (1992-2000).
ISPyB: an information management system for synchrotron macromolecular crystallography.
Delagenière, Solange; Brenchereau, Patrice; Launer, Ludovic; Ashton, Alun W; Leal, Ricardo; Veyrier, Stéphanie; Gabadinho, José; Gordon, Elspeth J; Jones, Samuel D; Levik, Karl Erik; McSweeney, Seán M; Monaco, Stéphanie; Nanao, Max; Spruce, Darren; Svensson, Olof; Walsh, Martin A; Leonard, Gordon A
2011-11-15
Individual research groups now analyze thousands of samples per year at synchrotron macromolecular crystallography (MX) resources. The efficient management of experimental data is thus essential if the best possible experiments are to be performed and the best possible data used in downstream processes in structure determination pipelines. Information System for Protein crystallography Beamlines (ISPyB), a Laboratory Information Management System (LIMS) with an underlying data model allowing for the integration of analyses down-stream of the data collection experiment was developed to facilitate such data management. ISPyB is now a multisite, generic LIMS for synchrotron-based MX experiments. Its initial functionality has been enhanced to include improved sample tracking and reporting of experimental protocols, the direct ranking of the diffraction characteristics of individual samples and the archiving of raw data and results from ancillary experiments and post-experiment data processing protocols. This latter feature paves the way for ISPyB to play a central role in future macromolecular structure solution pipelines and validates the application of the approach used in ISPyB to other experimental techniques, such as biological solution Small Angle X-ray Scattering and spectroscopy, which have similar sample tracking and data handling requirements.
Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.
2017-01-01
A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices. PMID:28594889
Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R
2017-01-01
A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices.
Abdelhafiz, Ali A; Ganzoury, Mohamed A; Amer, Ahmad W; Faiad, Azza A; Khalifa, Ahmed M; AlQaradawi, Siham Y; El-Sayed, Mostafa A; Alamgir, Faisal M; Allam, Nageh K
2018-04-18
Understanding the nature of interfacial defects of materials is a critical undertaking for the design of high-performance hybrid electrodes for photocatalysis applications. Theoretical and computational endeavors to achieve this have touched boundaries far ahead of their experimental counterparts. However, to achieve any industrial benefit out of such studies, experimental validation needs to be systematically undertaken. In this sense, we present herein experimental insights into the synergistic relationship between the lattice position and oxidation state of tungsten ions inside a TiO2 lattice, and the respective nature of the created defect states. Consequently, a roadmap to tune the defect states in anodically-fabricated, ultrathin-walled W-doped TiO2 nanotubes is proposed. Annealing the nanotubes in different gas streams enabled the engineering of defects in such structures, as confirmed by XRD and XPS measurements. While annealing under hydrogen stream resulted in the formation of abundant Wn+ (n < 6) ions at the interstitial sites of the TiO2 lattice, oxygen- and air-annealing induced W6+ ions at substitutional sites. EIS and Mott-Schottky analyses indicated the formation of deep-natured trap states in the hydrogen-annealed samples, and predominantly shallow donating defect states in the oxygen- and air-annealed samples. Consequently, the photocatalytic performance of the latter was significantly higher than those of the hydrogen-annealed counterparts. Upon increasing the W content, photoelectrochemical performance deteriorated due to the formation of WO3 crystallites that hindered charge transfer through the photoanode, as evident from the structural and chemical characterization. To this end, this study validates the previous theoretical predictions on the detrimental effect of interstitial W ions. In addition, it sheds light on the importance of defect states and their nature for tuning the photoelectrochemical performance of the investigated materials.
NASA Astrophysics Data System (ADS)
Boztepe, Sinan; Gilblas, Remi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice
2017-10-01
Most of the thermoforming processes of thermoplastic polymers and their composites are performed adopting a combined heating and forming stages at which a precursor is heated prior to the forming. This step is done in order to improve formability by softening the thermoplastic polymer. Due to low thermal conductivity and semi-transparency of polymers, infrared (IR) heating is widely used for thermoforming of such materials. Predictive radiation heat transfer models for temperature distributions are therefore critical for optimizations of thermoforming process. One of the key challenges is to build a predictive model including the physical background of radiation heat transfer phenomenon in semi-crystalline thermoplastics as their microcrystalline structure introduces an optically heterogeneous medium. In addition, the accuracy of a predictive model is required to be validated experimentally where IR thermography is one of the suitable methods for such a validation as it provides a non-invasive, full-field surface temperature measurement. Although IR cameras provide a non-invasive measurement, a key issue for obtaining a reliable measurement depends on the optical characteristics of a heated material and the operating spectral band of IR camera. It is desired that the surface of a material to be measured has a spectral band where the material behaves opaque and an employed IR camera operates in the corresponding band. In this study, the optical characteristics of the PO-based polymer are discussed and, an experimental approach is proposed in order to measure the surface temperature of the PO-based polymer via IR thermography. The preliminary analyses showed that IR thermographic measurements may not be simply performed on PO-based polymers and require a correction method as their semi-transparent medium introduce a challenge to obtain reliable surface temperature measurements.
NASA Astrophysics Data System (ADS)
Korhonen, Rami K.; Saarakkala, Simo; Töyräs, Juha; Laasanen, Mikko S.; Kiviranta, Ilkka; Jurvelin, Jukka S.
2003-06-01
Softening of articular cartilage, mainly attributable to deterioration of superficial collagen network and depletion of proteoglycans, is a sign of incipient osteoarthrosis. Early diagnosis of osteoarthrosis is essential to prevent the further destruction of the tissue. During the past decade, a few arthroscopic instruments have been introduced for the measurement of cartilage stiffness; these can be used to provide a sensitive measure of cartilage status. Ease of use, accuracy and reproducibility of the measurements as well as a low risk of damaging cartilage are the main qualities needed in any clinically applicable instrument. In this study, we have modified a commercially available arthroscopic indentation instrument to better fulfil these requirements when measuring cartilage stiffness in joints with thin cartilage. Our novel configuration was validated by experimental testing as well as by finite element (FE) modelling. Experimental and numerical tests indicated that it would be better to use a smaller reference plate and a lower pressing force (3 N) than those used in the original instrument (7-10 N). The reproducibility (CV = 5.0%) of the in situ indentation measurements was improved over that of the original instrument (CV = 7.6%), and the effect of material thickness on the indentation response was smaller than that obtained with the original instrument. The novel configuration showed a significant linear correlation between the indenter force and the reference dynamic modulus of cartilage in unconfined compression, especially in soft tissue (r = 0.893, p < 0.001, n = 16). FE analyses with a transversely isotropic poroelastic model indicated that the instrument was suitable for detecting the degeneration of superficial cartilage. In summary, the instrument presented in this study allows easy and reproducible measurement of cartilage stiffness, also in thin cartilage, and therefore represents a technical improvement for the early diagnosis of osteoarthrosis during arthroscopy.
Kong, Fan-Zhi; Yang, Ying; He, Yu-Chen; Zhang, Qiang; Li, Guo-Qing; Fan, Liu-Yin; Xiao, Hua; Li, Shan; Cao, Cheng-Xi
2016-09-01
In this work, charge-to-mass ratio (C/M) and band broadening analyses were combined to provide better guidance for the design of free-flow zone electrophoresis carrier buffer (CB). First, the C/M analyses of hemoglobin and C-phycocyanin (C-PC) under different pH were performed by CLC Protein Workbench software. Second, band dispersion due to the initial bandwidth, diffusion, and hydrodynamic broadening were discussed, respectively. Based on the analyses of the C/M and band broadening, a better guidance for preparation of free-flow zone electrophoresis CB was obtained. Series of experiments were performed to validate the proposed method. The experimental data showed high accordance with our prediction allowing the CB to be prepared easily with our proposed method. To further evaluate this method, C-PC was purified from crude extracts of Spirulina platensis with the selected separation condition. Results showed that C-PC was well separated from other phycobiliproteins that have similar physicochemical properties, and analytical grade product with purity up to 4.5 (A620/A280) was obtained. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code
NASA Astrophysics Data System (ADS)
Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.
2015-12-01
WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).
Personality and job performance: the Big Five revisited.
Hurtz, G M; Donovan, J J
2000-12-01
Prior meta-analyses investigating the relation between the Big 5 personality dimensions and job performance have all contained a threat to construct validity, in that much of the data included within these analyses was not derived from actual Big 5 measures. In addition, these reviews did not address the relations between the Big 5 and contextual performance. Therefore, the present study sought to provide a meta-analytic estimate of the criterion-related validity of explicit Big 5 measures for predicting job performance and contextual performance. The results for job performance closely paralleled 2 of the previous meta-analyses, whereas analyses with contextual performance showed more complex relations among the Big 5 and performance. A more critical interpretation of the Big 5-performance relationship is presented, and suggestions for future research aimed at enhancing the validity of personality predictors are provided.
Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister
2017-01-01
Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach offers practical benefits for probing novel insights into the mode of action of investigational compounds, and for the identification of new target selectivities for drug repurposing applications. PMID:28787438
Mantzoukas, Stefanos
2009-04-01
Evidence-based practice has become an imperative for efficient, effective and safe practice. Furthermore, evidences emerging from published research are considered as valid knowledge sources to guiding practice. The aim of this paper is to review all research articles published in the top 10 general nursing journals for the years 2000-2006 to identify the methodologies used, the types of evidence these studies produced and the issues upon which they endeavored. Quantitative content analysis was implemented to study all published research papers of the top 10 general nursing journals for the years 2000-2006. The top 10 general nursing journals were included in the study. The abstracts of all research articles were analysed with regards the methodologies of enquiry, the types of evidence produced and the issues of study they endeavored upon. Percentages were developed as to enable conclusions to be drawn. The results for the category methodologies used were 7% experimental, 6% quasi-experimental, 39% non-experimental, 2% ethnographical studies, 7% phenomenological, 4% grounded theory, 1% action research, 1% case study, 15% unspecified, 5.5% other, 0.5% meta-synthesis, 2% meta-analysis, 5% literature reviews and 3% secondary analysis. For the category types of evidence were 4% hypothesis/theory testing, 11% evaluative, 5% comparative, 2% correlational, 46% descriptive, 5% interpretative and 27% exploratory. For the category issues of study were 45% practice/clinical, 8% educational, 11% professional, 3% spiritual/ethical/metaphysical, 26% health promotion and 7% managerial/policy. Published studies can provide adequate evidences for practice if nursing journals conceptualise evidence emerging from non-experimental and qualitative studies as relevant types of evidences for practice and develop appropriate mechanisms for assessing their validity. Also, nursing journals need to increase and encourage the publication of studies that implement RCT methodology, systematic reviews, meta-synthesis and meta-analysis methodologies. Finally, nursing journals need to encourage more high quality research evidence that derive from interpretative, theory testing and evaluative types of studies that are practice relevant.
Test system stability and natural variability of a Lemna gibba L. bioassay.
Scherr, Claudia; Simon, Meinhard; Spranger, Jörg; Baumgartner, Stephan
2008-09-04
In ecotoxicological and environmental studies Lemna spp. are used as test organisms due to their small size, rapid predominantly vegetative reproduction, easy handling and high sensitivity to various chemicals. However, there is not much information available concerning spatial and temporal stability of experimental set-ups used for Lemna bioassays, though this is essential for interpretation and reliability of results. We therefore investigated stability and natural variability of a Lemna gibba bioassay assessing area-related and frond number-related growth rates under controlled laboratory conditions over about one year. Lemna gibba L. was grown in beakers with Steinberg medium for one week. Area-related and frond number-related growth rates (r(area) and r(num)) were determined with a non-destructive image processing system. To assess inter-experimental stability, 35 independent experiments were performed with 10 beakers each in the course of one year. We observed changes in growth rates by a factor of two over time. These did not correlate well with temperature or relative humidity in the growth chamber. In order to assess intra-experimental stability, we analysed six systematic negative control experiments (nontoxicant tests) with 96 replicate beakers each. Evaluation showed that the chosen experimental set-up was stable and did not produce false positive results. The coefficient of variation was lower for r(area) (2.99%) than for r(num) (4.27%). It is hypothesised that the variations in growth rates over time under controlled conditions are partly due to endogenic periodicities in Lemna gibba. The relevance of these variations for toxicity investigations should be investigated more closely. Area-related growth rate seems to be more precise as non-destructive calculation parameter than number-related growth rate. Furthermore, we propose two new validity criteria for Lemna gibba bioassays: variability of average specific and section-by-section segmented growth rate, complementary to average specific growth rate as the only validity criterion existing in guidelines for duckweed bioassays.
Test System Stability and Natural Variability of a Lemna Gibba L. Bioassay
Scherr, Claudia; Simon, Meinhard; Spranger, Jörg; Baumgartner, Stephan
2008-01-01
Background In ecotoxicological and environmental studies Lemna spp. are used as test organisms due to their small size, rapid predominantly vegetative reproduction, easy handling and high sensitivity to various chemicals. However, there is not much information available concerning spatial and temporal stability of experimental set-ups used for Lemna bioassays, though this is essential for interpretation and reliability of results. We therefore investigated stability and natural variability of a Lemna gibba bioassay assessing area-related and frond number-related growth rates under controlled laboratory conditions over about one year. Methology/Principal Findings Lemna gibba L. was grown in beakers with Steinberg medium for one week. Area-related and frond number-related growth rates (r(area) and r(num)) were determined with a non-destructive image processing system. To assess inter-experimental stability, 35 independent experiments were performed with 10 beakers each in the course of one year. We observed changes in growth rates by a factor of two over time. These did not correlate well with temperature or relative humidity in the growth chamber. In order to assess intra-experimental stability, we analysed six systematic negative control experiments (nontoxicant tests) with 96 replicate beakers each. Evaluation showed that the chosen experimental set-up was stable and did not produce false positive results. The coefficient of variation was lower for r(area) (2.99%) than for r(num) (4.27%). Conclusions/Significance It is hypothesised that the variations in growth rates over time under controlled conditions are partly due to endogenic periodicities in Lemna gibba. The relevance of these variations for toxicity investigations should be investigated more closely. Area-related growth rate seems to be more precise as non-destructive calculation parameter than number-related growth rate. Furthermore, we propose two new validity criteria for Lemna gibba bioassays: variability of average specific and section-by-section segmented growth rate, complementary to average specific growth rate as the only validity criterion existing in guidelines for duckweed bioassays. PMID:18769541
Commisso, Maria S; Martínez-Reina, Javier; Mayo, Juana; Domínguez, Jaime
2013-02-01
The main objectives of this work are: (a) to introduce an algorithm for adjusting the quasi-linear viscoelastic model to fit a material using a stress relaxation test and (b) to validate a protocol for performing such tests in temporomandibular joint discs. This algorithm is intended for fitting the Prony series coefficients and the hyperelastic constants of the quasi-linear viscoelastic model by considering that the relaxation test is performed with an initial ramp loading at a certain rate. This algorithm was validated before being applied to achieve the second objective. Generally, the complete three-dimensional formulation of the quasi-linear viscoelastic model is very complex. Therefore, it is necessary to design an experimental test to ensure a simple stress state, such as uniaxial compression to facilitate obtaining the viscoelastic properties. This work provides some recommendations about the experimental setup, which are important to follow, as an inadequate setup could produce a stress state far from uniaxial, thus, distorting the material constants determined from the experiment. The test considered is a stress relaxation test using unconfined compression performed in cylindrical specimens extracted from temporomandibular joint discs. To validate the experimental protocol, the test was numerically simulated using finite-element modelling. The disc was arbitrarily assigned a set of quasi-linear viscoelastic constants (c1) in the finite-element model. Another set of constants (c2) was obtained by fitting the results of the simulated test with the proposed algorithm. The deviation of constants c2 from constants c1 measures how far the stresses are from the uniaxial state. The effects of the following features of the experimental setup on this deviation have been analysed: (a) the friction coefficient between the compression plates and the specimen (which should be as low as possible); (b) the portion of the specimen glued to the compression plates (smaller areas glued are better); and (c) the variation in the thickness of the specimen. The specimen's faces should be parallel to ensure a uniaxial stress state. However, this is not possible in real specimens, and a criterion must be defined to accept the specimen in terms of the specimen's thickness variation and the deviation of the fitted constants arising from such a variation.
Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview
NASA Technical Reports Server (NTRS)
Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard
1996-01-01
The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'
Pablos, Leticia; Doetjes, Jenny; Cheng, Lisa L.-S.
2018-01-01
The empirical study of language is a young field in contemporary linguistics. This being the case, and following a natural development process, the field is currently at a stage where different research methods and experimental approaches are being put into question in terms of their validity. Without pretending to provide an answer with respect to the best way to conduct linguistics related experimental research, in this article we aim at examining the process that researchers follow in the design and implementation of experimental linguistics research with a goal to validate specific theoretical linguistic analyses. First, we discuss the general challenges that experimental work faces in finding a compromise between addressing theoretically relevant questions and being able to implement these questions in a specific controlled experimental paradigm. We discuss the Granularity Mismatch Problem (Poeppel and Embick, 2005) which addresses the challenges that research that is trying to bridge the representations and computations of language and their psycholinguistic/neurolinguistic evidence faces, and the basic assumptions that interdisciplinary research needs to consider due to the different conceptual granularity of the objects under study. To illustrate the practical implications of the points addressed, we compare two approaches to perform linguistic experimental research by reviewing a number of our own studies strongly grounded on theoretically informed questions. First, we show how linguistic phenomena similar at a conceptual level can be tested within the same language using measurement of event-related potentials (ERP) by discussing results from two ERP experiments on the processing of long-distance backward dependencies that involve coreference and negative polarity items respectively in Dutch. Second, we examine how the same linguistic phenomenon can be tested in different languages using reading time measures by discussing the outcome of four self-paced reading experiments on the processing of in-situ wh-questions in Mandarin Chinese and French. Finally, we review the implications that our findings have for the specific theoretical linguistics questions that we originally aimed to address. We conclude with an overview of the general insights that can be gained from the role of structural hierarchy and grammatical constraints in processing and the existing limitations on the generalization of results. PMID:29375417
Pablos, Leticia; Doetjes, Jenny; Cheng, Lisa L-S
2017-01-01
The empirical study of language is a young field in contemporary linguistics. This being the case, and following a natural development process, the field is currently at a stage where different research methods and experimental approaches are being put into question in terms of their validity. Without pretending to provide an answer with respect to the best way to conduct linguistics related experimental research, in this article we aim at examining the process that researchers follow in the design and implementation of experimental linguistics research with a goal to validate specific theoretical linguistic analyses. First, we discuss the general challenges that experimental work faces in finding a compromise between addressing theoretically relevant questions and being able to implement these questions in a specific controlled experimental paradigm. We discuss the Granularity Mismatch Problem (Poeppel and Embick, 2005) which addresses the challenges that research that is trying to bridge the representations and computations of language and their psycholinguistic/neurolinguistic evidence faces, and the basic assumptions that interdisciplinary research needs to consider due to the different conceptual granularity of the objects under study. To illustrate the practical implications of the points addressed, we compare two approaches to perform linguistic experimental research by reviewing a number of our own studies strongly grounded on theoretically informed questions. First, we show how linguistic phenomena similar at a conceptual level can be tested within the same language using measurement of event-related potentials (ERP) by discussing results from two ERP experiments on the processing of long-distance backward dependencies that involve coreference and negative polarity items respectively in Dutch. Second, we examine how the same linguistic phenomenon can be tested in different languages using reading time measures by discussing the outcome of four self-paced reading experiments on the processing of in-situ wh -questions in Mandarin Chinese and French. Finally, we review the implications that our findings have for the specific theoretical linguistics questions that we originally aimed to address. We conclude with an overview of the general insights that can be gained from the role of structural hierarchy and grammatical constraints in processing and the existing limitations on the generalization of results.
2016-05-24
experimental data. However, the time and length scales, and energy deposition rates in the canonical laboratory flames that have been studied over the...is to obtain high-fidelity experimental data critically needed to validate research codes at relevant conditions, and to develop systematic and...validated with experimental data. However, the time and length scales, and energy deposition rates in the canonical laboratory flames that have been
2016-06-02
Retrieval of droplet-size density distribution from multiple-field-of-view cross-polarized lidar signals: theory and experimental validation...theoretical and experimental studies of mul- tiple scattering and multiple-field-of-view (MFOV) li- dar detection have made possible the retrieval of cloud...droplet cloud are typical of Rayleigh scattering, with a signature close to a dipole (phase function quasi -flat and a zero-depolarization ratio
2001-08-30
Body with Thermo-Chemical destribution of Heat-Protected System . In: Physical and Gasdynamic Phenomena in Supersonic Flows Over Bodies. Edit. By...Final Report on ISTC Contract # 1809p Parametric Study of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental...of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental Validation Planning 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT
Experimental validation of ultrasonic guided modes in electrical cables by optical interferometry.
Mateo, Carlos; de Espinosa, Francisco Montero; Gómez-Ullate, Yago; Talavera, Juan A
2008-03-01
In this work, the dispersion curves of elastic waves propagating in electrical cables and in bare copper wires are obtained theoretically and validated experimentally. The theoretical model, based on Gazis equations formulated according to the global matrix methodology, is resolved numerically. Viscoelasticity and attenuation are modeled theoretically using the Kelvin-Voigt model. Experimental tests are carried out using interferometry. There is good agreement between the simulations and the experiments despite the peculiarities of electrical cables.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-05
... modeling needs and experimental validation techniques for complex flow phenomena in and around off- shore... experimental validation. Ultimately, research in this area may lead to significant improvements in wind plant... meeting will consist of an initial plenary session in which invited speakers will survey available...
Validation of a Monte Carlo simulation of the Inveon PET scanner using GATE
NASA Astrophysics Data System (ADS)
Lu, Lijun; Zhang, Houjin; Bian, Zhaoying; Ma, Jianhua; Feng, Qiangjin; Chen, Wufan
2016-08-01
The purpose of this study is to validate the application of GATE (Geant4 Application for Tomographic Emission) Monte Carlo simulation toolkit in order to model the performance characteristics of Siemens Inveon small animal PET system. The simulation results were validated against experimental/published data in accordance with the NEMA NU-4 2008 protocol for standardized evaluation of spatial resolution, sensitivity, scatter fraction (SF) and noise equivalent counting rate (NECR) of a preclinical PET system. An agreement of less than 18% was obtained between the radial, tangential and axial spatial resolutions of the simulated and experimental results. The simulated peak NECR of mouse-size phantom agreed with the experimental result, while for the rat-size phantom simulated value was higher than experimental result. The simulated and experimental SFs of mouse- and rat- size phantom both reached an agreement of less than 2%. It has been shown the feasibility of our GATE model to accurately simulate, within certain limits, all major performance characteristics of Inveon PET system.
A new simple local muscle recovery model and its theoretical and experimental validation.
Ma, Liang; Zhang, Wei; Wu, Su; Zhang, Zhanwu
2015-01-01
This study was conducted to provide theoretical and experimental validation of a local muscle recovery model. Muscle recovery has been modeled in different empirical and theoretical approaches to determine work-rest allowance for musculoskeletal disorder (MSD) prevention. However, time-related parameters and individual attributes have not been sufficiently considered in conventional approaches. A new muscle recovery model was proposed by integrating time-related task parameters and individual attributes. Theoretically, this muscle recovery model was compared to other theoretical models mathematically. Experimentally, a total of 20 subjects participated in the experimental validation. Hand grip force recovery and shoulder joint strength recovery were measured after a fatiguing operation. The recovery profile was fitted by using the recovery model, and individual recovery rates were calculated as well after fitting. Good fitting values (r(2) > .8) were found for all the subjects. Significant differences in recovery rates were found among different muscle groups (p < .05). The theoretical muscle recovery model was primarily validated by characterization of the recovery process after fatiguing operation. The determined recovery rate may be useful to represent individual recovery attribute.
Leoni-Scheiber, Claudia; Gothe, Raffaella Matteucci; Müller-Staub, Maria
2016-02-01
The attitude of nurses influences their application of the Advanced Nursing Process. Studies reveal deficits in the application of the Advanced Nursing Process that is based on valid assessments and nursing classifications. These deficits affect decision-making and – as a result – nursing care quality. In German speaking countries nurses' attitudes towards nursing diagnoses as part of the Advanced Nursing Process were not yet measured. The aim of this study was to evaluate the effects of an educational intervention on nurses' attitude. A quasi-experimental intervention study was carried out in Austria and Germany. Before and after a standardised educational intervention 51 nurses estimated their attitude with the instrument Positions on Nursing Diagnosis (PND). Analyses were performed by Wilcoxon- and U-tests. Before the educational intervention the average attitude score of the Austrian nurses was more positive than in the German group. After the study intervention both groups regarded nursing diagnostics statistically significant more convincing and better understandable. However, both groups still described the application of the Advanced Nursing Process as difficult and demanding to perform. In the future, more attention should be given to the reflexion and development of nurses' attitude towards the Advanced Nursing Process because attitudes lead nurses' actions. In further studies influencing organizational and structural factors in diverse settings will be analysed.
NASA Astrophysics Data System (ADS)
Dörr, Dominik; Joppich, Tobias; Schirmaier, Fabian; Mosthaf, Tobias; Kärger, Luise; Henning, Frank
2016-10-01
Thermoforming of continuously fiber reinforced thermoplastics (CFRTP) is ideally suited to thin walled and complex shaped products. By means of forming simulation, an initial validation of the producibility of a specific geometry, an optimization of the forming process and the prediction of fiber-reorientation due to forming is possible. Nevertheless, applied methods need to be validated. Therefor a method is presented, which enables the calculation of error measures for the mismatch between simulation results and experimental tests, based on measurements with a conventional coordinate measuring device. As a quantitative measure, describing the curvature is provided, the presented method is also suitable for numerical or experimental sensitivity studies on wrinkling behavior. The applied methods for forming simulation, implemented in Abaqus explicit, are presented and applied to a generic geometry. The same geometry is tested experimentally and simulation and test results are compared by the proposed validation method.
Validation of an automated mite counter for Dermanyssus gallinae in experimental laying hen cages.
Mul, Monique F; van Riel, Johan W; Meerburg, Bastiaan G; Dicke, Marcel; George, David R; Groot Koerkamp, Peter W G
2015-08-01
For integrated pest management (IPM) programs to be maximally effective, monitoring of the growth and decline of the pest populations is essential. Here, we present the validation results of a new automated monitoring device for the poultry red mite (Dermanyssus gallinae), a serious pest in laying hen facilities world-wide. This monitoring device (called an "automated mite counter") was validated in experimental laying hen cages with live birds and a growing population of D. gallinae. This validation study resulted in 17 data points of 'number of mites counted' by the automated mite counter and the 'number of mites present' in the experimental laying hen cages. The study demonstrated that the automated mite counter was able to track the D. gallinae population effectively. A wider evaluation showed that this automated mite counter can become a useful tool in IPM of D. gallinae in laying hen facilities.
van de Streek, Jacco; Neumann, Marcus A
2010-10-01
This paper describes the validation of a dispersion-corrected density functional theory (d-DFT) method for the purpose of assessing the correctness of experimental organic crystal structures and enhancing the information content of purely experimental data. 241 experimental organic crystal structures from the August 2008 issue of Acta Cryst. Section E were energy-minimized in full, including unit-cell parameters. The differences between the experimental and the minimized crystal structures were subjected to statistical analysis. The r.m.s. Cartesian displacement excluding H atoms upon energy minimization with flexible unit-cell parameters is selected as a pertinent indicator of the correctness of a crystal structure. All 241 experimental crystal structures are reproduced very well: the average r.m.s. Cartesian displacement for the 241 crystal structures, including 16 disordered structures, is only 0.095 Å (0.084 Å for the 225 ordered structures). R.m.s. Cartesian displacements above 0.25 A either indicate incorrect experimental crystal structures or reveal interesting structural features such as exceptionally large temperature effects, incorrectly modelled disorder or symmetry breaking H atoms. After validation, the method is applied to nine examples that are known to be ambiguous or subtly incorrect.
A model for flexi-bar to evaluate intervertebral disc and muscle forces in exercises.
Abdollahi, Masoud; Nikkhoo, Mohammad; Ashouri, Sajad; Asghari, Mohsen; Parnianpour, Mohamad; Khalaf, Kinda
2016-10-01
This study developed and validated a lumped parameter model for the FLEXI-BAR, a popular training instrument that provides vibration stimulation. The model which can be used in conjunction with musculoskeletal-modeling software for quantitative biomechanical analyses, consists of 3 rigid segments, 2 torsional springs, and 2 torsional dashpots. Two different sets of experiments were conducted to determine the model's key parameters including the stiffness of the springs and the damping ratio of the dashpots. In the first set of experiments, the free vibration of the FLEXI-BAR with an initial displacement at its end was considered, while in the second set, forced oscillations of the bar were studied. The properties of the mechanical elements in the lumped parameter model were derived utilizing a non-linear optimization algorithm which minimized the difference between the model's prediction and the experimental data. The results showed that the model is valid (8% error) and can be used for simulating exercises with the FLEXI-BAR for excitations in the range of the natural frequency. The model was then validated in combination with AnyBody musculoskeletal modeling software, where various lumbar disc, spinal muscles and hand muscles forces were determined during different FLEXI-BAR exercise simulations. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-06-01
Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.
Assessment of protein set coherence using functional annotations
Chagoyen, Monica; Carazo, Jose M; Pascual-Montano, Alberto
2008-01-01
Background Analysis of large-scale experimental datasets frequently produces one or more sets of proteins that are subsequently mined for functional interpretation and validation. To this end, a number of computational methods have been devised that rely on the analysis of functional annotations. Although current methods provide valuable information (e.g. significantly enriched annotations, pairwise functional similarities), they do not specifically measure the degree of homogeneity of a protein set. Results In this work we present a method that scores the degree of functional homogeneity, or coherence, of a set of proteins on the basis of the global similarity of their functional annotations. The method uses statistical hypothesis testing to assess the significance of the set in the context of the functional space of a reference set. As such, it can be used as a first step in the validation of sets expected to be homogeneous prior to further functional interpretation. Conclusion We evaluate our method by analysing known biologically relevant sets as well as random ones. The known relevant sets comprise macromolecular complexes, cellular components and pathways described for Saccharomyces cerevisiae, which are mostly significantly coherent. Finally, we illustrate the usefulness of our approach for validating 'functional modules' obtained from computational analysis of protein-protein interaction networks. Matlab code and supplementary data are available at PMID:18937846
Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-01-01
Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477
Lance, Blake W.; Smith, Barton L.
2016-06-23
Transient convection has been investigated experimentally for the purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. A specialized facility for validation benchmark experiments called the Rotatable Buoyancy Tunnel was used to acquire thermal and velocity measurements of flow over a smooth, vertical heated plate. The initial condition was forced convection downward with subsequent transition to mixed convection, ending with natural convection upward after a flow reversal. Data acquisition through the transient was repeated for ensemble-averaged results. With simple flow geometry, validation data were acquired at the benchmark level. All boundary conditions (BCs) were measured and their uncertainties quantified.more » Temperature profiles on all four walls and the inlet were measured, as well as as-built test section geometry. Inlet velocity profiles and turbulence levels were quantified using Particle Image Velocimetry. System Response Quantities (SRQs) were measured for comparison with CFD outputs and include velocity profiles, wall heat flux, and wall shear stress. Extra effort was invested in documenting and preserving the validation data. Details about the experimental facility, instrumentation, experimental procedure, materials, BCs, and SRQs are made available through this paper. As a result, the latter two are available for download and the other details are included in this work.« less
Bayesian cross-entropy methodology for optimal design of validation experiments
NASA Astrophysics Data System (ADS)
Jiang, X.; Mahadevan, S.
2006-07-01
An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.
Computational fluid dynamics modeling of laboratory flames and an industrial flare.
Singh, Kanwar Devesh; Gangadharan, Preeti; Chen, Daniel H; Lou, Helen H; Li, Xianchang; Richmond, Peyton
2014-11-01
A computational fluid dynamics (CFD) methodology for simulating the combustion process has been validated with experimental results. Three different types of experimental setups were used to validate the CFD model. These setups include an industrial-scale flare setups and two lab-scale flames. The CFD study also involved three different fuels: C3H6/CH/Air/N2, C2H4/O2/Ar and CH4/Air. In the first setup, flare efficiency data from the Texas Commission on Environmental Quality (TCEQ) 2010 field tests were used to validate the CFD model. In the second setup, a McKenna burner with flat flames was simulated. Temperature and mass fractions of important species were compared with the experimental data. Finally, results of an experimental study done at Sandia National Laboratories to generate a lifted jet flame were used for the purpose of validation. The reduced 50 species mechanism, LU 1.1, the realizable k-epsilon turbulence model, and the EDC turbulence-chemistry interaction model were usedfor this work. Flare efficiency, axial profiles of temperature, and mass fractions of various intermediate species obtained in the simulation were compared with experimental data and a good agreement between the profiles was clearly observed. In particular the simulation match with the TCEQ 2010 flare tests has been significantly improved (within 5% of the data) compared to the results reported by Singh et al. in 2012. Validation of the speciated flat flame data supports the view that flares can be a primary source offormaldehyde emission.
NASA Technical Reports Server (NTRS)
Hazelton, R. C.; Yadlowsky, E. J.; Churchill, R. J.; Parker, L. W.; Sellers, B.
1981-01-01
The effect differential charging of spacecraft thermal control surfaces is assessed by studying the dynamics of the charging process. A program to experimentally validate a computer model of the charging process was established. Time resolved measurements of the surface potential were obtained for samples of Kapton and Teflon irradiated with a monoenergetic electron beam. Results indicate that the computer model and experimental measurements agree well and that for Teflon, secondary emission is the governing factor. Experimental data indicate that bulk conductivities play a significant role in the charging of Kapton.
FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju
To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposedmore » based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting hybrid validation data package (composed of experimental and dummy data) will provide a clear and complete instance delineating the structure of the desired validation data and enabling effective communication among the modeler, the experimentalist, and the knowledgebase developer. With a good common understanding of the desired data structure by the three parties of subject matter experts, further existing data hunting will be effectively conducted, new experimental data generation will be realistically pursued, knowledgebase schema will be practically designed; and code validation will be confidently planned.« less
Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment
NASA Technical Reports Server (NTRS)
Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)
2017-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.
Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment
NASA Technical Reports Server (NTRS)
Storey, Jed; Kirk, Daniel (Editor); Marsell, Brandon (Editor); Schallhorn, Paul (Editor)
2017-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.
NASA Technical Reports Server (NTRS)
Johnson, Paul K.
2007-01-01
NASA Glenn Research Center (GRC) contracted Barber-Nichols, Arvada, CO to construct a dual Brayton power conversion system for use as a hardware proof of concept and to validate results from a computational code known as the Closed Cycle System Simulation (CCSS). Initial checkout tests were performed at Barber- Nichols to ready the system for delivery to GRC. This presentation describes the system hardware components and lists the types of checkout tests performed along with a couple issues encountered while conducting the tests. A description of the CCSS model is also presented. The checkout tests did not focus on generating data, therefore, no test data or model analyses are presented.
The Dream Property Scale: an exploratory English version.
Takeuchi, T; Ogilvie, R D; Ferrelli, A V; Murphy, T I; Belicki, K
2001-09-01
Our goal is to develop an English version of the Dream Property Scale (DPS-E) based on the original normed scale in Japan (DPS-J). Factor analyses extracted four factors (Emotionality, Rationality, Activity, and Impression) and its factor structure was apparently similar to the DPS-J. The DPS-E was also shown to be related to EEG power spectral values. These results indicate that the DPS-E may provide an exploratory basis for a reliable and valid tool for capturing and quantifying the properties of dream experiences that could reflect physiological activities without the intervention of experimenters. We suggest that the DPS-E will develop into a useful tool to help clarify dream production mechanisms by further investigation. Copyright 2001 Academic Press.
Validation of Shock Layer Radiation: Perspectives for Test Cases
NASA Technical Reports Server (NTRS)
Brandis, Aaron
2012-01-01
This paper presents a review of the analysis and measurement of radiation data obtained in the NASA Ames Research Center's Electric Arc Shock Tube (EAST) facility. The goal of these experiments was to measure the level of radiation encountered during atmospheric entry. The data obtained from these experiments is highlighted by providing the first spectrally and spatially resolved data for high speed Earth entry and measurements of the CO 4th positive band for conditions relevant to Mars entry. Comparisons of the EAST data with experimental results obtained from shock tunnels at JAXA and the University of Queensland are presented. Furthermore, the paper will detail initial analyses in to the influence and characterization of the measure non-equilibrium radiation.
Enhancing nurses' ethical practice: development of a clinical ethics program.
McDaniel, C
1998-06-01
There is increasing attention paid to ethics under managed care; however, few clinical-based ethics programs are reported. This paper reports the assessment and outcomes of one such program. A quasi-experimental research design with t-tests is used to assess the outcome differences between participants and control groups. There are twenty nurses in each; they are assessed for comparability. Differences are predicted on two outcomes using reliable and valid measures: nurses' time with their patients in ethics discussions, and nurses' opinions regarding their clinical ethics environments. Results reveal a statistically significant difference (p <.05) between the two groups, with modest positive change in the participants. Additional exploratory analyses are reported on variables influential in health care services.
Parallel-Connected Photovoltaic Inverters: Zero Frequency Sequence Harmonic Analysis and Solution
NASA Astrophysics Data System (ADS)
Carmeli, Maria Stefania; Mauri, Marco; Frosio, Luisa; Bezzolato, Alberto; Marchegiani, Gabriele
2013-05-01
High-power photovoltaic (PV) plants are usually constituted of the connection of different PV subfields, each of them with its interface transformer. Different solutions have been studied to improve the efficiency of the whole generation system. In particular, transformerless configurations are the more attractive one from efficiency and costs point of view. This paper focuses on transformerless PV configurations characterised by the parallel connection of interface inverters. The problem of zero sequence current due to both the parallel connection and the presence of undesirable parasitic earth capacitances is considered and a solution, which consists of the synchronisation of pulse-width modulation triangular carrier, is proposed and theoretically analysed. The theoretical analysis has been validated through simulation and experimental results.
NASA Technical Reports Server (NTRS)
Defelice, David M.; Aydelott, John C.
1987-01-01
The resupply of the cryogenic propellants is an enabling technology for spacebased orbit transfer vehicles. As part of the NASA Lewis ongoing efforts in microgravity fluid management, thermodynamic analysis and subscale modeling techniques were developed to support an on-orbit test bed for cryogenic fluid management technologies. Analytical results have shown that subscale experimental modeling of liquid resupply can be used to validate analytical models when the appropriate target temperature is selected to relate the model to its prototype system. Further analyses were used to develop a thermodynamic model of the tank chilldown process which is required prior to the no-vent fill operation. These efforts were incorporated into two FORTRAN programs which were used to present preliminary analyticl results.
Ehrhart, Mark G.; Torres, Elisa M.; Finn, Natalie K.; Roesch, Scott C.
2016-01-01
There have been recent calls for pragmatic measures to assess factors that influence evidence-based practice (EBP) implementation processes and outcomes. The Implementation Leadership Scale (ILS) is a brief and efficient measure that can be used for research or organizational development purposes to assess leader behaviors and actions that actively support effective EBP implementation. The ILS was developed and validated in mental health settings. This study validates the ILS factor structure with providers in alcohol and other drug (AOD) use treatment agencies. Participants were 323 service providers working in 72 workgroups from three AOD use treatment agencies. Confirmatory factor analyses and reliability analyses were conducted to examine the psychometric properties of the ILS. Convergent and discriminant validity were also assessed. Confirmatory factor analyses demonstrated good fit to the hypothesized first and second order factor structure. Internal consistency reliability was excellent. Convergent and discriminant validity was supported. The ILS psychometric characteristics, reliability, and validity were supported in AOD use treatment agencies. The ILS is a brief and pragmatic measure that can be used for research and practice to assess leadership for EBP implementation in AOD use treatment agencies. PMID:27431044
Aarons, Gregory A; Ehrhart, Mark G; Torres, Elisa M; Finn, Natalie K; Roesch, Scott C
2016-09-01
There have been recent calls for pragmatic measures to assess factors that influence evidence-based practice (EBP) implementation processes and outcomes. The Implementation Leadership Scale (ILS) is a brief and efficient measure that can be used for research or organizational development purposes to assess leader behaviors and actions that actively support effective EBP implementation. The ILS was developed and validated in mental health settings. This study validates the ILS factor structure with providers in alcohol and other drug (AOD) use treatment agencies. Participants were 323 service providers working in 72 workgroups from three AOD use treatment agencies. Confirmatory factor analyses and reliability analyses were conducted to examine the psychometric properties of the ILS. Convergent and discriminant validity were also assessed. Confirmatory factor analyses demonstrated good fit to the hypothesized first and second order factor structure. Internal consistency reliability was excellent. Convergent and discriminant validity was supported. The ILS psychometric characteristics, reliability, and validity were supported in AOD use treatment agencies. The ILS is a brief and pragmatic measure that can be used for research and practice to assess leadership for EBP implementation in AOD use treatment agencies. Copyright © 2016 Elsevier Inc. All rights reserved.
Marvel Analysis of the Measured High-resolution Rovibronic Spectra of TiO
NASA Astrophysics Data System (ADS)
McKemmish, Laura K.; Masseron, Thomas; Sheppard, Samuel; Sandeman, Elizabeth; Schofield, Zak; Furtenbacher, Tibor; Császár, Attila G.; Tennyson, Jonathan; Sousa-Silva, Clara
2017-02-01
Accurate, experimental rovibronic energy levels, with associated labels and uncertainties, are reported for 11 low-lying electronic states of the diatomic {}48{{Ti}}16{{O}} molecule, determined using the Marvel (Measured Active Rotational-Vibrational Energy Levels) algorithm. All levels are based on lines corresponding to critically reviewed and validated high-resolution experimental spectra taken from 24 literature sources. The transition data are in the 2-22,160 cm-1 region. Out of the 49,679 measured transitions, 43,885 are triplet-triplet, 5710 are singlet-singlet, and 84 are triplet-singlet transitions. A careful analysis of the resulting experimental spectroscopic network (SN) allows 48,590 transitions to be validated. The transitions determine 93 vibrational band origins of {}48{{Ti}}16{{O}}, including 71 triplet and 22 singlet ones. There are 276 (73) triplet-triplet (singlet-singlet) band-heads derived from Marvel experimental energies, 123(38) of which have never been assigned in low- or high-resolution experiments. The highest J value, where J stands for the total angular momentum, for which an energy level is validated is 163. The number of experimentally derived triplet and singlet {}48{{Ti}}16{{O}} rovibrational energy levels is 8682 and 1882, respectively. The lists of validated lines and levels for {}48{{Ti}}16{{O}} are deposited in the supporting information to this paper.
Houdek, Petr
2017-01-01
The aim of this perspective article is to show that current experimental evidence on factors influencing dishonesty has limited external validity. Most of experimental studies is built on random assignments, in which control/experimental groups of subjects face varied sizes of the expected reward for behaving dishonestly, opportunities for cheating, means of rationalizing dishonest behavior etc., and mean groups' reactions are observed. The studies have internal validity in assessing the causal influence of these and other factors, but they lack external validity in organizational, market and other environments. If people can opt into or out of diverse real-world environments, an experiment aimed at studying factors influencing real-life degree of dishonesty should permit for such an option. The behavior of such self-selected groups of marginal subjects would probably contain a larger level of (non)deception than the behavior of average people. The article warns that there are not many studies that would enable self-selection or sorting of participants into varying environments, and that limits current knowledge of the extent and dynamics of dishonest and fraudulent behavior. The article focuses on suggestions how to improve dishonesty research, especially how to avoid the experimenter demand bias.
Houdek, Petr
2017-01-01
The aim of this perspective article is to show that current experimental evidence on factors influencing dishonesty has limited external validity. Most of experimental studies is built on random assignments, in which control/experimental groups of subjects face varied sizes of the expected reward for behaving dishonestly, opportunities for cheating, means of rationalizing dishonest behavior etc., and mean groups’ reactions are observed. The studies have internal validity in assessing the causal influence of these and other factors, but they lack external validity in organizational, market and other environments. If people can opt into or out of diverse real-world environments, an experiment aimed at studying factors influencing real-life degree of dishonesty should permit for such an option. The behavior of such self-selected groups of marginal subjects would probably contain a larger level of (non)deception than the behavior of average people. The article warns that there are not many studies that would enable self-selection or sorting of participants into varying environments, and that limits current knowledge of the extent and dynamics of dishonest and fraudulent behavior. The article focuses on suggestions how to improve dishonesty research, especially how to avoid the experimenter demand bias. PMID:28955279
Duality, Gauge Symmetries, Renormalization Groups and the BKT Transition
NASA Astrophysics Data System (ADS)
José, Jorge V.
2017-03-01
In this chapter, I will briefly review, from my own perspective, the situation within theoretical physics at the beginning of the 1970s, and the advances that played an important role in providing a solid theoretical and experimental foundation for the Berezinskii-Kosterlitz-Thouless theory (BKT). Over this period, it became clear that the Abelian gauge symmetry of the 2D-XY model had to be preserved to get the right phase structure of the model. In previous analyses, this symmetry was broken when using low order calculational approximations. Duality transformations at that time for two-dimensional models with compact gauge symmetries were introduced by José, Kadanoff, Nelson and Kirkpatrick (JKKN). Their goal was to analyze the phase structure and excitations of XY and related models, including symmetry breaking fields which are experimentally important. In a separate context, Migdal had earlier developed an approximate Renormalization Group (RG) algorithm to implement Wilson’s RG for lattice gauge theories. Although Migdal’s RG approach, later extended by Kadanoff, did not produce a true phase transition for the XY model, it almost did asymptotically in terms of a non-perturbative expansion in the coupling constant with an essential singularity. Using these advances, including work done on instantons (vortices), JKKN analyzed the behavior of the spin-spin correlation functions of the 2D XY-model in terms of an expansion in temperature and vortex-pair fugacity. Their analysis led to a perturbative derivation of RG equations for the XY model which are the same as those first derived by Kosterlitz for the two-dimensional Coulomb gas. JKKN’s results gave a theoretical formulation foundation and justification for BKT’s sound physical assumptions and for the validity of their calculational approximations that were, in principle, strictly valid only at very low temperatures, away from the critical TBKT temperature. The theoretical predictions were soon tested successfully against experimental results on superfluid helium films. The success of the BKT theory also gave one of the first quantitative proofs of the validity of the RG theory.
Duality, Gauge Symmetries, Renormalization Groups and the BKT Transition
NASA Astrophysics Data System (ADS)
José, Jorge V.
2013-06-01
In this chapter, I will briefly review, from my own perspective, the situation within theoretical physics at the beginning of the 1970s, and the advances that played an important role in providing a solid theoretical and experimental foundation for the Berezinskii-Kosterlitz-Thouless theory (BKT). Over this period, it became clear that the Abelian gauge symmetry of the 2D-XY model had to be preserved to get the right phase structure of the model. In previous analyses, this symmetry was broken when using low order calculational approximations. Duality transformations at that time for two-dimensional models with compact gauge symmetries were introduced by José, Kadanoff, Nelson and Kirkpatrick (JKKN). Their goal was to analyze the phase structure and excitations of XY and related models, including symmetry breaking fields which are experimentally important. In a separate context, Migdal had earlier developed an approximate Renormalization Group (RG) algorithm to implement Wilson's RG for lattice gauge theories. Although Migdal's RG approach, later extended by Kadanoff, did not produce a true phase transition for the XY model, it almost did asymptotically in terms of a non-perturbative expansion in the coupling constant with an essential singularity. Using these advances, including work done on instantons (vortices), JKKN analyzed the behavior of the spin-spin correlation functions of the 2D XY-model in terms of an expansion in temperature and vortex-pair fugacity. Their analysis led to a perturbative derivation of RG equations for the XY model which are the same as those first derived by Kosterlitz for the two-dimensional Coulomb gas. JKKN's results gave a theoretical formulation foundation and justification for BKT's sound physical assumptions and for the validity of their calculational approximations that were, in principle, strictly valid only at very low temperatures, away from the critical TBKT temperature. The theoretical predictions were soon tested successfully against experimental results on superfluid helium films. The success of the BKT theory also gave one of the first quantitative proofs of the validity of the RG theory...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franco, Manuel
The objective of this work was to characterize the neutron irradiation system consisting of americium-241 beryllium (241AmBe) neutron sources placed in a polyethylene shielding for use at Sandia National Laboratories (SNL) Low Dose Rate Irradiation Facility (LDRIF). With a total activity of 0.3 TBq (9 Ci), the source consisted of three recycled 241AmBe sources of different activities that had been combined into a single source. The source in its polyethylene shielding will be used in neutron irradiation testing of components. The characterization of the source-shielding system was necessary to evaluate the radiation environment for future experiments. Characterization of the sourcemore » was also necessary because the documentation for the three component sources and their relative alignment within the Special Form Capsule (SFC) was inadequate. The system consisting of the source and shielding was modeled using Monte Carlo N-Particle transport code (MCNP). The model was validated by benchmarking it against measurements using multiple techniques. To characterize the radiation fields over the full spatial geometry of the irradiation system, it was necessary to use a number of instruments of varying sensitivities. First, the computed photon radiography assisted in determining orientation of the component sources. With the capsule properly oriented inside the shielding, the neutron spectra were measured using a variety of techniques. A N-probe Microspec and a neutron Bubble Dosimeter Spectrometer (BDS) set were used to characterize the neutron spectra/field in several locations. In the third technique, neutron foil activation was used to ascertain the neutron spectra. A high purity germanium (HPGe) detector was used to characterize the photon spectrum. The experimentally measured spectra and the MCNP results compared well. Once the MCNP model was validated to an adequate level of confidence, parametric analyses was performed on the model to optimize for potential experimental configurations and neutron spectra for component irradiation. The final product of this work is a MCNP model validated by measurements, an overall understanding of neutron irradiation system including photon/neutron transport and effective dose rates throughout the system, and possible experimental configurations for future irradiation of components.« less
Advanced Numerical Model for Irradiated Concrete
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giorla, Alain B.
In this report, we establish a numerical model for concrete exposed to irradiation to address these three critical points. The model accounts for creep in the cement paste and its coupling with damage, temperature and relative humidity. The shift in failure mode with the loading rate is also properly represented. The numerical model for creep has been validated and calibrated against different experiments in the literature [Wittmann, 1970, Le Roy, 1995]. Results from a simplified model are shown to showcase the ability of numerical homogenization to simulate irradiation effects in concrete. In future works, the complete model will be appliedmore » to the analysis of the irradiation experiments of Elleuch et al. [1972] and Kelly et al. [1969]. This requires a careful examination of the experimental environmental conditions as in both cases certain critical information are missing, including the relative humidity history. A sensitivity analysis will be conducted to provide lower and upper bounds of the concrete expansion under irradiation, and check if the scatter in the simulated results matches the one found in experiments. The numerical and experimental results will be compared in terms of expansion and loss of mechanical stiffness and strength. Both effects should be captured accordingly by the model to validate it. Once the model has been validated on these two experiments, it can be applied to simulate concrete from nuclear power plants. To do so, the materials used in these concrete must be as well characterized as possible. The main parameters required are the mechanical properties of each constituent in the concrete (aggregates, cement paste), namely the elastic modulus, the creep properties, the tensile and compressive strength, the thermal expansion coefficient, and the drying shrinkage. These can be either measured experimentally, estimated from the initial composition in the case of cement paste, or back-calculated from mechanical tests on concrete. If some are unknown, a sensitivity analysis must be carried out to provide lower and upper bounds of the material behaviour. Finally, the model can be used as a basis to formulate a macroscopic material model for concrete subject to irradiation, which later can be used in structural analyses to estimate the structural impact of irradiation on nuclear power plants.« less
Evaluating the Dimensionality of Self-Determination Theory's Relative Autonomy Continuum.
Sheldon, Kennon M; Osin, Evgeny N; Gordeeva, Tamara O; Suchkov, Dmitry D; Sychev, Oleg A
2017-09-01
We conducted a theoretical and psychometric evaluation of self-determination theory's "relative autonomy continuum" (RAC), an important aspect of the theory whose validity has recently been questioned. We first derived a Comprehensive Relative Autonomy Index (C-RAI) containing six subscales and 24 items, by conducting a paired paraphrase content analysis of existing RAI measures. We administered the C-RAI to multiple U.S. and Russian samples, assessing motivation to attend class, study a major, and take responsibility. Item-level and scale-level multidimensional scaling analyses, confirmatory factor analyses, and simplex/circumplex modeling analyses reaffirmed the validity of the RAC, across multiple samples, stems, and studies. Validation analyses predicting subjective well-being and trait autonomy from the six separate subscales, in combination with various higher order composites (weighted and unweighted), showed that an aggregate unweighted RAI score provides the most unbiased and efficient indicator of the overall quality of motivation within the behavioral domain being assessed.
Fatigue Failure of Space Shuttle Main Engine Turbine Blades
NASA Technical Reports Server (NTRS)
Swanson, Gregrory R.; Arakere, Nagaraj K.
2000-01-01
Experimental validation of finite element modeling of single crystal turbine blades is presented. Experimental results from uniaxial high cycle fatigue (HCF) test specimens and full scale Space Shuttle Main Engine test firings with the High Pressure Fuel Turbopump Alternate Turbopump (HPFTP/AT) provide the data used for the validation. The conclusions show the significant contribution of the crystal orientation within the blade on the resulting life of the component, that the analysis can predict this variation, and that experimental testing demonstrates it.
NASA Astrophysics Data System (ADS)
Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin
2018-04-01
This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.
Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.
Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan
2013-01-01
In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.
NASA Astrophysics Data System (ADS)
Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany
2016-07-01
Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.
Olondo, C; Legarda, F; Herranz, M; Idoeta, R
2017-04-01
This paper shows the procedure performed to validate the migration equation and the migration parameters' values presented in a previous paper (Legarda et al., 2011) regarding the migration of 137 Cs in Spanish mainland soils. In this paper, this model validation has been carried out checking experimentally obtained activity concentration values against those predicted by the model. This experimental data come from the measured vertical activity profiles of 8 new sampling points which are located in northern Spain. Before testing predicted values of the model, the uncertainty of those values has been assessed with the appropriate uncertainty analysis. Once establishing the uncertainty of the model, both activity concentration values, experimental versus model predicted ones, have been compared. Model validation has been performed analyzing its accuracy, studying it as a whole and also at different depth intervals. As a result, this model has been validated as a tool to predict 137 Cs behaviour in a Mediterranean environment. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ben Mosbah, Abdallah
In order to improve the qualities of wind tunnel tests, and the tools used to perform aerodynamic tests on aircraft wings in the wind tunnel, new methodologies were developed and tested on rigid and flexible wings models. A flexible wing concept is consists in replacing a portion (lower and/or upper) of the skin with another flexible portion whose shape can be changed using an actuation system installed inside of the wing. The main purpose of this concept is to improve the aerodynamic performance of the aircraft, and especially to reduce the fuel consumption of the airplane. Numerical and experimental analyses were conducted to develop and test the methodologies proposed in this thesis. To control the flow inside the test sections of the Price-Paidoussis wind tunnel of LARCASE, numerical and experimental analyses were performed. Computational fluid dynamics calculations have been made in order to obtain a database used to develop a new hybrid methodology for wind tunnel calibration. This approach allows controlling the flow in the test section of the Price-Paidoussis wind tunnel. For the fast determination of aerodynamic parameters, new hybrid methodologies were proposed. These methodologies were used to control flight parameters by the calculation of the drag, lift and pitching moment coefficients and by the calculation of the pressure distribution around an airfoil. These aerodynamic coefficients were calculated from the known airflow conditions such as angles of attack, the mach and the Reynolds numbers. In order to modify the shape of the wing skin, electric actuators were installed inside the wing to get the desired shape. These deformations provide optimal profiles according to different flight conditions in order to reduce the fuel consumption. A controller based on neural networks was implemented to obtain desired displacement actuators. A metaheuristic algorithm was used in hybridization with neural networks, and support vector machine approaches and their combination was optimized, and very good results were obtained in a reduced computing time. The validation of the obtained results has been made using numerical data obtained by the XFoil code, and also by the Fluent code. The results obtained using the methodologies presented in this thesis have been validated with experimental data obtained using the subsonic Price-Paidoussis blow down wind tunnel.
Monse, Bella; Benzian, Habib; Naliponguit, Ella; Belizario, Vincente; Schratz, Alexander; van Palenstein Helderman, Wim
2013-03-21
Child health in many low- and middle-income countries lags behind international goals and affects children's education, well-being, and general development. Large-scale school health programmes can be effective in reducing preventable diseases through cost-effective interventions. This paper outlines the baseline and 1-year results of a longitudinal health study assessing the impact of the Fit for School Programme in the Philippines. A longitudinal 4-year cohort study was conducted in the province of Camiguin, Mindanao (experimental group); an external concurrent control group was studied in Gingoog, Mindanao. The study has three experimental groups: group 1-daily handwashing with soap, daily brushing with fluoride toothpaste, biannual deworming with 400 mg albendazole (Essential Health Care Program [EHCP]); group 2-EHCP plus twice-a-year access to school-based Oral Urgent Treatment; group 3-EHCP plus weekly toothbrushing with high-fluoride concentration gel. A non-concurrent internal control group was also included. Baseline data on anthropometric indicators to calculate body mass index (BMI), soil-transmitted helminths (STH) infection in stool samples, and dental caries were collected in August 2009 and August 2010. Data were analysed to assess validity of the control group design, baseline, and 1-year results. In the cohort study, 412 children were examined at baseline and 341 1 year after intervention. The baseline results were in line with national averages for STH infection, BMI, and dental caries in group 1 and the control groups. Children lost to follow-up had similar baseline characteristics in the experimental and control groups. After 1 year, group 1 showed a significantly higher increase in mean BMI and lower prevalence of moderate to heavy STH infection than the external concurrent control group. The increases in caries and dental infections were reduced but not statistically significant. The results for groups 2 and 3 will be reported separately. Despite the short 1-year observation period, the study found a reduction in the prevalence of moderate to heavy STH infections, a rise in mean BMI, and a (statistically non-significant) reduction in dental caries and infections. The study design proved functional in actual field conditions. Critical aspects affecting the validity of cohort studies are analysed and discussed. DRKS00003431 WHO Universal Trial Number U1111-1126-0718.
Family Early Literacy Practices Questionnaire: A Validation Study for a Spanish-Speaking Population
ERIC Educational Resources Information Center
Lewis, Kandia
2012-01-01
The purpose of the current study was to evaluate the psychometric validity of a Spanish translated version of a family involvement questionnaire (the FELP) using a mixed-methods design. Thus, statistical analyses (i.e., factor analysis, reliability analysis, and item analysis) and qualitative analyses (i.e., focus group data) were assessed.…
Mangia, Anna Lisa; Cortesi, Matteo; Fantozzi, Silvia; Giovanardi, Andrea; Borra, Davide; Gatta, Giorgio
2017-01-01
The aims of the present study were the instrumental validation of inertial-magnetic measurements units (IMMUs) in water, and the description of their use in clinical and sports aquatic applications applying customized 3D multi-body models. Firstly, several tests were performed to map the magnetic field in the swimming pool and to identify the best volume for experimental test acquisition with a mean dynamic orientation error lower than 5°. Successively, the gait and the swimming analyses were explored in terms of spatiotemporal and joint kinematics variables. The extraction of only spatiotemporal parameters highlighted several critical issues and the joint kinematic information has shown to be an added value for both rehabilitative and sport training purposes. Furthermore, 3D joint kinematics applied using the IMMUs provided similar quantitative information than that of more expensive and bulky systems but with a simpler and faster setup preparation, a lower time consuming processing phase, as well as the possibility to record and analyze a higher number of strides/strokes without limitations imposed by the cameras. PMID:28441739
Ultrasound SIV measurement of helical valvular flow behind the great saphenous vein
NASA Astrophysics Data System (ADS)
Park, Jun Hong; Kim, Jeong Ju; Lee, Sang Joon; Yeom, Eunseop; Experimental Fluid Mechanics Laboratory Team; LaboratoryMicrothermal; Microfluidic Measurements Collaboration
2017-11-01
Dysfunction of venous valve and induced secondary abnormal flow are closely associated with venous diseases. Thus, detailed analysis of venous valvular flow is invaluable from biological and medical perspectives. However, most previous studies on venous perivalvular flows were based on qualitative analyses. On the contrary, quantitative analysis on the perivalvular flows has not been fully understood yet. In this study, 3D valvular flows under in vitro and in vivo conditions were experimentally investigated using ultrasound speckle image velocimetry (SIV) for analyzing their flow characteristics. The results for in vitro model obtained by the SIV technique were compared with those derived by numerical simulation and color Doppler method to validate its measurement accuracy. Then blood flow in the human great saphenous vein was measured using the SIV with respect to the dimensionless index, helical intensity. The results obtained by the SIV method are well matched well with those obtained by the numerical simulation and color Doppler method. The hemodynamic characteristics of 3D valvular flows measured by the validated SIV method would be helpful in diagnosis of valve-related venous diseases. None.
Accuracy Analysis for Automatic Orientation of a Tumbling Oblique Viewing Sensor System
NASA Astrophysics Data System (ADS)
Stebner, K.; Wieden, A.
2014-03-01
Dynamic camera systems with moving parts are difficult to handle in photogrammetric workflow, because it is not ensured that the dynamics are constant over the recording period. Minimum changes of the camera's orientation greatly influence the projection of oblique images. In this publication these effects - originating from the kinematic chain of a dynamic camera system - are analysed and validated. A member of the Modular Airborne Camera System family - MACS-TumbleCam - consisting of a vertical viewing and a tumbling oblique camera was used for this investigation. Focus is on dynamic geometric modeling and the stability of the kinematic chain. To validate the experimental findings, the determined parameters are applied to the exterior orientation of an actual aerial image acquisition campaign using MACS-TumbleCam. The quality of the parameters is sufficient for direct georeferencing of oblique image data from the orientation information of a synchronously captured vertical image dataset. Relative accuracy for the oblique data set ranges from 1.5 pixels when using all images of the image block to 0.3 pixels when using only adjacent images.
Welham, Nathan V.; Ling, Changying; Dawson, John A.; Kendziorski, Christina; Thibeault, Susan L.; Yamashita, Masaru
2015-01-01
The vocal fold (VF) mucosa confers elegant biomechanical function for voice production but is susceptible to scar formation following injury. Current understanding of VF wound healing is hindered by a paucity of data and is therefore often generalized from research conducted in skin and other mucosal systems. Here, using a previously validated rat injury model, expression microarray technology and an empirical Bayes analysis approach, we generated a VF-specific transcriptome dataset to better capture the system-level complexity of wound healing in this specialized tissue. We measured differential gene expression at 3, 14 and 60 days post-injury compared to experimentally naïve controls, pursued functional enrichment analyses to refine and add greater biological definition to the previously proposed temporal phases of VF wound healing, and validated the expression and localization of a subset of previously unidentified repair- and regeneration-related genes at the protein level. Our microarray dataset is a resource for the wider research community and has the potential to stimulate new hypotheses and avenues of investigation, improve biological and mechanistic insight, and accelerate the identification of novel therapeutic targets. PMID:25592437
Integrated modeling analysis of a novel hexapod and its application in active surface
NASA Astrophysics Data System (ADS)
Yang, Dehua; Zago, Lorenzo; Li, Hui; Lambert, Gregory; Zhou, Guohua; Li, Guoping
2011-09-01
This paper presents the concept and integrated modeling analysis of a novel mechanism, a 3-CPS/RPPS hexapod, for supporting segmented reflectors for radio telescopes and eventually segmented mirrors of optical telescopes. The concept comprises a novel type of hexapod with an original organization of actuators hence degrees of freedom, based on a swaying arm based design concept. Afterwards, with specially designed connecting joints between panels/segments, an iso-static master-slave active surface concept can be achieved for any triangular and/or hexagonal panel/segment pattern. The integrated modeling comprises all the multifold sizing and performance aspects which must be evaluated concurrently in order to optimize and validate the design and the configuration. In particular, comprehensive investigation of kinematic behavior, dynamic analysis, wave-front error and sensitivity analysis are carried out, where, frequently used tools like MATLAB/SimMechanics, CALFEM and ANSYS are used. Especially, we introduce the finite element method as a competent approach for analyses of the multi-degree of freedom mechanism. Some experimental verifications already performed validating single aspects of the integrated concept are also presented with the results obtained.
NASA Technical Reports Server (NTRS)
Gaddis, Stephen W.; Hudson, Susan T.; Johnson, P. D.
1992-01-01
NASA's Marshall Space Flight Center has established a cold airflow turbine test program to experimentally determine the performance of liquid rocket engine turbopump drive turbines. Testing of the SSME alternate turbopump development (ATD) fuel turbine was conducted for back-to-back comparisons with the baseline SSME fuel turbine results obtained in the first quarter of 1991. Turbine performance, Reynolds number effects, and turbine diagnostics, such as stage reactions and exit swirl angles, were investigated at the turbine design point and at off-design conditions. The test data showed that the ATD fuel turbine test article was approximately 1.4 percent higher in efficiency and flowed 5.3 percent more than the baseline fuel turbine test article. This paper describes the method and results used to validate the ATD fuel turbine aerodynamic design. The results are being used to determine the ATD high pressure fuel turbopump (HPFTP) turbine performance over its operating range, anchor the SSME ATD steady-state performance model, and validate various prediction and design analyses.
NASA Astrophysics Data System (ADS)
Nazir, Mohd Yusuf Mohd; Al-Shorgani, Najeeb Kaid Nasser; Kalil, Mohd Sahaid; Hamid, Aidil Abdul
2015-09-01
In this study, three factors (fructose concentration, agitation speed and monosodium glutamate (MSG) concentration) were optimized to enhance DHA production by Schizochytrium SW1 using response surface methodology (RSM). Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. The experiments were conducted using 500 mL flask with 100 mL working volume at 30°C for 96 hours. ANOVA analysis revealed that the process was adequately represented significantly by the quadratic model (p<0.0001) and two of the factors namely agitation speed and MSG concentration significantly affect DHA production (p<0.005). Level of influence for each variable and quadratic polynomial equation were obtained for DHA production by multiple regression analyses. The estimated optimum conditions for maximizing DHA production by SW1 were 70 g/L fructose, 250 rpm agitation speed and 12 g/L MSG. Consequently, the quadratic model was validated by applying of the estimated optimum conditions, which confirmed the model validity and 52.86% of DHA was produced.
Mangia, Anna Lisa; Cortesi, Matteo; Fantozzi, Silvia; Giovanardi, Andrea; Borra, Davide; Gatta, Giorgio
2017-04-22
The aims of the present study were the instrumental validation of inertial-magnetic measurements units (IMMUs) in water, and the description of their use in clinical and sports aquatic applications applying customized 3D multi-body models. Firstly, several tests were performed to map the magnetic field in the swimming pool and to identify the best volume for experimental test acquisition with a mean dynamic orientation error lower than 5°. Successively, the gait and the swimming analyses were explored in terms of spatiotemporal and joint kinematics variables. The extraction of only spatiotemporal parameters highlighted several critical issues and the joint kinematic information has shown to be an added value for both rehabilitative and sport training purposes. Furthermore, 3D joint kinematics applied using the IMMUs provided similar quantitative information than that of more expensive and bulky systems but with a simpler and faster setup preparation, a lower time consuming processing phase, as well as the possibility to record and analyze a higher number of strides/strokes without limitations imposed by the cameras.
NASA Astrophysics Data System (ADS)
Mehrpooya, Mehdi; Dehghani, Hossein; Ali Moosavian, S. M.
2016-02-01
A combined system containing solid oxide fuel cell-gas turbine power plant, Rankine steam cycle and ammonia-water absorption refrigeration system is introduced and analyzed. In this process, power, heat and cooling are produced. Energy and exergy analyses along with the economic factors are used to distinguish optimum operating point of the system. The developed electrochemical model of the fuel cell is validated with experimental results. Thermodynamic package and main parameters of the absorption refrigeration system are validated. The power output of the system is 500 kW. An optimization problem is defined in order to finding the optimal operating point. Decision variables are current density, temperature of the exhaust gases from the boiler, steam turbine pressure (high and medium), generator temperature and consumed cooling water. Results indicate that electrical efficiency of the combined system is 62.4% (LHV). Produced refrigeration (at -10 °C) and heat recovery are 101 kW and 22.1 kW respectively. Investment cost for the combined system (without absorption cycle) is about 2917 kW-1.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.
Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao
2017-06-30
Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do
2017-01-01
Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113
NASA Astrophysics Data System (ADS)
Nir, A.; Doughty, C.; Tsang, C. F.
Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no attempt to validate a specific model, but several models of increasing complexity are compared with experimental results. The outcome is interpreted as a demonstration of the paradigm proposed by van der Heijde, 26 that different constituencies have different objectives for the validation process and therefore their acceptance criteria differ also.
Pandey, Ramakant; Premalatha, M
2017-03-01
Open raceway ponds are widely adopted for cultivating microalgae on a large scale. Working depth of the raceway pond is the major component to be analysed for increasing the volume to surface area ratio. The working depth is limited up to 5-15 cm in conventional ponds but in this analysis working depth of raceway pond is considered as 25 cm. In this work, positioning of the paddle wheel is analysed and corresponding Vertical Mixing Index are calculated using CFD. Flow pattern along the length of the raceway pond, at three different paddle wheel speeds are analysed for L/W ratio of 6, 8 and 10, respectively. Effect of clearance (C) between rotor blade tip and bottom surface is also analysed by taking four clearance conditions i.e. C = 2, 5, 10 and 15. Moving reference frame method of Fluent is used for the modeling of six blade paddle wheel and realizable k-ε model is used for capturing turbulence characteristics. Overall objective of this work is to analyse the required geometry for maintaining a minimum flow velocity to avoid settling of algae corresponding to 25 cm working depth. Geometry given in [13] is designed using ANSYS Design modular and CFD results are generated using ANSYS FLUENT for the purpose of validation. Good agreement of results is observed between CFD and experimental Particle image velocimetry results with the deviation of 7.23%.
Pre-test CFD Calculations for a Bypass Flow Standard Problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rich Johnson
The bypass flow in a prismatic high temperature gas-cooled reactor (HTGR) is the flow that occurs between adjacent graphite blocks. Gaps exist between blocks due to variances in their manufacture and installation and because of the expansion and shrinkage of the blocks from heating and irradiation. Although the temperature of fuel compacts and graphite is sensitive to the presence of bypass flow, there is great uncertainty in the level and effects of the bypass flow. The Next Generation Nuclear Plant (NGNP) program at the Idaho National Laboratory has undertaken to produce experimental data of isothermal bypass flow between three adjacentmore » graphite blocks. These data are intended to provide validation for computational fluid dynamic (CFD) analyses of the bypass flow. Such validation data sets are called Standard Problems in the nuclear safety analysis field. Details of the experimental apparatus as well as several pre-test calculations of the bypass flow are provided. Pre-test calculations are useful in examining the nature of the flow and to see if there are any problems associated with the flow and its measurement. The apparatus is designed to be able to provide three different gap widths in the vertical direction (the direction of the normal coolant flow) and two gap widths in the horizontal direction. It is expected that the vertical bypass flow will range from laminar to transitional to turbulent flow for the different gap widths that will be available.« less
Epstein, Jonathan; Osborne, Richard H; Elsworth, Gerald R; Beaton, Dorcas E; Guillemin, Francis
2015-04-01
To assess the contribution of back-translation and expert committee to the content and psychometric properties of a translated multidimensional questionnaire. Recommendations for questionnaire translation include back-translation and expert committee, but their contribution to measurement properties is unknown. Four English to French translations of the Health Education Impact Questionnaire were generated with and without committee or back-translation. Face validity, acceptability, and structural properties were compared after random assignment to people with rheumatoid arthritis (N = 1,168), chronic renal failure (N = 2,368), and diabetes (N = 538). For face validity, 15 bilingual people compared translations quality with the original. Psychometric properties were examined using confirmatory factor analysis (metric and scalar invariance) and item response theory. Qualitatively, there were five types of translation errors: style, intensity, frequency/time frame, breadth, and meaning. Bilingual assessors ranked best the translations with committee (P = 0.0026). All translations had good structural properties (root mean square error of approximation <0.05; comparative fit index [CFI], ≥0.899; and Tucker-Lewis index, ≥0.889). Full measurement invariance was observed between translations (ΔCFI ≤ 0.01) with metric invariance between translations and original (lowest ΔCFI = 0.022 between fully constrained models and models with free intercepts). Item characteristic curve analyses revealed no significant differences. This is the first experimental evidence that back-translation has moderate impact, whereas expert committee helps to ensure accurate content. Copyright © 2015 Elsevier Inc. All rights reserved.
D’Urso, Gianluca; Giardini, Claudio
2016-01-01
The present study was carried out to evaluate how the friction stir spot welding (FSSW) process parameters affect the temperature distribution in the welding region, the welding forces and the mechanical properties of the joints. The experimental study was performed by means of a CNC machine tool obtaining FSSW lap joints on AA7050 aluminum alloy plates. Three thermocouples were inserted into the samples to measure the temperatures at different distance from the joint axis during the whole FSSW process. Experiments was repeated varying the process parameters, namely rotational speed, axial feed rate and plunging depth. Axial welding forces were measured during the tests using a piezoelectric load cell, while the mechanical properties of the joints were evaluated by executing shear tests on the specimens. The correlation found between process parameters and joints properties, allowed to identify the best technological window. The data collected during the experiments were used to validate a simulation model of the FSSW process, too. The model was set up using a 2D approach for the simulation of a 3D problem, in order to guarantee a very simple and practical solution for achieving results in a very short time. A specific external routine for the calculation of the thermal energy due to friction acting between pin and sheet was developed. An index for the prediction of the joint mechanical properties using the FEM simulations was finally presented and validated. PMID:28773810
D'Urso, Gianluca; Giardini, Claudio
2016-08-11
The present study was carried out to evaluate how the friction stir spot welding (FSSW) process parameters affect the temperature distribution in the welding region, the welding forces and the mechanical properties of the joints. The experimental study was performed by means of a CNC machine tool obtaining FSSW lap joints on AA7050 aluminum alloy plates. Three thermocouples were inserted into the samples to measure the temperatures at different distance from the joint axis during the whole FSSW process. Experiments was repeated varying the process parameters, namely rotational speed, axial feed rate and plunging depth. Axial welding forces were measured during the tests using a piezoelectric load cell, while the mechanical properties of the joints were evaluated by executing shear tests on the specimens. The correlation found between process parameters and joints properties, allowed to identify the best technological window. The data collected during the experiments were used to validate a simulation model of the FSSW process, too. The model was set up using a 2D approach for the simulation of a 3D problem, in order to guarantee a very simple and practical solution for achieving results in a very short time. A specific external routine for the calculation of the thermal energy due to friction acting between pin and sheet was developed. An index for the prediction of the joint mechanical properties using the FEM simulations was finally presented and validated.
Thermal performances of vertical hybrid PV/T air collector
NASA Astrophysics Data System (ADS)
Tabet, I.; Touafek, K.; Bellel, N.; Khelifa, A.
2016-11-01
In this work, numerical analyses and the experimental validation of the thermal behavior of a vertical photovoltaic thermal air collector are investigated. The thermal model is developed using the energy balance equations of the PV/T air collector. Experimental tests are conducted to validate our mathematical model. The tests are performed in the southern Algerian region (Ghardaïa) under clear sky conditions. The prototype of the PV/T air collector is vertically erected and south oriented. The absorber upper plate temperature, glass cover temperature, air temperature in the inlet and outlet of the collector, ambient temperature, wind speed, and solar radiation are measured. The efficiency of the collector increases with increase in mass flow of air, but the increase in mass flow of air reduces the temperature of the system. The increase in efficiency of the PV/T air collector is due to the increase in the number of fins added. In the experiments, the air temperature difference between the inlet and the outlet of the PV/T air collector reaches 10 ° C on November 21, 2014, the interval time is between 10:00 and 14:00, and the temperature of the upper plate reaches 45 ° C at noon. The mathematical model describing the dynamic behavior of the typical PV/T air collector is evaluated by calculating the root mean square error and mean absolute percentage error. A good agreement between the experiment and the simulation results is obtained.
Development and validation of a weight-bearing finite element model for total knee replacement.
Woiczinski, M; Steinbrück, A; Weber, P; Müller, P E; Jansson, V; Schröder, Ch
2016-01-01
Total knee arthroplasty (TKA) is a successful procedure for osteoarthritis. However, some patients (19%) do have pain after surgery. A finite element model was developed based on boundary conditions of a knee rig. A 3D-model of an anatomical full leg was generated from magnetic resonance image data and a total knee prosthesis was implanted without patella resurfacing. In the finite element model, a restarting procedure was programmed in order to hold the ground reaction force constant with an adapted quadriceps muscle force during a squat from 20° to 105° of flexion. Knee rig experimental data were used to validate the numerical model in the patellofemoral and femorotibial joint. Furthermore, sensitivity analyses of Young's modulus of the patella cartilage, posterior cruciate ligament (PCL) stiffness, and patella tendon origin were performed. Pearson's correlations for retropatellar contact area, pressure, patella flexion, and femorotibial ap-movement were near to 1. Lowest root mean square error for retropatellar pressure, patella flexion, and femorotibial ap-movement were found for the baseline model setup with Young's modulus of 5 MPa for patella cartilage, a downscaled PCL stiffness of 25% compared to the literature given value and an anatomical origin of the patella tendon. The results of the conducted finite element model are comparable with the experimental results. Therefore, the finite element model developed in this study can be used for further clinical investigations and will help to better understand the clinical aspects after TKA with an unresurfaced patella.
Functional validation and comparison framework for EIT lung imaging.
Grychtol, Bartłomiej; Elke, Gunnar; Meybohm, Patrick; Weiler, Norbert; Frerichs, Inéz; Adler, Andy
2014-01-01
Electrical impedance tomography (EIT) is an emerging clinical tool for monitoring ventilation distribution in mechanically ventilated patients, for which many image reconstruction algorithms have been suggested. We propose an experimental framework to assess such algorithms with respect to their ability to correctly represent well-defined physiological changes. We defined a set of clinically relevant ventilation conditions and induced them experimentally in 8 pigs by controlling three ventilator settings (tidal volume, positive end-expiratory pressure and the fraction of inspired oxygen). In this way, large and discrete shifts in global and regional lung air content were elicited. We use the framework to compare twelve 2D EIT reconstruction algorithms, including backprojection (the original and still most frequently used algorithm), GREIT (a more recent consensus algorithm for lung imaging), truncated singular value decomposition (TSVD), several variants of the one-step Gauss-Newton approach and two iterative algorithms. We consider the effects of using a 3D finite element model, assuming non-uniform background conductivity, noise modeling, reconstructing for electrode movement, total variation (TV) reconstruction, robust error norms, smoothing priors, and using difference vs. normalized difference data. Our results indicate that, while variation in appearance of images reconstructed from the same data is not negligible, clinically relevant parameters do not vary considerably among the advanced algorithms. Among the analysed algorithms, several advanced algorithms perform well, while some others are significantly worse. Given its vintage and ad-hoc formulation backprojection works surprisingly well, supporting the validity of previous studies in lung EIT.
Information Quality in Regulatory Decision Making: Peer Review versus Good Laboratory Practice.
McCarty, Lynn S; Borgert, Christopher J; Mihaich, Ellen M
2012-07-01
There is an ongoing discussion on the provenance of toxicity testing data regarding how best to ensure its validity and credibility. A central argument is whether journal peer-review procedures are superior to Good Laboratory Practice (GLP) standards employed for compliance with regulatory mandates. We sought to evaluate the rationale for regulatory decision making based on peer-review procedures versus GLP standards. We examined pertinent published literature regarding how scientific data quality and validity are evaluated for peer review, GLP compliance, and development of regulations. Some contend that peer review is a coherent, consistent evaluative procedure providing quality control for experimental data generation, analysis, and reporting sufficient to reliably establish relative merit, whereas GLP is seen as merely a tracking process designed to thwart investigator corruption. This view is not supported by published analyses pointing to subjectivity and variability in peer-review processes. Although GLP is not designed to establish relative merit, it is an internationally accepted quality assurance, quality control method for documenting experimental conduct and data. Neither process is completely sufficient for establishing relative scientific soundness. However, changes occurring both in peer-review processes and in regulatory guidance resulting in clearer, more transparent communication of scientific information point to an emerging convergence in ensuring information quality. The solution to determining relative merit lies in developing a well-documented, generally accepted weight-of-evidence scheme to evaluate both peer-reviewed and GLP information used in regulatory decision making where both merit and specific relevance inform the process.
Short-term airing by natural ventilation - modeling and control strategies.
Perino, M; Heiselberg, P
2009-10-01
The need to improve the energy efficiency of buildings requires new and more efficient ventilation systems. It has been demonstrated that innovative operating concepts that make use of natural ventilation seem to be more appreciated by occupants. This kind of system frequently integrates traditional mechanical ventilation components with natural ventilation devices, such as motorized windows and louvers. Among the various ventilation strategies that are currently available, buoyancy driven single-sided natural ventilation has proved to be very effective and can provide high air change rates for temperature and IAQ control. However, in order to promote a wider applications of these systems, an improvement in the knowledge of their working principles and the availability of new design and simulation tools is necessary. In this context, the paper analyses and presents the results of a research that was aimed at developing and validating numerical models for the analysis of buoyancy driven single-sided natural ventilation systems. Once validated, these models can be used to optimize control strategies in order to achieve satisfactory indoor comfort conditions and IAQ. Practical Implications Numerical and experimental analyses have proved that short-term airing by intermittent ventilation is an effective measure to satisfactorily control IAQ. Different control strategies have been investigated to optimize the capabilities of the systems. The proposed zonal model has provided good performances and could be adopted as a design tool, while CFD simulations can be profitably used for detailed studies of the pollutant concentration distribution in a room and to address local discomfort problems.
Selecting and Improving Quasi-Experimental Designs in Effectiveness and Implementation Research.
Handley, Margaret A; Lyles, Courtney R; McCulloch, Charles; Cattamanchi, Adithya
2018-04-01
Interventional researchers face many design challenges when assessing intervention implementation in real-world settings. Intervention implementation requires holding fast on internal validity needs while incorporating external validity considerations (such as uptake by diverse subpopulations, acceptability, cost, and sustainability). Quasi-experimental designs (QEDs) are increasingly employed to achieve a balance between internal and external validity. Although these designs are often referred to and summarized in terms of logistical benefits, there is still uncertainty about (a) selecting from among various QEDs and (b) developing strategies to strengthen the internal and external validity of QEDs. We focus here on commonly used QEDs (prepost designs with nonequivalent control groups, interrupted time series, and stepped-wedge designs) and discuss several variants that maximize internal and external validity at the design, execution and implementation, and analysis stages.
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
TREAT Neutronics Analysis and Design Support, Part II: Multi-SERTTA-CAL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Woolstenhulme, Nicolas E.; Hill, Connie M.
2016-08-01
Experiment vehicle design is necessary in preparation for Transient Reactor Test (TREAT) facility restart and the resumption of transient testing to support Accident Tolerant Fuel (ATF) characterization and other future fuels testing requirements. Currently the most mature vehicle design is the Multi-SERTTA (Static Environments Rodlet Transient Test Apparatuses), which can accommodate up to four concurrent rodlet-sized specimens under separate environmental conditions. Robust test vehicle design requires neutronics analyses to support design development, optimization of the power coupling factor (PCF) to efficiently maximize energy generation in the test fuel rodlets, and experiment safety analyses. In integral aspect of prior TREAT transientmore » testing was the incorporation of calibration experiments to experimentally evaluate and validate test conditions in preparation of the actual fuel testing. The calibration experiment package established the test parameter conditions to support fine-tuning of the computational models to deliver the required energy deposition to the fuel samples. The calibration vehicle was designed to be as near neutronically equivalent to the experiment vehicle as possible to minimize errors between the calibration and final tests. The Multi-SERTTA-CAL vehicle was designed to serve as the calibration vehicle supporting Multi-SERTTA experimentation. Models of the Multi-SERTTA-CAL vehicle containing typical PWR-fuel rodlets were prepared and neutronics calculations were performed using MCNP6.1 with ENDF/B-VII.1 nuclear data libraries; these results were then compared against those performed for Multi-SERTTA to determine the similarity and possible design modification necessary prior to construction of these experiment vehicles. The estimated reactivity insertion worth into the TREAT core is very similar between the two vehicle designs, with the primary physical difference being a hollow Inconel tube running down the length of the calibration vehicle. Calculations of PCF indicate that on average there is a reduction of approximately 6.3 and 12.6%, respectively, for PWR fuel rodlets irradiated under wet and dry conditions. Changes to the primary or secondary vessel structure in the calibration vehicle can be performed to offset this discrepancy and maintain neutronic equivalency. Current possible modifications to the calibration vehicle include reduction of the primary vessel wall thickness, swapping Zircaloy-4 for stainless steel 316 in the secondary containment, or slight modification to the temperature and pressure of the water environment within the primary vessel. Removal of some of the instrumentation within the calibration vehicle can also serve to slightly increase the PCF. Future efforts include further modification and optimization of the Multi-SERTTA and Multi-SERTTA-CAL designs in preparation of actual TREAT transient testing. Experimental results from both test vehicles will be compared against calculational results and methods to provide validation and support additional neutronics analyses.« less
Pérez-González, A; González-Lluch, C; Sancho-Bru, J L; Rodríguez-Cervantes, P J; Barjau-Escribano, A; Forner-Navarro, L
2012-03-01
The aim of this study was to analyse the strength and failure mode of teeth restored with fibre posts under retention and flexural-compressive loads at different stages of the restoration and to analyse whether including a simulated ligament in the experimental setup has any effect on the strength or the failure mode. Thirty human maxillary central incisors were distributed in three different groups to be restored with simulation of different restoration stages (1: only post, 2: post and core, 3: post-core and crown), using Rebilda fibre posts. The specimens were inserted in resin blocks and loaded by means of a universal testing machine until failure under tension (stage 1) and 50º flexion (stages 2-3). Half the specimens in each group were restored using a simulated ligament between root dentine and resin block and the other half did not use this element. Failure in stage 1 always occurred at the post-dentine interface, with a mean failure load of 191·2 N. Failure in stage 2 was located mainly in the core or coronal dentine (mean failure load of 505·9 N). Failure in stage 3 was observed in the coronal dentine (mean failure load 397·4 N). Failure loads registered were greater than expected masticatory loads. Fracture modes were mostly reparable, thus indicating that this post is clinically valid at the different stages of restoration studied. The inclusion of the simulated ligament in the experimental system did not show a statistically significant effect on the failure load or the failure mode. © 2011 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Kim, Sungwon; Uprety, Bibhisha; Mathews, V. John; Adams, Daniel O.
2015-03-01
Structural Health Monitoring (SHM) based on Acoustic Emission (AE) is dependent on both the sensors to detect an impact event as well as an algorithm to determine the impact location. The propagation of Lamb waves produced by an impact event in thin composite structures is affected by several unique aspects including material anisotropy, ply orientations, and geometric discontinuities within the structure. The development of accurate numerical models of Lamb wave propagation has important benefits towards the development of AE-based SHM systems for impact location estimation. Currently, many impact location algorithms utilize the time of arrival or velocities of Lamb waves. Therefore the numerical prediction of characteristic wave velocities is of great interest. Additionally, the propagation of the initial symmetric (S0) and asymmetric (A0) wave modes is important, as these wave modes are used for time of arrival estimation. In this investigation, finite element analyses were performed to investigate aspects of Lamb wave propagation in composite plates with active signal excitation. A comparative evaluation of two three-dimensional modeling approaches was performed, with emphasis placed on the propagation and velocity of both the S0 and A0 wave modes. Results from numerical simulations are compared to experimental results obtained from active AE testing. Of particular interest is the directional dependence of Lamb waves in quasi-isotropic carbon/epoxy composite plates. Numerical and experimental results suggest that although a quasi-isotropic composite plate may have the same effective elastic modulus in all in-plane directions, the Lamb wave velocity may have some directional dependence. Further numerical analyses were performed to investigate Lamb wave propagation associated with circular cutouts in composite plates.
Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reboud, C.; Premel, D.; Lesselier, D.
2007-03-21
Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.
Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations
NASA Astrophysics Data System (ADS)
Reboud, C.; Prémel, D.; Lesselier, D.; Bisiaux, B.
2007-03-01
Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.
van der Sluis, Olaf; Vossen, Bart; Geers, Marc
2018-01-01
Metal-elastomer interfacial systems, often encountered in stretchable electronics, demonstrate remarkably high interface fracture toughness values. Evidently, a large gap exists between the rather small adhesion energy levels at the microscopic scale (‘intrinsic adhesion’) and the large measured macroscopic work-of-separation. This energy gap is closed here by unravelling the underlying dissipative mechanisms through a systematic numerical/experimental multi-scale approach. This self-containing contribution collects and reviews previously published results and addresses the remaining open questions by providing new and independent results obtained from an alternative experimental set-up. In particular, the experimental studies on Cu-PDMS (Poly(dimethylsiloxane)) samples conclusively reveal the essential role of fibrillation mechanisms at the micro-meter scale during the metal-elastomer delamination process. The micro-scale numerical analyses on single and multiple fibrils show that the dynamic release of the stored elastic energy by multiple fibril fracture, including the interaction with the adjacent deforming bulk PDMS and its highly nonlinear behaviour, provide a mechanistic understanding of the high work-of-separation. An experimentally validated quantitative relation between the macroscopic work-of-separation and peel front height is established from the simulation results. Finally, it is shown that a micro-mechanically motivated shape of the traction-separation law in cohesive zone models is essential to describe the delamination process in fibrillating metal-elastomer systems in a physically meaningful way. PMID:29393908
Szymczynska, P; Walsh, S; Greenberg, L; Priebe, S
2017-07-01
Essential criteria for the methodological quality and validity of randomized controlled trials are the drop-out rates from both the experimental intervention and the study as a whole. This systematic review and meta-analysis assessed these drop-out rates in non-pharmacological schizophrenia trials. A systematic literature search was used to identify relevant trials with ≥100 sample size and to extract the drop-out data. The rates of drop-out from the experimental intervention and study were calculated with meta-analysis of proportions. Meta-regression was applied to explore the association between the study and sample characteristics and the drop-out rates. 43 RCTs were found, with drop-out from intervention ranging from 0% to 63% and study drop-out ranging from 4% to 71%. Meta-analyses of proportions showed an overall drop-out rate of 14% (95% CI: 13-15%) at the experimental intervention level and 20% (95% CI: 17-24%) at the study level. Meta-regression showed that the active intervention drop-out rates were predicted by the number of intervention sessions. In non-pharmacological schizophrenia trials, drop-out rates of less than 20% can be achieved for both the study and the experimental intervention. A high heterogeneity of drop-out rates across studies shows that even lower rates are achievable. Copyright © 2017 Elsevier Ltd. All rights reserved.
Initial Reliability and Validity of the Perceived Social Competence Scale
ERIC Educational Resources Information Center
Anderson-Butcher, Dawn; Iachini, Aidyn L.; Amorose, Anthony J.
2008-01-01
Objective: This study describes the development and validation of a perceived social competence scale that social workers can easily use to assess children's and youth's social competence. Method: Exploratory and confirmatory factor analyses were conducted on a calibration and a cross-validation sample of youth. Predictive validity was also…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arcilesi, David J.; Ham, Tae Kyu; Kim, In Hun
2015-07-01
A critical event in the safety analysis of the very high-temperature gas-cooled reactor (VHTR) is an air-ingress accident. This accident is initiated, in its worst case scenario, by a double-ended guillotine break of the coaxial cross vessel, which leads to a rapid reactor vessel depressurization. In a VHTR, the reactor vessel is located within a reactor cavity that is filled with air during normal operating conditions. Following the vessel depressurization, the dominant mode of ingress of an air–helium mixture into the reactor vessel will either be molecular diffusion or density-driven stratified flow. The mode of ingress is hypothesized to dependmore » largely on the break conditions of the cross vessel. Since the time scales of these two ingress phenomena differ by orders of magnitude, it is imperative to understand under which conditions each of these mechanisms will dominate in the air ingress process. Computer models have been developed to analyze this type of accident scenario. There are, however, limited experimental data available to understand the phenomenology of the air-ingress accident and to validate these models. Therefore, there is a need to design and construct a scaled-down experimental test facility to simulate the air-ingress accident scenarios and to collect experimental data. The current paper focuses on the analyses performed for the design and operation of a 1/8th geometric scale (by height and diameter), high-temperature test facility. A geometric scaling analysis for the VHTR, a time scale analysis of the air-ingress phenomenon, a transient depressurization analysis of the reactor vessel, a hydraulic similarity analysis of the test facility, a heat transfer characterization of the hot plenum, a power scaling analysis for the reactor system, and a design analysis of the containment vessel are discussed.« less
Gene Expression Analyses of Subchondral Bone in Early Experimental Osteoarthritis by Microarray
Chen, YuXian; Shen, Jun; Lu, HuaDing; Zeng, Chun; Ren, JianHua; Zeng, Hua; Li, ZhiFu; Chen, ShaoMing; Cai, DaoZhang; Zhao, Qing
2012-01-01
Osteoarthritis (OA) is a degenerative joint disease that affects both cartilage and bone. A better understanding of the early molecular changes in subchondral bone may help elucidate the pathogenesis of OA. We used microarray technology to investigate the time course of molecular changes in the subchondral bone in the early stages of experimental osteoarthritis in a rat model. We identified 2,234 differentially expressed (DE) genes at 1 week, 1,944 at 2 weeks and 1,517 at 4 weeks post-surgery. Further analyses of the dysregulated genes indicated that the events underlying subchondral bone remodeling occurred sequentially and in a time-dependent manner at the gene expression level. Some of the identified dysregulated genes that were identified have suspected roles in bone development or remodeling; these genes include Alp, Igf1, Tgf β1, Postn, Mmp3, Tnfsf11, Acp5, Bmp5, Aspn and Ihh. The differences in the expression of these genes were confirmed by real-time PCR, and the results indicated that our microarray data accurately reflected gene expression patterns characteristic of early OA. To validate the results of our microarray analysis at the protein level, immunohistochemistry staining was used to investigate the expression of Mmp3 and Aspn protein in tissue sections. These analyses indicate that Mmp3 protein expression completely matched the results of both the microarray and real-time PCR analyses; however, Aspn protein expression was not observed to differ at any time. In summary, our study demonstrated a simple method of separation of subchondral bone sample from the knee joint of rat, which can effectively avoid bone RNA degradation. These findings also revealed the gene expression profiles of subchondral bone in the rat OA model at multiple time points post-surgery and identified important DE genes with known or suspected roles in bone development or remodeling. These genes may be novel diagnostic markers or therapeutic targets for OA. PMID:22384228
Experimental validation of the DARWIN2.3 package for fuel cycle applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
San-Felice, L.; Eschbach, R.; Bourdot, P.
2012-07-01
The DARWIN package, developed by the CEA and its French partners (AREVA and EDF) provides the required parameters for fuel cycle applications: fuel inventory, decay heat, activity, neutron, {gamma}, {alpha}, {beta} sources and spectrum, radiotoxicity. This paper presents the DARWIN2.3 experimental validation for fuel inventory and decay heat calculations on Pressurized Water Reactor (PWR). In order to validate this code system for spent fuel inventory a large program has been undertaken, based on spent fuel chemical assays. This paper deals with the experimental validation of DARWIN2.3 for the Pressurized Water Reactor (PWR) Uranium Oxide (UOX) and Mixed Oxide (MOX) fuelmore » inventory calculation, focused on the isotopes involved in Burn-Up Credit (BUC) applications and decay heat computations. The calculation - experiment (C/E-1) discrepancies are calculated with the latest European evaluation file JEFF-3.1.1 associated with the SHEM energy mesh. An overview of the tendencies is obtained on a complete range of burn-up from 10 to 85 GWd/t (10 to 60 GWcVt for MOX fuel). The experimental validation of the DARWIN2.3 package for decay heat calculation is performed using calorimetric measurements carried out at the Swedish Interim Spent Fuel Storage Facility for Pressurized Water Reactor (PWR) assemblies, covering a large burn-up (20 to 50 GWd/t) and cooling time range (10 to 30 years). (authors)« less
Viability of Cross-Flow Fan with Helical Blades for Vertical Take-off and Landing Aircraft
2012-09-01
fluid dynamics (CFD) software, ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental results...computational fluid dynamics software (CFD), ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental...37 B. SIZING PARAMETERS AND ILLUSTRATION ................................. 37 APPENDIX B. ANSYS CFX PARAMETERS
2011-09-01
a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range
NASA Astrophysics Data System (ADS)
Rizzo, Axel; Vaglio-Gaudard, Claire; Martin, Julie-Fiona; Noguère, Gilles; Eschbach, Romain
2017-09-01
DARWIN2.3 is the reference package used for fuel cycle applications in France. It solves the Boltzmann and Bateman equations in a coupling way, with the European JEFF-3.1.1 nuclear data library, to compute the fuel cycle values of interest. It includes both deterministic transport codes APOLLO2 (for light water reactors) and ERANOS2 (for fast reactors), and the DARWIN/PEPIN2 depletion code, each of them being developed by CEA/DEN with the support of its industrial partners. The DARWIN2.3 package has been experimentally validated for pressurized and boiling water reactors, as well as for sodium fast reactors; this experimental validation relies on the analysis of post-irradiation experiments (PIE). The DARWIN2.3 experimental validation work points out some isotopes for which the depleted concentration calculation can be improved. Some other nuclides have no available experimental validation, and their concentration calculation uncertainty is provided by the propagation of a priori nuclear data uncertainties. This paper describes the work plan of studies initiated this year to improve the accuracy of the DARWIN2.3 depleted material balance calculation concerning some nuclides of interest for the fuel cycle.
ERIC Educational Resources Information Center
Brückner, Sebastian; Pellegrino, James W.
2016-01-01
The Standards for Educational and Psychological Testing indicate that validation of assessments should include analyses of participants' response processes. However, such analyses typically are conducted only to supplement quantitative field studies with qualitative data, and seldom are such data connected to quantitative data on student or item…
ERIC Educational Resources Information Center
James, Chris; James, Jane; Potter, Ian
2017-01-01
An adult ego development (AED) perspective accepts that the way adults interpret and interact in the social world can change during their life-span. This article seeks to analyse the validity and potential of AED for enhancing understandings of educational leadership practice and development. We analysed the AED literature and interviewed 16…
Burns, Gully A.P.C.; Turner, Jessica A.
2015-01-01
Neuroimaging data is raw material for cognitive neuroscience experiments, leading to scientific knowledge about human neurological and psychological disease, language, perception, attention and ultimately, cognition. The structure of the variables used in the experimental design defines the structure of the data gathered in the experiments; this in turn structures the interpretative assertions that may be presented as experimental conclusions. Representing these assertions and the experimental data which support them in a computable way means that they could be used in logical reasoning environments, i.e. for automated meta-analyses, or linking hypotheses and results across different levels of neuroscientific experiments. Therefore, a crucial first step in being able to represent neuroimaging results in a clear, computable way is to develop representations for the scientific variables involved in neuroimaging experiments. These representations should be expressive, computable, valid, extensible, and easy-to-use. They should also leverage existing semantic standards to interoperate easily with other systems. We present an ontology design pattern called the Ontology of Experimental Variables and Values (OoEVV). This is designed to provide a lightweight framework to capture mathematical properties of data, with appropriate ‘hooks’ to permit linkage to other ontology-driven projects (such as the Ontology of Biomedical Investigations, OBI). We instantiate the OoEVV system with a small number of functional Magnetic Resonance Imaging datasets, to demonstrate the system’s ability to describe the variables of a neuroimaging experiment. OoEVV is designed to be compatible with the XCEDE neuroimaging data standard for data collection terminology, and with the Cognitive Paradigm Ontology (CogPO) for specific reasoning elements of neuroimaging experimental designs. PMID:23684873
2010-01-01
Background The finite volume solver Fluent (Lebanon, NH, USA) is a computational fluid dynamics software employed to analyse biological mass-transport in the vasculature. A principal consideration for computational modelling of blood-side mass-transport is convection-diffusion discretisation scheme selection. Due to numerous discretisation schemes available when developing a mass-transport numerical model, the results obtained should either be validated against benchmark theoretical solutions or experimentally obtained results. Methods An idealised aneurysm model was selected for the experimental and computational mass-transport analysis of species concentration due to its well-defined recirculation region within the aneurysmal sac, allowing species concentration to vary slowly with time. The experimental results were obtained from fluid samples extracted from a glass aneurysm model, using the direct spectrophometric concentration measurement technique. The computational analysis was conducted using the four convection-diffusion discretisation schemes available to the Fluent user, including the First-Order Upwind, the Power Law, the Second-Order Upwind and the Quadratic Upstream Interpolation for Convective Kinetics (QUICK) schemes. The fluid has a diffusivity of 3.125 × 10-10 m2/s in water, resulting in a Peclet number of 2,560,000, indicating strongly convection-dominated flow. Results The discretisation scheme applied to the solution of the convection-diffusion equation, for blood-side mass-transport within the vasculature, has a significant influence on the resultant species concentration field. The First-Order Upwind and the Power Law schemes produce similar results. The Second-Order Upwind and QUICK schemes also correlate well but differ considerably from the concentration contour plots of the First-Order Upwind and Power Law schemes. The computational results were then compared to the experimental findings. An average error of 140% and 116% was demonstrated between the experimental results and those obtained from the First-Order Upwind and Power Law schemes, respectively. However, both the Second-Order upwind and QUICK schemes accurately predict species concentration under high Peclet number, convection-dominated flow conditions. Conclusion Convection-diffusion discretisation scheme selection has a strong influence on resultant species concentration fields, as determined by CFD. Furthermore, either the Second-Order or QUICK discretisation schemes should be implemented when numerically modelling convection-dominated mass-transport conditions. Finally, care should be taken not to utilize computationally inexpensive discretisation schemes at the cost of accuracy in resultant species concentration. PMID:20642816
Ràfols, Clara; Bosch, Elisabeth; Barbas, Rafael; Prohens, Rafel
2016-07-01
A study about the suitability of the chelation reaction of Ca(2+)with ethylenediaminetetraacetic acid (EDTA) as a validation standard for Isothermal Titration Calorimeter measurements has been performed exploring the common experimental variables (buffer, pH, ionic strength and temperature). Results obtained in a variety of experimental conditions have been amended according to the side reactions involved in the main process and to the experimental ionic strength and, finally, validated by contrast with the potentiometric reference values. It is demonstrated that the chelation reaction performed in acetate buffer 0.1M and 25°C shows accurate and precise results and it is robust enough to be adopted as a standard calibration process. Copyright © 2016 Elsevier B.V. All rights reserved.
Experimental validation of an ultrasonic flowmeter for unsteady flows
NASA Astrophysics Data System (ADS)
Leontidis, V.; Cuvier, C.; Caignaert, G.; Dupont, P.; Roussette, O.; Fammery, S.; Nivet, P.; Dazin, A.
2018-04-01
An ultrasonic flowmeter was developed for further applications in cryogenic conditions and for measuring flow rate fluctuations in the range of 0 to 70 Hz. The prototype was installed in a flow test rig, and was validated experimentally both in steady and unsteady water flow conditions. A Coriolis flowmeter was used for the calibration under steady state conditions, whereas in the unsteady case the validation was done simultaneously against two methods: particle image velocimetry (PIV), and with pressure transducers installed flush on the wall of the pipe. The results show that the developed flowmeter and the proposed methodology can accurately measure the frequency and amplitude of unsteady fluctuations in the experimental range of 0-9 l s-1 of the mean main flow rate and 0-70 Hz of the imposed disturbances.
Flight Research and Validation Formerly Experimental Capabilities Supersonic Project
NASA Technical Reports Server (NTRS)
Banks, Daniel
2009-01-01
This slide presentation reviews the work of the Experimental Capabilities Supersonic project, that is being reorganized into Flight Research and Validation. The work of Experimental Capabilities Project in FY '09 is reviewed, and the specific centers that is assigned to do the work is given. The portfolio of the newly formed Flight Research and Validation (FRV) group is also reviewed. The various projects for FY '10 for the FRV are detailed. These projects include: Eagle Probe, Channeled Centerbody Inlet Experiment (CCIE), Supersonic Boundary layer Transition test (SBLT), Aero-elastic Test Wing-2 (ATW-2), G-V External Vision Systems (G5 XVS), Air-to-Air Schlieren (A2A), In Flight Background Oriented Schlieren (BOS), Dynamic Inertia Measurement Technique (DIM), and Advanced In-Flight IR Thermography (AIR-T).
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1993-01-01
Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.
NASA Technical Reports Server (NTRS)
Baumeister, Joseph F.
1994-01-01
A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1992-01-01
Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.
NASA Astrophysics Data System (ADS)
Mashayekhi, Somayeh; Miles, Paul; Hussaini, M. Yousuff; Oates, William S.
2018-02-01
In this paper, fractional and non-fractional viscoelastic models for elastomeric materials are derived and analyzed in comparison to experimental results. The viscoelastic models are derived by expanding thermodynamic balance equations for both fractal and non-fractal media. The order of the fractional time derivative is shown to strongly affect the accuracy of the viscoelastic constitutive predictions. Model validation uses experimental data describing viscoelasticity of the dielectric elastomer Very High Bond (VHB) 4910. Since these materials are known for their broad applications in smart structures, it is important to characterize and accurately predict their behavior across a large range of time scales. Whereas integer order viscoelastic models can yield reasonable agreement with data, the model parameters often lack robustness in prediction at different deformation rates. Alternatively, fractional order models of viscoelasticity provide an alternative framework to more accurately quantify complex rate-dependent behavior. Prior research that has considered fractional order viscoelasticity lacks experimental validation and contains limited links between viscoelastic theory and fractional order derivatives. To address these issues, we use fractional order operators to experimentally validate fractional and non-fractional viscoelastic models in elastomeric solids using Bayesian uncertainty quantification. The fractional order model is found to be advantageous as predictions are significantly more accurate than integer order viscoelastic models for deformation rates spanning four orders of magnitude.
Hernansaiz-Garrido, Helena; Alonso-Tapia, Jesús
2017-01-01
Internalized stigma and disclosure concerns are key elements for the study of mental health in people living with HIV. Since no measures of these constructs were available for Spanish population, this study sought to develop such instruments, to analyze their reliability and validity and to provide a short version. A heterogeneous sample of 458 adults from different Spanish-speaking countries completed the HIV-Internalized Stigma Scale and the HIV-Disclosure Concerns Scale, along with the Hospital Anxiety and Depression Scale, Rosenberg's Self-esteem Scale and other socio-demographic variables. Reliability and correlation analyses, exploratory factor analyses, path analyses with latent variables, and ANOVAs were conducted to test the scales' psychometric properties. The scales showed good reliability in terms of internal consistency and temporal stability, as well as good sensitivity and factorial and criterion validity. The HIV-Internalized Stigma Scale and the HIV-Disclosure Concerns Scale are reliable and valid means to assess these variables in several contexts.
Lyon, Aaron R; Pullmann, Michael D; Dorsey, Shannon; Martin, Prerna; Grigore, Alexandra A; Becker, Emily M; Jensen-Doss, Amanda
2018-05-11
Measurement-based care (MBC) is an increasingly popular, evidence-based practice, but there are no tools with established psychometrics to evaluate clinician use of MBC practices in mental health service delivery. The current study evaluated the reliability, validity, and factor structure of scores generated from a brief, standardized tool to measure MBC practices, the Current Assessment Practice Evaluation-Revised (CAPER). Survey data from a national sample of 479 mental health clinicians were used to conduct exploratory and confirmatory factor analyses, as well as reliability and validity analyses (e.g., relationships between CAPER subscales and clinician MBC attitudes). Analyses revealed competing two- and three-factor models. Regardless of the model used, scores from CAPER subscales demonstrated good reliability and convergent and divergent validity with MBC attitudes in the expected directions. The CAPER appears to be a psychometrically sound tool for assessing clinician MBC practices. Future directions for development and application of the tool are discussed.
NASA Technical Reports Server (NTRS)
Magee, Todd E.; Fugal, Spencer R.; Fink, Lawrence E.; Adamson, Eric E.; Shaw, Stephen G.
2015-01-01
This report describes the work conducted under NASA funding for the Boeing N+2 Supersonic Experimental Validation project to experimentally validate the conceptual design of a supersonic airliner feasible for entry into service in the 2018 -to 2020 timeframe (NASA N+2 generation). The primary goal of the project was to develop a low-boom configuration optimized for minimum sonic boom signature (65 to 70 PLdB). This was a very aggressive goal that could be achieved only through integrated multidisciplinary optimization tools validated in relevant ground and, later, flight environments. The project was split into two phases. Phase I of the project covered the detailed aerodynamic design of a low boom airliner as well as the wind tunnel tests to validate that design (ref. 1). This report covers Phase II of the project, which continued the design methodology development of Phase I with a focus on the propulsion integration aspects as well as the testing involved to validate those designs. One of the major airplane configuration features of the Boeing N+2 low boom design was the overwing nacelle. The location of the nacelle allowed for a minimal effect on the boom signature, however, it added a level of difficulty to designing an inlet with acceptable performance in the overwing flow field. Using the Phase I work as the starting point, the goals of the Phase 2 project were to design and verify inlet performance while maintaining a low-boom signature. The Phase II project was successful in meeting all contract objectives. New modular nacelles were built for the larger Performance Model along with a propulsion rig with an electrically-actuated mass flow plug. Two new mounting struts were built for the smaller Boom Model, along with new nacelles. Propulsion integration testing was performed using an instrumented fan face and a mass flow plug, while boom signatures were measured using a wall-mounted pressure rail. A side study of testing in different wind tunnels was completed as a precursor to the selection of the facilities used for validation testing. As facility schedules allowed, the propulsion testing was done at the NASA Glenn Research Center (GRC) 8 x 6-Foot wind tunnel, while boom and force testing was done at the NASA Ames Research Center (ARC) 9 x 7-Foot wind tunnel. During boom testing, a live balance was used for gathering force data. This report is broken down into nine sections. The first technical section (Section 2) covers the general scope of the Phase II activities, goals, a description of the design and testing efforts, and the project plan and schedule. Section 3 covers the details of the propulsion system concepts and design evolution. A series of short tests to evaluate the suitability of different wind tunnels for boom, propulsion, and force testing was also performed under the Phase 2 effort, with the results covered in Section 4. The propulsion integration testing is covered in Section 5 and the boom and force testing in Section 6. CFD comparisons and analyses are included in Section 7. Section 8 includes the conclusions and lessons learned.
Dahlke, Jeffrey A; Kostal, Jack W; Sackett, Paul R; Kuncel, Nathan R
2018-05-03
We explore potential explanations for validity degradation using a unique predictive validation data set containing up to four consecutive years of high school students' cognitive test scores and four complete years of those students' college grades. This data set permits analyses that disentangle the effects of predictor-score age and timing of criterion measurements on validity degradation. We investigate the extent to which validity degradation is explained by criterion dynamism versus the limited shelf-life of ability scores. We also explore whether validity degradation is attributable to fluctuations in criterion variability over time and/or GPA contamination from individual differences in course-taking patterns. Analyses of multiyear predictor data suggest that changes to the determinants of performance over time have much stronger effects on validity degradation than does the shelf-life of cognitive test scores. The age of predictor scores had only a modest relationship with criterion-related validity when the criterion measurement occasion was held constant. Practical implications and recommendations for future research are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Hishinuma, Yuri; Horiuchi, Shigeko; Yanai, Haruo
2016-06-01
Midwives are always involved in educational activities whenever novice midwives are present. Although various scales for measuring the educational competencies of nurses have already been developed in previous studies, a scale for the educational competencies particular to midwives has yet to be developed, or even no previous studies have revealed their functions as clinical educators. The purpose of this study was to develop a scale to measure the mentoring competencies of clinical midwives (MCCM Scale) and to confirm its validity and reliability. An exploratory quantitative research study. Questionnaires were distributed to 1,645 midwives at 148 facilities who had previously instructed novice midwives. 1,004 midwives (61.0%) voluntarily returned valid responses and 296 (18.0%) voluntarily agreed to participate in the survey for test-retest reliability. Exploratory factor analyses were performed over 41 items and the following seven factors were extracted with a reliability coefficient (Cronbach's α) of 0.953: (i) supporting experimental study, (ii) personal characteristics particularly in clinical educators, (iii) thoughtfulness and empathy for new midwives, (iv) self-awareness and self-reflection for finding confidence, (v) making effective use of the new midwives' own experience, (vi) commitment to educational activities, and (vii) sharing their midwifery practice. Test-retest reliability was measured based on a convenience sample of 246 (83.1%). Pearson's test-retest correlation coefficient for the entire scale was r=0.863. The factor loadings of each item on its respective factor were 0.313-0.925. The total score of the MCCM Scale was positively correlated with that of the Quality of Nurses' Occupational Experience Scale (r=0.641, p=0.000) and was negatively correlated with the total score of the Japanese Burnout Scale (r=-0.480, p=0.000). The MCCM Scale is composed of 41 items and three subscales measured from a total of seven factors. The validity and reliability of the MCCM Scale was supported by the statistical analyses. Copyright © 2016 Elsevier Ltd. All rights reserved.
76 FR 39343 - Reducing Regulatory Burden; Retrospective Review Under E.O. 13563
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-06
... on ED regulations. c. How, if at all, will the agency incorporate experimental designs into retrospective analyses? Although ED will not be incorporating experimental designs into its analyses, its... evaluations, including those that use experimental designs, reveal about the efficacy of the regulations and...
Computer Simulations of Coronary Blood Flow Through a Constriction
2014-03-01
interventional procedures (e.g., stent deployment). Building off previous models that have been partially validated with experimental data, this thesis... stent deployment). Building off previous models that have been partially validated with experimental data, this thesis continues to develop the...the artery and increase blood flow. Generally a stent , or a mesh wire tube, is permanently inserted in order to scaffold open the artery wall
Perceptions vs Reality: A Longitudinal Experiment in Influenced Judgement Performance
2003-03-25
validity were manifested equally between treatment and control groups , thereby lending further validity to the experimental research design . External...Stanley (1975) identify this as a True Experimental Design : Pretest- Posttest Control Group Design . However, due to the longitudinal aspect required to...1975:43). Nonequivalence will be ruled out as pretest equivalence is shown between treatment and control groups (1975:47). For quasi
Assessing the stability of human locomotion: a review of current measures
Bruijn, S. M.; Meijer, O. G.; Beek, P. J.; van Dieën, J. H.
2013-01-01
Falling poses a major threat to the steadily growing population of the elderly in modern-day society. A major challenge in the prevention of falls is the identification of individuals who are at risk of falling owing to an unstable gait. At present, several methods are available for estimating gait stability, each with its own advantages and disadvantages. In this paper, we review the currently available measures: the maximum Lyapunov exponent (λS and λL), the maximum Floquet multiplier, variability measures, long-range correlations, extrapolated centre of mass, stabilizing and destabilizing forces, foot placement estimator, gait sensitivity norm and maximum allowable perturbation. We explain what these measures represent and how they are calculated, and we assess their validity, divided up into construct validity, predictive validity in simple models, convergent validity in experimental studies, and predictive validity in observational studies. We conclude that (i) the validity of variability measures and λS is best supported across all levels, (ii) the maximum Floquet multiplier and λL have good construct validity, but negative predictive validity in models, negative convergent validity and (for λL) negative predictive validity in observational studies, (iii) long-range correlations lack construct validity and predictive validity in models and have negative convergent validity, and (iv) measures derived from perturbation experiments have good construct validity, but data are lacking on convergent validity in experimental studies and predictive validity in observational studies. In closing, directions for future research on dynamic gait stability are discussed. PMID:23516062
Assessment of mechanical properties of human head tissues for trauma modelling.
Lozano-Mínguez, Estívaliz; Palomar, Marta; Infante-García, Diego; Rupérez, María José; Giner, Eugenio
2018-05-01
Many discrepancies are found in the literature regarding the damage and constitutive models for head tissues as well as the values of the constants involved in the constitutive equations. Their proper definition is required for consistent numerical model performance when predicting human head behaviour, and hence skull fracture and brain damage. The objective of this research is to perform a critical review of constitutive models and damage indicators describing human head tissue response under impact loading. A 3D finite element human head model has been generated by using computed tomography images, which has been validated through the comparison to experimental data in the literature. The threshold values of the skull and the scalp that lead to fracture have been analysed. We conclude that (1) compact bone properties are critical in skull fracture, (2) the elastic constants of the cerebrospinal fluid affect the intracranial pressure distribution, and (3) the consideration of brain tissue as a nearly incompressible solid with a high (but not complete) water content offers pressure responses consistent with the experimental data. Copyright © 2018 John Wiley & Sons, Ltd.
Virtual hybrid test control of sinuous crack
NASA Astrophysics Data System (ADS)
Jailin, Clément; Carpiuc, Andreea; Kazymyrenko, Kyrylo; Poncelet, Martin; Leclerc, Hugo; Hild, François; Roux, Stéphane
2017-05-01
The present study aims at proposing a new generation of experimental protocol for analysing crack propagation in quasi brittle materials. The boundary conditions are controlled in real-time to conform to a predefined crack path. Servo-control is achieved through a full-field measurement technique to determine the pre-set fracture path and a simple predictor model based on linear elastic fracture mechanics to prescribe the boundary conditions on the fly so that the actual crack path follows at best the predefined trajectory. The final goal is to identify, for instance, non-local damage models involving internal lengths. The validation of this novel procedure is performed via a virtual test-case based on an enriched damage model with an internal length scale, a prior chosen sinusoidal crack path and a concrete sample. Notwithstanding the fact that the predictor model selected for monitoring the test is a highly simplified picture of the targeted constitutive law, the proposed protocol exhibits a much improved sensitivity to the sought parameters such as internal lengths as assessed from the comparison with other available experimental tests.
NASA Astrophysics Data System (ADS)
Flanagan, S.; Schachter, J. M.; Schissel, D. P.
2001-10-01
A Data Analysis Monitoring (DAM) system has been developed to monitor between pulse physics analysis at the DIII-D National Fusion Facility. The system allows for rapid detection of discrepancies in diagnostic measurements or the results from physics analysis codes. This enables problems to be detected and possibly fixed between pulses as opposed to after the experimental run has concluded thus increasing the efficiency of experimental time. An example of a consistency check is comparing the stored energy from integrating the measured kinetic profiles to that calculated from magnetic measurements by EFIT. This new system also tracks the progress of MDSplus dispatching of software for data analysis and the loading of analyzed data into MDSplus. DAM uses a Java Servlet to receive messages, Clips to implement expert system logic, and displays its results to multiple web clients via HTML. If an error is detected by DAM, users can view more detailed information so that steps can be taken to eliminate the error for the next pulse. A demonstration of this system including a simulated DIII-D pulse cycle will be presented.
Inherent rhythm of smooth muscle cells in rat mesenteric arterioles: An eigensystem formulation
NASA Astrophysics Data System (ADS)
Ho, I. Lin; Moshkforoush, Arash; Hong, Kwangseok; Meininger, Gerald A.; Hill, Michael A.; Tsoukias, Nikolaos M.; Kuo, Watson
2016-04-01
On the basis of experimental data and mathematical equations in the literature, we remodel the ionic dynamics of smooth muscle cells (SMCs) as an eigensystem formulation, which is valid for investigating finite variations of variables from the equilibrium such as in common experimental operations. This algorithm provides an alternate viewpoint from frequency-domain analysis and enables one to probe functionalities of SMCs' rhythm by means of a resonance-related mechanism. Numerical results show three types of calcium oscillations of SMCs in mesenteric arterioles: spontaneous calcium oscillation, agonist-dependent calcium oscillation, and agonist-dependent calcium spike. For simple single and double SMCs, we demonstrate properties of synchronization among complex signals related to calcium oscillations, and show different correlation relations between calcium and voltage signals for various synchronization and resonance conditions. For practical cell clusters, our analyses indicate that the rhythm of SMCs could (1) benefit enhancements of signal communications among remote cells, (2) respond to a significant calcium peaking against transient stimulations for triggering globally oscillating modes, and (3) characterize the globally oscillating modes via frog-leap (non-molecular-diffusion) calcium waves across inhomogeneous SMCs.
Assessment of a 3-D boundary layer code to predict heat transfer and flow losses in a turbine
NASA Technical Reports Server (NTRS)
Anderson, O. L.
1984-01-01
Zonal concepts are utilized to delineate regions of application of three-dimensional boundary layer (DBL) theory. The zonal approach requires three distinct analyses. A modified version of the 3-DBL code named TABLET is used to analyze the boundary layer flow. This modified code solves the finite difference form of the compressible 3-DBL equations in a nonorthogonal surface coordinate system which includes coriolis forces produced by coordinate rotation. These equations are solved using an efficient, implicit, fully coupled finite difference procedure. The nonorthogonal surface coordinate system is calculated using a general analysis based on the transfinite mapping of Gordon which is valid for any arbitrary surface. Experimental data is used to determine the boundary layer edge conditions. The boundary layer edge conditions are determined by integrating the boundary layer edge equations, which are the Euler equations at the edge of the boundary layer, using the known experimental wall pressure distribution. Starting solutions along the inflow boundaries are estimated by solving the appropriate limiting form of the 3-DBL equations.
NASA Astrophysics Data System (ADS)
Zhao, Jiaye; Wen, Huihui; Liu, Zhanwei; Rong, Jili; Xie, Huimin
2018-05-01
Three-dimensional (3D) deformation measurements are a key issue in experimental mechanics. In this paper, a displacement field correlation (DFC) method to measure centrosymmetric 3D dynamic deformation using a single camera is proposed for the first time. When 3D deformation information is collected by a camera at a tilted angle, the measured displacement fields are coupling fields of both the in-plane and out-of-plane displacements. The features of the coupling field are analysed in detail, and a decoupling algorithm based on DFC is proposed. The 3D deformation to be measured can be inverted and reconstructed using only one coupling field. The accuracy of this method was validated by a high-speed impact experiment that simulated an underwater explosion. The experimental results show that the approach proposed in this paper can be used in 3D deformation measurements with higher sensitivity and accuracy, and is especially suitable for high-speed centrosymmetric deformation. In addition, this method avoids the non-synchronisation problem associated with using a pair of high-speed cameras, as is common in 3D dynamic measurements.
Experimental test of the viscous anisotropy hypothesis for partially molten rocks
Qi, Chao; Kohlstedt, David L.; Katz, Richard F.; Takei, Yasuko
2015-01-01
Chemical differentiation of rocky planets occurs by melt segregation away from the region of melting. The mechanics of this process, however, are complex and incompletely understood. In partially molten rocks undergoing shear deformation, melt pockets between grains align coherently in the stress field; it has been hypothesized that this anisotropy in microstructure creates an anisotropy in the viscosity of the aggregate. With the inclusion of anisotropic viscosity, continuum, two-phase-flow models reproduce the emergence and angle of melt-enriched bands that form in laboratory experiments. In the same theoretical context, these models also predict sample-scale melt migration due to a gradient in shear stress. Under torsional deformation, melt is expected to segregate radially inward. Here we present torsional deformation experiments on partially molten rocks that test this prediction. Microstructural analyses of the distribution of melt and solid reveal a radial gradient in melt fraction, with more melt toward the center of the cylinder. The extent of this radial melt segregation grows with progressive strain, consistent with theory. The agreement between theoretical prediction and experimental observation provides a validation of this theory. PMID:26417107
Experimental test of the viscous anisotropy hypothesis for partially molten rocks.
Qi, Chao; Kohlstedt, David L; Katz, Richard F; Takei, Yasuko
2015-10-13
Chemical differentiation of rocky planets occurs by melt segregation away from the region of melting. The mechanics of this process, however, are complex and incompletely understood. In partially molten rocks undergoing shear deformation, melt pockets between grains align coherently in the stress field; it has been hypothesized that this anisotropy in microstructure creates an anisotropy in the viscosity of the aggregate. With the inclusion of anisotropic viscosity, continuum, two-phase-flow models reproduce the emergence and angle of melt-enriched bands that form in laboratory experiments. In the same theoretical context, these models also predict sample-scale melt migration due to a gradient in shear stress. Under torsional deformation, melt is expected to segregate radially inward. Here we present torsional deformation experiments on partially molten rocks that test this prediction. Microstructural analyses of the distribution of melt and solid reveal a radial gradient in melt fraction, with more melt toward the center of the cylinder. The extent of this radial melt segregation grows with progressive strain, consistent with theory. The agreement between theoretical prediction and experimental observation provides a validation of this theory.
Modeling of Stone-impact Resistance of Monolithic Glass Ply Using Continuum Damage Mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Xin; Khaleel, Mohammad A.; Davies, Richard W.
2005-04-01
We study the stone-impact resistance of a monolithic glass ply using a combined experimental and computational approach. Instrumented stone impact tests were first carried out in controlled environment. Explicit finite element analyses were then used to simulate the interactions of the indentor and the glass layer during the impact event, and a continuum damage mechanics (CDM) model was used to describe the constitutive behavior of glass. The experimentally measured strain histories for low velocity impact served as validation of the modeling procedures. Next, stair-stepping impact experiments were performed with two indentor sizes on two glass ply thickness, and the testmore » results were used to calibrate the critical stress parameters used in the CDM constitutive model. The purpose of this study is to establish the modeling procedures and the CDM critical stress parameters under impact loading conditions. The modeling procedures and the CDM model will be used in our future studies to predict through-thickness damage evolution patterns for different laminated windshield designs in automotive applications.« less
Juan, J A; Prat, J; Vera, P; Hoyos, J V; Sánchez-Lacuesta, J; Peris, J L; Dejoz, R; Alepuz, R
1992-09-01
A theoretical analysis by a finite elements model (FEM) of some external fixators (Hoffmann, Wagner, Orthofix and Ilizarov) was carried out. This study considered a logarithmic progress of callus elastic characteristics. A standard configuration of each fixator was defined where design and application characteristics were modified. A comparison among standard configurations and influence of every variation was made with regard to displacement and load transmission at the fracture site. An experimental evaluation of standard configurations was performed with a testing machine. After experimental validation of the theoretical model was achieved, an application of physiological loads which act on a fractured limb during normal gait was analysed. A minimal contribution from an external fixator to the total rigidity of the bone-callus-fixator system was assessed when a callus showing minimum elastic characteristics had just been established. Insufficient rigidity from the fixation devices to assure an adequate immobilization during the early stages of fracture healing was verified. However, regardless of the external fixator, callus development was the overriding element for the rigidity of the fixator-bone system.
Verevkin, Sergey P; Zaitsau, Dzmitry H; Emel'yanenko, Vladimir N; Yermalayeu, Andrei V; Schick, Christoph; Liu, Hongjun; Maginn, Edward J; Bulut, Safak; Krossing, Ingo; Kalb, Roland
2013-05-30
Vaporization enthalpy of an ionic liquid (IL) is a key physical property for applications of ILs as thermofluids and also is useful in developing liquid state theories and validating intermolecular potential functions used in molecular modeling of these liquids. Compilation of the data for a homologous series of 1-alkyl-3-methylimidazolium bis(trifluoromethane-sulfonyl)imide ([C(n)mim][NTf2]) ILs has revealed an embarrassing disarray of literature results. New experimental data, based on the concurring results from quartz crystal microbalance, thermogravimetric analyses, and molecular dynamics simulation have revealed a clear linear dependence of IL vaporization enthalpies on the chain length of the alkyl group on the cation. Ambiguity of the procedure for extrapolation of vaporization enthalpies to the reference temperature 298 K was found to be a major source of the discrepancies among previous data sets. Two simple methods for temperature adjustment of vaporization enthalpies have been suggested. Resulting vaporization enthalpies obey group additivity, although the values of the additivity parameters for ILs are different from those for molecular compounds.
Lenain, Christelle; Gamboa, Bastien; Perrin, Agnes; Séraïdaris, Alexia; Bertino, Béatrice; Rival, Yves; Bernardi, Mathieu; Piwnica, David; Méhul, Bruno
2018-05-01
We investigated UV-induced signalling in an ex vivo skin organ culture model using phospho-antibody array. Phosphorylation modulations were analysed in time-course experiments following exposure to solar-simulated UV and validated by Western blot analyses. We found that UV induced P-p38 and its substrates, P-ERK1/2 and P-AKT, which were previously shown to be upregulated by UV in cultured keratinocytes and in vivo human skin. This indicates that phospho-antibody array applied to ex vivo skin organ culture is a relevant experimental system to investigate signalling events following perturbations. As the identified proteins are components of pathways implicated in skin tumorigenesis, UV-exposed skin organ culture model could be used to investigate the effect on these pathways of NMSC cancer drug candidates. In addition, we found that phospho-HCK is induced upon UV exposure, producing a new candidate for future studies investigating its role in the skin response to UV and UV-induced carcinogenesis. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Caltaru, M.; Badicioiu, M.; Ripeanu, R. G.; Dinita, A.; Minescu, M.; Laudacescu, E.
2018-01-01
Drill pipe is a seamless steel pipe with upset ends fitted with special threaded ends that are known as tool joints. During drilling operations, the wall thickness of the drill pipe and the outside diameter of the tool joints will be gradually reduced due to wear. The present research work investigate the possibility of reconditioning the drill pipe tool joints by hardbanding with a new metal-cored coppered flux cored wire, Cr-Mo alloyed, using the gas metal active welding process, taking into considerations two different hardbanding technologies, consisting in: hardbanding drill pipe tool joints after removing the old hardbanding material and surface reconstruction with a compensation material (case A), and hardbanding tool joint drill pipe, without removing the old hardbanding material (case B). The present paper brings forward the experimental researches regarding the tribological characterization of the reconditioned drill pipe tool joint by performing macroscopic analyses, metallographic analyses, Vickers hardness measurement, chemical composition measurement and wear tests conducted on ball on disk friction couples, in order to certify the quality of the hardbanding obtained by different technological approaches, to validate the optimum technology.
Li, Jian-Hao; Zuehlsdorff, T J; Payne, M C; Hine, N D M
2015-05-14
We show that the transition origins of electronic excitations identified by quantified natural transition orbital (QNTO) analysis can be employed to connect potential energy surfaces (PESs) according to their character across a wide range of molecular geometries. This is achieved by locating the switching of transition origins of adiabatic potential surfaces as the geometry changes. The transition vectors for analysing transition origins are provided by linear response time-dependent density functional theory (TDDFT) calculations under the Tamm-Dancoff approximation. We study the photochemical CO ring opening of oxirane as an example and show that the results corroborate the traditional Gomer-Noyes mechanism derived experimentally. The knowledge of specific states for the reaction also agrees well with that given by previous theoretical work using TDDFT surface-hopping dynamics that was validated by high-quality quantum Monte Carlo calculations. We also show that QNTO can be useful for considerably larger and more complex systems: by projecting the excitations to those of a reference oxirane molecule, the approach is able to identify and analyse specific excitations of a trans-2,3-diphenyloxirane molecule.
Transient Thermal Analyses of Passive Systems on SCEPTOR X-57
NASA Technical Reports Server (NTRS)
Chin, Jeffrey C.; Schnulo, Sydney L.; Smith, Andrew D.
2017-01-01
As efficiency, emissions, and noise become increasingly prominent considerations in aircraft design, turning to an electric propulsion system is a desirable solution. Achieving the intended benefits of distributed electric propulsion (DEP) requires thermally demanding high power systems, presenting a different set of challenges compared to traditional aircraft propulsion. The embedded nature of these heat sources often preclude the use of traditional thermal management systems in order to maximize performance, with less opportunity to exhaust waste heat to the surrounding environment. This paper summarizes the thermal analyses of X-57 vehicle subsystems that don't employ externally air-cooled heat sinks. The high-power battery, wires, high-lift motors, and aircraft outer surface are subjected to heat loads with stringent thermal constraints. The temperature of these components are tracked transiently, since they never reach a steady-state equilibrium. Through analysis and testing, this report demonstrates that properly characterizing the material properties is key to accurately modeling peak temperature of these systems, with less concern for spatial thermal gradients. Experimentally validated results show the thermal profile of these systems can be sufficiently estimated using reduced order approximations.
NASA Astrophysics Data System (ADS)
Yehia, Ali M.; Arafa, Reham M.; Abbas, Samah S.; Amer, Sawsan M.
2016-01-01
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL- 1. Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method.
Experimental Results from the Active Aeroelastic Wing Wind Tunnel Test Program
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Spain, Charles V.; Florance, James R.; Wieseman, Carol D.; Ivanco, Thomas G.; DeMoss, Joshua; Silva, Walter A.; Panetta, Andrew; Lively, Peter; Tumwa, Vic
2005-01-01
The Active Aeroelastic Wing (AAW) program is a cooperative effort among NASA, the Air Force Research Laboratory and the Boeing Company, encompassing flight testing, wind tunnel testing and analyses. The objective of the AAW program is to investigate the improvements that can be realized by exploiting aeroelastic characteristics, rather than viewing them as a detriment to vehicle performance and stability. To meet this objective, a wind tunnel model was crafted to duplicate the static aeroelastic behavior of the AAW flight vehicle. The model was tested in the NASA Langley Transonic Dynamics Tunnel in July and August 2004. The wind tunnel investigation served the program goal in three ways. First, the wind tunnel provided a benchmark for comparison with the flight vehicle and various levels of theoretical analyses. Second, it provided detailed insight highlighting the effects of individual parameters upon the aeroelastic response of the AAW vehicle. This parameter identification can then be used for future aeroelastic vehicle design guidance. Third, it provided data to validate scaling laws and their applicability with respect to statically scaled aeroelastic models.
Meta-Analysis of Inquiry-Based Instruction Research
NASA Astrophysics Data System (ADS)
Hasanah, N.; Prasetyo, A. P. B.; Rudyatmi, E.
2017-04-01
Inquiry-based instruction in biology has been the focus of educational research conducted by Unnes biology department students in collaboration with their university supervisors. This study aimed to describe the methodological aspects, inquiry teaching methods critically, and to analyse the results claims, of the selected four student research reports, grounded in inquiry, based on the database of Unnes biology department 2014. Four experimental quantitative research of 16 were selected as research objects by purposive sampling technique. Data collected through documentation study was qualitatively analysed regarding methods used, quality of inquiry syntax, and finding claims. Findings showed that the student research was still the lack of relevant aspects of research methodology, namely in appropriate sampling procedures, limited validity tests of all research instruments, and the limited parametric statistic (t-test) not supported previously by data normality tests. Their consistent inquiry syntax supported the four mini-thesis claims that inquiry-based teaching influenced their dependent variables significantly. In other words, the findings indicated that positive claims of the research results were not fully supported by good research methods, and well-defined inquiry procedures implementation.
Unsteady Analyses of Valve Systems in Rocket Engine Testing Environments
NASA Technical Reports Server (NTRS)
Shipman, Jeremy; Hosangadi, Ashvin; Ahuja, Vineet
2004-01-01
This paper discusses simulation technology used to support the testing of rocket propulsion systems by performing high fidelity analyses of feed system components. A generalized multi-element framework has been used to perform simulations of control valve systems. This framework provides the flexibility to resolve the structural and functional complexities typically associated with valve-based high pressure feed systems that are difficult to deal with using traditional Computational Fluid Dynamics (CFD) methods. In order to validate this framework for control valve systems, results are presented for simulations of a cryogenic control valve at various plug settings and compared to both experimental data and simulation results obtained at NASA Stennis Space Center. A detailed unsteady analysis has also been performed for a pressure regulator type control valve used to support rocket engine and component testing at Stennis Space Center. The transient simulation captures the onset of a modal instability that has been observed in the operation of the valve. A discussion of the flow physics responsible for the instability and a prediction of the dominant modes associated with the fluctuations is presented.
Detection of internal cracks in rubber composite structures using an impact acoustic modality
NASA Astrophysics Data System (ADS)
Shen, Q.; Kurfess, T. R.; Omar, M.; Gramling, F.
2014-01-01
The objective of this study is to investigate the use of impact acoustic signals to non-intrusively inspect rubber composite structures for the presence of internal cracks, such as those found in an automobile tyre. Theoretical contact dynamic models for both integral and defective rubber structures are developed based on Hertz's impact model, further modified for rubber composite materials. The model generates the prediction of major impact dynamic quantities, namely the maximum impact force, impact duration and contact deformation; such parameters are also theoretically proven to be correlated with the presence of internal cracks. The tyre structures are simplified into cubic rubber blocks, to mitigate complexity for analytical modelling. Both impact force and impact sound signals are measured experimentally, and extraction of useful features from both signals for defect identification is achieved. The impact force produces two direct measurements of theoretical impact dynamic quantities. A good correlation between these experimental discriminators and the theoretical dynamic quantities provide validation for the contact dynamics models. Defect discriminators extracted from the impact sound are dependent on both time- and frequency-domain analyses. All the discriminators are closely connected with the theoretical dynamic quantities and experimentally verified as good indicators of internal cracks in rubber composite structures.
Kerry, Matthew J; Embretson, Susan E
2017-01-01
Future time perspective (FTP) is defined as "perceptions of the future as being limited or open-ended" (Lang and Carstensen, 2002; p. 125). The construct figures prominently in both workplace and retirement domains, but the age-predictions are competing: Workplace research predicts decreasing FTP age-change, in contrast, retirement scholars predict increasing FTP age-change. For the first time, these competing predictions are pitted in an experimental manipulation of subjective life expectancy (SLE). A sample of N = 207 older adults (age 45-60) working full-time (>30-h/week) were randomly assigned to SLE questions framed as either 'Live-to' or 'Die-by' to evaluate competing predictions for FTP. Results indicate general support for decreasing age-change in FTP, indicated by independent-sample t -tests showing lower FTP in the 'Die-by' framing condition. Further general-linear model analyses were conducted to test for interaction effects of retirement planning with experimental framings on FTP and intended retirement; While retirement planning buffered FTP's decrease, simple-effects also revealed that retirement planning increased intentions for sooner retirement, but lack of planning increased intentions for later retirement. Discussion centers on practical implications of our findings and consequences validity evidence in future empirical research of FTP in both workplace and retirement domains.
Scaling in biomechanical experimentation: a finite similitude approach.
Ochoa-Cabrero, Raul; Alonso-Rasgado, Teresa; Davey, Keith
2018-06-01
Biological experimentation has many obstacles: resource limitations, unavailability of materials, manufacturing complexities and ethical compliance issues; any approach that resolves all or some of these is of some interest. The aim of this study is applying the recently discovered concept of finite similitude as a novel approach for the design of scaled biomechanical experiments supported with analysis using a commercial finite-element package and validated by means of image correlation software. The study of isotropic scaling of synthetic bones leads to the selection of three-dimensional (3D) printed materials for the trial-space materials. These materials conforming to the theory are analysed in finite-element models of a cylinder and femur geometries undergoing compression, tension, torsion and bending tests to assess the efficacy of the approach using reverse scaling of the approach. The finite-element results show similar strain patterns in the surface for the cylinder with a maximum difference of less than 10% and for the femur with a maximum difference of less than 4% across all tests. Finally, the trial-space, physical-trial experimentation using 3D printed materials for compression and bending testing provides a good agreement in a Bland-Altman statistical analysis, providing good supporting evidence for the practicality of the approach. © 2018 The Author(s).
Bitter, Thom; Khan, Imran; Marriott, Tim; Lovelady, Elaine; Verdonschot, Nico; Janssen, Dennis
2017-09-01
Fretting corrosion at the taper interface of modular hip implants has been implicated as a possible cause of implant failure. This study was set up to gain more insight in the taper mechanics that lead to fretting corrosion. The objectives of this study therefore were (1) to select experimental loading conditions to reproduce clinically relevant fretting corrosion features observed in retrieved components, (2) to develop a finite element model consistent with the fretting experiments and (3) to apply more complicated loading conditions of activities of daily living to the finite element model to study the taper mechanics. The experiments showed similar wear patterns on the taper surface as observed in retrievals. The finite element wear score based on Archard's law did not correlate well with the amount of material loss measured in the experiments. However, similar patterns were observed between the simulated micromotions and the experimental wear measurements. Although the finite element model could not be validated, the loading conditions based on activities of daily living demonstrate the importance of assembly load on the wear potential. These findings suggest that finite element models that do not incorporate geometry updates to account for wear loss may not be appropriate to predict wear volumes of taper connections.
Model validations for low-global warming potential refrigerants in mini-split air-conditioning units
Shen, Bo; Shrestha, Som; Abdelaziz, Omar
2016-09-02
To identify low GWP (global warming potential) refrigerants to replace R-22 and R-410A, extensive experimental evaluations were conducted for multiple candidates of refrigerant at the standard test conditions and at high-ambient conditions with outdoor temperature varying from 27.8 C to 55.0 C.. In the study, R-22 was compared to propane (R-290), DR-3, ARM-20B, N-20B and R-444B in a mini-split air conditioning unit originally designed for R-22; R-410A was compared to R-32, DR-55, ARM-71A, L41-2 (R-447A) in a mini-split unit designed for R-410A. To reveal physics behind the measured performance results, thermodynamic properties of the alternative refrigerants were analysed. In addition,more » the experimental data was used to calibrate a physics-based equipment model, i.e. ORNL Heat Pump Design Model (HPDM). The calibrated model translated the experimental results to key calculated parameters, i.e. compressor efficiencies, refrigerant side two-phase heat transfer coefficients, corresponding to each refrigerant. As a result, these calculated values provide scientific insights on the performance of the alternative refrigerants and are useful for other applications beyond mini-split air conditioning units.« less
NASA Astrophysics Data System (ADS)
Weres, Jerzy; Kujawa, Sebastian; Olek, Wiesław; Czajkowski, Łukasz
2016-04-01
Knowledge of physical properties of biomaterials is important in understanding and designing agri-food and wood processing industries. In the study presented in this paper computational methods were developed and combined with experiments to enhance identification of agri-food and forest product properties, and to predict heat and water transport in such products. They were based on the finite element model of heat and water transport and supplemented with experimental data. Algorithms were proposed for image processing, geometry meshing, and inverse/direct finite element modelling. The resulting software system was composed of integrated subsystems for 3D geometry data acquisition and mesh generation, for 3D geometry modelling and visualization, and for inverse/direct problem computations for the heat and water transport processes. Auxiliary packages were developed to assess performance, accuracy and unification of data access. The software was validated by identifying selected properties and using the estimated values to predict the examined processes, and then comparing predictions to experimental data. The geometry, thermal conductivity, specific heat, coefficient of water diffusion, equilibrium water content and convective heat and water transfer coefficients in the boundary layer were analysed. The estimated values, used as an input for simulation of the examined processes, enabled reduction in the uncertainty associated with predictions.
Jahromi, Hamed Dehdashti; Mahmoodi, Ali; Sheikhi, Mohammad Hossein; Zarifkar, Abbas
2016-10-20
Reduction of dark current at high-temperature operation is a great challenge in conventional quantum dot infrared photodetectors, as the rate of thermal excitations resulting in the dark current increases exponentially with temperature. A resonant tunneling barrier is the best candidate for suppression of dark current, enhancement in signal-to-noise ratio, and selective extraction of different wavelength response. In this paper, we use a physical model developed by the authors recently to design a proper resonant tunneling barrier for quantum infrared photodetectors and to study and analyze the spectral response of these devices. The calculated transmission coefficient of electrons by this model and its dependency on bias voltage are in agreement with experimental results. Furthermore, based on the calculated transmission coefficient, the dark current of a quantum dot infrared photodetector with a resonant tunneling barrier is calculated and compared with the experimental data. The validity of our model is proven through this comparison. Theoretical dark current by our model shows better agreement with the experimental data and is more accurate than the previously developed model. Moreover, noise in the device is calculated. Finally, the effect of different parameters, such as temperature, size of quantum dots, and bias voltage, on the performance of the device is simulated and studied.
Investigation of combustion characteristics in a scramjet combustor using a modified flamelet model
NASA Astrophysics Data System (ADS)
Zhao, Guoyan; Sun, Mingbo; Wang, Hongbo; Ouyang, Hao
2018-07-01
In this study, the characteristics of supersonic combustion inside an ethylene-fueled scramjet combustor equipped with multi-cavities were investigated with different injection schemes. Experimental results showed that the flames concentrated in the cavity and separated boundary layer downstream of the cavity, and they occupied the flow channel further enhancing the bulk flow compression. The flame structure in distributed injection scheme differed from that in centralized injection scheme. In numerical simulations, a modified flamelet model was introduced to consider that the pressure distribution is far from homogenous inside the scramjet combustor. Compared with original flamelet model, numerical predictions based on the modified model showed better agreement with the experimental results, validating the reliability of the calculations. Based on the modified model, the simulations with different injection schemes were analysed. The predicted flame agreed reasonably with the experimental observations in structure. The CO masses were concentrated in cavity and subsonic region adjacent to the cavity shear layer leading to intense heat release. Compared with centralized scheme, the higher jet mixing efficiency in distributed scheme induced an intense combustion in posterior upper cavity and downstream of the cavity. From streamline and isolation surfaces, the combustion at trail of lower cavity was depressed since the bulk flow downstream of the cavity is pushed down.
Model validations for low-global warming potential refrigerants in mini-split air-conditioning units
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Bo; Shrestha, Som; Abdelaziz, Omar
To identify low GWP (global warming potential) refrigerants to replace R-22 and R-410A, extensive experimental evaluations were conducted for multiple candidates of refrigerant at the standard test conditions and at high-ambient conditions with outdoor temperature varying from 27.8 C to 55.0 C.. In the study, R-22 was compared to propane (R-290), DR-3, ARM-20B, N-20B and R-444B in a mini-split air conditioning unit originally designed for R-22; R-410A was compared to R-32, DR-55, ARM-71A, L41-2 (R-447A) in a mini-split unit designed for R-410A. To reveal physics behind the measured performance results, thermodynamic properties of the alternative refrigerants were analysed. In addition,more » the experimental data was used to calibrate a physics-based equipment model, i.e. ORNL Heat Pump Design Model (HPDM). The calibrated model translated the experimental results to key calculated parameters, i.e. compressor efficiencies, refrigerant side two-phase heat transfer coefficients, corresponding to each refrigerant. As a result, these calculated values provide scientific insights on the performance of the alternative refrigerants and are useful for other applications beyond mini-split air conditioning units.« less
Macias, Cathaleene; Barreira, Paul; Hargreaves, William; Bickman, Leonard; Fisher, William; Aronson, Elliot
2009-01-01
Objective The inability to blind research participants to their experimental conditions is the Achilles’ heel of mental health services research. When one experimental condition receives more disappointed participants, or more satisfied participants, research findings can be biased in spite of random assignment. The authors explored the potential for research participants’ preference for one experimental program over another to compromise the generalizability and validity of randomized controlled service evaluations as well as cross-study comparisons. Method Three Cox regression analyses measured the impact of applicants’ service assignment preference on research project enrollment, engagement in assigned services, and a service-related outcome, competitive employment. Results A stated service preference, referral by an agency with a low level of continuity in outpatient care, and willingness to switch from current services were significant positive predictors of research enrollment. Match to service assignment preference was a significant positive predictor of service engagement, and mismatch to assignment preference was a significant negative predictor of both service engagement and employment outcome. Conclusions Referral source type and service assignment preference should be routinely measured and statistically controlled for in all studies of mental health service effectiveness to provide a sound empirical base for evidence-based practice. PMID:15800153
Acoustic emission by self-organising effects of micro-hollow cathode discharges
NASA Astrophysics Data System (ADS)
Kotschate, Daniel; Gaal, Mate; Kersten, Holger
2018-04-01
We designed micro-hollow cathode discharge prototypes under atmospheric pressure and investigated their acoustic characteristics. For the acoustic model of the discharge, we correlated the self-organisation effect of the current density distribution with the ideal model of an acoustic membrane. For validation of the obtained model, sound particle velocity spectroscopy was used to detect and analyse the acoustic emission experimentally. The results have shown a behaviour similar to the ideal acoustic membrane. Therefore, the acoustic excitation is decomposable into its eigenfrequencies and predictable. The model was unified utilising the gas exhaust velocity caused by the electrohydrodynamic force. The results may allow a contactless prediction of the current density distribution by measuring the acoustic emission or using the micro-discharge as a tunable acoustic source for specific applications as well.
Lu, Liang-Xing; Wang, Ying-Min; Srinivasan, Bharathi Madurai; Asbahi, Mohamed; Yang, Joel K W; Zhang, Yong-Wei
2016-09-01
We perform systematic two-dimensional energetic analysis to study the stability of various nanostructures formed by dewetting solid films deposited on patterned substrates. Our analytical results show that by controlling system parameters such as the substrate surface pattern, film thickness and wetting angle, a variety of equilibrium nanostructures can be obtained. Phase diagrams are presented to show the complex relations between these system parameters and various nanostructure morphologies. We further carry out both phase field simulations and dewetting experiments to validate the analytically derived phase diagrams. Good agreements between the results from our energetic analyses and those from our phase field simulations and experiments verify our analysis. Hence, the phase diagrams presented here provide guidelines for using solid-state dewetting as a tool to achieve various nanostructures.
A critical examination of the validity of simplified models for radiant heat transfer analysis.
NASA Technical Reports Server (NTRS)
Toor, J. S.; Viskanta, R.
1972-01-01
Examination of the directional effects of the simplified models by comparing the experimental data with the predictions based on simple and more detailed models for the radiation characteristics of surfaces. Analytical results indicate that the constant property diffuse and specular models do not yield the upper and lower bounds on local radiant heat flux. In general, the constant property specular analysis yields higher values of irradiation than the constant property diffuse analysis. A diffuse surface in the enclosure appears to destroy the effect of specularity of the other surfaces. Semigray and gray analyses predict the irradiation reasonably well provided that the directional properties and the specularity of the surfaces are taken into account. The uniform and nonuniform radiosity diffuse models are in satisfactory agreement with each other.
Acoustic wave transmission through piezoelectric structured materials.
Lam, M; Le Clézio, E; Amorín, H; Algueró, M; Holc, Janez; Kosec, Marija; Hladky-Hennion, A C; Feuillard, G
2009-05-01
This paper deals with the transmission of acoustic waves through multilayered piezoelectric materials. It is modeled in an octet formalism via the hybrid matrix of the structure. The theoretical evolution with the angle and frequency of the transmission coefficients of ultrasonic plane waves propagating through a partially depoled PZT plate is compared to finite element calculations showing that both methods are in very good agreement. The model is then used to study a periodic stack of 0.65 PMN-0.35 PT/0.90 PMN-0.10 PT layers. The transmission spectra are interpreted in terms of a dispersive behavior of the critical angles of longitudinal and transverse waves, and band gap structures are analysed. Transmission measurements confirm the theoretical calculations and deliver an experimental validation of the model.
Extra virgin olive oil bitterness evaluation by sensory and chemical analyses.
Favati, Fabio; Condelli, Nicola; Galgano, Fernanda; Caruso, Marisa Carmela
2013-08-15
An experimental investigation was performed on blend extra virgin olive oils (EVOOs) from different cultivars and EVOO from different olive monovarieties (Coratina, Leccino, Maiatica, Ogliarola) with the aim to evaluate the possibility of estimating the perceived bitterness intensity by using chemical indices, such as the total phenol content and the compounds responsible for oil bitterness measured spectrophotometrically at 225 nm (K225 value), as bitterness predictors in different EVOO. Therefore, a bitterness predictive model, based on the relationship between the perceived bitterness intensity of the selected stimuli and the chosen chemicals parameters has been built and validated. The results indicated that the oil bitterness intensity could be satisfactorily predicted by using the K225 values of oil samples. Copyright © 2013 Elsevier Ltd. All rights reserved.
Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T
2012-08-01
InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.
Experimentally validated finite element model of electrocaloric multilayer ceramic structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk
2014-07-28
A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to supportmore » the design of optimised electrocaloric units and operating conditions.« less
Waldman, Irwin D; Poore, Holly E; van Hulle, Carol; Rathouz, Paul J; Lahey, Benjamin B
2016-11-01
Several recent studies of the hierarchical phenotypic structure of psychopathology have identified a General psychopathology factor in addition to the more expected specific Externalizing and Internalizing dimensions in both youth and adult samples and some have found relevant unique external correlates of this General factor. We used data from 1,568 twin pairs (599 MZ & 969 DZ) age 9 to 17 to test hypotheses for the underlying structure of youth psychopathology and the external validity of the higher-order factors. Psychopathology symptoms were assessed via structured interviews of caretakers and youth. We conducted phenotypic analyses of competing structural models using Confirmatory Factor Analysis and used Structural Equation Modeling and multivariate behavior genetic analyses to understand the etiology of the higher-order factors and their external validity. We found that both a General factor and specific Externalizing and Internalizing dimensions are necessary for characterizing youth psychopathology at both the phenotypic and etiologic levels, and that the 3 higher-order factors differed substantially in the magnitudes of their underlying genetic and environmental influences. Phenotypically, the specific Externalizing and Internalizing dimensions were slightly negatively correlated when a General factor was included, which reflected a significant inverse correlation between the nonshared environmental (but not genetic) influences on Internalizing and Externalizing. We estimated heritability of the general factor of psychopathology for the first time. Its moderate heritability suggests that it is not merely an artifact of measurement error but a valid construct. The General, Externalizing, and Internalizing factors differed in their relations with 3 external validity criteria: mother's smoking during pregnancy, parent's harsh discipline, and the youth's association with delinquent peers. Multivariate behavior genetic analyses supported the external validity of the 3 higher-order factors by suggesting that the General, Externalizing, and Internalizing factors were correlated with peer delinquency and parent's harsh discipline for different etiologic reasons. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Experimental and Quasi-Experimental Design.
ERIC Educational Resources Information Center
Cottrell, Edward B.
With an emphasis on the problems of control of extraneous variables and threats to internal and external validity, the arrangement or design of experiments is discussed. The purpose of experimentation in an educational institution, and the principles governing true experimentation (randomization, replication, and control) are presented, as are…
Mobashsher, Ahmed Toaha; Abbosh, A M
2016-11-29
Rapid, on-the-spot diagnostic and monitoring systems are vital for the survival of patients with intracranial hematoma, as their conditions drastically deteriorate with time. To address the limited accessibility, high costs and static structure of currently used MRI and CT scanners, a portable non-invasive multi-slice microwave imaging system is presented for accurate 3D localization of hematoma inside human head. This diagnostic system provides fast data acquisition and imaging compared to the existing systems by means of a compact array of low-profile, unidirectional antennas with wideband operation. The 3D printed low-cost and portable system can be installed in an ambulance for rapid on-site diagnosis by paramedics. In this paper, the multi-slice head imaging system's operating principle is numerically analysed and experimentally validated on realistic head phantoms. Quantitative analyses demonstrate that the multi-slice head imaging system is able to generate better quality reconstructed images providing 70% higher average signal to clutter ratio, 25% enhanced maximum signal to clutter ratio and with around 60% hematoma target localization compared to the previous head imaging systems. Nevertheless, numerical and experimental results demonstrate that previous reported 2D imaging systems are vulnerable to localization error, which is overcome in the presented multi-slice 3D imaging system. The non-ionizing system, which uses safe levels of very low microwave power, is also tested on human subjects. Results of realistic phantom and subjects demonstrate the feasibility of the system in future preclinical trials.
Flaws in animal studies exploring statins and impact on meta-analysis.
Moja, Lorenzo; Pecoraro, Valentina; Ciccolallo, Laura; Dall'Olmo, Luigi; Virgili, Gianni; Garattini, Silvio
2014-06-01
Animal experiments should be appropriately designed, correctly analysed and transparently reported to increase their scientific validity and maximise the knowledge gained from each experiment. This systematic review of animal experiments investigating statins evaluates their quality of reporting and methodological aspects as well as their implications for the conduction of meta-analyses. We searched medline and embase for studies reporting research on statins in mice, rats and rabbits. We collected detailed information about the characteristics of studies, animals and experimental methods. We retrieved 161 studies. A little over half did not report randomisation (55%) and most did not describe blinding (88%). All studies reported details on the experimental procedure, although many omitted information about animal gender, age or weight. Four percent did not report the number of animals used. None reported the sample size. Fixed- and random-effects models gave different results (ratio of effect size increased by five folds). Heterogeneity was consistently substantial within animal models, for which accounting for covariates had minimal impact. Publication bias is highly suspected across studies. Although statins showed efficacy in animal models, preclinical studies highlighted fundamental problems in the way in which such research is conducted and reported. Results were often difficult to interpret and reproduce. Different meta-analytic approaches were highly inconsistent: a reliable approach to estimate the true parameter was imperceptible. Policies that address these issues are required from investigators, editors and institutions that care about the quality standards and ethics of animal research. © 2014 Stichting European Society for Clinical Investigation Journal Foundation.
Reliability and validity of the Salford-Scott Nursing Values Questionnaire in Turkish.
Ulusoy, Hatice; Güler, Güngör; Yıldırım, Gülay; Demir, Ecem
2018-02-01
Developing professional values among nursing students is important because values are a significant predictor of the quality care that will be provided, the clients' recognition, and consequently the nurses' job satisfaction. The literature analysis showed that there is only one validated tool available in Turkish that examines both the personal and the professional values of nursing students. The aim of this study was to assess the reliability and validity of the Salford-Scott Nursing Values Questionnaire in Turkish. This study was a Turkish linguistic and cultural adaptation of a research tool. Participants and research context: The sample of this study consisted of 627 undergraduate nursing students from different geographical areas of Turkey. Two questionnaires were used for data collection: a socio-demographic form and the Salford-Scott Nursing Values Questionnaire. For the Salford-Scott Nursing Values Questionnaire, construct validity was examined using factor analyses. Ethical considerations: The study was approved by the Cumhuriyet University Faculty of Medicine Research Ethics Board. Students were informed that participation in the study was entirely voluntary and anonymous. Item content validity index ranged from 0.66 to 1.0, and the total content validity index was 0.94. The Kaiser-Meyer-Olkin measure of sampling was 0.870, and Bartlett's test of sphericity was statistically significant (x 2 = 3108.714, p < 0.001). Construct validity was examined using factor analyses and the six factors were identified. Cronbach's alpha was used to assess the internal consistency reliability and the value of 0.834 was obtained. Our analyses showed that the Turkish version of Salford-Scott Nursing Values Questionnaire has high validity and reliability.
NASA Astrophysics Data System (ADS)
Giardina, G.; Mandaglio, G.; Nasirov, A. K.; Anastasi, A.; Curciarello, F.; Fazio, G.
2018-02-01
Experimental and theoretical results of the PCN fusion probability of reactants in the entrance channel and the Wsur survival probability against fission at deexcitation of the compound nucleus formed in heavy-ion collisions are discussed. The theoretical results for a set of nuclear reactions leading to formation of compound nuclei (CNs) with the charge number Z = 102- 122 reveal a strong sensitivity of PCN to the characteristics of colliding nuclei in the entrance channel, dynamics of the reaction mechanism, and excitation energy of the system. We discuss the validity of assumptions and procedures for analysis of experimental data, and also the limits of validity of theoretical results obtained by the use of phenomenological models. The comparison of results obtained in many investigated reactions reveals serious limits of validity of the data analysis and calculation procedures.
Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment
NASA Technical Reports Server (NTRS)
Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.
2008-01-01
Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.
WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruehl, Kelley; Michelen, Carlos; Bosma, Bret
2016-08-01
The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is amore » follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.« less
Examining students' views about validity of experiments: From introductory to Ph.D. students
NASA Astrophysics Data System (ADS)
Hu, Dehui; Zwickl, Benjamin M.
2018-06-01
We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.
NASA Technical Reports Server (NTRS)
Geng, Tao; Paxson, Daniel E.; Zheng, Fei; Kuznetsov, Andrey V.; Roberts, William L.
2008-01-01
Pulsed combustion is receiving renewed interest as a potential route to higher performance in air breathing propulsion systems. Pulsejets offer a simple experimental device with which to study unsteady combustion phenomena and validate simulations. Previous computational fluid dynamic (CFD) simulation work focused primarily on the pulsejet combustion and exhaust processes. This paper describes a new inlet sub-model which simulates the fluidic and mechanical operation of a valved pulsejet head. The governing equations for this sub-model are described. Sub-model validation is provided through comparisons of simulated and experimentally measured reed valve motion, and time averaged inlet mass flow rate. The updated pulsejet simulation, with the inlet sub-model implemented, is validated through comparison with experimentally measured combustion chamber pressure, inlet mass flow rate, operational frequency, and thrust. Additionally, the simulated pulsejet exhaust flowfield, which is dominated by a starting vortex ring, is compared with particle imaging velocimetry (PIV) measurements on the bases of velocity, vorticity, and vortex location. The results show good agreement between simulated and experimental data. The inlet sub-model is shown to be critical for the successful modeling of pulsejet operation. This sub-model correctly predicts both the inlet mass flow rate and its phase relationship with the combustion chamber pressure. As a result, the predicted pulsejet thrust agrees very well with experimental data.
Experimental Validation of an Integrated Controls-Structures Design Methodology
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.
1996-01-01
The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.
Detection of overreported psychopathology with the MMPI-2-RF [corrected] validity scales.
Sellbom, Martin; Bagby, R Michael
2010-12-01
We examined the utility of the validity scales on the recently released Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2 RF; Ben-Porath & Tellegen, 2008) to detect overreported psychopathology. This set of validity scales includes a newly developed scale and revised versions of the original MMPI-2 validity scales. We used an analogue, experimental simulation in which MMPI-2 RF responses (derived from archived MMPI-2 protocols) of undergraduate students instructed to overreport psychopathology (in either a coached or noncoached condition) were compared with those of psychiatric inpatients who completed the MMPI-2 under standardized instructions. The MMPI-2 RF validity scale Infrequent Psychopathology Responses best differentiated the simulation groups from the sample of patients, regardless of experimental condition. No other validity scale added consistent incremental predictive utility to Infrequent Psychopathology Responses in distinguishing the simulation groups from the sample of patients. Classification accuracy statistics confirmed the recommended cut scores in the MMPI-2 RF manual (Ben-Porath & Tellegen, 2008).
NASA Astrophysics Data System (ADS)
Reddy, Vanteru M.; Rahman, Mustafa M.; Gandi, Appala N.; Elbaz, Ayman M.; Schrecengost, Robert A.; Roberts, William L.
2016-01-01
Heavy fuel oil (HFO) as a fuel in industrial and power generation plants ensures the availability of energy at economy. Coke and cenosphere emissions from HFO combustion need to be controlled by particulate control equipment such as electrostatic precipitators, and collection effectiveness is impacted by the properties of these particulates. The cenosphere formation is a function of HFO composition, which varies depending on the source of the HFO. Numerical modelling of the cenosphere formation mechanism presented in this paper is an economical method of characterising cenosphere formation potential for HFO in comparison to experimental analysis of individual HFO samples, leading to better control and collection. In the present work, a novel numerical model is developed for understanding the global cenosphere formation mechanism. The critical diameter of the cenosphere is modelled based on the balance between two pressures developed in an HFO droplet. First is the pressure (Prpf) developed at the interface of the liquid surface and the inner surface of the accumulated coke due to the flow restriction of volatile components from the interior of the droplet. Second is the pressure due to the outer shell strength (PrC) gained from van der Walls energy of the coke layers and surface energy. In this present study it is considered that when PrC ≥ Prpf the outer shell starts to harden. The internal motion in the shell layer ceases and the outer diameter (DSOut) of the shell is then fixed. The entire process of cenosphere formation in this study is analysed in three phases: regression, shell formation and hardening, and post shell hardening. Variations in pressures during shell formation are analysed. Shell (cenosphere) dimensions are evaluated at the completion of droplet evaporation. The rate of fuel evaporation, rate of coke formation and coke accumulation are analysed. The model predicts shell outer diameters of 650, 860 and 1040 µm, and inner diameters are 360, 410 and 430 µm respectively, for 700, 900 and 1100 µm HFO droplets. The present numerical model is validated with experimental results available from the literature. Total variation between computational and experimental results is in the range of 3-7%.
Fault-tolerant clock synchronization validation methodology. [in computer systems
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.
1987-01-01
A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.
Richardson, Michelle; Katsakou, Christina; Torres-González, Francisco; Onchev, George; Kallert, Thomas; Priebe, Stefan
2011-06-30
Patients' views of inpatient care need to be assessed for research and routine evaluation. For this a valid instrument is required. The Client Assessment of Treatment Scale (CAT) has been used in large scale international studies, but its psychometric properties have not been well established. The structural validity of the CAT was tested among involuntary inpatients with psychosis. Data from locations in three separate European countries (England, Spain and Bulgaria) were collected. The factorial validity was initially tested using single sample confirmatory factor analyses in each country. Subsequent multi-sample analyses were used to test for invariance of the factor loadings, and factor variances across the countries. Results provide good initial support for the factorial validity and invariance of the CAT scores. Future research is needed to cross-validate these findings and to generalise them to other countries, treatment settings, and patient populations. Copyright © 2011 Elsevier Ltd. All rights reserved.
Validation of a quality-of-life instrument for patients with nonmelanoma skin cancer.
Rhee, John S; Matthews, B Alex; Neuburg, Marcy; Logan, Brent R; Burzynski, Mary; Nattinger, Ann B
2006-01-01
To validate a disease-specific quality-of-life instrument--the Skin Cancer Index--intended to measure quality-of-life issues relevant to patients with nonmelanoma skin cancer. Internal reliability, convergent and divergent validity with existing scales, and factor analyses were performed in a cross-sectional study of 211 patients presenting with cervicofacial nonmelanoma skin cancer to a dermatologic surgery clinic. Factor analyses of the Skin Cancer Index confirmed a multidimensional scale with 3 distinct subscales-emotional, social, and appearance. Excellent internal validity of the 3 subscales was demonstrated. Substantial evidence was observed for convergent validity with the Dermatology Life Quality Index, Rosenberg Self-Esteem Scale, Lerman's Cancer Worry Scale, and Medical Outcomes Survey Short-Form 12 domains for vitality, emotion, social function, and mental health. These findings validate a new disease-specific quality-of-life instrument for patients with cervicofacial nonmelanoma skin cancer. Studies on the responsiveness of the Skin Cancer Index to clinical intervention are currently under way.
ERIC Educational Resources Information Center
Cory, Charles H.
This report presents data concerning the validity of a set of experimental computerized and paper-and-pencil tests for measures of on-job performance on global and job elements. It reports on the usefulness of 30 experimental and operational variables for predicting marks on 42 job elements and on a global criterion for Electrician's Mate,…
Viscoelasticity of Axisymmetric Composite Structures: Analysis and Experimental Validation
2013-02-01
compressive stress at the interface between the composite and steel prior to the sheath’s cut-off. Accordingly, the viscoelastic analysis is used...The hoop-stress profile in figure 6 shows the steel region is in compression , resulting from the winding tension of composite overwrap. The stress...mechanical and thermal loads. Experimental validation of the model is conducted using a high- tensioned composite overwrapped on a steel cylinder. The creep
Zimmerman, C.E.
2005-01-01
Analysis of otolith strontium (Sr) or strontium-to-calcium (Sr:Ca) ratios provides a powerful tool to reconstruct the chronology of migration among salinity environments for diadromous salmonids. Although use of this method has been validated by examination of known individuals and translocation experiments, it has never been validated under controlled experimental conditions. In this study, incorporation of otolith Sr was tested across a range of salinities and resulting levels of ambient Sr and Ca concentrations in juvenile chinook salmon (Oncorhynchus tshawytscha), coho salmon (Oncorhynchus kisutch), sockeye salmon (Oncorhynchus nerka), rainbow trout (Oncorhynchus rnykiss), and Arctic char (Salvelinus alpinus). Experimental water was mixed, using stream water and seawater as end members, to create experimental salinities of 0.1, 6.3, 12.7, 18.6, 25.5, and 33.0 psu. Otolith Sr and Sr:Ca ratios were significantly related to salinity for all species (r2 range: 0.80-0.91) but provide only enough predictive resolution to discriminate among fresh water, brackish water, and saltwater residency. These results validate the use of otolith Sr:Ca ratios to broadly discriminate salinity histories encountered by salmonids but highlight the need for further research concerning the influence of osmoregulation and physiological changes associated with smoking on otolith microchemistry.
Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram
2014-01-10
Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.
Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram
2014-01-01
Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343
Hegazy, Maha A; Lotfy, Hayam M; Mowaka, Shereen; Mohamed, Ekram Hany
2016-07-05
Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations. Copyright © 2016 Elsevier B.V. All rights reserved.
Berger, Martin D; Stintzing, Sebastian; Heinemann, Volker; Cao, Shu; Yang, Dongyun; Sunakawa, Yu; Matsusaka, Satoshi; Ning, Yan; Okazaki, Satoshi; Miyamoto, Yuji; Suenaga, Mitsukuni; Schirripa, Marta; Hanna, Diana L; Soni, Shivani; Puccini, Alberto; Zhang, Wu; Cremolini, Chiara; Falcone, Alfredo; Loupakis, Fotios; Lenz, Heinz-Josef
2018-02-15
Purpose: Vitamin D exerts its inhibitory influence on colon cancer growth by inhibiting Wnt signaling and angiogenesis. We hypothesized that SNPs in genes involved in vitamin D transport, metabolism, and signaling are associated with outcome in metastatic colorectal cancer (mCRC) patients treated with first-line FOLFIRI and bevacizumab. Experimental Design: 522 mCRC patients enrolled in the FIRE-3 (discovery cohort) and TRIBE (validation set) trials treated with FOLFIRI/bevacizumab were included in this study. 278 patients receiving FOLFIRI and cetuximab (FIRE-3) served as a control cohort. Six SNPs in 6 genes ( GC, CYP24A1, CYP27B1, VDR, DKK1, CST5 ) were analyzed. Results: In the discovery cohort, AA carriers of the GC rs4588 SNP encoding for the vitamin D-binding protein, and treated with FOLFIRI/bevacizumab had a shorter overall survival (OS) than those harboring any C allele (15.9 vs. 25.1 months) in both univariable ( P = 0.001) and multivariable analyses ( P = 0.047). This association was confirmed in the validation cohort in multivariable analysis (OS 18.1 vs. 26.2 months, HR, 1.83; P = 0.037). Interestingly, AA carriers in the control set exhibited a longer OS (48.0 vs. 25.2 months, HR, 0.50; P = 0.021). This association was further confirmed in a second validation cohort comprising refractory mCRC patients treated with cetuximab ± irinotecan (PFS 8.7 vs. 3.7 months) in univariable ( P = 0.033) and multivariable analyses ( P = 0.046). Conclusions: GC rs4588 SNP might serve as a predictive marker in mCRC patients treated with FOLFIRI/bevacizumab or FOLFIRI/cetuximab. Whereas AA carriers derive a survival benefit with FOLFIRI/cetuximab, treatment with FOLFIRI/bevacizumab is associated with a worse outcome. Clin Cancer Res; 24(4); 784-93. ©2017 AACR . ©2017 American Association for Cancer Research.
The Validity of Selection and Classification Procedures for Predicting Job Performance.
1987-04-01
lacholual or pulley Issues. They cemmunicate Me resulls of special analyses, Iantrim rp or phses of a teak, ad hasm quick macton werk. Paperm r reviw ...51 I. Alternative Selection Procedures ................. 56 J. Meta-Analyses of Validities ............. 58 K . Meta-Analytic Comparisons of...Aptitude Test Battery GM General Maintenance GS General Science GVN Cognitive Ability HS&T Health, Social and Technology K Motor Coordination KFM
A Surrogate Approach to the Experimental Optimization of Multielement Airfoils
NASA Technical Reports Server (NTRS)
Otto, John C.; Landman, Drew; Patera, Anthony T.
1996-01-01
The incorporation of experimental test data into the optimization process is accomplished through the use of Bayesian-validated surrogates. In the surrogate approach, a surrogate for the experiment (e.g., a response surface) serves in the optimization process. The validation step of the framework provides a qualitative assessment of the surrogate quality, and bounds the surrogate-for-experiment error on designs "near" surrogate-predicted optimal designs. The utility of the framework is demonstrated through its application to the experimental selection of the trailing edge ap position to achieve a design lift coefficient for a three-element airfoil.
The Use of Virtual Reality in the Study of People's Responses to Violent Incidents.
Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel
2009-01-01
This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call 'plausibility' - including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.
The Use of Virtual Reality in the Study of People's Responses to Violent Incidents
Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel
2009-01-01
This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call ‘plausibility’ – including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents. PMID:20076762
Validation of Magnetic Resonance Thermometry by Computational Fluid Dynamics
NASA Astrophysics Data System (ADS)
Rydquist, Grant; Owkes, Mark; Verhulst, Claire M.; Benson, Michael J.; Vanpoppel, Bret P.; Burton, Sascha; Eaton, John K.; Elkins, Christopher P.
2016-11-01
Magnetic Resonance Thermometry (MRT) is a new experimental technique that can create fully three-dimensional temperature fields in a noninvasive manner. However, validation is still required to determine the accuracy of measured results. One method of examination is to compare data gathered experimentally to data computed with computational fluid dynamics (CFD). In this study, large-eddy simulations have been performed with the NGA computational platform to generate data for a comparison with previously run MRT experiments. The experimental setup consisted of a heated jet inclined at 30° injected into a larger channel. In the simulations, viscosity and density were scaled according to the local temperature to account for differences in buoyant and viscous forces. A mesh-independent study was performed with 5 mil-, 15 mil- and 45 mil-cell meshes. The program Star-CCM + was used to simulate the complete experimental geometry. This was compared to data generated from NGA. Overall, both programs show good agreement with the experimental data gathered with MRT. With this data, the validity of MRT as a diagnostic tool has been shown and the tool can be used to further our understanding of a range of flows with non-trivial temperature distributions.
COMPRESSORS, *AIR FLOW, TURBOFAN ENGINES , TRANSIENTS, SURGES, STABILITY, COMPUTERIZED SIMULATION, EXPERIMENTAL DATA, VALIDATION, DIGITAL SIMULATION, INLET GUIDE VANES , ROTATION, STALLING, RECOVERY, HYSTERESIS
The Question of Education Science: "Experiment"ism Versus "Experimental"ism
ERIC Educational Resources Information Center
Howe, Kenneth R.
2005-01-01
The ascendant view in the current debate about education science -- experimentism -- is a reassertion of the randomized experiment as the methodological gold standard. Advocates of this view have ignored, not answered, long-standing criticisms of the randomized experiment: its frequent impracticality, its lack of external validity, its confinement…
Internal Validity: A Must in Research Designs
ERIC Educational Resources Information Center
Cahit, Kaya
2015-01-01
In experimental research, internal validity refers to what extent researchers can conclude that changes in dependent variable (i.e. outcome) are caused by manipulations in independent variable. The causal inference permits researchers to meaningfully interpret research results. This article discusses (a) internal validity threats in social and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freed, Melanie; Miller, Stuart; Tang, Katherine
Purpose: MANTIS is a Monte Carlo code developed for the detailed simulation of columnar CsI scintillator screens in x-ray imaging systems. Validation of this code is needed to provide a reliable and valuable tool for system optimization and accurate reconstructions for a variety of x-ray applications. Whereas previous validation efforts have focused on matching of summary statistics, in this work the authors examine the complete point response function (PRF) of the detector system in addition to relative light output values. Methods: Relative light output values and high-resolution PRFs have been experimentally measured with a custom setup. A corresponding set ofmore » simulated light output values and PRFs have also been produced, where detailed knowledge of the experimental setup and CsI:Tl screen structures are accounted for in the simulations. Four different screens were investigated with different thicknesses, column tilt angles, and substrate types. A quantitative comparison between the experimental and simulated PRFs was performed for four different incidence angles (0 deg., 15 deg., 30 deg., and 45 deg.) and two different x-ray spectra (40 and 70 kVp). The figure of merit (FOM) used measures the normalized differences between the simulated and experimental data averaged over a region of interest. Results: Experimental relative light output values ranged from 1.456 to 1.650 and were in approximate agreement for aluminum substrates, but poor agreement for graphite substrates. The FOMs for all screen types, incidence angles, and energies ranged from 0.1929 to 0.4775. To put these FOMs in context, the same FOM was computed for 2D symmetric Gaussians fit to the same experimental data. These FOMs ranged from 0.2068 to 0.8029. Our analysis demonstrates that MANTIS reproduces experimental PRFs with higher accuracy than a symmetric 2D Gaussian fit to the experimental data in the majority of cases. Examination of the spatial distribution of differences between the PRFs shows that the main reason for errors between MANTIS and the experimental data is that MANTIS-generated PRFs are sharper than the experimental PRFs. Conclusions: The experimental validation of MANTIS performed in this study demonstrates that MANTIS is able to reliably predict experimental PRFs, especially for thinner screens, and can reproduce the highly asymmetric shape seen in the experimental data. As a result, optimizations and reconstructions carried out using MANTIS should yield results indicative of actual detector performance. Better characterization of screen properties is necessary to reconcile the simulated light output values with experimental data.« less
Validity of the Microcomputer Evaluation Screening and Assessment Aptitude Scores.
ERIC Educational Resources Information Center
Janikowski, Timothy P.; And Others
1991-01-01
Examined validity of Microcomputer Evaluation Screening and Assessment (MESA) aptitude scores relative to General Aptitude Test Battery (GATB) using multitrait-multimethod correlational analyses. Findings from 54 rehabilitation clients and 29 displaced workers revealed no evidence to support the construct validity of the MESA. (Author/NB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jernigan, Dann A.; Blanchat, Thomas K.
It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less
Evaluation and cross-validation of Environmental Models
NASA Astrophysics Data System (ADS)
Lemaire, Joseph
Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore, preliminary analyses should be carried out under the control and authority of an established international professional Organization or Association, before any final political decision is made by ISO to select a specific Environmental Models, like for example IGRF and DGRF. Of course, Commissions responsible for checking the consistency of definitions, methods and algorithms for data processing might consider to delegate specific tasks (e.g. bench-marking the technical tools, the calibration procedures, the methods of data analysis, and the software algorithms employed in building the different types of models, as well as their usage) to private, intergovernmental or international organization/agencies (e.g.: NASA, ESA, AGU, EGU, COSPAR, . . . ); eventually, the latter should report conclusions to the Commissions members appointed by IAGA or any established authority like IUGG.
Achieving external validity in home advantage research: generalizing crowd noise effects
Myers, Tony D.
2014-01-01
Different factors have been postulated to explain the home advantage phenomenon in sport. One plausible explanation investigated has been the influence of a partisan home crowd on sports officials' decisions. Different types of studies have tested the crowd influence hypothesis including purposefully designed experiments. However, while experimental studies investigating crowd influences have high levels of internal validity, they suffer from a lack of external validity; decision-making in a laboratory setting bearing little resemblance to decision-making in live sports settings. This focused review initially considers threats to external validity in applied and theoretical experimental research. Discussing how such threats can be addressed using representative design by focusing on a recently published study that arguably provides the first experimental evidence of the impact of live crowd noise on officials in sport. The findings of this controlled experiment conducted in a real tournament setting offer a level of confirmation of the findings of laboratory studies in the area. Finally directions for future research and the future conduct of crowd noise studies are discussed. PMID:24917839
Functional Validation and Comparison Framework for EIT Lung Imaging
Meybohm, Patrick; Weiler, Norbert; Frerichs, Inéz; Adler, Andy
2014-01-01
Introduction Electrical impedance tomography (EIT) is an emerging clinical tool for monitoring ventilation distribution in mechanically ventilated patients, for which many image reconstruction algorithms have been suggested. We propose an experimental framework to assess such algorithms with respect to their ability to correctly represent well-defined physiological changes. We defined a set of clinically relevant ventilation conditions and induced them experimentally in 8 pigs by controlling three ventilator settings (tidal volume, positive end-expiratory pressure and the fraction of inspired oxygen). In this way, large and discrete shifts in global and regional lung air content were elicited. Methods We use the framework to compare twelve 2D EIT reconstruction algorithms, including backprojection (the original and still most frequently used algorithm), GREIT (a more recent consensus algorithm for lung imaging), truncated singular value decomposition (TSVD), several variants of the one-step Gauss-Newton approach and two iterative algorithms. We consider the effects of using a 3D finite element model, assuming non-uniform background conductivity, noise modeling, reconstructing for electrode movement, total variation (TV) reconstruction, robust error norms, smoothing priors, and using difference vs. normalized difference data. Results and Conclusions Our results indicate that, while variation in appearance of images reconstructed from the same data is not negligible, clinically relevant parameters do not vary considerably among the advanced algorithms. Among the analysed algorithms, several advanced algorithms perform well, while some others are significantly worse. Given its vintage and ad-hoc formulation backprojection works surprisingly well, supporting the validity of previous studies in lung EIT. PMID:25110887
NASA Astrophysics Data System (ADS)
Petrie, Christian M.; Koyanagi, Takaaki; McDuffee, Joel L.; Deck, Christian P.; Katoh, Yutai; Terrani, Kurt A.
2017-08-01
The purpose of this work is to design an irradiation vehicle for testing silicon carbide (SiC) fiber-reinforced SiC matrix composite cladding materials under conditions representative of a light water reactor in order to validate thermo-mechanical models of stress states in these materials due to irradiation swelling and differential thermal expansion. The design allows for a constant tube outer surface temperature in the range of 300-350 °C under a representative high heat flux (∼0.66 MW/m2) during one cycle of irradiation in an un-instrumented ;rabbit; capsule in the High Flux Isotope Reactor. An engineered aluminum foil was developed to absorb the expansion of the cladding tubes, due to irradiation swelling, without changing the thermal resistance of the gap between the cladding and irradiation capsule. Finite-element analyses of the capsule were performed, and the models used to calculate thermal contact resistance were validated by out-of-pile testing and post-irradiation examination of the foils and passive SiC thermometry. Six irradiated cladding tubes (both monoliths and composites) were irradiated and subsequently disassembled in a hot cell. The calculated temperatures of passive SiC thermometry inside the capsules showed good agreement with temperatures measured post-irradiation, with two calculated temperatures falling within 10 °C of experimental measurements. The success of this design could lead to new opportunities for irradiation applications with materials that suffer from irradiation swelling, creep, or other dimensional changes that can affect the specimen temperature during irradiation.
Excore Modeling with VERAShift
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandya, Tara M.; Evans, Thomas M.
It is important to be able to accurately predict the neutron flux outside the immediate reactor core for a variety of safety and material analyses. Monte Carlo radiation transport calculations are required to produce the high fidelity excore responses. Under this milestone VERA (specifically the VERAShift package) has been extended to perform excore calculations by running radiation transport calculations with Shift. This package couples VERA-CS with Shift to perform excore tallies for multiple state points concurrently, with each component capable of parallel execution on independent domains. Specifically, this package performs fluence calculations in the core barrel and vessel, or, performsmore » the requested tallies in any user-defined excore regions. VERAShift takes advantage of the general geometry package in Shift. This gives VERAShift the flexibility to explicitly model features outside the core barrel, including detailed vessel models, detectors, and power plant details. A very limited set of experimental and numerical benchmarks is available for excore simulation comparison. The Consortium for the Advanced Simulation of Light Water Reactors (CASL) has developed a set of excore benchmark problems to include as part of the VERA-CS verification and validation (V&V) problems. The excore capability in VERAShift has been tested on small representative assembly problems, multiassembly problems, and quarter-core problems. VERAView has also been extended to visualize these vessel fluence results from VERAShift. Preliminary vessel fluence results for quarter-core multistate calculations look very promising. Further development is needed to determine the details relevant to excore simulations. Validation of VERA for fluence and excore detectors still needs to be performed against experimental and numerical results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrie, Christian M.; Koyanagi, Takaaki; McDuffee, Joel L.
The purpose of this work is to design an irradiation vehicle for testing silicon carbide (SiC) fiber-reinforced SiC matrix composite cladding materials under conditions representative of a light water reactor in order to validate thermo-mechanical models of stress states in these materials due to irradiation swelling and differential thermal expansion. The design allows for a constant tube outer surface temperature in the range of 300–350 °C under a representative high heat flux (~0.66 MW/m 2) during one cycle of irradiation in an un-instrumented “rabbit” capsule in the High Flux Isotope Reactor. An engineered aluminum foil was developed to absorb themore » expansion of the cladding tubes, due to irradiation swelling, without changing the thermal resistance of the gap between the cladding and irradiation capsule. Finite-element analyses of the capsule were performed, and the models used to calculate thermal contact resistance were validated by out-of-pile testing and post-irradiation examination of the foils and passive SiC thermometry. Six irradiated cladding tubes (both monoliths and composites) were irradiated and subsequently disassembled in a hot cell. The calculated temperatures of passive SiC thermometry inside the capsules showed good agreement with temperatures measured post-irradiation, with two calculated temperatures falling within 10 °C of experimental measurements. Furthermore, the success of this design could lead to new opportunities for irradiation applications with materials that suffer from irradiation swelling, creep, or other dimensional changes that can affect the specimen temperature during irradiation.« less