Sample records for jannus experimental validation

  1. Experimental validation of structural optimization methods

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.

    1992-01-01

    The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.

  2. A Comprehensive Validation Methodology for Sparse Experimental Data

    NASA Technical Reports Server (NTRS)

    Norman, Ryan B.; Blattnig, Steve R.

    2010-01-01

    A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.

  3. Experimental Design and Some Threats to Experimental Validity: A Primer

    ERIC Educational Resources Information Center

    Skidmore, Susan

    2008-01-01

    Experimental designs are distinguished as the best method to respond to questions involving causality. The purpose of the present paper is to explicate the logic of experimental design and why it is so vital to questions that demand causal conclusions. In addition, types of internal and external validity threats are discussed. To emphasize the…

  4. Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.

    PubMed

    Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B

    2018-01-01

    The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.

  5. Experimental validation of predicted cancer genes using FRET

    NASA Astrophysics Data System (ADS)

    Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.

    2018-07-01

    Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.

  6. Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2

    NASA Technical Reports Server (NTRS)

    Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)

    1998-01-01

    The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.

  7. Solar-Diesel Hybrid Power System Optimization and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Jacobus, Headley Stewart

    As of 2008 1.46 billion people, or 22 percent of the World's population, were without electricity. Many of these people live in remote areas where decentralized generation is the only method of electrification. Most mini-grids are powered by diesel generators, but new hybrid power systems are becoming a reliable method to incorporate renewable energy while also reducing total system cost. This thesis quantifies the measurable Operational Costs for an experimental hybrid power system in Sierra Leone. Two software programs, Hybrid2 and HOMER, are used during the system design and subsequent analysis. Experimental data from the installed system is used to validate the two programs and to quantify the savings created by each component within the hybrid system. This thesis bridges the gap between design optimization studies that frequently lack subsequent validation and experimental hybrid system performance studies.

  8. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  9. Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model

    DTIC Science & Technology

    2010-03-01

    EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End

  10. Experimental validation of an ultrasonic flowmeter for unsteady flows

    NASA Astrophysics Data System (ADS)

    Leontidis, V.; Cuvier, C.; Caignaert, G.; Dupont, P.; Roussette, O.; Fammery, S.; Nivet, P.; Dazin, A.

    2018-04-01

    An ultrasonic flowmeter was developed for further applications in cryogenic conditions and for measuring flow rate fluctuations in the range of 0 to 70 Hz. The prototype was installed in a flow test rig, and was validated experimentally both in steady and unsteady water flow conditions. A Coriolis flowmeter was used for the calibration under steady state conditions, whereas in the unsteady case the validation was done simultaneously against two methods: particle image velocimetry (PIV), and with pressure transducers installed flush on the wall of the pipe. The results show that the developed flowmeter and the proposed methodology can accurately measure the frequency and amplitude of unsteady fluctuations in the experimental range of 0-9 l s-1 of the mean main flow rate and 0-70 Hz of the imposed disturbances.

  11. Flight Research and Validation Formerly Experimental Capabilities Supersonic Project

    NASA Technical Reports Server (NTRS)

    Banks, Daniel

    2009-01-01

    This slide presentation reviews the work of the Experimental Capabilities Supersonic project, that is being reorganized into Flight Research and Validation. The work of Experimental Capabilities Project in FY '09 is reviewed, and the specific centers that is assigned to do the work is given. The portfolio of the newly formed Flight Research and Validation (FRV) group is also reviewed. The various projects for FY '10 for the FRV are detailed. These projects include: Eagle Probe, Channeled Centerbody Inlet Experiment (CCIE), Supersonic Boundary layer Transition test (SBLT), Aero-elastic Test Wing-2 (ATW-2), G-V External Vision Systems (G5 XVS), Air-to-Air Schlieren (A2A), In Flight Background Oriented Schlieren (BOS), Dynamic Inertia Measurement Technique (DIM), and Advanced In-Flight IR Thermography (AIR-T).

  12. Experimental validation of a new heterogeneous mechanical test design

    NASA Astrophysics Data System (ADS)

    Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.

    2018-05-01

    Standard material parameters identification strategies generally use an extensive number of classical tests for collecting the required experimental data. However, a great effort has been made recently by the scientific and industrial communities to support this experimental database on heterogeneous tests. These tests can provide richer information on the material behavior allowing the identification of a more complete set of material parameters. This is a result of the recent development of full-field measurements techniques, like digital image correlation (DIC), that can capture the heterogeneous deformation fields on the specimen surface during the test. Recently, new specimen geometries were designed to enhance the richness of the strain field and capture supplementary strain states. The butterfly specimen is an example of these new geometries, designed through a numerical optimization procedure where an indicator capable of evaluating the heterogeneity and the richness of strain information. However, no experimental validation was yet performed. The aim of this work is to experimentally validate the heterogeneous butterfly mechanical test in the parameter identification framework. For this aim, DIC technique and a Finite Element Model Up-date inverse strategy are used together for the parameter identification of a DC04 steel, as well as the calculation of the indicator. The experimental tests are carried out in a universal testing machine with the ARAMIS measuring system to provide the strain states on the specimen surface. The identification strategy is accomplished with the data obtained from the experimental tests and the results are compared to a reference numerical solution.

  13. INL Experimental Program Roadmap for Thermal Hydraulic Code Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glenn McCreery; Hugh McIlroy

    2007-09-01

    Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role ofmore » expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments

  14. Experimental validation of calculated atomic charges in ionic liquids

    NASA Astrophysics Data System (ADS)

    Fogarty, Richard M.; Matthews, Richard P.; Ashworth, Claire R.; Brandt-Talbot, Agnieszka; Palgrave, Robert G.; Bourne, Richard A.; Vander Hoogerstraete, Tom; Hunt, Patricia A.; Lovelock, Kevin R. J.

    2018-05-01

    A combination of X-ray photoelectron spectroscopy and near edge X-ray absorption fine structure spectroscopy has been used to provide an experimental measure of nitrogen atomic charges in nine ionic liquids (ILs). These experimental results are used to validate charges calculated with three computational methods: charges from electrostatic potentials using a grid-based method (ChelpG), natural bond orbital population analysis, and the atoms in molecules approach. By combining these results with those from a previous study on sulfur, we find that ChelpG charges provide the best description of the charge distribution in ILs. However, we find that ChelpG charges can lead to significant conformational dependence and therefore advise that small differences in ChelpG charges (<0.3 e) should be interpreted with care. We use these validated charges to provide physical insight into nitrogen atomic charges for the ILs probed.

  15. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...

  16. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...

  17. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...

  18. Experimental Validation of an Integrated Controls-Structures Design Methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.

    1996-01-01

    The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.

  19. Experimental validation of wireless communication with chaos.

    PubMed

    Ren, Hai-Peng; Bai, Chao; Liu, Jian; Baptista, Murilo S; Grebogi, Celso

    2016-08-01

    The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and an integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.

  20. Experimental validation of wireless communication with chaos

    NASA Astrophysics Data System (ADS)

    Ren, Hai-Peng; Bai, Chao; Liu, Jian; Baptista, Murilo S.; Grebogi, Celso

    2016-08-01

    The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and an integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.

  1. Experimental validation of the DARWIN2.3 package for fuel cycle applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    San-Felice, L.; Eschbach, R.; Bourdot, P.

    2012-07-01

    The DARWIN package, developed by the CEA and its French partners (AREVA and EDF) provides the required parameters for fuel cycle applications: fuel inventory, decay heat, activity, neutron, {gamma}, {alpha}, {beta} sources and spectrum, radiotoxicity. This paper presents the DARWIN2.3 experimental validation for fuel inventory and decay heat calculations on Pressurized Water Reactor (PWR). In order to validate this code system for spent fuel inventory a large program has been undertaken, based on spent fuel chemical assays. This paper deals with the experimental validation of DARWIN2.3 for the Pressurized Water Reactor (PWR) Uranium Oxide (UOX) and Mixed Oxide (MOX) fuelmore » inventory calculation, focused on the isotopes involved in Burn-Up Credit (BUC) applications and decay heat computations. The calculation - experiment (C/E-1) discrepancies are calculated with the latest European evaluation file JEFF-3.1.1 associated with the SHEM energy mesh. An overview of the tendencies is obtained on a complete range of burn-up from 10 to 85 GWd/t (10 to 60 GWcVt for MOX fuel). The experimental validation of the DARWIN2.3 package for decay heat calculation is performed using calorimetric measurements carried out at the Swedish Interim Spent Fuel Storage Facility for Pressurized Water Reactor (PWR) assemblies, covering a large burn-up (20 to 50 GWd/t) and cooling time range (10 to 30 years). (authors)« less

  2. Experimental validation of flexible robot arm modeling and control

    NASA Technical Reports Server (NTRS)

    Ulsoy, A. Galip

    1989-01-01

    Flexibility is important for high speed, high precision operation of lightweight manipulators. Accurate dynamic modeling of flexible robot arms is needed. Previous work has mostly been based on linear elasticity with prescribed rigid body motions (i.e., no effect of flexible motion on rigid body motion). Little or no experimental validation of dynamic models for flexible arms is available. Experimental results are also limited for flexible arm control. Researchers include the effects of prismatic as well as revolute joints. They investigate the effect of full coupling between the rigid and flexible motions, and of axial shortening, and consider the control of flexible arms using only additional sensors.

  3. Experimental Characterization and Validation of Simultaneous Gust Alleviation and Energy Harvesting for Multifunctional Wing Spars

    DTIC Science & Technology

    2012-08-01

    U0=15m/s,  Lv  =350m   Cloud Wind and Clear Sky Gust Simulation Using Dryden PSD* Harvested Energy from Normal Vibration (Red) to...energy control law based on limited energy constraints 4) Experimentally validated simultaneous energy harvesting and vibration control Summary...Experimental Characterization and Validation of Simultaneous Gust Alleviation and Energy Harvesting for Multifunctional Wing Spars AFOSR

  4. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk

    2014-07-28

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to supportmore » the design of optimised electrocaloric units and operating conditions.« less

  5. A new simple local muscle recovery model and its theoretical and experimental validation.

    PubMed

    Ma, Liang; Zhang, Wei; Wu, Su; Zhang, Zhanwu

    2015-01-01

    This study was conducted to provide theoretical and experimental validation of a local muscle recovery model. Muscle recovery has been modeled in different empirical and theoretical approaches to determine work-rest allowance for musculoskeletal disorder (MSD) prevention. However, time-related parameters and individual attributes have not been sufficiently considered in conventional approaches. A new muscle recovery model was proposed by integrating time-related task parameters and individual attributes. Theoretically, this muscle recovery model was compared to other theoretical models mathematically. Experimentally, a total of 20 subjects participated in the experimental validation. Hand grip force recovery and shoulder joint strength recovery were measured after a fatiguing operation. The recovery profile was fitted by using the recovery model, and individual recovery rates were calculated as well after fitting. Good fitting values (r(2) > .8) were found for all the subjects. Significant differences in recovery rates were found among different muscle groups (p < .05). The theoretical muscle recovery model was primarily validated by characterization of the recovery process after fatiguing operation. The determined recovery rate may be useful to represent individual recovery attribute.

  6. Experimental validation of solid rocket motor damping models

    NASA Astrophysics Data System (ADS)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2017-12-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe

  7. Experimental validation of solid rocket motor damping models

    NASA Astrophysics Data System (ADS)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2018-06-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe

  8. Validation of experimental molecular crystal structures with dispersion-corrected density functional theory calculations.

    PubMed

    van de Streek, Jacco; Neumann, Marcus A

    2010-10-01

    This paper describes the validation of a dispersion-corrected density functional theory (d-DFT) method for the purpose of assessing the correctness of experimental organic crystal structures and enhancing the information content of purely experimental data. 241 experimental organic crystal structures from the August 2008 issue of Acta Cryst. Section E were energy-minimized in full, including unit-cell parameters. The differences between the experimental and the minimized crystal structures were subjected to statistical analysis. The r.m.s. Cartesian displacement excluding H atoms upon energy minimization with flexible unit-cell parameters is selected as a pertinent indicator of the correctness of a crystal structure. All 241 experimental crystal structures are reproduced very well: the average r.m.s. Cartesian displacement for the 241 crystal structures, including 16 disordered structures, is only 0.095 Å (0.084 Å for the 225 ordered structures). R.m.s. Cartesian displacements above 0.25 A either indicate incorrect experimental crystal structures or reveal interesting structural features such as exceptionally large temperature effects, incorrectly modelled disorder or symmetry breaking H atoms. After validation, the method is applied to nine examples that are known to be ambiguous or subtly incorrect.

  9. An experimentally validated network of nine haematopoietic transcription factors reveals mechanisms of cell state stability

    PubMed Central

    Schütte, Judith; Wang, Huange; Antoniou, Stella; Jarratt, Andrew; Wilson, Nicola K; Riepsaame, Joey; Calero-Nieto, Fernando J; Moignard, Victoria; Basilico, Silvia; Kinston, Sarah J; Hannah, Rebecca L; Chan, Mun Chiang; Nürnberg, Sylvia T; Ouwehand, Willem H; Bonzanni, Nicola; de Bruijn, Marella FTR; Göttgens, Berthold

    2016-01-01

    Transcription factor (TF) networks determine cell-type identity by establishing and maintaining lineage-specific expression profiles, yet reconstruction of mammalian regulatory network models has been hampered by a lack of comprehensive functional validation of regulatory interactions. Here, we report comprehensive ChIP-Seq, transgenic and reporter gene experimental data that have allowed us to construct an experimentally validated regulatory network model for haematopoietic stem/progenitor cells (HSPCs). Model simulation coupled with subsequent experimental validation using single cell expression profiling revealed potential mechanisms for cell state stabilisation, and also how a leukaemogenic TF fusion protein perturbs key HSPC regulators. The approach presented here should help to improve our understanding of both normal physiological and disease processes. DOI: http://dx.doi.org/10.7554/eLife.11469.001 PMID:26901438

  10. Experimental Validation: Subscale Aircraft Ground Facilities and Integrated Test Capability

    NASA Technical Reports Server (NTRS)

    Bailey, Roger M.; Hostetler, Robert W., Jr.; Barnes, Kevin N.; Belcastro, Celeste M.; Belcastro, Christine M.

    2005-01-01

    Experimental testing is an important aspect of validating complex integrated safety critical aircraft technologies. The Airborne Subscale Transport Aircraft Research (AirSTAR) Testbed is being developed at NASA Langley to validate technologies under conditions that cannot be flight validated with full-scale vehicles. The AirSTAR capability comprises a series of flying sub-scale models, associated ground-support equipment, and a base research station at NASA Langley. The subscale model capability utilizes a generic 5.5% scaled transport class vehicle known as the Generic Transport Model (GTM). The AirSTAR Ground Facilities encompass the hardware and software infrastructure necessary to provide comprehensive support services for the GTM testbed. The ground facilities support remote piloting of the GTM aircraft, and include all subsystems required for data/video telemetry, experimental flight control algorithm implementation and evaluation, GTM simulation, data recording/archiving, and audio communications. The ground facilities include a self-contained, motorized vehicle serving as a mobile research command/operations center, capable of deployment to remote sites when conducting GTM flight experiments. The ground facilities also include a laboratory based at NASA LaRC providing near identical capabilities as the mobile command/operations center, as well as the capability to receive data/video/audio from, and send data/audio to the mobile command/operations center during GTM flight experiments.

  11. Viscoelasticity of Axisymmetric Composite Structures: Analysis and Experimental Validation

    DTIC Science & Technology

    2013-02-01

    compressive stress at the interface between the composite and steel prior to the sheath’s cut-off. Accordingly, the viscoelastic analysis is used...The hoop-stress profile in figure 6 shows the steel region is in compression , resulting from the winding tension of composite overwrap. The stress...mechanical and thermal loads. Experimental validation of the model is conducted using a high- tensioned composite overwrapped on a steel cylinder. The creep

  12. Experimental validation of wireless communication with chaos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Hai-Peng; Bai, Chao; Liu, Jian

    The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and anmore » integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.« less

  13. Summary: Experimental validation of real-time fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Iyer, R. K.; Choi, G. S.

    1992-01-01

    Testing and validation of real-time systems is always difficult to perform since neither the error generation process nor the fault propagation problem is easy to comprehend. There is no better substitute to results based on actual measurements and experimentation. Such results are essential for developing a rational basis for evaluation and validation of real-time systems. However, with physical experimentation, controllability and observability are limited to external instrumentation that can be hooked-up to the system under test. And this process is quite a difficult, if not impossible, task for a complex system. Also, to set up such experiments for measurements, physical hardware must exist. On the other hand, a simulation approach allows flexibility that is unequaled by any other existing method for system evaluation. A simulation methodology for system evaluation was successfully developed and implemented and the environment was demonstrated using existing real-time avionic systems. The research was oriented toward evaluating the impact of permanent and transient faults in aircraft control computers. Results were obtained for the Bendix BDX 930 system and Hamilton Standard EEC131 jet engine controller. The studies showed that simulated fault injection is valuable, in the design stage, to evaluate the susceptibility of computing sytems to different types of failures.

  14. Numerical Modelling of Femur Fracture and Experimental Validation Using Bone Simulant.

    PubMed

    Marco, Miguel; Giner, Eugenio; Larraínzar-Garijo, Ricardo; Caeiro, José Ramón; Miguélez, María Henar

    2017-10-01

    Bone fracture pattern prediction is still a challenge and an active field of research. The main goal of this article is to present a combined methodology (experimental and numerical) for femur fracture onset analysis. Experimental work includes the characterization of the mechanical properties and fracture testing on a bone simulant. The numerical work focuses on the development of a model whose material properties are provided by the characterization tests. The fracture location and the early stages of the crack propagation are modelled using the extended finite element method and the model is validated by fracture tests developed in the experimental work. It is shown that the accuracy of the numerical results strongly depends on a proper bone behaviour characterization.

  15. Experimental validation of ultrasonic guided modes in electrical cables by optical interferometry.

    PubMed

    Mateo, Carlos; de Espinosa, Francisco Montero; Gómez-Ullate, Yago; Talavera, Juan A

    2008-03-01

    In this work, the dispersion curves of elastic waves propagating in electrical cables and in bare copper wires are obtained theoretically and validated experimentally. The theoretical model, based on Gazis equations formulated according to the global matrix methodology, is resolved numerically. Viscoelasticity and attenuation are modeled theoretically using the Kelvin-Voigt model. Experimental tests are carried out using interferometry. There is good agreement between the simulations and the experiments despite the peculiarities of electrical cables.

  16. Fractional viscoelasticity in fractal and non-fractal media: Theory, experimental validation, and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Mashayekhi, Somayeh; Miles, Paul; Hussaini, M. Yousuff; Oates, William S.

    2018-02-01

    In this paper, fractional and non-fractional viscoelastic models for elastomeric materials are derived and analyzed in comparison to experimental results. The viscoelastic models are derived by expanding thermodynamic balance equations for both fractal and non-fractal media. The order of the fractional time derivative is shown to strongly affect the accuracy of the viscoelastic constitutive predictions. Model validation uses experimental data describing viscoelasticity of the dielectric elastomer Very High Bond (VHB) 4910. Since these materials are known for their broad applications in smart structures, it is important to characterize and accurately predict their behavior across a large range of time scales. Whereas integer order viscoelastic models can yield reasonable agreement with data, the model parameters often lack robustness in prediction at different deformation rates. Alternatively, fractional order models of viscoelasticity provide an alternative framework to more accurately quantify complex rate-dependent behavior. Prior research that has considered fractional order viscoelasticity lacks experimental validation and contains limited links between viscoelastic theory and fractional order derivatives. To address these issues, we use fractional order operators to experimentally validate fractional and non-fractional viscoelastic models in elastomeric solids using Bayesian uncertainty quantification. The fractional order model is found to be advantageous as predictions are significantly more accurate than integer order viscoelastic models for deformation rates spanning four orders of magnitude.

  17. A Philosophical Perspective on Construct Validation: Application of Inductive Logic to the Analysis of Experimental Episode Construct Validity.

    ERIC Educational Resources Information Center

    Rossi, Robert Joseph

    Methods drawn from four logical theories associated with studies of inductive processes are applied to the assessment and evaluation of experimental episode construct validity. It is shown that this application provides for estimates of episode informativeness with respect to the person examined in terms of the construct and to the construct…

  18. Numerical modeling and experimental validation of thermoplastic composites induction welding

    NASA Astrophysics Data System (ADS)

    Palmieri, Barbara; Nele, Luigi; Galise, Francesco

    2018-05-01

    In this work, a numerical simulation and experimental test of the induction welding of continuous fibre-reinforced thermoplastic composites (CFRTPCs) was provided. The thermoplastic Polyamide 66 (PA66) with carbon fiber fabric was used. Using a dedicated software (JMag Designer), the influence of the fundamental process parameters such as temperature, current and holding time was investigated. In order to validate the results of the simulations, and therefore the numerical model used, experimental tests were carried out, and the temperature values measured during the tests were compared with the aid of an optical pyrometer, with those provided by the numerical simulation. The mechanical properties of the welded joints were evaluated by single lap shear tests.

  19. Three-dimensional shape optimization of a cemented hip stem and experimental validations.

    PubMed

    Higa, Masaru; Tanino, Hiromasa; Nishimura, Ikuya; Mitamura, Yoshinori; Matsuno, Takeo; Ito, Hiroshi

    2015-03-01

    This study proposes novel optimized stem geometry with low stress values in the cement using a finite element (FE) analysis combined with an optimization procedure and experimental measurements of cement stress in vitro. We first optimized an existing stem geometry using a three-dimensional FE analysis combined with a shape optimization technique. One of the most important factors in the cemented stem design is to reduce stress in the cement. Hence, in the optimization study, we minimized the largest tensile principal stress in the cement mantle under a physiological loading condition by changing the stem geometry. As the next step, the optimized stem and the existing stem were manufactured to validate the usefulness of the numerical models and the results of the optimization in vitro. In the experimental study, strain gauges were embedded in the cement mantle to measure the strain in the cement mantle adjacent to the stems. The overall trend of the experimental study was in good agreement with the results of the numerical study, and we were able to reduce the largest stress by more than 50% in both shape optimization and strain gauge measurements. Thus, we could validate the usefulness of the numerical models and the results of the optimization using the experimental models. The optimization employed in this study is a useful approach for developing new stem designs.

  20. Experimental validation of ultrasonic NDE simulation software

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Larche, Michael; Diaz, Aaron A.; Crawford, Susan L.; Prowant, Matthew S.; Anderson, Michael T.

    2016-02-01

    Computer modeling and simulation is becoming an essential tool for transducer design and insight into ultrasonic nondestructive evaluation (UT-NDE). As the popularity of simulation tools for UT-NDE increases, it becomes important to assess their reliability to model acoustic responses from defects in operating components and provide information that is consistent with in-field inspection data. This includes information about the detectability of different defect types for a given UT probe. Recently, a cooperative program between the Electrical Power Research Institute and the U.S. Nuclear Regulatory Commission was established to validate numerical modeling software commonly used for simulating UT-NDE of nuclear power plant components. In the first phase of this cooperative, extensive experimental UT measurements were conducted on machined notches with varying depth, length, and orientation in stainless steel plates. Then, the notches were modeled in CIVA, a semi-analytical NDE simulation platform developed by the French Commissariat a l'Energie Atomique, and their responses compared with the experimental measurements. Discrepancies between experimental and simulation results are due to either improper inputs to the simulation model, or to incorrect approximations and assumptions in the numerical models. To address the former, a variation study was conducted on the different parameters that are required as inputs for the model, specifically the specimen and transducer properties. Then, the ability of simulations to give accurate predictions regarding the detectability of the different defects was demonstrated. This includes the results in terms of the variations in defect amplitude indications, and the ratios between tip diffracted and specular signal amplitudes.

  1. Validation of an automated mite counter for Dermanyssus gallinae in experimental laying hen cages.

    PubMed

    Mul, Monique F; van Riel, Johan W; Meerburg, Bastiaan G; Dicke, Marcel; George, David R; Groot Koerkamp, Peter W G

    2015-08-01

    For integrated pest management (IPM) programs to be maximally effective, monitoring of the growth and decline of the pest populations is essential. Here, we present the validation results of a new automated monitoring device for the poultry red mite (Dermanyssus gallinae), a serious pest in laying hen facilities world-wide. This monitoring device (called an "automated mite counter") was validated in experimental laying hen cages with live birds and a growing population of D. gallinae. This validation study resulted in 17 data points of 'number of mites counted' by the automated mite counter and the 'number of mites present' in the experimental laying hen cages. The study demonstrated that the automated mite counter was able to track the D. gallinae population effectively. A wider evaluation showed that this automated mite counter can become a useful tool in IPM of D. gallinae in laying hen facilities.

  2. Experimental Validation of Model Updating and Damage Detection via Eigenvalue Sensitivity Methods with Artificial Boundary Conditions

    DTIC Science & Technology

    2017-09-01

    VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY

  3. Computational Fluid Dynamics Modeling of the Human Pulmonary Arteries with Experimental Validation.

    PubMed

    Bordones, Alifer D; Leroux, Matthew; Kheyfets, Vitaly O; Wu, Yu-An; Chen, Chia-Yuan; Finol, Ender A

    2018-05-21

    Pulmonary hypertension (PH) is a chronic progressive disease characterized by elevated pulmonary arterial pressure, caused by an increase in pulmonary arterial impedance. Computational fluid dynamics (CFD) can be used to identify metrics representative of the stage of PH disease. However, experimental validation of CFD models is often not pursued due to the geometric complexity of the model or uncertainties in the reproduction of the required flow conditions. The goal of this work is to validate experimentally a CFD model of a pulmonary artery phantom using a particle image velocimetry (PIV) technique. Rapid prototyping was used for the construction of the patient-specific pulmonary geometry, derived from chest computed tomography angiography images. CFD simulations were performed with the pulmonary model with a Reynolds number matching those of the experiments. Flow rates, the velocity field, and shear stress distributions obtained with the CFD simulations were compared to their counterparts from the PIV flow visualization experiments. Computationally predicted flow rates were within 1% of the experimental measurements for three of the four branches of the CFD model. The mean velocities in four transversal planes of study were within 5.9 to 13.1% of the experimental mean velocities. Shear stresses were qualitatively similar between the two methods with some discrepancies in the regions of high velocity gradients. The fluid flow differences between the CFD model and the PIV phantom are attributed to experimental inaccuracies and the relative compliance of the phantom. This comparative analysis yielded valuable information on the accuracy of CFD predicted hemodynamics in pulmonary circulation models.

  4. Experimental validation of 2D uncertainty quantification for DIC.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reu, Phillip L.

    Because digital image correlation (DIC) has become such an important and standard tool in the toolbox of experimental mechanicists, a complete uncertainty quantification of the method is needed. It should be remembered that each DIC setup and series of images will have a unique uncertainty based on the calibration quality and the image and speckle quality of the analyzed images. Any pretest work done with a calibrated DIC stereo-rig to quantify the errors using known shapes and translations, while useful, do not necessarily reveal the uncertainty of a later test. This is particularly true with high-speed applications where actual testmore » images are often less than ideal. Work has previously been completed on the mathematical underpinnings of DIC uncertainty quantification and is already published, this paper will present corresponding experimental work used to check the validity of the uncertainty equations.« less

  5. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...

  6. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...

  7. Relationship of otolith strontium-to-calcium ratios and salinity: Experimental validation for juvenile salmonids

    USGS Publications Warehouse

    Zimmerman, C.E.

    2005-01-01

    Analysis of otolith strontium (Sr) or strontium-to-calcium (Sr:Ca) ratios provides a powerful tool to reconstruct the chronology of migration among salinity environments for diadromous salmonids. Although use of this method has been validated by examination of known individuals and translocation experiments, it has never been validated under controlled experimental conditions. In this study, incorporation of otolith Sr was tested across a range of salinities and resulting levels of ambient Sr and Ca concentrations in juvenile chinook salmon (Oncorhynchus tshawytscha), coho salmon (Oncorhynchus kisutch), sockeye salmon (Oncorhynchus nerka), rainbow trout (Oncorhynchus rnykiss), and Arctic char (Salvelinus alpinus). Experimental water was mixed, using stream water and seawater as end members, to create experimental salinities of 0.1, 6.3, 12.7, 18.6, 25.5, and 33.0 psu. Otolith Sr and Sr:Ca ratios were significantly related to salinity for all species (r2 range: 0.80-0.91) but provide only enough predictive resolution to discriminate among fresh water, brackish water, and saltwater residency. These results validate the use of otolith Sr:Ca ratios to broadly discriminate salinity histories encountered by salmonids but highlight the need for further research concerning the influence of osmoregulation and physiological changes associated with smoking on otolith microchemistry.

  8. Experimental validation of clock synchronization algorithms

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Graham, R. Lynn

    1992-01-01

    The objective of this work is to validate mathematically derived clock synchronization theories and their associated algorithms through experiment. Two theories are considered, the Interactive Convergence Clock Synchronization Algorithm and the Midpoint Algorithm. Special clock circuitry was designed and built so that several operating conditions and failure modes (including malicious failures) could be tested. Both theories are shown to predict conservative upper bounds (i.e., measured values of clock skew were always less than the theory prediction). Insight gained during experimentation led to alternative derivations of the theories. These new theories accurately predict the behavior of the clock system. It is found that a 100 percent penalty is paid to tolerate worst-case failures. It is also shown that under optimal conditions (with minimum error and no failures) the clock skew can be as much as three clock ticks. Clock skew grows to six clock ticks when failures are present. Finally, it is concluded that one cannot rely solely on test procedures or theoretical analysis to predict worst-case conditions.

  9. Modeling and Experimental Validation for 3D mm-wave Radar Imaging

    NASA Astrophysics Data System (ADS)

    Ghazi, Galia

    As the problem of identifying suicide bombers wearing explosives concealed under clothing becomes increasingly important, it becomes essential to detect suspicious individuals at a distance. Systems which employ multiple sensors to determine the presence of explosives on people are being developed. Their functions include observing and following individuals with intelligent video, identifying explosives residues or heat signatures on the outer surface of their clothing, and characterizing explosives using penetrating X-rays, terahertz waves, neutron analysis, or nuclear quadrupole resonance. At present, mm-wave radar is the only modality that can both penetrate and sense beneath clothing at a distance of 2 to 50 meters without causing physical harm. Unfortunately, current mm-wave radar systems capable of performing high-resolution, real-time imaging require using arrays with a large number of transmitting and receiving modules; therefore, these systems present undesired large size, weight and power consumption, as well as extremely complex hardware architecture. The overarching goal of this thesis is the development and experimental validation of a next generation inexpensive, high-resolution radar system that can distinguish security threats hidden on individuals located at 2-10 meters range. In pursuit of this goal, this thesis proposes the following contributions: (1) Development and experimental validation of a new current-based, high-frequency computational method to model large scattering problems (hundreds of wavelengths) involving lossy, penetrable and multi-layered dielectric and conductive structures, which is needed for an accurate characterization of the wave-matter interaction and EM scattering in the target region; (2) Development of combined Norm-1, Norm-2 regularized imaging algorithms, which are needed for enhancing the resolution of the images while using a minimum number of transmitting and receiving antennas; (3) Implementation and experimental

  10. Validation of the Soil Moisture Active Passive mission using USDA-ARS experimental watersheds

    USDA-ARS?s Scientific Manuscript database

    The calibration and validation program of the Soil Moisture Active Passive mission (SMAP) relies upon an international cooperative of in situ networks to provide ground truth references across a variety of landscapes. The USDA Agricultural Research Service operates several experimental watersheds wh...

  11. Experimental Validation and Combustion Modeling of a JP-8 Surrogate in a Single Cylinder Diesel Engine

    DTIC Science & Technology

    2014-04-15

    SINGLE CYLINDER DIESEL ENGINE Amit Shrestha, Umashankar Joshi, Ziliang Zheng, Tamer Badawy, Naeim A. Henein, Wayne State University, Detroit, MI, USA...13-03-2014 4. TITLE AND SUBTITLE EXPERIMENTAL VALIDATION AND COMBUSTION MODELING OF A JP-8 SURROGATE IN A SINGLE CYLINDER DIESEL ENGINE 5a...INTERNATIONAL UNCLASSIFIED • Validate a two-component JP-8 surrogate in a single cylinder diesel engine. Validation parameters include – Ignition delay

  12. Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reboud, C.; Premel, D.; Lesselier, D.

    2007-03-21

    Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.

  13. Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations

    NASA Astrophysics Data System (ADS)

    Reboud, C.; Prémel, D.; Lesselier, D.; Bisiaux, B.

    2007-03-01

    Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.

  14. Modeling and experimental validation of a Hybridized Energy Storage System for automotive applications

    NASA Astrophysics Data System (ADS)

    Fiorenti, Simone; Guanetti, Jacopo; Guezennec, Yann; Onori, Simona

    2013-11-01

    This paper presents the development and experimental validation of a dynamic model of a Hybridized Energy Storage System (HESS) consisting of a parallel connection of a lead acid (PbA) battery and double layer capacitors (DLCs), for automotive applications. The dynamic modeling of both the PbA battery and the DLC has been tackled via the equivalent electric circuit based approach. Experimental tests are designed for identification purposes. Parameters of the PbA battery model are identified as a function of state of charge and current direction, whereas parameters of the DLC model are identified for different temperatures. A physical HESS has been assembled at the Center for Automotive Research The Ohio State University and used as a test-bench to validate the model against a typical current profile generated for Start&Stop applications. The HESS model is then integrated into a vehicle simulator to assess the effects of the battery hybridization on the vehicle fuel economy and mitigation of the battery stress.

  15. Experimental validation of prototype high voltage bushing

    NASA Astrophysics Data System (ADS)

    Shah, Sejal; Tyagi, H.; Sharma, D.; Parmar, D.; M. N., Vishnudev; Joshi, K.; Patel, K.; Yadav, A.; Patel, R.; Bandyopadhyay, M.; Rotti, C.; Chakraborty, A.

    2017-08-01

    Prototype High voltage bushing (PHVB) is a scaled down configuration of DNB High Voltage Bushing (HVB) of ITER. It is designed for operation at 50 kV DC to ensure operational performance and thereby confirming the design configuration of DNB HVB. Two concentric insulators viz. Ceramic and Fiber reinforced polymer (FRP) rings are used as double layered vacuum boundary for 50 kV isolation between grounded and high voltage flanges. Stress shields are designed for smooth electric field distribution. During ceramic to Kovar brazing, spilling cannot be controlled which may lead to high localized electrostatic stress. To understand spilling phenomenon and precise stress calculation, quantitative analysis was performed using Scanning Electron Microscopy (SEM) of brazed sample and similar configuration modeled while performing the Finite Element (FE) analysis. FE analysis of PHVB is performed to find out electrical stresses on different areas of PHVB and are maintained similar to DNB HV Bushing. With this configuration, the experiment is performed considering ITER like vacuum and electrical parameters. Initial HV test is performed by temporary vacuum sealing arrangements using gaskets/O-rings at both ends in order to achieve desired vacuum and keep the system maintainable. During validation test, 50 kV voltage withstand is performed for one hour. Voltage withstand test for 60 kV DC (20% higher rated voltage) have also been performed without any breakdown. Successful operation of PHVB confirms the design of DNB HV Bushing. In this paper, configuration of PHVB with experimental validation data is presented.

  16. A ferrofluid based energy harvester: Computational modeling, analysis, and experimental validation

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Alazemi, Saad F.; Daqaq, Mohammed F.; Li, Gang

    2018-03-01

    A computational model is described and implemented in this work to analyze the performance of a ferrofluid based electromagnetic energy harvester. The energy harvester converts ambient vibratory energy into an electromotive force through a sloshing motion of a ferrofluid. The computational model solves the coupled Maxwell's equations and Navier-Stokes equations for the dynamic behavior of the magnetic field and fluid motion. The model is validated against experimental results for eight different configurations of the system. The validated model is then employed to study the underlying mechanisms that determine the electromotive force of the energy harvester. Furthermore, computational analysis is performed to test the effect of several modeling aspects, such as three-dimensional effect, surface tension, and type of the ferrofluid-magnetic field coupling on the accuracy of the model prediction.

  17. Bayesian truthing and experimental validation in homeland security and defense

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Forrester, Thomas; Wang, Wenjian; Kostrzewski, Andrew; Pradhan, Ranjit

    2014-05-01

    In this paper we discuss relations between Bayesian Truthing (experimental validation), Bayesian statistics, and Binary Sensing in the context of selected Homeland Security and Intelligence, Surveillance, Reconnaissance (ISR) optical and nonoptical application scenarios. The basic Figure of Merit (FoM) is Positive Predictive Value (PPV), as well as false positives and false negatives. By using these simple binary statistics, we can analyze, classify, and evaluate a broad variety of events including: ISR; natural disasters; QC; and terrorism-related, GIS-related, law enforcement-related, and other C3I events.

  18. OECD-NEA Expert Group on Multi-Physics Experimental Data, Benchmarks and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valentine, Timothy; Rohatgi, Upendra S.

    High-fidelity, multi-physics modeling and simulation (M&S) tools are being developed and utilized for a variety of applications in nuclear science and technology and show great promise in their abilities to reproduce observed phenomena for many applications. Even with the increasing fidelity and sophistication of coupled multi-physics M&S tools, the underpinning models and data still need to be validated against experiments that may require a more complex array of validation data because of the great breadth of the time, energy and spatial domains of the physical phenomena that are being simulated. The Expert Group on Multi-Physics Experimental Data, Benchmarks and Validationmore » (MPEBV) of the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) was formed to address the challenges with the validation of such tools. The work of the MPEBV expert group is shared among three task forces to fulfill its mandate and specific exercises are being developed to demonstrate validation principles for common industrial challenges. This paper describes the overall mission of the group, the specific objectives of the task forces, the linkages among the task forces, and the development of a validation exercise that focuses on a specific reactor challenge problem.« less

  19. Experimental Validation of Displacement Underestimation in ARFI Ultrasound

    PubMed Central

    Czernuszewicz, Tomasz J.; Streeter, Jason E.; Dayton, Paul A.; Gallippi, Caterina M.

    2014-01-01

    Acoustic radiation force impulse (ARFI) imaging is an elastography technique that uses ultrasonic pulses to both displace and track tissue motion. Previous modeling studies have shown that ARFI displacements are susceptible to underestimation due to lateral and elevational shearing that occurs within the tracking resolution cell. In this study, optical tracking was utilized to experimentally measure the displacement underestimation achieved by acoustic tracking using a clinical ultrasound system. Three optically translucent phantoms of varying stiffness were created, embedded with sub-wavelength diameter microspheres, and ARFI excitation pulses with F/1.5 or F/3 lateral focal configurations were transmitted from a standard linear array to induce phantom motion. Displacements were tracked using confocal optical and acoustic methods. As predicted by earlier FEM studies, significant acoustic displacement underestimation was observed for both excitation focal configurations; the maximum underestimation error was 35% of the optically measured displacement for the F/1.5 excitation pulse in the softest phantom. Using higher F/#, less tightly focused beams in the lateral dimension improved accuracy of displacements by approximately 10 percentage points. This work experimentally demonstrates limitations of ARFI implemented on a clinical scanner using a standard linear array and sets up a framework for future displacement tracking validation studies. PMID:23858054

  20. Computational design and experimental validation of new thermal barrier systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Shengmin

    2015-03-31

    The focus of this project is on the development of a reliable and efficient ab initio based computational high temperature material design method which can be used to assist the Thermal Barrier Coating (TBC) bond-coat and top-coat design. Experimental evaluations on the new TBCs are conducted to confirm the new TBCs’ properties. Southern University is the subcontractor on this project with a focus on the computational simulation method development. We have performed ab initio density functional theory (DFT) method and molecular dynamics simulation on screening the top coats and bond coats for gas turbine thermal barrier coating design and validationmore » applications. For experimental validations, our focus is on the hot corrosion performance of different TBC systems. For example, for one of the top coatings studied, we examined the thermal stability of TaZr 2.75O 8 and confirmed it’s hot corrosion performance.« less

  1. Experimental validation of boundary element methods for noise prediction

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Oswald, Fred B.

    1992-01-01

    Experimental validation of methods to predict radiated noise is presented. A combined finite element and boundary element model was used to predict the vibration and noise of a rectangular box excited by a mechanical shaker. The predicted noise was compared to sound power measured by the acoustic intensity method. Inaccuracies in the finite element model shifted the resonance frequencies by about 5 percent. The predicted and measured sound power levels agree within about 2.5 dB. In a second experiment, measured vibration data was used with a boundary element model to predict noise radiation from the top of an operating gearbox. The predicted and measured sound power for the gearbox agree within about 3 dB.

  2. Computerized Planning of Cryosurgery Using Bubble Packing: An Experimental Validation on a Phantom Material

    PubMed Central

    Rossi, Michael R.; Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed

    2009-01-01

    The current study focuses on experimentally validating a planning scheme based on the so-called bubble-packing method. This study is a part of an ongoing effort to develop computerized planning tools for cryosurgery, where bubble packing has been previously developed as a means to find an initial, uniform distribution of cryoprobes within a given domain; the so-called force-field analogy was then used to move cryoprobes to their optimum layout. However, due to the high quality of the cryoprobes’ distribution, suggested by bubble packing and its low computational cost, it has been argued that a planning scheme based solely on bubble packing may be more clinically relevant. To test this argument, an experimental validation is performed on a simulated cross-section of the prostate, using gelatin solution as a phantom material, proprietary liquid-nitrogen based cryoprobes, and a cryoheater to simulate urethral warming. Experimental results are compared with numerically simulated temperature histories resulting from planning. Results indicate an average disagreement of 0.8 mm in identifying the freezing front location, which is an acceptable level of uncertainty in the context of prostate cryosurgery imaging. PMID:19885373

  3. An experimental validation of genomic selection in octoploid strawberry

    PubMed Central

    Gezan, Salvador A; Osorio, Luis F; Verma, Sujeet; Whitaker, Vance M

    2017-01-01

    The primary goal of genomic selection is to increase genetic gains for complex traits by predicting performance of individuals for which phenotypic data are not available. The objective of this study was to experimentally evaluate the potential of genomic selection in strawberry breeding and to define a strategy for its implementation. Four clonally replicated field trials, two in each of 2 years comprised of a total of 1628 individuals, were established in 2013–2014 and 2014–2015. Five complex yield and fruit quality traits with moderate to low heritability were assessed in each trial. High-density genotyping was performed with the Affymetrix Axiom IStraw90 single-nucleotide polymorphism array, and 17 479 polymorphic markers were chosen for analysis. Several methods were compared, including Genomic BLUP, Bayes B, Bayes C, Bayesian LASSO Regression, Bayesian Ridge Regression and Reproducing Kernel Hilbert Spaces. Cross-validation within training populations resulted in higher values than for true validations across trials. For true validations, Bayes B gave the highest predictive abilities on average and also the highest selection efficiencies, particularly for yield traits that were the lowest heritability traits. Selection efficiencies using Bayes B for parent selection ranged from 74% for average fruit weight to 34% for early marketable yield. A breeding strategy is proposed in which advanced selection trials are utilized as training populations and in which genomic selection can reduce the breeding cycle from 3 to 2 years for a subset of untested parents based on their predicted genomic breeding values. PMID:28090334

  4. Thermal conductivity of microporous layers: Analytical modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Andisheh-Tadbir, Mehdi; Kjeang, Erik; Bahrami, Majid

    2015-11-01

    A new compact relationship is developed for the thermal conductivity of the microporous layer (MPL) used in polymer electrolyte fuel cells as a function of pore size distribution, porosity, and compression pressure. The proposed model is successfully validated against experimental data obtained from a transient plane source thermal constants analyzer. The thermal conductivities of carbon paper samples with and without MPL were measured as a function of load (1-6 bars) and the MPL thermal conductivity was found between 0.13 and 0.17 W m-1 K-1. The proposed analytical model predicts the experimental thermal conductivities within 5%. A correlation generated from the analytical model was used in a multi objective genetic algorithm to predict the pore size distribution and porosity for an MPL with optimized thermal conductivity and mass diffusivity. The results suggest that an optimized MPL, in terms of heat and mass transfer coefficients, has an average pore size of 122 nm and 63% porosity.

  5. Experimental validation of finite element and boundary element methods for predicting structural vibration and radiated noise

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Wu, T. W.; Wu, X. F.

    1994-01-01

    This research report is presented in three parts. In the first part, acoustical analyses were performed on modes of vibration of the housing of a transmission of a gear test rig developed by NASA. The modes of vibration of the transmission housing were measured using experimental modal analysis. The boundary element method (BEM) was used to calculate the sound pressure and sound intensity on the surface of the housing and the radiation efficiency of each mode. The radiation efficiency of each of the transmission housing modes was then compared to theoretical results for a finite baffled plate. In the second part, analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise level radiated from the box. The FEM was used to predict the vibration, while the BEM was used to predict the sound intensity and total radiated sound power using surface vibration as the input data. Vibration predicted by the FEM model was validated by experimental modal analysis; noise predicted by the BEM was validated by measurements of sound intensity. Three types of results are presented for the total radiated sound power: sound power predicted by the BEM model using vibration data measured on the surface of the box; sound power predicted by the FEM/BEM model; and sound power measured by an acoustic intensity scan. In the third part, the structure used in part two was modified. A rib was attached to the top plate of the structure. The FEM and BEM were then used to predict structural vibration and radiated noise respectively. The predicted vibration and radiated noise were then validated through experimentation.

  6. Experimental Validation of ARFI Surveillance of Subcutaneous Hemorrhage (ASSH) Using Calibrated Infusions in a Tissue-Mimicking Model and Dogs.

    PubMed

    Geist, Rebecca E; DuBois, Chase H; Nichols, Timothy C; Caughey, Melissa C; Merricks, Elizabeth P; Raymer, Robin; Gallippi, Caterina M

    2016-09-01

    Acoustic radiation force impulse (ARFI) Surveillance of Subcutaneous Hemorrhage (ASSH) has been previously demonstrated to differentiate bleeding phenotype and responses to therapy in dogs and humans, but to date, the method has lacked experimental validation. This work explores experimental validation of ASSH in a poroelastic tissue-mimic and in vivo in dogs. The experimental design exploits calibrated flow rates and infusion durations of evaporated milk in tofu or heparinized autologous blood in dogs. The validation approach enables controlled comparisons of ASSH-derived bleeding rate (BR) and time to hemostasis (TTH) metrics. In tissue-mimicking experiments, halving the calibrated flow rate yielded ASSH-derived BRs that decreased by 44% to 48%. Furthermore, for calibrated flow durations of 5.0 minutes and 7.0 minutes, average ASSH-derived TTH was 5.2 minutes and 7.0 minutes, respectively, with ASSH predicting the correct TTH in 78% of trials. In dogs undergoing calibrated autologous blood infusion, ASSH measured a 3-minute increase in TTH, corresponding to the same increase in the calibrated flow duration. For a measured 5% decrease in autologous infusion flow rate, ASSH detected a 7% decrease in BR. These tissue-mimicking and in vivo preclinical experimental validation studies suggest the ASSH BR and TTH measures reflect bleeding dynamics. © The Author(s) 2015.

  7. Structurally compliant rocket engine combustion chamber: Experimental and analytical validation

    NASA Technical Reports Server (NTRS)

    Jankovsky, Robert S.; Arya, Vinod K.; Kazaroff, John M.; Halford, Gary R.

    1994-01-01

    A new, structurally compliant rocket engine combustion chamber design has been validated through analysis and experiment. Subscale, tubular channel chambers have been cyclically tested and analytically evaluated. Cyclic lives were determined to have a potential for 1000 percent increase over those of rectangular channel designs, the current state of the art. Greater structural compliance in the circumferential direction gave rise to lower thermal strains during hot firing, resulting in lower thermal strain ratcheting and longer predicted fatigue lives. Thermal, structural, and durability analyses of the combustion chamber design, involving cyclic temperatures, strains, and low-cycle fatigue lives, have corroborated the experimental observations.

  8. Experimental validation of Monte Carlo (MANTIS) simulated x-ray response of columnar CsI scintillator screens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freed, Melanie; Miller, Stuart; Tang, Katherine

    Purpose: MANTIS is a Monte Carlo code developed for the detailed simulation of columnar CsI scintillator screens in x-ray imaging systems. Validation of this code is needed to provide a reliable and valuable tool for system optimization and accurate reconstructions for a variety of x-ray applications. Whereas previous validation efforts have focused on matching of summary statistics, in this work the authors examine the complete point response function (PRF) of the detector system in addition to relative light output values. Methods: Relative light output values and high-resolution PRFs have been experimentally measured with a custom setup. A corresponding set ofmore » simulated light output values and PRFs have also been produced, where detailed knowledge of the experimental setup and CsI:Tl screen structures are accounted for in the simulations. Four different screens were investigated with different thicknesses, column tilt angles, and substrate types. A quantitative comparison between the experimental and simulated PRFs was performed for four different incidence angles (0 deg., 15 deg., 30 deg., and 45 deg.) and two different x-ray spectra (40 and 70 kVp). The figure of merit (FOM) used measures the normalized differences between the simulated and experimental data averaged over a region of interest. Results: Experimental relative light output values ranged from 1.456 to 1.650 and were in approximate agreement for aluminum substrates, but poor agreement for graphite substrates. The FOMs for all screen types, incidence angles, and energies ranged from 0.1929 to 0.4775. To put these FOMs in context, the same FOM was computed for 2D symmetric Gaussians fit to the same experimental data. These FOMs ranged from 0.2068 to 0.8029. Our analysis demonstrates that MANTIS reproduces experimental PRFs with higher accuracy than a symmetric 2D Gaussian fit to the experimental data in the majority of cases. Examination of the spatial distribution of differences between the

  9. Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Maiti, Raman

    2016-06-01

    The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.

  10. Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Maiti, Raman

    2018-06-01

    The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.

  11. Experimentally validated quantitative linear model for the device physics of elastomeric microfluidic valves

    NASA Astrophysics Data System (ADS)

    Kartalov, Emil P.; Scherer, Axel; Quake, Stephen R.; Taylor, Clive R.; Anderson, W. French

    2007-03-01

    A systematic experimental study and theoretical modeling of the device physics of polydimethylsiloxane "pushdown" microfluidic valves are presented. The phase space is charted by 1587 dimension combinations and encompasses 45-295μm lateral dimensions, 16-39μm membrane thickness, and 1-28psi closing pressure. Three linear models are developed and tested against the empirical data, and then combined into a fourth-power-polynomial superposition. The experimentally validated final model offers a useful quantitative prediction for a valve's properties as a function of its dimensions. Typical valves (80-150μm width) are shown to behave like thin springs.

  12. Parametric model of servo-hydraulic actuator coupled with a nonlinear system: Experimental validation

    NASA Astrophysics Data System (ADS)

    Maghareh, Amin; Silva, Christian E.; Dyke, Shirley J.

    2018-05-01

    Hydraulic actuators play a key role in experimental structural dynamics. In a previous study, a physics-based model for a servo-hydraulic actuator coupled with a nonlinear physical system was developed. Later, this dynamical model was transformed into controllable canonical form for position tracking control purposes. For this study, a nonlinear device is designed and fabricated to exhibit various nonlinear force-displacement profiles depending on the initial condition and the type of materials used as replaceable coupons. Using this nonlinear system, the controllable canonical dynamical model is experimentally validated for a servo-hydraulic actuator coupled with a nonlinear physical system.

  13. Experimental validation of finite element modelling of a modular metal-on-polyethylene total hip replacement.

    PubMed

    Hua, Xijin; Wang, Ling; Al-Hajjar, Mazen; Jin, Zhongmin; Wilcox, Ruth K; Fisher, John

    2014-07-01

    Finite element models are becoming increasingly useful tools to conduct parametric analysis, design optimisation and pre-clinical testing for hip joint replacements. However, the verification of the finite element model is critically important. The purposes of this study were to develop a three-dimensional anatomic finite element model for a modular metal-on-polyethylene total hip replacement for predicting its contact mechanics and to conduct experimental validation for a simple finite element model which was simplified from the anatomic finite element model. An anatomic modular metal-on-polyethylene total hip replacement model (anatomic model) was first developed and then simplified with reasonable accuracy to a simple modular total hip replacement model (simplified model) for validation. The contact areas on the articulating surface of three polyethylene liners of modular metal-on-polyethylene total hip replacement bearings with different clearances were measured experimentally in the Leeds ProSim hip joint simulator under a series of loading conditions and different cup inclination angles. The contact areas predicted from the simplified model were then compared with that measured experimentally under the same conditions. The results showed that the simplification made for the anatomic model did not change the predictions of contact mechanics of the modular metal-on-polyethylene total hip replacement substantially (less than 12% for contact stresses and contact areas). Good agreements of contact areas between the finite element predictions from the simplified model and experimental measurements were obtained, with maximum difference of 14% across all conditions considered. This indicated that the simplification and assumptions made in the anatomic model were reasonable and the finite element predictions from the simplified model were valid. © IMechE 2014.

  14. Two-Phase Flow Model and Experimental Validation for Bubble Augmented Waterjet Propulsion Nozzle

    NASA Astrophysics Data System (ADS)

    Choi, J.-K.; Hsiao, C.-T.; Wu, X.; Singh, S.; Jayaprakash, A.; Chahine, G.

    2011-11-01

    The concept of thrust augmentation through bubble injection into a waterjet has been the subject of many patents and publications over the past several decades, and there are simplified computational and experimental evidence of thrust increase. In this work, we present more rigorous numerical and experimental studies which aim at investigating two-phase water jet propulsion systems. The numerical model is based on a Lagrangian-Eulerian method, which considers the bubbly mixture flow both in the microscopic level where individual bubble dynamics are tracked and in the macroscopic level where bubbles are collectively described by the local void fraction of the mixture. DYNAFLOW's unsteady RANS solver, 3DYNAFS-Vis is used to solve the macro level variable density mixture medium, and a fully unsteady two-way coupling between this and the bubble dynamics/tracking code 3DYNAFS-DSM is utilized. Validation studies using measurements in a half 3-D experimental setup composed of divergent and convergent sections are presented. Visualization of the bubbles, PIV measurements of the flow, bubble size and behavior are observed, and the measured flow field data are used to validate the models. Thrust augmentation as high as 50% could be confirmed both by predictions and by experiments. This work was supported by the Office of Naval Research under the contract N00014-07-C-0427, monitored by Dr. Ki-Han Kim.

  15. Molecular simulation and experimental validation of resorcinol adsorption on Ordered Mesoporous Carbon (OMC).

    PubMed

    Ahmad, Zaki Uddin; Chao, Bing; Konggidinata, Mas Iwan; Lian, Qiyu; Zappi, Mark E; Gang, Daniel Dianchen

    2018-04-27

    Numerous research works have been devoted in the adsorption area using experimental approaches. All these approaches are based on trial and error process and extremely time consuming. Molecular simulation technique is a new tool that can be used to design and predict the performance of an adsorbent. This research proposed a simulation technique that can greatly reduce the time in designing the adsorbent. In this study, a new Rhombic ordered mesoporous carbon (OMC) model is proposed and constructed with various pore sizes and oxygen contents using Materials Visualizer Module to optimize the structure of OMC for resorcinol adsorption. The specific surface area, pore volume, small angle X-ray diffraction pattern, and resorcinol adsorption capacity were calculated by Forcite and Sorption module in Materials Studio Package. The simulation results were validated experimentally through synthesizing OMC with different pore sizes and oxygen contents prepared via hard template method employing SBA-15 silica scaffold. Boric acid was used as the pore expanding reagent to synthesize OMC with different pore sizes (from 4.6 to 11.3 nm) and varying oxygen contents (from 11.9% to 17.8%). Based on the simulation and experimental validation, the optimal pore size was found to be 6 nm for maximum adsorption of resorcinol. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Advanced Reactors-Intermediate Heat Exchanger (IHX) Coupling: Theoretical Modeling and Experimental Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard

    2016-12-29

    The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less

  17. miRTarBase update 2018: a resource for experimentally validated microRNA-target interactions.

    PubMed

    Chou, Chih-Hung; Shrestha, Sirjana; Yang, Chi-Dung; Chang, Nai-Wen; Lin, Yu-Ling; Liao, Kuang-Wen; Huang, Wei-Chi; Sun, Ting-Hsuan; Tu, Siang-Jyun; Lee, Wei-Hsiang; Chiew, Men-Yee; Tai, Chun-San; Wei, Ting-Yen; Tsai, Tzi-Ren; Huang, Hsin-Tzu; Wang, Chung-Yu; Wu, Hsin-Yi; Ho, Shu-Yi; Chen, Pin-Rong; Chuang, Cheng-Hsun; Hsieh, Pei-Jung; Wu, Yi-Shin; Chen, Wen-Liang; Li, Meng-Ju; Wu, Yu-Chun; Huang, Xin-Yi; Ng, Fung Ling; Buddhakosai, Waradee; Huang, Pei-Chun; Lan, Kuan-Chun; Huang, Chia-Yen; Weng, Shun-Long; Cheng, Yeong-Nan; Liang, Chao; Hsu, Wen-Lian; Huang, Hsien-Da

    2018-01-04

    MicroRNAs (miRNAs) are small non-coding RNAs of ∼ 22 nucleotides that are involved in negative regulation of mRNA at the post-transcriptional level. Previously, we developed miRTarBase which provides information about experimentally validated miRNA-target interactions (MTIs). Here, we describe an updated database containing 422 517 curated MTIs from 4076 miRNAs and 23 054 target genes collected from over 8500 articles. The number of MTIs curated by strong evidence has increased ∼1.4-fold since the last update in 2016. In this updated version, target sites validated by reporter assay that are available in the literature can be downloaded. The target site sequence can extract new features for analysis via a machine learning approach which can help to evaluate the performance of miRNA-target prediction tools. Furthermore, different ways of browsing enhance user browsing specific MTIs. With these improvements, miRTarBase serves as more comprehensively annotated, experimentally validated miRNA-target interactions databases in the field of miRNA related research. miRTarBase is available at http://miRTarBase.mbc.nctu.edu.tw/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Analytical and experimental validation of the Oblique Detonation Wave Engine concept

    NASA Technical Reports Server (NTRS)

    Adelman, Henry G.; Cambier, Jean-Luc; Menees, Gene P.; Balboni, John A.

    1988-01-01

    The Oblique Detonation Wave Engine (ODWE) for hypersonic flight has been analytically studied by NASA using the CFD codes which fully couple finite rate chemistry with fluid dynamics. Fuel injector designs investigated included wall and strut injectors, and the in-stream strut injectors were chosen to provide good mixing with minimal stagnation pressure losses. Plans for experimentally validating the ODWE concept in an arc-jet hypersonic wind tunnel are discussed. Measurements of the flow field properties behind the oblique wave will be compared to analytical predictions.

  19. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    PubMed

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  20. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin

    PubMed Central

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2016-01-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a “complete mystical experience” that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. PMID:26442957

  1. Experimental validation of the RATE tool for inferring HLA restrictions of T cell epitopes.

    PubMed

    Paul, Sinu; Arlehamn, Cecilia S Lindestam; Schulten, Veronique; Westernberg, Luise; Sidney, John; Peters, Bjoern; Sette, Alessandro

    2017-06-21

    The RATE tool was recently developed to computationally infer the HLA restriction of given epitopes from immune response data of HLA typed subjects without additional cumbersome experimentation. Here, RATE was validated using experimentally defined restriction data from a set of 191 tuberculosis-derived epitopes and 63 healthy individuals with MTB infection from the Western Cape Region of South Africa. Using this experimental dataset, the parameters utilized by the RATE tool to infer restriction were optimized, which included relative frequency (RF) of the subjects responding to a given epitope and expressing a given allele as compared to the general test population and the associated p-value in a Fisher's exact test. We also examined the potential for further optimization based on the predicted binding affinity of epitopes to potential restricting HLA alleles, and the absolute number of individuals expressing a given allele and responding to the specific epitope. Different statistical measures, including Matthew's correlation coefficient, accuracy, sensitivity and specificity were used to evaluate performance of RATE as a function of these criteria. Based on our results we recommend selection of HLA restrictions with cutoffs of p-value < 0.01 and RF ≥ 1.3. The usefulness of the tool was demonstrated by inferring new HLA restrictions for epitope sets where restrictions could not be experimentally determined due to lack of necessary cell lines and for an additional data set related to recognition of pollen derived epitopes from allergic patients. Experimental data sets were used to validate RATE tool and the parameters used by the RATE tool to infer restriction were optimized. New HLA restrictions were identified using the optimized RATE tool.

  2. miRwayDB: a database for experimentally validated microRNA-pathway associations in pathophysiological conditions

    PubMed Central

    Das, Sankha Subhra; Saha, Pritam

    2018-01-01

    Abstract MicroRNAs (miRNAs) are well-known as key regulators of diverse biological pathways. A series of experimental evidences have shown that abnormal miRNA expression profiles are responsible for various pathophysiological conditions by modulating genes in disease associated pathways. In spite of the rapid increase in research data confirming such associations, scientists still do not have access to a consolidated database offering these miRNA-pathway association details for critical diseases. We have developed miRwayDB, a database providing comprehensive information of experimentally validated miRNA-pathway associations in various pathophysiological conditions utilizing data collected from published literature. To the best of our knowledge, it is the first database that provides information about experimentally validated miRNA mediated pathway dysregulation as seen specifically in critical human diseases and hence indicative of a cause-and-effect relationship in most cases. The current version of miRwayDB collects an exhaustive list of miRNA-pathway association entries for 76 critical disease conditions by reviewing 663 published articles. Each database entry contains complete information on the name of the pathophysiological condition, associated miRNA(s), experimental sample type(s), regulation pattern (up/down) of miRNA, pathway association(s), targeted member of dysregulated pathway(s) and a brief description. In addition, miRwayDB provides miRNA, gene and pathway score to evaluate the role of a miRNA regulated pathways in various pathophysiological conditions. The database can also be used for other biomedical approaches such as validation of computational analysis, integrated analysis and prediction of computational model. It also offers a submission page to submit novel data from recently published studies. We believe that miRwayDB will be a useful tool for miRNA research community. Database URL: http://www.mirway.iitkgp.ac.in PMID:29688364

  3. Nonsequential modeling of laser diode stacks using Zemax: simulation, optimization, and experimental validation.

    PubMed

    Coluccelli, Nicola

    2010-08-01

    Modeling a real laser diode stack based on Zemax ray tracing software that operates in a nonsequential mode is reported. The implementation of the model is presented together with the geometric and optical parameters to be adjusted to calibrate the model and to match the simulated intensity irradiance profiles with the experimental profiles. The calibration of the model is based on a near-field and a far-field measurement. The validation of the model has been accomplished by comparing the simulated and experimental transverse irradiance profiles at different positions along the caustic formed by a lens. Spot sizes and waist location are predicted with a maximum error below 6%.

  4. Neuroinflammatory targets and treatments for epilepsy validated in experimental models.

    PubMed

    Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M

    2017-07-01

    A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  5. Experimental validation of 2D uncertainty quantification for digital image correlation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reu, Phillip L.

    Because digital image correlation (DIC) has become such an important and standard tool in the toolbox of experimental mechanicists, a complete uncertainty quantification of the method is needed. It should be remembered that each DIC setup and series of images will have a unique uncertainty based on the calibration quality and the image and speckle quality of the analyzed images. Any pretest work done with a calibrated DIC stereo-rig to quantify the errors using known shapes and translations, while useful, do not necessarily reveal the uncertainty of a later test. This is particularly true with high-speed applications where actual testmore » images are often less than ideal. Work has previously been completed on the mathematical underpinnings of DIC uncertainty quantification and is already published, this paper will present corresponding experimental work used to check the validity of the uncertainty equations.« less

  6. Tyre tread-block friction: modelling, simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Wallaschek, Jörg; Wies, Burkard

    2013-07-01

    Pneumatic tyres are used in vehicles since the beginning of the last century. They generate braking and steering forces for bicycles, motor cycles, cars, busses, trucks, agricultural vehicles and aircraft. These forces are generated in the usually very small contact area between tyre and road and their performance characteristics are of eminent importance for safety and comfort. Much research has been addressed to optimise tyre design with respect to footprint pressure and friction. In this context, the development of virtual tyre prototypes, that is, simulation models for the tyre, has grown to a science in its own. While the modelling of the structural dynamics of the tyre has reached a very advanced level, which allows to take into account effects like the rate-independent inelasticity of filled elastomers or the transient 3D deformations of the ply-reinforced tread, shoulder and sidewalls, little is known about the friction between tread-block elements and road. This is particularly obvious in the case when snow, ice, water or a third-body layer are present in the tyre-road contact. In the present paper, we give a survey on the present state of knowledge in the modelling, simulation and experimental validation of tyre tread-block friction processes. We concentrate on experimental techniques.

  7. Experimental Validation of Normalized Uniform Load Surface Curvature Method for Damage Localization

    PubMed Central

    Jung, Ho-Yeon; Sung, Seung-Hoon; Jung, Hyung-Jo

    2015-01-01

    In this study, we experimentally validated the normalized uniform load surface (NULS) curvature method, which has been developed recently to assess damage localization in beam-type structures. The normalization technique allows for the accurate assessment of damage localization with greater sensitivity irrespective of the damage location. In this study, damage to a simply supported beam was numerically and experimentally investigated on the basis of the changes in the NULS curvatures, which were estimated from the modal flexibility matrices obtained from the acceleration responses under an ambient excitation. Two damage scenarios were considered for the single damage case as well as the multiple damages case by reducing the bending stiffness (EI) of the affected element(s). Numerical simulations were performed using MATLAB as a preliminary step. During the validation experiments, a series of tests were performed. It was found that the damage locations could be identified successfully without any false-positive or false-negative detections using the proposed method. For comparison, the damage detection performances were compared with those of two other well-known methods based on the modal flexibility matrix, namely, the uniform load surface (ULS) method and the ULS curvature method. It was confirmed that the proposed method is more effective for investigating the damage locations of simply supported beams than the two conventional methods in terms of sensitivity to damage under measurement noise. PMID:26501286

  8. BioNetCAD: design, simulation and experimental validation of synthetic biochemical networks

    PubMed Central

    Rialle, Stéphanie; Felicori, Liza; Dias-Lopes, Camila; Pérès, Sabine; El Atia, Sanaâ; Thierry, Alain R.; Amar, Patrick; Molina, Franck

    2010-01-01

    Motivation: Synthetic biology studies how to design and construct biological systems with functions that do not exist in nature. Biochemical networks, although easier to control, have been used less frequently than genetic networks as a base to build a synthetic system. To date, no clear engineering principles exist to design such cell-free biochemical networks. Results: We describe a methodology for the construction of synthetic biochemical networks based on three main steps: design, simulation and experimental validation. We developed BioNetCAD to help users to go through these steps. BioNetCAD allows designing abstract networks that can be implemented thanks to CompuBioTicDB, a database of parts for synthetic biology. BioNetCAD enables also simulations with the HSim software and the classical Ordinary Differential Equations (ODE). We demonstrate with a case study that BioNetCAD can rationalize and reduce further experimental validation during the construction of a biochemical network. Availability and implementation: BioNetCAD is freely available at http://www.sysdiag.cnrs.fr/BioNetCAD. It is implemented in Java and supported on MS Windows. CompuBioTicDB is freely accessible at http://compubiotic.sysdiag.cnrs.fr/ Contact: stephanie.rialle@sysdiag.cnrs.fr; franck.molina@sysdiag.cnrs.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20628073

  9. Experimental Validation of an Ion Beam Optics Code with a Visualized Ion Thruster

    NASA Astrophysics Data System (ADS)

    Nakayama, Yoshinori; Nakano, Masakatsu

    For validation of an ion beam optics code, the behavior of ion beam optics was experimentally observed and evaluated with a two-dimensional visualized ion thruster (VIT). Since the observed beam focus positions, sheath positions and measured ion beam currents were in good agreement with the numerical results, it was confirmed that the numerical model of this code was appropriated. In addition, it was also confirmed that the beam focus position was moved on center axis of grid hole according to the applied grid potentials, which differs from conventional understanding/assumption. The VIT operations may be useful not only for the validation of ion beam optics codes but also for the fundamental and intuitive understanding of the Child Law Sheath theory.

  10. Development and Validation of a Rubric for Diagnosing Students’ Experimental Design Knowledge and Difficulties

    PubMed Central

    Dasgupta, Annwesa P.; Anderson, Trevor R.

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students’ responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students’ experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. PMID:26086658

  11. Relationships between the decoupled and coupled transfer functions: Theoretical studies and experimental validation

    NASA Astrophysics Data System (ADS)

    Wang, Zengwei; Zhu, Ping; Liu, Zhao

    2018-01-01

    A generalized method for predicting the decoupled transfer functions based on in-situ transfer functions is proposed. The method allows predicting the decoupled transfer functions using coupled transfer functions, without disassembling the system. Two ways to derive relationships between the decoupled and coupled transfer functions are presented. Issues related to immeasurability of coupled transfer functions are also discussed. The proposed method is validated by numerical and experimental case studies.

  12. Effects of human running cadence and experimental validation of the bouncing ball model

    NASA Astrophysics Data System (ADS)

    Bencsik, László; Zelei, Ambrus

    2017-05-01

    The biomechanical analysis of human running is a complex problem, because of the large number of parameters and degrees of freedom. However, simplified models can be constructed, which are usually characterized by some fundamental parameters, like step length, foot strike pattern and cadence. The bouncing ball model of human running is analysed theoretically and experimentally in this work. It is a minimally complex dynamic model when the aim is to estimate the energy cost of running and the tendency of ground-foot impact intensity as a function of cadence. The model shows that cadence has a direct effect on energy efficiency of running and ground-foot impact intensity. Furthermore, it shows that higher cadence implies lower risk of injury and better energy efficiency. An experimental data collection of 121 amateur runners is presented. The experimental results validate the model and provides information about the walk-to-run transition speed and the typical development of cadence and grounded phase ratio in different running speed ranges.

  13. Flutter suppression for the Active Flexible Wing - Control system design and experimental validation

    NASA Technical Reports Server (NTRS)

    Waszak, M. R.; Srinathkumar, S.

    1992-01-01

    The synthesis and experimental validation of a control law for an active flutter suppression system for the Active Flexible Wing wind-tunnel model is presented. The design was accomplished with traditional root locus and Nyquist methods using interactive computer graphics tools and with extensive use of simulation-based analysis. The design approach relied on a fundamental understanding of the flutter mechanism to formulate understanding of the flutter mechanism to formulate a simple control law structure. Experimentally, the flutter suppression controller succeeded in simultaneous suppression of two flutter modes, significantly increasing the flutter dynamic pressure despite errors in the design model. The flutter suppression controller was also successfully operated in combination with a rolling maneuver controller to perform flutter suppression during rapid rolling maneuvers.

  14. Computational Modeling and Experimental Validation of Shock Induced Damage in Woven E-Glass/Vinylester Laminates

    NASA Astrophysics Data System (ADS)

    Hufner, D. R.; Augustine, M. R.

    2018-05-01

    A novel experimental method was developed to simulate underwater explosion pressure pulses within a laboratory environment. An impact-based experimental apparatus was constructed; capable of generating pressure pulses with basic character similar to underwater explosions, while also allowing the pulse to be tuned to different intensities. Having the capability to vary the shock impulse was considered essential to producing various levels of shock-induced damage without the need to modify the fixture. The experimental apparatus and test method are considered ideal for investigating the shock response of composite material systems and/or experimental validation of new material models. One such test program is presented herein, in which a series of E-glass/Vinylester laminates were subjected to a range of shock pulses that induced varying degrees of damage. Analysis-test correlations were performed using a rate-dependent constitutive model capable of representing anisotropic damage and ultimate yarn failure. Agreement between analytical predictions and experimental results was considered acceptable.

  15. Experimental validation for thermal transmittances of window shading systems with perimeter gaps

    DOE PAGES

    Hart, Robert; Goudey, Howdy; Curcija, D. Charlie

    2018-02-22

    Virtually all residential and commercial windows in the U.S. have some form of window attachment, but few have been designed for energy savings. ISO 15099 presents a simulation framework to determine thermal performance of window attachments, but the model has not been validated for these products. This paper outlines a review and validation of the ISO 15099 centre-of-glass heat transfer correlations for perimeter gaps (top, bottom, and side) in naturally ventilated cavities through measurement and simulation. The thermal transmittance impact due to dimensional variations of these gaps is measured experimentally, simulated using computational fluid dynamics, and simulated utilizing simplified correlationsmore » from ISO 15099. Results show that the ISO 15099 correlations produce a mean error between measured and simulated heat flux of 2.5 ± 7%. These tolerances are similar to those obtained from sealed cavity comparisons and are deemed acceptable within the ISO 15099 framework.« less

  16. Supersonic, nonlinear, attached-flow wing design for high lift with experimental validation

    NASA Technical Reports Server (NTRS)

    Pittman, J. L.; Miller, D. S.; Mason, W. H.

    1984-01-01

    Results of the experimental validation are presented for the three dimensional cambered wing which was designed to achieve attached supercritical cross flow for lifting conditions typical of supersonic maneuver. The design point was a lift coefficient of 0.4 at Mach 1.62 and 12 deg angle of attack. Results from the nonlinear full potential method are presented to show the validity of the design process along with results from linear theory codes. Longitudinal force and moment data and static pressure data were obtained in the Langley Unitary Plan Wind Tunnel at Mach numbers of 1.58, 1.62, 1.66, 1.70, and 2.00 over an angle of attack range of 0 to 14 deg at a Reynolds number of 2.0 x 10 to the 6th power per foot. Oil flow photographs of the upper surface were obtained at M = 1.62 for alpha approx. = 8, 10, 12, and 14 deg.

  17. Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes

    NASA Astrophysics Data System (ADS)

    Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.

    2018-04-01

    To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.

  18. Integral Reactor Containment Condensation Model and Experimental Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Qiao; Corradini, Michael

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flowmore » into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for

  19. Experimental program for real gas flow code validation at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.; Strawa, Anthony W.; Sharma, Surendra P.; Park, Chul

    1989-01-01

    The experimental program for validating real gas hypersonic flow codes at NASA Ames Rsearch Center is described. Ground-based test facilities used include ballistic ranges, shock tubes and shock tunnels, arc jet facilities and heated-air hypersonic wind tunnels. Also included are large-scale computer systems for kinetic theory simulations and benchmark code solutions. Flight tests consist of the Aeroassist Flight Experiment, the Space Shuttle, Project Fire 2, and planetary probes such as Galileo, Pioneer Venus, and PAET.

  20. Experimental evaluation of certification trails using abstract data type validation

    NASA Technical Reports Server (NTRS)

    Wilson, Dwight S.; Sullivan, Gregory F.; Masson, Gerald M.

    1993-01-01

    Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. Recent experimental work reveals many cases in which a certification-trail approach allows for significantly faster program execution time than a basic time-redundancy approach. Algorithms for answer-validation of abstract data types allow a certification trail approach to be used for a wide variety of problems. An attempt to assess the performance of algorithms utilizing certification trails on abstract data types is reported. Specifically, this method was applied to the following problems: heapsort, Hullman tree, shortest path, and skyline. Previous results used certification trails specific to a particular problem and implementation. The approach allows certification trails to be localized to 'data structure modules,' making the use of this technique transparent to the user of such modules.

  1. Design and experimental validation of a flutter suppression controller for the active flexible wing

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Srinathkumar, S.

    1992-01-01

    The synthesis and experimental validation of an active flutter suppression controller for the Active Flexible Wing wind tunnel model is presented. The design is accomplished with traditional root locus and Nyquist methods using interactive computer graphics tools and extensive simulation based analysis. The design approach uses a fundamental understanding of the flutter mechanism to formulate a simple controller structure to meet stringent design specifications. Experimentally, the flutter suppression controller succeeded in simultaneous suppression of two flutter modes, significantly increasing the flutter dynamic pressure despite modeling errors in predicted flutter dynamic pressure and flutter frequency. The flutter suppression controller was also successfully operated in combination with another controller to perform flutter suppression during rapid rolling maneuvers.

  2. The impact of crowd noise on officiating in muay thai: achieving external validity in an experimental setting.

    PubMed

    Myers, Tony; Balmer, Nigel

    2012-01-01

    Numerous factors have been proposed to explain the home advantage in sport. Several authors have suggested that a partisan home crowd enhances home advantage and that this is at least in part a consequence of their influence on officiating. However, while experimental studies examining this phenomenon have high levels of internal validity (since only the "crowd noise" intervention is allowed to vary), they suffer from a lack of external validity, with decision-making in a laboratory setting typically bearing little resemblance to decision-making in live sports settings. Conversely, observational and quasi-experimental studies with high levels of external validity suffer from low levels of internal validity as countless factors besides crowd noise vary. The present study provides a unique opportunity to address these criticisms, by conducting a controlled experiment on the impact of crowd noise on officiating in a live tournament setting. Seventeen qualified judges officiated on thirty Thai boxing bouts in a live international tournament setting featuring "home" and "away" boxers. In each bout, judges were randomized into a "noise" (live sound) or "no crowd noise" (noise-canceling headphones and white noise) condition, resulting in 59 judgments in the "no crowd noise" and 61 in the "crowd noise" condition. The results provide the first experimental evidence of the impact of live crowd noise on officials in sport. A cross-classified statistical model indicated that crowd noise had a statistically significant impact, equating to just over half a point per bout (in the context of five round bouts with the "10-point must" scoring system shared with professional boxing). The practical significance of the findings, their implications for officiating and for the future conduct of crowd noise studies are discussed.

  3. The Impact of Crowd Noise on Officiating in Muay Thai: Achieving External Validity in an Experimental Setting

    PubMed Central

    Myers, Tony; Balmer, Nigel

    2012-01-01

    Numerous factors have been proposed to explain the home advantage in sport. Several authors have suggested that a partisan home crowd enhances home advantage and that this is at least in part a consequence of their influence on officiating. However, while experimental studies examining this phenomenon have high levels of internal validity (since only the “crowd noise” intervention is allowed to vary), they suffer from a lack of external validity, with decision-making in a laboratory setting typically bearing little resemblance to decision-making in live sports settings. Conversely, observational and quasi-experimental studies with high levels of external validity suffer from low levels of internal validity as countless factors besides crowd noise vary. The present study provides a unique opportunity to address these criticisms, by conducting a controlled experiment on the impact of crowd noise on officiating in a live tournament setting. Seventeen qualified judges officiated on thirty Thai boxing bouts in a live international tournament setting featuring “home” and “away” boxers. In each bout, judges were randomized into a “noise” (live sound) or “no crowd noise” (noise-canceling headphones and white noise) condition, resulting in 59 judgments in the “no crowd noise” and 61 in the “crowd noise” condition. The results provide the first experimental evidence of the impact of live crowd noise on officials in sport. A cross-classified statistical model indicated that crowd noise had a statistically significant impact, equating to just over half a point per bout (in the context of five round bouts with the “10-point must” scoring system shared with professional boxing). The practical significance of the findings, their implications for officiating and for the future conduct of crowd noise studies are discussed. PMID:23049520

  4. A Perspective on Research on Dishonesty: Limited External Validity Due to the Lack of Possibility of Self-Selection in Experimental Designs.

    PubMed

    Houdek, Petr

    2017-01-01

    The aim of this perspective article is to show that current experimental evidence on factors influencing dishonesty has limited external validity. Most of experimental studies is built on random assignments, in which control/experimental groups of subjects face varied sizes of the expected reward for behaving dishonestly, opportunities for cheating, means of rationalizing dishonest behavior etc., and mean groups' reactions are observed. The studies have internal validity in assessing the causal influence of these and other factors, but they lack external validity in organizational, market and other environments. If people can opt into or out of diverse real-world environments, an experiment aimed at studying factors influencing real-life degree of dishonesty should permit for such an option. The behavior of such self-selected groups of marginal subjects would probably contain a larger level of (non)deception than the behavior of average people. The article warns that there are not many studies that would enable self-selection or sorting of participants into varying environments, and that limits current knowledge of the extent and dynamics of dishonest and fraudulent behavior. The article focuses on suggestions how to improve dishonesty research, especially how to avoid the experimenter demand bias.

  5. A Perspective on Research on Dishonesty: Limited External Validity Due to the Lack of Possibility of Self-Selection in Experimental Designs

    PubMed Central

    Houdek, Petr

    2017-01-01

    The aim of this perspective article is to show that current experimental evidence on factors influencing dishonesty has limited external validity. Most of experimental studies is built on random assignments, in which control/experimental groups of subjects face varied sizes of the expected reward for behaving dishonestly, opportunities for cheating, means of rationalizing dishonest behavior etc., and mean groups’ reactions are observed. The studies have internal validity in assessing the causal influence of these and other factors, but they lack external validity in organizational, market and other environments. If people can opt into or out of diverse real-world environments, an experiment aimed at studying factors influencing real-life degree of dishonesty should permit for such an option. The behavior of such self-selected groups of marginal subjects would probably contain a larger level of (non)deception than the behavior of average people. The article warns that there are not many studies that would enable self-selection or sorting of participants into varying environments, and that limits current knowledge of the extent and dynamics of dishonest and fraudulent behavior. The article focuses on suggestions how to improve dishonesty research, especially how to avoid the experimenter demand bias. PMID:28955279

  6. Validation of reference genes for quantitative gene expression analysis in experimental epilepsy.

    PubMed

    Sadangi, Chinmaya; Rosenow, Felix; Norwood, Braxton A

    2017-12-01

    To grasp the molecular mechanisms and pathophysiology underlying epilepsy development (epileptogenesis) and epilepsy itself, it is important to understand the gene expression changes that occur during these phases. Quantitative real-time polymerase chain reaction (qPCR) is a technique that rapidly and accurately determines gene expression changes. It is crucial, however, that stable reference genes are selected for each experimental condition to ensure that accurate values are obtained for genes of interest. If reference genes are unstably expressed, this can lead to inaccurate data and erroneous conclusions. To date, epilepsy studies have used mostly single, nonvalidated reference genes. This is the first study to systematically evaluate reference genes in male Sprague-Dawley rat models of epilepsy. We assessed 15 potential reference genes in hippocampal tissue obtained from 2 different models during epileptogenesis, 1 model during chronic epilepsy, and a model of noninjurious seizures. Reference gene ranking varied between models and also differed between epileptogenesis and chronic epilepsy time points. There was also some variance between the four mathematical models used to rank reference genes. Notably, we found novel reference genes to be more stably expressed than those most often used in experimental epilepsy studies. The consequence of these findings is that reference genes suitable for one epilepsy model may not be appropriate for others and that reference genes can change over time. It is, therefore, critically important to validate potential reference genes before using them as normalizing factors in expression analysis in order to ensure accurate, valid results. © 2017 Wiley Periodicals, Inc.

  7. Experimental and statistical post-validation of positive example EST sequences carrying peroxisome targeting signals type 1 (PTS1).

    PubMed

    Lingner, Thomas; Kataya, Amr R A; Reumann, Sigrun

    2012-02-01

    We recently developed the first algorithms specifically for plants to predict proteins carrying peroxisome targeting signals type 1 (PTS1) from genome sequences. As validated experimentally, the prediction methods are able to correctly predict unknown peroxisomal Arabidopsis proteins and to infer novel PTS1 tripeptides. The high prediction performance is primarily determined by the large number and sequence diversity of the underlying positive example sequences, which mainly derived from EST databases. However, a few constructs remained cytosolic in experimental validation studies, indicating sequencing errors in some ESTs. To identify erroneous sequences, we validated subcellular targeting of additional positive example sequences in the present study. Moreover, we analyzed the distribution of prediction scores separately for each orthologous group of PTS1 proteins, which generally resembled normal distributions with group-specific mean values. The cytosolic sequences commonly represented outliers of low prediction scores and were located at the very tail of a fitted normal distribution. Three statistical methods for identifying outliers were compared in terms of sensitivity and specificity." Their combined application allows elimination of erroneous ESTs from positive example data sets. This new post-validation method will further improve the prediction accuracy of both PTS1 and PTS2 protein prediction models for plants, fungi, and mammals.

  8. Experimental and statistical post-validation of positive example EST sequences carrying peroxisome targeting signals type 1 (PTS1)

    PubMed Central

    Lingner, Thomas; Kataya, Amr R. A.; Reumann, Sigrun

    2012-01-01

    We recently developed the first algorithms specifically for plants to predict proteins carrying peroxisome targeting signals type 1 (PTS1) from genome sequences.1 As validated experimentally, the prediction methods are able to correctly predict unknown peroxisomal Arabidopsis proteins and to infer novel PTS1 tripeptides. The high prediction performance is primarily determined by the large number and sequence diversity of the underlying positive example sequences, which mainly derived from EST databases. However, a few constructs remained cytosolic in experimental validation studies, indicating sequencing errors in some ESTs. To identify erroneous sequences, we validated subcellular targeting of additional positive example sequences in the present study. Moreover, we analyzed the distribution of prediction scores separately for each orthologous group of PTS1 proteins, which generally resembled normal distributions with group-specific mean values. The cytosolic sequences commonly represented outliers of low prediction scores and were located at the very tail of a fitted normal distribution. Three statistical methods for identifying outliers were compared in terms of sensitivity and specificity.” Their combined application allows elimination of erroneous ESTs from positive example data sets. This new post-validation method will further improve the prediction accuracy of both PTS1 and PTS2 protein prediction models for plants, fungi, and mammals. PMID:22415050

  9. Experimental validation of Swy-2 clay standard's PHREEQC model

    NASA Astrophysics Data System (ADS)

    Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György

    2017-04-01

    One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast

  10. Validation of reference genes for RT-qPCR studies of gene expression in banana fruit under different experimental conditions.

    PubMed

    Chen, Lei; Zhong, Hai-ying; Kuang, Jian-fei; Li, Jian-guo; Lu, Wang-jin; Chen, Jian-ye

    2011-08-01

    Reverse transcription quantitative real-time PCR (RT-qPCR) is a sensitive technique for quantifying gene expression, but its success depends on the stability of the reference gene(s) used for data normalization. Only a few studies on validation of reference genes have been conducted in fruit trees and none in banana yet. In the present work, 20 candidate reference genes were selected, and their expression stability in 144 banana samples were evaluated and analyzed using two algorithms, geNorm and NormFinder. The samples consisted of eight sample sets collected under different experimental conditions, including various tissues, developmental stages, postharvest ripening, stresses (chilling, high temperature, and pathogen), and hormone treatments. Our results showed that different suitable reference gene(s) or combination of reference genes for normalization should be selected depending on the experimental conditions. The RPS2 and UBQ2 genes were validated as the most suitable reference genes across all tested samples. More importantly, our data further showed that the widely used reference genes, ACT and GAPDH, were not the most suitable reference genes in many banana sample sets. In addition, the expression of MaEBF1, a gene of interest that plays an important role in regulating fruit ripening, under different experimental conditions was used to further confirm the validated reference genes. Taken together, our results provide guidelines for reference gene(s) selection under different experimental conditions and a foundation for more accurate and widespread use of RT-qPCR in banana.

  11. Model Development and Experimental Validation of the Fusible Heat Sink Design for Exploration Vehicles

    NASA Technical Reports Server (NTRS)

    Cognata, Thomas J.; Leimkuehler, Thomas O.; Sheth, Rubik B.; Le,Hung

    2012-01-01

    The Fusible Heat Sink is a novel vehicle heat rejection technology which combines a flow through radiator with a phase change material. The combined technologies create a multi-function device able to shield crew members against Solar Particle Events (SPE), reduce radiator extent by permitting sizing to the average vehicle heat load rather than to the peak vehicle heat load, and to substantially absorb heat load excursions from the average while constantly maintaining thermal control system setpoints. This multi-function technology provides great flexibility for mission planning, making it possible to operate a vehicle in hot or cold environments and under high or low heat load conditions for extended periods of time. This paper describes the model development and experimental validation of the Fusible Heat Sink technology. The model developed was intended to meet the radiation and heat rejection requirements of a nominal MMSEV mission. Development parameters and results, including sizing and model performance will be discussed. From this flight-sized model, a scaled test-article design was modeled, designed, and fabricated for experimental validation of the technology at Johnson Space Center thermal vacuum chamber facilities. Testing showed performance comparable to the model at nominal loads and the capability to maintain heat loads substantially greater than nominal for extended periods of time.

  12. Model Development and Experimental Validation of the Fusible Heat Sink Design for Exploration Vehicles

    NASA Technical Reports Server (NTRS)

    Cognata, Thomas J.; Leimkuehler, Thomas; Sheth, Rubik; Le, Hung

    2013-01-01

    The Fusible Heat Sink is a novel vehicle heat rejection technology which combines a flow through radiator with a phase change material. The combined technologies create a multi-function device able to shield crew members against Solar Particle Events (SPE), reduce radiator extent by permitting sizing to the average vehicle heat load rather than to the peak vehicle heat load, and to substantially absorb heat load excursions from the average while constantly maintaining thermal control system setpoints. This multi-function technology provides great flexibility for mission planning, making it possible to operate a vehicle in hot or cold environments and under high or low heat load conditions for extended periods of time. This paper describes the modeling and experimental validation of the Fusible Heat Sink technology. The model developed was intended to meet the radiation and heat rejection requirements of a nominal MMSEV mission. Development parameters and results, including sizing and model performance will be discussed. From this flight-sized model, a scaled test-article design was modeled, designed, and fabricated for experimental validation of the technology at Johnson Space Center thermal vacuum chamber facilities. Testing showed performance comparable to the model at nominal loads and the capability to maintain heat loads substantially greater than nominal for extended periods of time.

  13. Experimental validation of systematically designed acoustic hyperbolic meta material slab exhibiting negative refraction

    NASA Astrophysics Data System (ADS)

    Christiansen, Rasmus E.; Sigmund, Ole

    2016-09-01

    This Letter reports on the experimental validation of a two-dimensional acoustic hyperbolic metamaterial slab optimized to exhibit negative refractive behavior. The slab was designed using a topology optimization based systematic design method allowing for tailoring the refractive behavior. The experimental results confirm the predicted refractive capability as well as the predicted transmission at an interface. The study simultaneously provides an estimate of the attenuation inside the slab stemming from the boundary layer effects—insight which can be utilized in the further design of the metamaterial slabs. The capability of tailoring the refractive behavior opens possibilities for different applications. For instance, a slab exhibiting zero refraction across a wide angular range is capable of funneling acoustic energy through it, while a material exhibiting the negative refractive behavior across a wide angular range provides lensing and collimating capabilities.

  14. Experimental validation of a numerical model predicting the charging characteristics of Teflon and Kapton under electron beam irradiation

    NASA Technical Reports Server (NTRS)

    Hazelton, R. C.; Yadlowsky, E. J.; Churchill, R. J.; Parker, L. W.; Sellers, B.

    1981-01-01

    The effect differential charging of spacecraft thermal control surfaces is assessed by studying the dynamics of the charging process. A program to experimentally validate a computer model of the charging process was established. Time resolved measurements of the surface potential were obtained for samples of Kapton and Teflon irradiated with a monoenergetic electron beam. Results indicate that the computer model and experimental measurements agree well and that for Teflon, secondary emission is the governing factor. Experimental data indicate that bulk conductivities play a significant role in the charging of Kapton.

  15. Computer-aided design of liposomal drugs: In silico prediction and experimental validation of drug candidates for liposomal remote loading.

    PubMed

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-10

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.

  16. Computer-aided design of liposomal drugs: in silico prediction and experimental validation of drug candidates for liposomal remote loading

    PubMed Central

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-01

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343

  17. Experimental validation benchmark data for CFD of transient convection from forced to natural with flow reversal on a vertical flat plate

    DOE PAGES

    Lance, Blake W.; Smith, Barton L.

    2016-06-23

    Transient convection has been investigated experimentally for the purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. A specialized facility for validation benchmark experiments called the Rotatable Buoyancy Tunnel was used to acquire thermal and velocity measurements of flow over a smooth, vertical heated plate. The initial condition was forced convection downward with subsequent transition to mixed convection, ending with natural convection upward after a flow reversal. Data acquisition through the transient was repeated for ensemble-averaged results. With simple flow geometry, validation data were acquired at the benchmark level. All boundary conditions (BCs) were measured and their uncertainties quantified.more » Temperature profiles on all four walls and the inlet were measured, as well as as-built test section geometry. Inlet velocity profiles and turbulence levels were quantified using Particle Image Velocimetry. System Response Quantities (SRQs) were measured for comparison with CFD outputs and include velocity profiles, wall heat flux, and wall shear stress. Extra effort was invested in documenting and preserving the validation data. Details about the experimental facility, instrumentation, experimental procedure, materials, BCs, and SRQs are made available through this paper. As a result, the latter two are available for download and the other details are included in this work.« less

  18. Experimental validation of beam quality correction factors for proton beams

    NASA Astrophysics Data System (ADS)

    Gomà, Carles; Hofstetter-Boillat, Bénédicte; Safai, Sairos; Vörös, Sándor

    2015-04-01

    This paper presents a method to experimentally validate the beam quality correction factors (kQ) tabulated in IAEA TRS-398 for proton beams and to determine the kQ of non-tabulated ionization chambers (based on the already tabulated values). The method is based exclusively on ionometry and it consists in comparing the reading of two ionization chambers under the same reference conditions in a proton beam quality Q and a reference beam quality 60Co. This allows one to experimentally determine the ratio between the kQ of the two ionization chambers. In this work, 7 different ionization chamber models were irradiated under the IAEA TRS-398 reference conditions for 60Co beams and proton beams. For the latter, the reference conditions for both modulated beams (spread-out Bragg peak field) and monoenergetic beams (pseudo-monoenergetic field) were studied. For monoenergetic beams, it was found that the experimental kQ values obtained for plane-parallel chambers are consistent with the values tabulated in IAEA TRS-398; whereas the kQ values obtained for cylindrical chambers are not consistent—being higher than the tabulated values. These results support the suggestion (of previous publications) that the IAEA TRS-398 reference conditions for monoenergetic proton beams should be revised so that the effective point of measurement of cylindrical ionization chambers is taken into account when positioning the reference point of the chamber at the reference depth. For modulated proton beams, the tabulated kQ values of all the ionization chambers studied in this work were found to be consistent with each other—except for the IBA FC65-G, whose experimental kQ value was found to be 0.6% lower than the tabulated one. The kQ of the PTW Advanced Markus chamber, which is not tabulated in IAEA TRS-398, was found to be 0.997 ± 0.042 (k = 2), based on the tabulated value of the PTW Markus chamber.

  19. Parametric Study of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental Validation Planning

    DTIC Science & Technology

    2001-08-30

    Body with Thermo-Chemical destribution of Heat-Protected System . In: Physical and Gasdynamic Phenomena in Supersonic Flows Over Bodies. Edit. By...Final Report on ISTC Contract # 1809p Parametric Study of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental...of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental Validation Planning 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT

  20. Computational identification of structural factors affecting the mutagenic potential of aromatic amines: study design and experimental validation.

    PubMed

    Slavov, Svetoslav H; Stoyanova-Slavova, Iva; Mattes, William; Beger, Richard D; Brüschweiler, Beat J

    2018-07-01

    A grid-based, alignment-independent 3D-SDAR (three-dimensional spectral data-activity relationship) approach based on simulated 13 C and 15 N NMR chemical shifts augmented with through-space interatomic distances was used to model the mutagenicity of 554 primary and 419 secondary aromatic amines. A robust modeling strategy supported by extensive validation including randomized training/hold-out test set pairs, validation sets, "blind" external test sets as well as experimental validation was applied to avoid over-parameterization and build Organization for Economic Cooperation and Development (OECD 2004) compliant models. Based on an experimental validation set of 23 chemicals tested in a two-strain Salmonella typhimurium Ames assay, 3D-SDAR was able to achieve performance comparable to 5-strain (Ames) predictions by Lhasa Limited's Derek and Sarah Nexus for the same set. Furthermore, mapping of the most frequently occurring bins on the primary and secondary aromatic amine structures allowed the identification of molecular features that were associated either positively or negatively with mutagenicity. Prominent structural features found to enhance the mutagenic potential included: nitrobenzene moieties, conjugated π-systems, nitrothiophene groups, and aromatic hydroxylamine moieties. 3D-SDAR was also able to capture "true" negative contributions that are particularly difficult to detect through alternative methods. These include sulphonamide, acetamide, and other functional groups, which not only lack contributions to the overall mutagenic potential, but are known to actively lower it, if present in the chemical structures of what otherwise would be potential mutagens.

  1. Development and Validation of a Rubric for Diagnosing Students' Experimental Design Knowledge and Difficulties.

    PubMed

    Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students' responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students' experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. © 2014 A. P. Dasgupta et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  2. Experimental validation of tape springs to be used as thin-walled space structures

    NASA Astrophysics Data System (ADS)

    Oberst, S.; Tuttle, S. L.; Griffin, D.; Lambert, A.; Boyce, R. R.

    2018-04-01

    With the advent of standardised launch geometries and off-the-shelf payloads, space programs utilising nano-satellite platforms are growing worldwide. Thin-walled, flexible and self-deployable structures are commonly used for antennae, instrument booms or solar panels owing to their lightweight, ideal packaging characteristics and near zero energy consumption. However their behaviour in space, in particular in Low Earth Orbits with continually changing environmental conditions, raises many questions. Accurate numerical models, which are often not available due to the difficulty of experimental testing under 1g-conditions, are needed to answer these questions. In this study, we present on-earth experimental validations, as a starting point to study the response of a tape spring as a representative of thin-walled flexible structures under static and vibrational loading. Material parameters of tape springs in a singly (straight, open cylinder) and a doubly curved design, are compared to each other by combining finite element calculations, with experimental laser vibrometry within a single and multi-stage model updating approach. While the determination of the Young's modulus is unproblematic, the damping is found to be inversely proportional to deployment length. With updated material properties the buckling instability margin is calculated using different slenderness ratios. Results indicate a high sensitivity of thin-walled structures to miniscule perturbations, which makes proper experimental testing a key requirement for stability prediction on thin-elastic space structures. The doubly curved tape spring provides closer agreement with experimental results than a straight tape spring design.

  3. Analytical modeling and experimental validation of a magnetorheological mount

    NASA Astrophysics Data System (ADS)

    Nguyen, The; Ciocanel, Constantin; Elahinia, Mohammad

    2009-03-01

    Magnetorheological (MR) fluid has been increasingly researched and applied in vibration isolation devices. To date, the suspension system of several high performance vehicles has been equipped with MR fluid based dampers and research is ongoing to develop MR fluid based mounts for engine and powertrain isolation. MR fluid based devices have received attention due to the MR fluid's capability to change its properties in the presence of a magnetic field. This characteristic places MR mounts in the class of semiactive isolators making them a desirable substitution for the passive hydraulic mounts. In this research, an analytical model of a mixed-mode MR mount was constructed. The magnetorheological mount employs flow (valve) mode and squeeze mode. Each mode is powered by an independent electromagnet, so one mode does not affect the operation of the other. The analytical model was used to predict the performance of the MR mount with different sets of parameters. Furthermore, in order to produce the actual prototype, the analytical model was used to identify the optimal geometry of the mount. The experimental phase of this research was carried by fabricating and testing the actual MR mount. The manufactured mount was tested to evaluate the effectiveness of each mode individually and in combination. The experimental results were also used to validate the ability of the analytical model in predicting the response of the MR mount. Based on the observed response of the mount a suitable controller can be designed for it. However, the control scheme is not addressed in this study.

  4. Servo-hydraulic actuator in controllable canonical form: Identification and experimental validation

    NASA Astrophysics Data System (ADS)

    Maghareh, Amin; Silva, Christian E.; Dyke, Shirley J.

    2018-02-01

    Hydraulic actuators have been widely used to experimentally examine structural behavior at multiple scales. Real-time hybrid simulation (RTHS) is one innovative testing method that largely relies on such servo-hydraulic actuators. In RTHS, interface conditions must be enforced in real time, and controllers are often used to achieve tracking of the desired displacements. Thus, neglecting the dynamics of hydraulic transfer system may result either in system instability or sub-optimal performance. Herein, we propose a nonlinear dynamical model for a servo-hydraulic actuator (a.k.a. hydraulic transfer system) coupled with a nonlinear physical specimen. The nonlinear dynamical model is transformed into controllable canonical form for further tracking control design purposes. Through a number of experiments, the controllable canonical model is validated.

  5. Virtual Reality for Enhanced Ecological Validity and Experimental Control in the Clinical, Affective and Social Neurosciences

    PubMed Central

    Parsons, Thomas D.

    2015-01-01

    An essential tension can be found between researchers interested in ecological validity and those concerned with maintaining experimental control. Research in the human neurosciences often involves the use of simple and static stimuli lacking many of the potentially important aspects of real world activities and interactions. While this research is valuable, there is a growing interest in the human neurosciences to use cues about target states in the real world via multimodal scenarios that involve visual, semantic, and prosodic information. These scenarios should include dynamic stimuli presented concurrently or serially in a manner that allows researchers to assess the integrative processes carried out by perceivers over time. Furthermore, there is growing interest in contextually embedded stimuli that can constrain participant interpretations of cues about a target’s internal states. Virtual reality environments proffer assessment paradigms that combine the experimental control of laboratory measures with emotionally engaging background narratives to enhance affective experience and social interactions. The present review highlights the potential of virtual reality environments for enhanced ecological validity in the clinical, affective, and social neurosciences. PMID:26696869

  6. Virtual Reality for Enhanced Ecological Validity and Experimental Control in the Clinical, Affective and Social Neurosciences.

    PubMed

    Parsons, Thomas D

    2015-01-01

    An essential tension can be found between researchers interested in ecological validity and those concerned with maintaining experimental control. Research in the human neurosciences often involves the use of simple and static stimuli lacking many of the potentially important aspects of real world activities and interactions. While this research is valuable, there is a growing interest in the human neurosciences to use cues about target states in the real world via multimodal scenarios that involve visual, semantic, and prosodic information. These scenarios should include dynamic stimuli presented concurrently or serially in a manner that allows researchers to assess the integrative processes carried out by perceivers over time. Furthermore, there is growing interest in contextually embedded stimuli that can constrain participant interpretations of cues about a target's internal states. Virtual reality environments proffer assessment paradigms that combine the experimental control of laboratory measures with emotionally engaging background narratives to enhance affective experience and social interactions. The present review highlights the potential of virtual reality environments for enhanced ecological validity in the clinical, affective, and social neurosciences.

  7. Standing wave design and experimental validation of a tandem simulated moving bed process for insulin purification.

    PubMed

    Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda

    2002-01-01

    A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.

  8. Numerical and experimental validation of a particle Galerkin method for metal grinding simulation

    NASA Astrophysics Data System (ADS)

    Wu, C. T.; Bui, Tinh Quoc; Wu, Youcai; Luo, Tzui-Liang; Wang, Morris; Liao, Chien-Chih; Chen, Pei-Yin; Lai, Yu-Sheng

    2018-03-01

    In this paper, a numerical approach with an experimental validation is introduced for modelling high-speed metal grinding processes in 6061-T6 aluminum alloys. The derivation of the present numerical method starts with an establishment of a stabilized particle Galerkin approximation. A non-residual penalty term from strain smoothing is introduced as a means of stabilizing the particle Galerkin method. Additionally, second-order strain gradients are introduced to the penalized functional for the regularization of damage-induced strain localization problem. To handle the severe deformation in metal grinding simulation, an adaptive anisotropic Lagrangian kernel is employed. Finally, the formulation incorporates a bond-based failure criterion to bypass the prospective spurious damage growth issues in material failure and cutting debris simulation. A three-dimensional metal grinding problem is analyzed and compared with the experimental results to demonstrate the effectiveness and accuracy of the proposed numerical approach.

  9. Retrieval of Droplet size Density Distribution from Multiple field of view Cross polarized Lidar Signals: Theory and Experimental Validation

    DTIC Science & Technology

    2016-06-02

    Retrieval of droplet-size density distribution from multiple-field-of-view cross-polarized lidar signals: theory and experimental validation...theoretical and experimental studies of mul- tiple scattering and multiple-field-of-view (MFOV) li- dar detection have made possible the retrieval of cloud...droplet cloud are typical of Rayleigh scattering, with a signature close to a dipole (phase function quasi -flat and a zero-depolarization ratio

  10. The International Experimental Thermal Hydraulic Systems database – TIETHYS: A new NEA validation tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, Upendra S.

    Nuclear reactor codes require validation with appropriate data representing the plant for specific scenarios. The thermal-hydraulic data is scattered in different locations and in different formats. Some of the data is in danger of being lost. A relational database is being developed to organize the international thermal hydraulic test data for various reactor concepts and different scenarios. At the reactor system level, that data is organized to include separate effect tests and integral effect tests for specific scenarios and corresponding phenomena. The database relies on the phenomena identification sections of expert developed PIRTs. The database will provide a summary ofmore » appropriate data, review of facility information, test description, instrumentation, references for the experimental data and some examples of application of the data for validation. The current database platform includes scenarios for PWR, BWR, VVER, and specific benchmarks for CFD modelling data and is to be expanded to include references for molten salt reactors. There are place holders for high temperature gas cooled reactors, CANDU and liquid metal reactors. This relational database is called The International Experimental Thermal Hydraulic Systems (TIETHYS) database and currently resides at Nuclear Energy Agency (NEA) of the OECD and is freely open to public access. Going forward the database will be extended to include additional links and data as they become available. https://www.oecd-nea.org/tiethysweb/« less

  11. How to validate similarity in linear transform models of event-related potentials between experimental conditions?

    PubMed

    Cong, Fengyu; Lin, Qiu-Hua; Astikainen, Piia; Ristaniemi, Tapani

    2014-10-30

    It is well-known that data of event-related potentials (ERPs) conform to the linear transform model (LTM). For group-level ERP data processing using principal/independent component analysis (PCA/ICA), ERP data of different experimental conditions and different participants are often concatenated. It is theoretically assumed that different experimental conditions and different participants possess the same LTM. However, how to validate the assumption has been seldom reported in terms of signal processing methods. When ICA decomposition is globally optimized for ERP data of one stimulus, we gain the ratio between two coefficients mapping a source in brain to two points along the scalp. Based on such a ratio, we defined a relative mapping coefficient (RMC). If RMCs between two conditions for an ERP are not significantly different in practice, mapping coefficients of this ERP between the two conditions are statistically identical. We examined whether the same LTM of ERP data could be applied for two different stimulus types of fearful and happy facial expressions. They were used in an ignore oddball paradigm in adult human participants. We found no significant difference in LTMs (based on ICASSO) of N170 responses to the fearful and the happy faces in terms of RMCs of N170. We found no methods for straightforward comparison. The proposed RMC in light of ICA decomposition is an effective approach for validating the similarity of LTMs of ERPs between experimental conditions. This is very fundamental to apply group-level PCA/ICA to process ERP data. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. On the experimental validation of model-based dose calculation algorithms for 192Ir HDR brachytherapy treatment planning

    NASA Astrophysics Data System (ADS)

    Pappas, Eleftherios P.; Zoros, Emmanouil; Moutsatsos, Argyris; Peppa, Vasiliki; Zourari, Kyveli; Karaiskos, Pantelis; Papagiannis, Panagiotis

    2017-05-01

    There is an acknowledged need for the design and implementation of physical phantoms appropriate for the experimental validation of model-based dose calculation algorithms (MBDCA) introduced recently in 192Ir brachytherapy treatment planning systems (TPS), and this work investigates whether it can be met. A PMMA phantom was prepared to accommodate material inhomogeneities (air and Teflon), four plastic brachytherapy catheters, as well as 84 LiF TLD dosimeters (MTS-100M 1  ×  1  ×  1 mm3 microcubes), two radiochromic films (Gafchromic EBT3) and a plastic 3D dosimeter (PRESAGE). An irradiation plan consisting of 53 source dwell positions was prepared on phantom CT images using a commercially available TPS and taking into account the calibration dose range of each detector. Irradiation was performed using an 192Ir high dose rate (HDR) source. Dose to medium in medium, Dmm , was calculated using the MBDCA option of the same TPS as well as Monte Carlo (MC) simulation with the MCNP code and a benchmarked methodology. Measured and calculated dose distributions were spatially registered and compared. The total standard (k  =  1) spatial uncertainties for TLD, film and PRESAGE were: 0.71, 1.58 and 2.55 mm. Corresponding percentage total dosimetric uncertainties were: 5.4-6.4, 2.5-6.4 and 4.85, owing mainly to the absorbed dose sensitivity correction and the relative energy dependence correction (position dependent) for TLD, the film sensitivity calibration (dose dependent) and the dependencies of PRESAGE sensitivity. Results imply a LiF over-response due to a relative intrinsic energy dependence between 192Ir and megavoltage calibration energies, and a dose rate dependence of PRESAGE sensitivity at low dose rates (<1 Gy min-1). Calculations were experimentally validated within uncertainties except for MBDCA results for points in the phantom periphery and dose levels  <20%. Experimental MBDCA validation is laborious, yet feasible. Further

  13. Supersonic Retro-Propulsion Experimental Design for Computational Fluid Dynamics Model Validation

    NASA Technical Reports Server (NTRS)

    Berry, Scott A.; Laws, Christopher T.; Kleb, W. L.; Rhode, Matthew N.; Spells, Courtney; McCrea, Andrew C.; Truble, Kerry A.; Schauerhamer, Daniel G.; Oberkampf, William L.

    2011-01-01

    The development of supersonic retro-propulsion, an enabling technology for heavy payload exploration missions to Mars, is the primary focus for the present paper. A new experimental model, intended to provide computational fluid dynamics model validation data, was recently designed for the Langley Research Center Unitary Plan Wind Tunnel Test Section 2. Pre-test computations were instrumental for sizing and refining the model, over the Mach number range of 2.4 to 4.6, such that tunnel blockage and internal flow separation issues would be minimized. A 5-in diameter 70-deg sphere-cone forebody, which accommodates up to four 4:1 area ratio nozzles, followed by a 10-in long cylindrical aftbody was developed for this study based on the computational results. The model was designed to allow for a large number of surface pressure measurements on the forebody and aftbody. Supplemental data included high-speed Schlieren video and internal pressures and temperatures. The run matrix was developed to allow for the quantification of various sources of experimental uncertainty, such as random errors due to run-to-run variations and bias errors due to flow field or model misalignments. Some preliminary results and observations from the test are presented, although detailed analyses of the data and uncertainties are still on going.

  14. Experimental validation of finite element model analysis of a steel frame in simulated post-earthquake fire environments

    NASA Astrophysics Data System (ADS)

    Huang, Ying; Bevans, W. J.; Xiao, Hai; Zhou, Zhi; Chen, Genda

    2012-04-01

    During or after an earthquake event, building system often experiences large strains due to shaking effects as observed during recent earthquakes, causing permanent inelastic deformation. In addition to the inelastic deformation induced by the earthquake effect, the post-earthquake fires associated with short fuse of electrical systems and leakage of gas devices can further strain the already damaged structures during the earthquakes, potentially leading to a progressive collapse of buildings. Under these harsh environments, measurements on the involved building by various sensors could only provide limited structural health information. Finite element model analysis, on the other hand, if validated by predesigned experiments, can provide detail structural behavior information of the entire structures. In this paper, a temperature dependent nonlinear 3-D finite element model (FEM) of a one-story steel frame is set up by ABAQUS based on the cited material property of steel from EN 1993-1.2 and AISC manuals. The FEM is validated by testing the modeled steel frame in simulated post-earthquake environments. Comparisons between the FEM analysis and the experimental results show that the FEM predicts the structural behavior of the steel frame in post-earthquake fire conditions reasonably. With experimental validations, the FEM analysis of critical structures could be continuously predicted for structures in these harsh environments for a better assistant to fire fighters in their rescue efforts and save fire victims.

  15. Systematic bioinformatics and experimental validation of yeast complexes reduces the rate of attrition during structural investigations.

    PubMed

    Brooks, Mark A; Gewartowski, Kamil; Mitsiki, Eirini; Létoquart, Juliette; Pache, Roland A; Billier, Ysaline; Bertero, Michela; Corréa, Margot; Czarnocki-Cieciura, Mariusz; Dadlez, Michal; Henriot, Véronique; Lazar, Noureddine; Delbos, Lila; Lebert, Dorothée; Piwowarski, Jan; Rochaix, Pascal; Böttcher, Bettina; Serrano, Luis; Séraphin, Bertrand; van Tilbeurgh, Herman; Aloy, Patrick; Perrakis, Anastassis; Dziembowski, Andrzej

    2010-09-08

    For high-throughput structural studies of protein complexes of composition inferred from proteomics data, it is crucial that candidate complexes are selected accurately. Herein, we exemplify a procedure that combines a bioinformatics tool for complex selection with in vivo validation, to deliver structural results in a medium-throughout manner. We have selected a set of 20 yeast complexes, which were predicted to be feasible by either an automated bioinformatics algorithm, by manual inspection of primary data, or by literature searches. These complexes were validated with two straightforward and efficient biochemical assays, and heterologous expression technologies of complex components were then used to produce the complexes to assess their feasibility experimentally. Approximately one-half of the selected complexes were useful for structural studies, and we detail one particular success story. Our results underscore the importance of accurate target selection and validation in avoiding transient, unstable, or simply nonexistent complexes from the outset. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Out-of-plane buckling of pantographic fabrics in displacement-controlled shear tests: experimental results and model validation

    NASA Astrophysics Data System (ADS)

    Barchiesi, Emilio; Ganzosch, Gregor; Liebold, Christian; Placidi, Luca; Grygoruk, Roman; Müller, Wolfgang H.

    2018-01-01

    Due to the latest advancements in 3D printing technology and rapid prototyping techniques, the production of materials with complex geometries has become more affordable than ever. Pantographic structures, because of their attractive features, both in dynamics and statics and both in elastic and inelastic deformation regimes, deserve to be thoroughly investigated with experimental and theoretical tools. Herein, experimental results relative to displacement-controlled large deformation shear loading tests of pantographic structures are reported. In particular, five differently sized samples are analyzed up to first rupture. Results show that the deformation behavior is strongly nonlinear, and the structures are capable of undergoing large elastic deformations without reaching complete failure. Finally, a cutting edge model is validated by means of these experimental results.

  17. Experimental and Quasi-Experimental Design.

    ERIC Educational Resources Information Center

    Cottrell, Edward B.

    With an emphasis on the problems of control of extraneous variables and threats to internal and external validity, the arrangement or design of experiments is discussed. The purpose of experimentation in an educational institution, and the principles governing true experimentation (randomization, replication, and control) are presented, as are…

  18. Experimental Validation Plan for the Xolotl Plasma-Facing Component Simulator Using Tokamak Sample Exposures

    NASA Astrophysics Data System (ADS)

    Chan, V. S.; Wong, C. P. C.; McLean, A. G.; Luo, G. N.; Wirth, B. D.

    2013-10-01

    The Xolotl code under development by PSI-SciDAC will enhance predictive modeling capability of plasma-facing materials under burning plasma conditions. The availability and application of experimental data to compare to code-calculated observables are key requirements to validate the breadth and content of physics included in the model and ultimately gain confidence in its results. A dedicated effort has been in progress to collect and organize a) a database of relevant experiments and their publications as previously carried out at sample exposure facilities in US and Asian tokamaks (e.g., DIII-D DiMES, and EAST MAPES), b) diagnostic and surface analysis capabilities available at each device, and c) requirements for future experiments with code validation in mind. The content of this evolving database will serve as a significant resource for the plasma-material interaction (PMI) community. Work supported in part by the US Department of Energy under GA-DE-SC0008698, DE-AC52-07NA27344 and DE-AC05-00OR22725.

  19. Faster experimental validation of microRNA targets using cold fusion cloning and a dual firefly-Renilla luciferase reporter assay.

    PubMed

    Alvarez, M Lucrecia

    2014-01-01

    Different target prediction algorithms have been developed to provide a list of candidate target genes for a given animal microRNAs (miRNAs). However, these computational approaches provide both false-positive and false-negative predictions. Therefore, the target genes of a specific miRNA identified in silico should be experimentally validated. In this chapter, we describe a step-by-step protocol for the experimental validation of a direct miRNA target using a faster Dual Firefly-Renilla Luciferase Reporter Assay. We describe how to construct reporter plasmids using the simple, fast, and highly efficient cold fusion cloning technology, which does not require ligase, phosphatase, or restriction enzymes. In addition, we provide a protocol for co-transfection of reporter plasmids with either miRNA mimics or miRNA inhibitors in human embryonic kidney 293 (HEK293) cells, as well as a description on how to measure Firefly and Renilla luciferase activity using the Dual-Glo Luciferase Assay kit. As an example of the use of this technology, we will validate glucose-6-phosphate dehydrogenase (G6PD) as a direct target of miR-1207-5p.

  20. System-Level Experimental Validations for Supersonic Commercial Transport Aircraft Entering Service in the 2018-2020 Time Period

    NASA Technical Reports Server (NTRS)

    Magee, Todd E.; Wilcox, Peter A.; Fugal, Spencer R.; Acheson, Kurt E.; Adamson, Eric E.; Bidwell, Alicia L.; Shaw, Stephen G.

    2013-01-01

    This report describes the work conducted by The Boeing Company under American Recovery and Reinvestment Act (ARRA) and NASA funding to experimentally validate the conceptual design of a supersonic airliner feasible for entry into service in the 2018 to 2020 timeframe (NASA N+2 generation). The report discusses the design, analysis and development of a low-boom concept that meets aggressive sonic boom and performance goals for a cruise Mach number of 1.8. The design is achieved through integrated multidisciplinary optimization tools. The report also describes the detailed design and fabrication of both sonic boom and performance wind tunnel models of the low-boom concept. Additionally, a description of the detailed validation wind tunnel testing that was performed with the wind tunnel models is provided along with validation comparisons with pretest Computational Fluid Dynamics (CFD). Finally, the report describes the evaluation of existing NASA sonic boom pressure rail measurement instrumentation and a detailed description of new sonic boom measurement instrumentation that was constructed for the validation wind tunnel testing.

  1. Using experimental human influenza infections to validate a viral dynamic model and the implications for prediction.

    PubMed

    Chen, S C; You, S H; Liu, C Y; Chio, C P; Liao, C M

    2012-09-01

    The aim of this work was to use experimental infection data of human influenza to assess a simple viral dynamics model in epithelial cells and better understand the underlying complex factors governing the infection process. The developed study model expands on previous reports of a target cell-limited model with delayed virus production. Data from 10 published experimental infection studies of human influenza was used to validate the model. Our results elucidate, mechanistically, the associations between epithelial cells, human immune responses, and viral titres and were supported by the experimental infection data. We report that the maximum total number of free virions following infection is 10(3)-fold higher than the initial introduced titre. Our results indicated that the infection rates of unprotected epithelial cells probably play an important role in affecting viral dynamics. By simulating an advanced model of viral dynamics and applying it to experimental infection data of human influenza, we obtained important estimates of the infection rate. This work provides epidemiologically meaningful results, meriting further efforts to understand the causes and consequences of influenza A infection.

  2. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  3. Contact-coupled impact of slender rods: analysis and experimental validation

    PubMed Central

    Tibbitts, Ira B.; Kakarla, Deepika; Siskey, Stephanie; Ochoa, Jorge A.; Ong, Kevin L.; Brannon, Rebecca M.

    2013-01-01

    To validate models of contact mechanics in low speed structural impact, slender rods were impacted in a drop tower, and measurements of the contact and vibration were compared to analytical and finite element (FE) models. The contact area was recorded using a novel thin-film transfer technique, and the contact duration was measured using electrical continuity. Strain gages recorded the vibratory strain in one rod, and a laser Doppler vibrometer measured speed. The experiment was modeled analytically on a one-dimensional spatial domain using a quasi-static Hertzian contact law and a system of delay differential equations. The three-dimensional FE model used hexahedral elements, a penalty contact algorithm, and explicit time integration. A small submodel taken from the initial global FE model economically refined the analysis in the small contact region. Measured contact areas were within 6% of both models’ predictions, peak speeds within 2%, cyclic strains within 12 με (RMS value), and contact durations within 2 μs. The global FE model and the measurements revealed small disturbances, not predicted by the analytical model, believed to be caused by interactions of the non-planar stress wavefront with the rod’s ends. The accuracy of the predictions for this simple test, as well as the versatility of the diagnostic tools, validates the theoretical and computational models, corroborates instrument calibration, and establishes confidence that the same methods may be used in experimental and computational study of contact mechanics during impact of more complicated structures. Recommendations are made for applying the methods to a particular biomechanical problem: the edge-loading of a loose prosthetic hip joint which can lead to premature wear and prosthesis failure. PMID:24729630

  4. Reversed phase HPLC for strontium ranelate: Method development and validation applying experimental design.

    PubMed

    Kovács, Béla; Kántor, Lajos Kristóf; Croitoru, Mircea Dumitru; Kelemen, Éva Katalin; Obreja, Mona; Nagy, Előd Ernő; Székely-Szentmiklósi, Blanka; Gyéresi, Árpád

    2018-06-01

    A reverse-phase HPLC (RP-HPLC) method was developed for strontium ranelate using a full factorial, screening experimental design. The analytical procedure was validated according to international guidelines for linearity, selectivity, sensitivity, accuracy and precision. A separate experimental design was used to demonstrate the robustness of the method. Strontium ranelate was eluted at 4.4 minutes and showed no interference with the excipients used in the formulation, at 321 nm. The method is linear in the range of 20-320 μg mL-1 (R2 = 0.99998). Recovery, tested in the range of 40-120 μg mL-1, was found to be 96.1-102.1 %. Intra-day and intermediate precision RSDs ranged from 1.0-1.4 and 1.2-1.4 %, resp. The limit of detection and limit of quantitation were 0.06 and 0.20 μg mL-1, resp. The proposed technique is fast, cost-effective, reliable and reproducible, and is proposed for the routine analysis of strontium ranelate.

  5. Numerical predictions of shock propagation through unreactive and reactive liquids with experimental validation

    NASA Astrophysics Data System (ADS)

    Stekovic, Svjetlana; Nissen, Erin; Bhowmick, Mithun; Stewart, Donald S.; Dlott, Dana D.

    2017-06-01

    The objective of this work is to numerically analyze shock behavior as it propagates through compressed, unreactive and reactive liquid, such as liquid water and liquid nitromethane. Parameters, such as pressure and density, are analyzed using the Mie-Gruneisen EOS and each multi-material system is modeled using the ALE3D software. The motivation for this study is based on provided high-resolution, optical interferometer (PDV) and optical pyrometer measurements. In the experimental set-up, a liquid is placed between an Al 1100 plate and Pyrex BK-7 glass. A laser-driven Al 1100 flyer impacts the plate, causing the liquid to be highly compressed. The numerical model investigates the influence of the high pressure, shock-compressed behavior in each liquid, the energy transfer, and the wave impedance at the interface of each material in contact. The numerical results using ALE3D will be validated by experimental data. This work aims to provide further understanding of shock-compressed behavior and how the shock influences phase transition in each liquid.

  6. Numerical Simulation and Experimental Validation of Failure Caused by Vibration of a Fan

    NASA Astrophysics Data System (ADS)

    Zhou, Qiang; Han, Wu; Feng, Jianmei; Jia, Xiaohan; Peng, Xueyuan

    2017-08-01

    This paper presents the root cause analysis of an unexpected fracture occurred on the blades of a motor fan used in a natural gas reciprocating compressor unit. A finite element model was established to investigate the natural frequencies and modal shapes of the fan, and a modal test was performed to verify the numerical results. It was indicated that the numerical results agreed well with experimental data. The third order natural frequency was close to the six times excitation frequency, and the corresponding modal shape was the combination of bending and torsional vibration, which consequently contributed to low-order resonance and fracture failure of the fan. The torsional moment obtained by a torsional vibration analysis of the compressor shaft system was exerted on the numerical model of the fan to evaluate the dynamic stress response of the fan. The results showed that the stress concentration regions on the numerical model were consistent with the location of fractures on the fan. Based on the numerical simulation and experimental validation, some recommendations were given to improve the reliability of the motor fan.

  7. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation.

    PubMed

    Escaño, Mary Clare Sison; Arevalo, Ryan Lacdao; Gyenge, Elod; Kasai, Hideaki

    2014-09-03

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4(-) on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.

  8. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    NASA Astrophysics Data System (ADS)

    Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki

    2014-09-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.

  9. Experimental Definition and Validation of Protein Coding Transcripts in Chlamydomonas reinhardtii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kourosh Salehi-Ashtiani; Jason A. Papin

    Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnectedmore » metabolic networks to optimize production of the compounds of interest. Using Chlamydomonas reinhardtii as a model, we developed a systems-level methodology bridging metabolic network reconstruction with annotation and experimental verification of enzyme encoding open reading frames. We reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. Our approach to generate a predictive metabolic model integrated with cloned open reading frames, provides a cost-effective platform to generate metabolic engineering resources. While the generated resources are specific to algal systems, the approach that we have developed is not specific to

  10. Integral nuclear data validation using experimental spent nuclear fuel compositions

    DOE PAGES

    Gauld, Ian C.; Williams, Mark L.; Michel-Sendis, Franco; ...

    2017-07-19

    Measurements of the isotopic contents of spent nuclear fuel provide experimental data that are a prerequisite for validating computer codes and nuclear data for many spent fuel applications. Under the auspices of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) and guidance of the Expert Group on Assay Data of Spent Nuclear Fuel of the NEA Working Party on Nuclear Criticality Safety, a new database of expanded spent fuel isotopic compositions has been compiled. The database, Spent Fuel Compositions (SFCOMPO) 2.0, includes measured data for more than 750 fuel samples acquired from 44 different reactors andmore » representing eight different reactor technologies. Measurements for more than 90 isotopes are included. This new database provides data essential for establishing the reliability of code systems for inventory predictions, but it also has broader potential application to nuclear data evaluation. Furthermore, the database, together with adjoint based sensitivity and uncertainty tools for transmutation systems developed to quantify the importance of nuclear data on nuclide concentrations, are described.« less

  11. Integral nuclear data validation using experimental spent nuclear fuel compositions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauld, Ian C.; Williams, Mark L.; Michel-Sendis, Franco

    Measurements of the isotopic contents of spent nuclear fuel provide experimental data that are a prerequisite for validating computer codes and nuclear data for many spent fuel applications. Under the auspices of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) and guidance of the Expert Group on Assay Data of Spent Nuclear Fuel of the NEA Working Party on Nuclear Criticality Safety, a new database of expanded spent fuel isotopic compositions has been compiled. The database, Spent Fuel Compositions (SFCOMPO) 2.0, includes measured data for more than 750 fuel samples acquired from 44 different reactors andmore » representing eight different reactor technologies. Measurements for more than 90 isotopes are included. This new database provides data essential for establishing the reliability of code systems for inventory predictions, but it also has broader potential application to nuclear data evaluation. Furthermore, the database, together with adjoint based sensitivity and uncertainty tools for transmutation systems developed to quantify the importance of nuclear data on nuclide concentrations, are described.« less

  12. Optimization and experimental validation of electrostatic adhesive geometry

    NASA Astrophysics Data System (ADS)

    Ruffatto, D.; Shah, J.; Spenko, M.

    This paper introduces a method to optimize the electrode geometry of electrostatic adhesives for robotic gripping, attachment, and manipulation applications. Electrostatic adhesion is achieved by applying a high voltage potential, on the order of kV, to a set of electrodes, which generates an electric field. The electric field polarizes the substrate material and creates an adhesion force. Previous attempts at creating electro-static adhesives have shown them to be effective, but researchers have made no effort to optimize the electrode configuration and geometry. We have shown that by optimizing the geometry of the electrode configuration, the electric field strength, and therefore the adhesion force, is enhanced. To accomplish this, Comsol Multiphysics was utilized to evaluate the average electric field generated by a given electrode geometry. Several electrode patterns were evaluated, including parallel conductors, concentric circles, Hilbert curves (a fractal geometry) and spirals. The arrangement of the electrodes in concentric circles with varying electrode widths proved to be the most effective. The most effective sizing was to use the smallest gap spacing allowable coupled with a variable electrode width. These results were experimentally validated on several different surfaces including drywall, wood, tile, glass, and steel. A new manufacturing process allowing for the fabrication of thin, conformal electro-static adhesive pads was utilized. By combining the optimized electrode geometry with the new fabrication process we are able to demonstrate a marked improvement of up to 500% in shear pressure when compared to previously published values.

  13. Three-dimensional deformation response of a NiTi shape memory helical-coil actuator during thermomechanical cycling: experimentally validated numerical model

    NASA Astrophysics Data System (ADS)

    Dhakal, B.; Nicholson, D. E.; Saleeb, A. F.; Padula, S. A., II; Vaidyanathan, R.

    2016-09-01

    Shape memory alloy (SMA) actuators often operate under a complex state of stress for an extended number of thermomechanical cycles in many aerospace and engineering applications. Hence, it becomes important to account for multi-axial stress states and deformation characteristics (which evolve with thermomechanical cycling) when calibrating any SMA model for implementation in large-scale simulation of actuators. To this end, the present work is focused on the experimental validation of an SMA model calibrated for the transient and cyclic evolutionary behavior of shape memory Ni49.9Ti50.1, for the actuation of axially loaded helical-coil springs. The approach requires both experimental and computational aspects to appropriately assess the thermomechanical response of these multi-dimensional structures. As such, an instrumented and controlled experimental setup was assembled to obtain temperature, torque, degree of twist and extension, while controlling end constraints during heating and cooling of an SMA spring under a constant externally applied axial load. The computational component assesses the capabilities of a general, multi-axial, SMA material-modeling framework, calibrated for Ni49.9Ti50.1 with regard to its usefulness in the simulation of SMA helical-coil spring actuators. Axial extension, being the primary response, was examined on an axially-loaded spring with multiple active coils. Two different conditions of end boundary constraint were investigated in both the numerical simulations as well as the validation experiments: Case (1) where the loading end is restrained against twist (and the resulting torque measured as the secondary response) and Case (2) where the loading end is free to twist (and the degree of twist measured as the secondary response). The present study focuses on the transient and evolutionary response associated with the initial isothermal loading and the subsequent thermal cycles under applied constant axial load. The experimental

  14. Three-dimensional computational fluid dynamics modelling and experimental validation of the Jülich Mark-F solid oxide fuel cell stack

    NASA Astrophysics Data System (ADS)

    Nishida, R. T.; Beale, S. B.; Pharoah, J. G.; de Haart, L. G. J.; Blum, L.

    2018-01-01

    This work is among the first where the results of an extensive experimental research programme are compared to performance calculations of a comprehensive computational fluid dynamics model for a solid oxide fuel cell stack. The model, which combines electrochemical reactions with momentum, heat, and mass transport, is used to obtain results for an established industrial-scale fuel cell stack design with complex manifolds. To validate the model, comparisons with experimentally gathered voltage and temperature data are made for the Jülich Mark-F, 18-cell stack operating in a test furnace. Good agreement is obtained between the model and experiment results for cell voltages and temperature distributions, confirming the validity of the computational methodology for stack design. The transient effects during ramp up of current in the experiment may explain a lower average voltage than model predictions for the power curve.

  15. Simulation of particle motion in a closed conduit validated against experimental data

    NASA Astrophysics Data System (ADS)

    Dolanský, Jindřich

    2015-05-01

    Motion of a number of spherical particles in a closed conduit is examined by means of both simulation and experiment. The bed of the conduit is covered by stationary spherical particles of the size of the moving particles. The flow is driven by experimentally measured velocity profiles which are inputs of the simulation. Altering input velocity profiles generates various trajectory patterns. The lattice Boltzmann method (LBM) based simulation is developed to study mutual interactions of the flow and the particles. The simulation enables to model both the particle motion and the fluid flow. The entropic LBM is employed to deal with the flow characterized by the high Reynolds number. The entropic modification of the LBM along with the enhanced refinement of the lattice grid yield an increase in demands on computational resources. Due to the inherently parallel nature of the LBM it can be handled by employing the Parallel Computing Toolbox (MATLAB) and other transformations enabling usage of the CUDA GPU computing technology. The trajectories of the particles determined within the LBM simulation are validated against data gained from the experiments. The compatibility of the simulation results with the outputs of experimental measurements is evaluated. The accuracy of the applied approach is assessed and stability and efficiency of the simulation is also considered.

  16. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis.

    PubMed

    Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.

  17. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis

    PubMed Central

    Holgado-Tello, Fco. P.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A.

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity. PMID:27378991

  18. Experimental validation of a numerical model for subway induced vibrations

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Degrande, G.; Lombaert, G.

    2009-04-01

    This paper presents the experimental validation of a coupled periodic finite element-boundary element model for the prediction of subway induced vibrations. The model fully accounts for the dynamic interaction between the train, the track, the tunnel and the soil. The periodicity or invariance of the tunnel and the soil in the longitudinal direction is exploited using the Floquet transformation, which allows for an efficient formulation in the frequency-wavenumber domain. A general analytical formulation is used to compute the response of three-dimensional invariant or periodic media that are excited by moving loads. The numerical model is validated by means of several experiments that have been performed at a site in Regent's Park on the Bakerloo line of London Underground. Vibration measurements have been performed on the axle boxes of the train, on the rail, the tunnel invert and the tunnel wall, and in the free field, both at the surface and at a depth of 15 m. Prior to these vibration measurements, the dynamic soil characteristics and the track characteristics have been determined. The Bakerloo line tunnel of London Underground has been modelled using the coupled periodic finite element-boundary element approach and free field vibrations due to the passage of a train at different speeds have been predicted and compared to the measurements. The correspondence between the predicted and measured response in the tunnel is reasonably good, although some differences are observed in the free field. The discrepancies are explained on the basis of various uncertainties involved in the problem. The variation in the response with train speed is similar for the measurements as well as the predictions. This study demonstrates the applicability of the coupled periodic finite element-boundary element model to make realistic predictions of the vibrations from underground railways.

  19. Comparison of LIDAR system performance for alternative single-mode receiver architectures: modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Toliver, Paul; Ozdur, Ibrahim; Agarwal, Anjali; Woodward, T. K.

    2013-05-01

    In this paper, we describe a detailed performance comparison of alternative single-pixel, single-mode LIDAR architectures including (i) linear-mode APD-based direct-detection, (ii) optically-preamplified PIN receiver, (iii) PINbased coherent-detection, and (iv) Geiger-mode single-photon-APD counting. Such a comparison is useful when considering next-generation LIDAR on a chip, which would allow one to leverage extensive waveguide-based structures and processing elements developed for telecom and apply them to small form-factor sensing applications. Models of four LIDAR transmit and receive systems are described in detail, which include not only the dominant sources of receiver noise commonly assumed in each of the four detection limits, but also additional noise terms present in realistic implementations. These receiver models are validated through the analysis of detection statistics collected from an experimental LIDAR testbed. The receiver is reconfigurable into four modes of operation, while transmit waveforms and channel characteristics are held constant. The use of a diffuse hard target highlights the importance of including speckle noise terms in the overall system analysis. All measurements are done at 1550 nm, which offers multiple system advantages including less stringent eye safety requirements and compatibility with available telecom components, optical amplification, and photonic integration. Ultimately, the experimentally-validated detection statistics can be used as part of an end-to-end system model for projecting rate, range, and resolution performance limits and tradeoffs of alternative integrated LIDAR architectures.

  20. Experimental validation of a coupled neutron-photon inverse radiation transport solver

    NASA Astrophysics Data System (ADS)

    Mattingly, John; Mitchell, Dean J.; Harding, Lee T.

    2011-10-01

    Sandia National Laboratories has developed an inverse radiation transport solver that applies nonlinear regression to coupled neutron-photon deterministic transport models. The inverse solver uses nonlinear regression to fit a radiation transport model to gamma spectrometry and neutron multiplicity counting measurements. The subject of this paper is the experimental validation of that solver. This paper describes a series of experiments conducted with a 4.5 kg sphere of α-phase, weapons-grade plutonium. The source was measured bare and reflected by high-density polyethylene (HDPE) spherical shells with total thicknesses between 1.27 and 15.24 cm. Neutron and photon emissions from the source were measured using three instruments: a gross neutron counter, a portable neutron multiplicity counter, and a high-resolution gamma spectrometer. These measurements were used as input to the inverse radiation transport solver to evaluate the solver's ability to correctly infer the configuration of the source from its measured radiation signatures.

  1. Zero-G experimental validation of a robotics-based inertia identification algorithm

    NASA Astrophysics Data System (ADS)

    Bruggemann, Jeremy J.; Ferrel, Ivann; Martinez, Gerardo; Xie, Pu; Ma, Ou

    2010-04-01

    The need to efficiently identify the changing inertial properties of on-orbit spacecraft is becoming more critical as satellite on-orbit services, such as refueling and repairing, become increasingly aggressive and complex. This need stems from the fact that a spacecraft's control system relies on the knowledge of the spacecraft's inertia parameters. However, the inertia parameters may change during flight for reasons such as fuel usage, payload deployment or retrieval, and docking/capturing operations. New Mexico State University's Dynamics, Controls, and Robotics Research Group has proposed a robotics-based method of identifying unknown spacecraft inertia properties1. Previous methods require firing known thrusts then measuring the thrust, and the velocity and acceleration changes. The new method utilizes the concept of momentum conservation, while employing a robotic device powered by renewable energy to excite the state of the satellite. Thus, it requires no fuel usage or force and acceleration measurements. The method has been well studied in theory and demonstrated by simulation. However its experimental validation is challenging because a 6- degree-of-freedom motion in a zero-gravity condition is required. This paper presents an on-going effort to test the inertia identification method onboard the NASA zero-G aircraft. The design and capability of the test unit will be discussed in addition to the flight data. This paper also introduces the design and development of an airbearing based test used to partially validate the method, in addition to the approach used to obtain reference value for the test system's inertia parameters that can be used for comparison with the algorithm results.

  2. Computational Prediction and Rationalization, and Experimental Validation of Handedness Induction in Helical Aromatic Oligoamide Foldamers.

    PubMed

    Liu, Zhiwei; Hu, Xiaobo; Abramyan, Ara M; Mészáros, Ádám; Csékei, Márton; Kotschy, András; Huc, Ivan; Pophristic, Vojislava

    2017-03-13

    Metadynamics simulations were used to describe the conformational energy landscapes of several helically folded aromatic quinoline carboxamide oligomers bearing a single chiral group at either the C or N terminus. The calculations allowed the prediction of whether a helix handedness bias occurs under the influence of the chiral group and gave insight into the interactions (sterics, electrostatics, hydrogen bonds) responsible for a particular helix sense preference. In the case of camphanyl-based and morpholine-based chiral groups, experimental data confirming the validity of the calculations were already available. New chiral groups with a proline residue were also investigated and were predicted to induce handedness. This prediction was verified experimentally through the synthesis of proline-containing monomers, their incorporation into an oligoamide sequence by solid phase synthesis and the investigation of handedness induction by NMR spectroscopy and circular dichroism. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Experimentally valid predictions of muscle force and EMG in models of motor-unit function are most sensitive to neural properties.

    PubMed

    Keenan, Kevin G; Valero-Cuevas, Francisco J

    2007-09-01

    Computational models of motor-unit populations are the objective implementations of the hypothesized mechanisms by which neural and muscle properties give rise to electromyograms (EMGs) and force. However, the variability/uncertainty of the parameters used in these models--and how they affect predictions--confounds assessing these hypothesized mechanisms. We perform a large-scale computational sensitivity analysis on the state-of-the-art computational model of surface EMG, force, and force variability by combining a comprehensive review of published experimental data with Monte Carlo simulations. To exhaustively explore model performance and robustness, we ran numerous iterative simulations each using a random set of values for nine commonly measured motor neuron and muscle parameters. Parameter values were sampled across their reported experimental ranges. Convergence after 439 simulations found that only 3 simulations met our two fitness criteria: approximating the well-established experimental relations for the scaling of EMG amplitude and force variability with mean force. An additional 424 simulations preferentially sampling the neighborhood of those 3 valid simulations converged to reveal 65 additional sets of parameter values for which the model predictions approximate the experimentally known relations. We find the model is not sensitive to muscle properties but very sensitive to several motor neuron properties--especially peak discharge rates and recruitment ranges. Therefore to advance our understanding of EMG and muscle force, it is critical to evaluate the hypothesized neural mechanisms as implemented in today's state-of-the-art models of motor unit function. We discuss experimental and analytical avenues to do so as well as new features that may be added in future implementations of motor-unit models to improve their experimental validity.

  4. Multiscale free-space optical interconnects for intrachip global communication: motivation, analysis, and experimental validation.

    PubMed

    McFadden, Michael J; Iqbal, Muzammil; Dillon, Thomas; Nair, Rohit; Gu, Tian; Prather, Dennis W; Haney, Michael W

    2006-09-01

    The use of optical interconnects for communication between points on a microchip is motivated by system-level interconnect modeling showing the saturation of metal wire capacity at the global layer. Free-space optical solutions are analyzed for intrachip communication at the global layer. A multiscale solution comprising microlenses, etched compound slope microprisms, and a curved mirror is shown to outperform a single-scale alternative. Microprisms are designed and fabricated and inserted into an optical setup apparatus to experimentally validate the concept. The multiscale free-space system is shown to have the potential to provide the bandwidth density and configuration flexibility required for global communication in future generations of microchips.

  5. Experimental validation of the Helmoltz equation for the surface potential of Langmuir monolayers

    NASA Astrophysics Data System (ADS)

    El Abed, Abdel I.

    2009-10-01

    We show in this paper that monolayers of the nonhydrophilic F8H18 semifluorinated n -alkane constitute when spread on the hydrophobic top of an alamethicin Langmuir monolayer, a very good experimental system in order to check the validity of Helmoltz equation. This system allows for a good agreement between measured and calculated surface potentials of unionized Langmuir monolayers. We show also that the relative dielectric constant of the F8H18 monolayer does not vary upon compression of the monolayer, the measured 2.9 value is in a very good agreement with literature data. We attribute this behavior to the self-aggregation of F8H18 molecules in nanosized circular domains whose size remains constant upon compression as shown by atomic force microscopy.

  6. Numerical and experimental validation for the thermal transmittance of windows with cellular shades

    DOE PAGES

    Hart, Robert

    2018-02-21

    Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less

  7. Numerical and experimental validation for the thermal transmittance of windows with cellular shades

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Robert

    Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less

  8. Experimental Validation of Numerical Simulations for an Acoustic Liner in Grazing Flow

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.; Pastouchenko, Nikolai N.; Jones, Michael G.; Watson, Willie R.

    2013-01-01

    A coordinated experimental and numerical simulation effort is carried out to improve our understanding of the physics of acoustic liners in a grazing flow as well our computational aeroacoustics (CAA) method prediction capability. A numerical simulation code based on advanced CAA methods is developed. In a parallel effort, experiments are performed using the Grazing Flow Impedance Tube at the NASA Langley Research Center. In the experiment, a liner is installed in the upper wall of a rectangular flow duct with a 2 inch by 2.5 inch cross section. Spatial distribution of sound pressure levels and relative phases are measured on the wall opposite the liner in the presence of a Mach 0.3 grazing flow. The computer code is validated by comparing computed results with experimental measurements. Good agreements are found. The numerical simulation code is then used to investigate the physical properties of the acoustic liner. It is shown that an acoustic liner can produce self-noise in the presence of a grazing flow and that a feedback acoustic resonance mechanism is responsible for the generation of this liner self-noise. In addition, the same mechanism also creates additional liner drag. An estimate, based on numerical simulation data, indicates that for a resonant liner with a 10% open area ratio, the drag increase would be about 4% of the turbulent boundary layer drag over a flat wall.

  9. Final Design and Experimental Validation of the Thermal Performance of the LHC Lattice Cryostats

    NASA Astrophysics Data System (ADS)

    Bourcey, N.; Capatina, O.; Parma, V.; Poncet, A.; Rohmig, P.; Serio, L.; Skoczen, B.; Tock, J.-P.; Williams, L. R.

    2004-06-01

    The recent commissioning and operation of the LHC String 2 have given a first experimental validation of the global thermal performance of the LHC lattice cryostat at nominal cryogenic conditions. The cryostat designed to minimize the heat inleak from ambient temperature, houses under vacuum and thermally protects the cold mass, which contains the LHC twin-aperture superconducting magnets operating at 1.9 K in superfluid helium. Mechanical components linking the cold mass to the vacuum vessel, such as support posts and insulation vacuum barriers are designed with efficient thermalisations for heat interception to minimise heat conduction. Heat inleak by radiation is reduced by employing multilayer insulation (MLI) wrapped around the cold mass and around an aluminium thermal shield cooled to about 60 K. Measurements of the total helium vaporization rate in String 2 gives, after substraction of supplementary heat loads and end effects, an estimate of the total thermal load to a standard LHC cell (107 m) including two Short Straight Sections and six dipole cryomagnets. Temperature sensors installed at critical locations provide a temperature mapping which allows validation of the calculated and estimated thermal performance of the cryostat components, including efficiency of the heat interceptions.

  10. Finite Element Vibration Modeling and Experimental Validation for an Aircraft Engine Casing

    NASA Astrophysics Data System (ADS)

    Rabbitt, Christopher

    This thesis presents a procedure for the development and validation of a theoretical vibration model, applies this procedure to a pair of aircraft engine casings, and compares select parameters from experimental testing of those casings to those from a theoretical model using the Modal Assurance Criterion (MAC) and linear regression coefficients. A novel method of determining the optimal MAC between axisymmetric results is developed and employed. It is concluded that the dynamic finite element models developed as part of this research are fully capable of modelling the modal parameters within the frequency range of interest. Confidence intervals calculated in this research for correlation coefficients provide important information regarding the reliability of predictions, and it is recommended that these intervals be calculated for all comparable coefficients. The procedure outlined for aligning mode shapes around an axis of symmetry proved useful, and the results are promising for the development of further optimization techniques.

  11. Development of robust flexible OLED encapsulations using simulated estimations and experimental validations

    NASA Astrophysics Data System (ADS)

    Lee, Chang-Chun; Shih, Yan-Shin; Wu, Chih-Sheng; Tsai, Chia-Hao; Yeh, Shu-Tang; Peng, Yi-Hao; Chen, Kuang-Jung

    2012-07-01

    This work analyses the overall stress/strain characteristic of flexible encapsulations with organic light-emitting diode (OLED) devices. A robust methodology composed of a mechanical model of multi-thin film under bending loads and related stress simulations based on nonlinear finite element analysis (FEA) is proposed, and validated to be more reliable compared with related experimental data. With various geometrical combinations of cover plate, stacked thin films and plastic substrate, the position of the neutral axis (NA) plate, which is regarded as a key design parameter to minimize stress impact for the concerned OLED devices, is acquired using the present methodology. The results point out that both the thickness and mechanical properties of the cover plate help in determining the NA location. In addition, several concave and convex radii are applied to examine the reliable mechanical tolerance and to provide an insight into the estimated reliability of foldable OLED encapsulations.

  12. Experimentally Manipulating Items Informs on the (Limited) Construct and Criterion Validity of the Humor Styles Questionnaire

    PubMed Central

    Ruch, Willibald; Heintz, Sonja

    2017-01-01

    How strongly does humor (i.e., the construct-relevant content) in the Humor Styles Questionnaire (HSQ; Martin et al., 2003) determine the responses to this measure (i.e., construct validity)? Also, how much does humor influence the relationships of the four HSQ scales, namely affiliative, self-enhancing, aggressive, and self-defeating, with personality traits and subjective well-being (i.e., criterion validity)? The present paper answers these two questions by experimentally manipulating the 32 items of the HSQ to only (or mostly) contain humor (i.e., construct-relevant content) or to substitute the humor content with non-humorous alternatives (i.e., only assessing construct-irrelevant context). Study 1 (N = 187) showed that the HSQ affiliative scale was mainly determined by humor, self-enhancing and aggressive were determined by both humor and non-humorous context, and self-defeating was primarily determined by the context. This suggests that humor is not the primary source of the variance in three of the HQS scales, thereby limiting their construct validity. Study 2 (N = 261) showed that the relationships of the HSQ scales to the Big Five personality traits and subjective well-being (positive affect, negative affect, and life satisfaction) were consistently reduced (personality) or vanished (subjective well-being) when the non-humorous contexts in the HSQ items were controlled for. For the HSQ self-defeating scale, the pattern of relationships to personality was also altered, supporting an positive rather than a negative view of the humor in this humor style. The present findings thus call for a reevaluation of the role that humor plays in the HSQ (construct validity) and in the relationships to personality and well-being (criterion validity). PMID:28473794

  13. Construction and Experimental Validation of a Petri Net Model of Wnt/β-Catenin Signaling.

    PubMed

    Jacobsen, Annika; Heijmans, Nika; Verkaar, Folkert; Smit, Martine J; Heringa, Jaap; van Amerongen, Renée; Feenstra, K Anton

    2016-01-01

    The Wnt/β-catenin signaling pathway is important for multiple developmental processes and tissue maintenance in adults. Consequently, deregulated signaling is involved in a range of human diseases including cancer and developmental defects. A better understanding of the intricate regulatory mechanism and effect of physiological (active) and pathophysiological (hyperactive) WNT signaling is important for predicting treatment response and developing novel therapies. The constitutively expressed CTNNB1 (commonly and hereafter referred to as β-catenin) is degraded by a destruction complex, composed of amongst others AXIN1 and GSK3. The destruction complex is inhibited during active WNT signaling, leading to β-catenin stabilization and induction of β-catenin/TCF target genes. In this study we investigated the mechanism and effect of β-catenin stabilization during active and hyperactive WNT signaling in a combined in silico and in vitro approach. We constructed a Petri net model of Wnt/β-catenin signaling including main players from the plasma membrane (WNT ligands and receptors), cytoplasmic effectors and the downstream negative feedback target gene AXIN2. We validated that our model can be used to simulate both active (WNT stimulation) and hyperactive (GSK3 inhibition) signaling by comparing our simulation and experimental data. We used this experimentally validated model to get further insights into the effect of the negative feedback regulator AXIN2 upon WNT stimulation and observed an attenuated β-catenin stabilization. We furthermore simulated the effect of APC inactivating mutations, yielding a stabilization of β-catenin levels comparable to the Wnt-pathway activities observed in colorectal and breast cancer. Our model can be used for further investigation and viable predictions of the role of Wnt/β-catenin signaling in oncogenesis and development.

  14. Construction and Experimental Validation of a Petri Net Model of Wnt/β-Catenin Signaling

    PubMed Central

    Heijmans, Nika; Verkaar, Folkert; Smit, Martine J.; Heringa, Jaap

    2016-01-01

    The Wnt/β-catenin signaling pathway is important for multiple developmental processes and tissue maintenance in adults. Consequently, deregulated signaling is involved in a range of human diseases including cancer and developmental defects. A better understanding of the intricate regulatory mechanism and effect of physiological (active) and pathophysiological (hyperactive) WNT signaling is important for predicting treatment response and developing novel therapies. The constitutively expressed CTNNB1 (commonly and hereafter referred to as β-catenin) is degraded by a destruction complex, composed of amongst others AXIN1 and GSK3. The destruction complex is inhibited during active WNT signaling, leading to β-catenin stabilization and induction of β-catenin/TCF target genes. In this study we investigated the mechanism and effect of β-catenin stabilization during active and hyperactive WNT signaling in a combined in silico and in vitro approach. We constructed a Petri net model of Wnt/β-catenin signaling including main players from the plasma membrane (WNT ligands and receptors), cytoplasmic effectors and the downstream negative feedback target gene AXIN2. We validated that our model can be used to simulate both active (WNT stimulation) and hyperactive (GSK3 inhibition) signaling by comparing our simulation and experimental data. We used this experimentally validated model to get further insights into the effect of the negative feedback regulator AXIN2 upon WNT stimulation and observed an attenuated β-catenin stabilization. We furthermore simulated the effect of APC inactivating mutations, yielding a stabilization of β-catenin levels comparable to the Wnt-pathway activities observed in colorectal and breast cancer. Our model can be used for further investigation and viable predictions of the role of Wnt/β-catenin signaling in oncogenesis and development. PMID:27218469

  15. Experimental validation of photon-heating calculation for the Jules Horowitz Reactor

    NASA Astrophysics Data System (ADS)

    Lemaire, M.; Vaglio-Gaudard, C.; Lyoussi, A.; Reynard-Carette, C.; Di Salvo, J.; Gruel, A.

    2015-04-01

    The Jules Horowitz Reactor (JHR) is the next Material-Testing Reactor (MTR) under construction at CEA Cadarache. High values of photon heating (up to 20 W/g) are expected in this MTR. As temperature is a key parameter for material behavior, the accuracy of photon-heating calculation in the different JHR structures is an important stake with regard to JHR safety and performances. In order to experimentally validate the calculation of photon heating in the JHR, an integral experiment called AMMON was carried out in the critical mock-up EOLE at CEA Cadarache to help ascertain the calculation bias and its associated uncertainty. Nuclear heating was measured in different JHR-representative AMMON core configurations using ThermoLuminescent Detectors (TLDs) and Optically Stimulated Luminescent Detectors (OSLDs). This article presents the interpretation methodology and the calculation/experiment (C/E) ratio for all the TLD and OSLD measurements conducted in AMMON. It then deals with representativeness elements of the AMMON experiment regarding the JHR and establishes the calculation biases (and its associated uncertainty) applicable to photon-heating calculation for the JHR.

  16. Optimization of critical quality attributes in continuous twin-screw wet granulation via design space validated with pilot scale experimental data.

    PubMed

    Liu, Huolong; Galbraith, S C; Ricart, Brendon; Stanton, Courtney; Smith-Goettler, Brandye; Verdi, Luke; O'Connor, Thomas; Lee, Sau; Yoon, Seongkyu

    2017-06-15

    In this study, the influence of key process variables (screw speed, throughput and liquid to solid (L/S) ratio) of a continuous twin screw wet granulation (TSWG) was investigated using a central composite face-centered (CCF) experimental design method. Regression models were developed to predict the process responses (motor torque, granule residence time), granule properties (size distribution, volume average diameter, yield, relative width, flowability) and tablet properties (tensile strength). The effects of the three key process variables were analyzed via contour and interaction plots. The experimental results have demonstrated that all the process responses, granule properties and tablet properties are influenced by changing the screw speed, throughput and L/S ratio. The TSWG process was optimized to produce granules with specific volume average diameter of 150μm and the yield of 95% based on the developed regression models. A design space (DS) was built based on volume average granule diameter between 90 and 200μm and the granule yield larger than 75% with a failure probability analysis using Monte Carlo simulations. Validation experiments successfully validated the robustness and accuracy of the DS generated using the CCF experimental design in optimizing a continuous TSWG process. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Experimental Equipment Validation for Methane (CH4) and Carbon Dioxide (CO2) Hydrates

    NASA Astrophysics Data System (ADS)

    Saad Khan, Muhammad; Yaqub, Sana; Manner, Naathiya; Ani Karthwathi, Nur; Qasim, Ali; Mellon, Nurhayati Binti; Lal, Bhajan

    2018-04-01

    Clathrate hydrates are eminent structures regard as a threat to the gas and oil industry in light of their irritating propensity to subsea pipelines. For natural gas transmission and processing, the formation of gas hydrate is one of the main flow assurance delinquent has led researchers toward conducting fresh and meticulous studies on various aspects of gas hydrates. This paper highlighted the thermodynamic analysis on pure CH4 and CO2 gas hydrates on the custom fabricated equipment (Sapphire cell hydrate reactor) for experimental validation. CO2 gas hydrate formed at lower pressure (41 bar) as compared to CH4 gas hydrate (70 bar) while comparison of thermodynamic properties between CH4 and CO2 also presented in this study. This preliminary study could provide pathways for the quest of potent hydrate inhibitors.

  18. Elasto-dynamic analysis of a gear pump-Part III: Experimental validation procedure and model extension to helical gears

    NASA Astrophysics Data System (ADS)

    Mucchi, E.; Dalpiaz, G.

    2015-01-01

    This work concerns external gear pumps for automotive applications, which operate at high speed and low pressure. In previous works of the authors (Part I and II, [1,2]), a non-linear lumped-parameter kineto-elastodynamic model for the prediction of the dynamic behaviour of external gear pumps was presented. It takes into account the most important phenomena involved in the operation of this kind of machine. The two main sources of noise and vibration are considered: pressure pulsation and gear meshing. The model has been used in order to foresee the influence of working conditions and design modifications on vibration generation. The model's experimental validation is a difficult task. Thus, Part III proposes a novel methodology for the validation carried out by the comparison of simulations and experimental results concerning forces and moments: it deals with the external and inertial components acting on the gears, estimated by the model, and the reactions and inertial components on the pump casing and the test plate, obtained by measurements. The validation is carried out comparing the level of the time synchronous average in the time domain and the waterfall maps in the frequency domain, with particular attention to identify system resonances. The validation results are satisfactory globally, but discrepancies are still present. Moreover, the assessed model has been properly modified for the application to a new virtual pump prototype with helical gears in order to foresee gear accelerations and dynamic forces. Part IV is focused on improvements in the modelling and analysis of the phenomena bound to the pressure evolution around the gears in order to achieve results closer to the measured values. As a matter of fact, the simulation results have shown that a variable meshing stiffness has a notable contribution on the dynamic behaviour of the pump but this is not as important as the pressure phenomena. As a consequence, the original model was modified with the

  19. Experimental validation of the TOPAS Monte Carlo system for passive scattering proton therapy

    PubMed Central

    Testa, M.; Schümann, J.; Lu, H.-M.; Shin, J.; Faddegon, B.; Perl, J.; Paganetti, H.

    2013-01-01

    Purpose: TOPAS (TOol for PArticle Simulation) is a particle simulation code recently developed with the specific aim of making Monte Carlo simulations user-friendly for research and clinical physicists in the particle therapy community. The authors present a thorough and extensive experimental validation of Monte Carlo simulations performed with TOPAS in a variety of setups relevant for proton therapy applications. The set of validation measurements performed in this work represents an overall end-to-end testing strategy recommended for all clinical centers planning to rely on TOPAS for quality assurance or patient dose calculation and, more generally, for all the institutions using passive-scattering proton therapy systems. Methods: The authors systematically compared TOPAS simulations with measurements that are performed routinely within the quality assurance (QA) program in our institution as well as experiments specifically designed for this validation study. First, the authors compared TOPAS simulations with measurements of depth-dose curves for spread-out Bragg peak (SOBP) fields. Second, absolute dosimetry simulations were benchmarked against measured machine output factors (OFs). Third, the authors simulated and measured 2D dose profiles and analyzed the differences in terms of field flatness and symmetry and usable field size. Fourth, the authors designed a simple experiment using a half-beam shifter to assess the effects of multiple Coulomb scattering, beam divergence, and inverse square attenuation on lateral and longitudinal dose profiles measured and simulated in a water phantom. Fifth, TOPAS’ capabilities to simulate time dependent beam delivery was benchmarked against dose rate functions (i.e., dose per unit time vs time) measured at different depths inside an SOBP field. Sixth, simulations of the charge deposited by protons fully stopping in two different types of multilayer Faraday cups (MLFCs) were compared with measurements to benchmark the

  20. Validation of a buffet meal design in an experimental restaurant.

    PubMed

    Allirot, Xavier; Saulais, Laure; Disse, Emmanuel; Roth, Hubert; Cazal, Camille; Laville, Martine

    2012-06-01

    We assessed the reproducibility of intakes and meal mechanics parameters (cumulative energy intake (CEI), number of bites, bite rate, mean energy content per bite) during a buffet meal designed in a natural setting, and their sensitivity to food deprivation. Fourteen men were invited to three lunch sessions in an experimental restaurant. Subjects ate their regular breakfast before sessions A and B. They skipped breakfast before session FAST. The same ad libitum buffet was offered each time. Energy intakes and meal mechanics were assessed by foods weighing and video recording. Intrasubject reproducibility was evaluated by determining intraclass correlation coefficients (ICC). Mixed-models were used to assess the effects of the sessions on CEI. We found a good reproducibility between A and B for total energy (ICC=0.82), carbohydrate (ICC=0.83), lipid (ICC=0.81) and protein intake (ICC=0.79) and for meal mechanics parameters. Total energy, lipid and carbohydrate intake were higher in FAST than in A and B. CEI were found sensitive to differences in hunger level while the other meal mechanics parameters were stable between sessions. In conclusion, a buffet meal in a normal eating environment is a valid tool for assessing the effects of interventions on intakes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. An Experimentally Validated Numerical Modeling Technique for Perforated Plate Heat Exchangers

    PubMed Central

    Nellis, G. F.; Kelin, S. A.; Zhu, W.; Gianchandani, Y.

    2010-01-01

    Cryogenic and high-temperature systems often require compact heat exchangers with a high resistance to axial conduction in order to control the heat transfer induced by axial temperature differences. One attractive design for such applications is a perforated plate heat exchanger that utilizes high conductivity perforated plates to provide the stream-to-stream heat transfer and low conductivity spacers to prevent axial conduction between the perforated plates. This paper presents a numerical model of a perforated plate heat exchanger that accounts for axial conduction, external parasitic heat loads, variable fluid and material properties, and conduction to and from the ends of the heat exchanger. The numerical model is validated by experimentally testing several perforated plate heat exchangers that are fabricated using microelectromechanical systems based manufacturing methods. This type of heat exchanger was investigated for potential use in a cryosurgical probe. One of these heat exchangers included perforated plates with integrated platinum resistance thermometers. These plates provided in situ measurements of the internal temperature distribution in addition to the temperature, pressure, and flow rate measured at the inlet and exit ports of the device. The platinum wires were deposited between the fluid passages on the perforated plate and are used to measure the temperature at the interface between the wall material and the flowing fluid. The experimental testing demonstrates the ability of the numerical model to accurately predict both the overall performance and the internal temperature distribution of perforated plate heat exchangers over a range of geometry and operating conditions. The parameters that were varied include the axial length, temperature range, mass flow rate, and working fluid. PMID:20976021

  2. An Experimentally Validated Numerical Modeling Technique for Perforated Plate Heat Exchangers.

    PubMed

    White, M J; Nellis, G F; Kelin, S A; Zhu, W; Gianchandani, Y

    2010-11-01

    Cryogenic and high-temperature systems often require compact heat exchangers with a high resistance to axial conduction in order to control the heat transfer induced by axial temperature differences. One attractive design for such applications is a perforated plate heat exchanger that utilizes high conductivity perforated plates to provide the stream-to-stream heat transfer and low conductivity spacers to prevent axial conduction between the perforated plates. This paper presents a numerical model of a perforated plate heat exchanger that accounts for axial conduction, external parasitic heat loads, variable fluid and material properties, and conduction to and from the ends of the heat exchanger. The numerical model is validated by experimentally testing several perforated plate heat exchangers that are fabricated using microelectromechanical systems based manufacturing methods. This type of heat exchanger was investigated for potential use in a cryosurgical probe. One of these heat exchangers included perforated plates with integrated platinum resistance thermometers. These plates provided in situ measurements of the internal temperature distribution in addition to the temperature, pressure, and flow rate measured at the inlet and exit ports of the device. The platinum wires were deposited between the fluid passages on the perforated plate and are used to measure the temperature at the interface between the wall material and the flowing fluid. The experimental testing demonstrates the ability of the numerical model to accurately predict both the overall performance and the internal temperature distribution of perforated plate heat exchangers over a range of geometry and operating conditions. The parameters that were varied include the axial length, temperature range, mass flow rate, and working fluid.

  3. An experimentally validated model for geometrically nonlinear plucking-based frequency up-conversion in energy harvesting

    NASA Astrophysics Data System (ADS)

    Kathpalia, B.; Tan, D.; Stern, I.; Erturk, A.

    2018-01-01

    It is well known that plucking-based frequency up-conversion can enhance the power output in piezoelectric energy harvesting by enabling cyclic free vibration at the fundamental bending mode of the harvester even for very low excitation frequencies. In this work, we present a geometrically nonlinear plucking-based framework for frequency up-conversion in piezoelectric energy harvesting under quasistatic excitations associated with low-frequency stimuli such as walking and similar rigid body motions. Axial shortening of the plectrum is essential to enable plucking excitation, which requires a nonlinear framework relating the plectrum parameters (e.g. overlap length between the plectrum and harvester) to the overall electrical power output. Von Kármán-type geometrically nonlinear deformation of the flexible plectrum cantilever is employed to relate the overlap length between the flexible (nonlinear) plectrum and the stiff (linear) harvester to the transverse quasistatic tip displacement of the plectrum, and thereby the tip load on the linear harvester in each plucking cycle. By combining the nonlinear plectrum mechanics and linear harvester dynamics with two-way electromechanical coupling, the electrical power output is obtained directly in terms of the overlap length. Experimental case studies and validations are presented for various overlap lengths and a set of electrical load resistance values. Further analysis results are reported regarding the combined effects of plectrum thickness and overlap length on the plucking force and harvested power output. The experimentally validated nonlinear plectrum-linear harvester framework proposed herein can be employed to design and optimize frequency up-conversion by properly choosing the plectrum parameters (geometry, material, overlap length, etc) as well as the harvester parameters.

  4. Meta-Analysis and Experimental Validation Identified FREM2 and SPRY1 as New Glioblastoma Marker Candidates.

    PubMed

    Vidak, Marko; Jovcevska, Ivana; Samec, Neja; Zottel, Alja; Liovic, Mirjana; Rozman, Damjana; Dzeroski, Saso; Juvan, Peter; Komel, Radovan

    2018-05-04

    Glioblastoma (GB) is the most aggressive brain malignancy. Although some potential glioblastoma biomarkers have already been identified, there is a lack of cell membrane-bound biomarkers capable of distinguishing brain tissue from glioblastoma and/or glioblastoma stem cells (GSC), which are responsible for the rapid post-operative tumor reoccurrence. In order to find new GB/GSC marker candidates that would be cell surface proteins (CSP), we have performed meta-analysis of genome-scale mRNA expression data from three data repositories (GEO, ArrayExpress and GLIOMASdb). The search yielded ten appropriate datasets, and three (GSE4290/GDS1962, GSE23806/GDS3885, and GLIOMASdb) were used for selection of new GB/GSC marker candidates, while the other seven (GSE4412/GDS1975, GSE4412/GDS1976, E-GEOD-52009, E-GEOD-68848, E-GEOD-16011, E-GEOD-4536, and E-GEOD-74571) were used for bioinformatic validation. The selection identified four new CSP-encoding candidate genes— CD276 , FREM2 , SPRY1 , and SLC47A1 —and the bioinformatic validation confirmed these findings. A review of the literature revealed that CD276 is not a novel candidate, while SLC47A1 had lower validation test scores than the other new candidates and was therefore not considered for experimental validation. This validation revealed that the expression of FREM2—but not SPRY1—is higher in glioblastoma cell lines when compared to non-malignant astrocytes. In addition, FREM2 gene and protein expression levels are higher in GB stem-like cell lines than in conventional glioblastoma cell lines. FREM2 is thus proposed as a novel GB biomarker and a putative biomarker of glioblastoma stem cells. Both FREM2 and SPRY1 are expressed on the surface of the GB cells, while SPRY1 alone was found overexpressed in the cytosol of non-malignant astrocytes.

  5. Image-based multi-scale simulation and experimental validation of thermal conductivity of lanthanum zirconate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Xingye; Hu, Bin; Wei, Changdong

    Lanthanum zirconate (La2Zr2O7) is a promising candidate material for thermal barrier coating (TBC) applications due to its low thermal conductivity and high-temperature phase stability. In this work, a novel image-based multi-scale simulation framework combining molecular dynamics (MD) and finite element (FE) calculations is proposed to study the thermal conductivity of La2Zr2O7 coatings. Since there is no experimental data of single crystal La2Zr2O7 thermal conductivity, a reverse non-equilibrium molecular dynamics (reverse NEMD) approach is first employed to compute the temperature-dependent thermal conductivity of single crystal La2Zr2O7. The single crystal data is then passed to a FE model which takes into accountmore » of realistic thermal barrier coating microstructures. The predicted thermal conductivities from the FE model are in good agreement with experimental validations using both flash laser technique and pulsed thermal imaging-multilayer analysis. The framework proposed in this work provides a powerful tool for future design of advanced coating systems. (C) 2016 Elsevier Ltd. All rights reserved.« less

  6. On-line experimental validation of a model-based diagnostic algorithm dedicated to a solid oxide fuel cell system

    NASA Astrophysics Data System (ADS)

    Polverino, Pierpaolo; Esposito, Angelo; Pianese, Cesare; Ludwig, Bastian; Iwanschitz, Boris; Mai, Andreas

    2016-02-01

    In the current energetic scenario, Solid Oxide Fuel Cells (SOFCs) exhibit appealing features which make them suitable for environmental-friendly power production, especially for stationary applications. An example is represented by micro-combined heat and power (μ-CHP) generation units based on SOFC stacks, which are able to produce electric and thermal power with high efficiency and low pollutant and greenhouse gases emissions. However, the main limitations to their diffusion into the mass market consist in high maintenance and production costs and short lifetime. To improve these aspects, the current research activity focuses on the development of robust and generalizable diagnostic techniques, aimed at detecting and isolating faults within the entire system (i.e. SOFC stack and balance of plant). Coupled with appropriate recovery strategies, diagnosis can prevent undesired system shutdowns during faulty conditions, with consequent lifetime increase and maintenance costs reduction. This paper deals with the on-line experimental validation of a model-based diagnostic algorithm applied to a pre-commercial SOFC system. The proposed algorithm exploits a Fault Signature Matrix based on a Fault Tree Analysis and improved through fault simulations. The algorithm is characterized on the considered system and it is validated by means of experimental induction of faulty states in controlled conditions.

  7. Numerical model validation using experimental data: Application of the area metric on a Francis runner

    NASA Astrophysics Data System (ADS)

    Chatenet, Q.; Tahan, A.; Gagnon, M.; Chamberland-Lauzon, J.

    2016-11-01

    Nowadays, engineers are able to solve complex equations thanks to the increase of computing capacity. Thus, finite elements software is widely used, especially in the field of mechanics to predict part behavior such as strain, stress and natural frequency. However, it can be difficult to determine how a model might be right or wrong, or whether a model is better than another one. Nevertheless, during the design phase, it is very important to estimate how the hydroelectric turbine blades will behave according to the stress to which it is subjected. Indeed, the static and dynamic stress levels will influence the blade's fatigue resistance and thus its lifetime, which is a significant feature. In the industry, engineers generally use either graphic representation, hypothesis tests such as the Student test, or linear regressions in order to compare experimental to estimated data from the numerical model. Due to the variability in personal interpretation (reproducibility), graphical validation is not considered objective. For an objective assessment, it is essential to use a robust validation metric to measure the conformity of predictions against data. We propose to use the area metric in the case of a turbine blade that meets the key points of the ASME Standards and produces a quantitative measure of agreement between simulations and empirical data. This validation metric excludes any belief and criterion of accepting a model which increases robustness. The present work is aimed at applying a validation method, according to ASME V&V 10 recommendations. Firstly, the area metric is applied on the case of a real Francis runner whose geometry and boundaries conditions are complex. Secondly, the area metric will be compared to classical regression methods to evaluate the performance of the method. Finally, we will discuss the use of the area metric as a tool to correct simulations.

  8. CPV cells cooling system based on submerged jet impingement: CFD modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Montorfano, Davide; Gaetano, Antonio; Barbato, Maurizio C.; Ambrosetti, Gianluca; Pedretti, Andrea

    2014-09-01

    Concentrating photovoltaic (CPV) cells offer higher efficiencies with regard to the PV ones and allow to strongly reduce the overall solar cell area. However, to operate correctly and exploit their advantages, their temperature has to be kept low and as uniform as possible and the cooling circuit pressure drops need to be limited. In this work an impingement water jet cooling system specifically designed for an industrial HCPV receiver is studied. Through the literature and by means of accurate computational fluid dynamics (CFD) simulations, the nozzle to plate distance, the number of jets and the nozzle pitch, i.e. the distance between adjacent jets, were optimized. Afterwards, extensive experimental tests were performed to validate pressure drops and cooling power simulation results.

  9. Relationship disruption stress in human infants: a validation study with experimental and control groups.

    PubMed

    Haley, David W

    2011-09-01

    The current study examined whether the psychological stress of the still-face (SF) task (i.e. stress resulting from a parent's unresponsiveness) is a valid laboratory stress paradigm for evaluating infant cortisol reactivity. Given that factors external to the experimental paradigm, such as arriving at a new place, may cause an elevation in cortisol secretion; we tested the hypothesis that infants would show a cortisol response to the SF task but not to a normal FF task (control). Saliva was collected for cortisol measurement from 6-month-old infants (n = 31) randomly assigned to either a repeated SF task or to a continuous FF task. Parent-infant dyads were videotaped. Salivary cortisol concentration was measured at baseline, 20, and 30 min after the start of the procedure. Infant salivary cortisol concentrations showed a significant increase over time for the SF task but not for the FF task. The results provide new evidence that the repeated SF task provides a psychological challenge that is due to the SF condition rather than to some non-task related factor; these results provide internal validity for the paradigm. The study offers new insight into the role of parent-infant interactions in the activation of the infant stress response system.

  10. A Validation Approach for Quasistatic Numerical/Experimental Indentation Analysis in Soft Materials Using 3D Digital Image Correlation.

    PubMed

    Felipe-Sesé, Luis; López-Alba, Elías; Hannemann, Benedikt; Schmeer, Sebastian; Diaz, Francisco A

    2017-06-28

    A quasistatic indentation numerical analysis in a round section specimen made of soft material has been performed and validated with a full field experimental technique, i.e., Digital Image Correlation 3D. The contact experiment specifically consisted of loading a 25 mm diameter rubber cylinder of up to a 5 mm indentation and then unloading. Experimental strains fields measured at the surface of the specimen during the experiment were compared with those obtained by performing two numerical analyses employing two different hyperplastic material models. The comparison was performed using an Image Decomposition new methodology that makes a direct comparison of full-field data independently of their scale or orientation possible. Numerical results show a good level of agreement with those measured during the experiments. However, since image decomposition allows for the differences to be quantified, it was observed that one of the adopted material models reproduces lower differences compared to experimental results.

  11. A Validation Approach for Quasistatic Numerical/Experimental Indentation Analysis in Soft Materials Using 3D Digital Image Correlation

    PubMed Central

    Felipe-Sesé, Luis; López-Alba, Elías; Hannemann, Benedikt; Schmeer, Sebastian; Diaz, Francisco A.

    2017-01-01

    A quasistatic indentation numerical analysis in a round section specimen made of soft material has been performed and validated with a full field experimental technique, i.e., Digital Image Correlation 3D. The contact experiment specifically consisted of loading a 25 mm diameter rubber cylinder of up to a 5 mm indentation and then unloading. Experimental strains fields measured at the surface of the specimen during the experiment were compared with those obtained by performing two numerical analyses employing two different hyperplastic material models. The comparison was performed using an Image Decomposition new methodology that makes a direct comparison of full-field data independently of their scale or orientation possible. Numerical results show a good level of agreement with those measured during the experiments. However, since image decomposition allows for the differences to be quantified, it was observed that one of the adopted material models reproduces lower differences compared to experimental results. PMID:28773081

  12. Multiphysics modelling and experimental validation of an air-coupled array of PMUTs with residual stresses

    NASA Astrophysics Data System (ADS)

    Massimino, G.; Colombo, A.; D'Alessandro, L.; Procopio, F.; Ardito, R.; Ferrera, M.; Corigliano, A.

    2018-05-01

    In this paper a complete multiphysics modelling via the finite element method (FEM) of an air-coupled array of piezoelectric micromachined ultrasonic transducers (PMUT) and its experimental validation are presented. Two numerical models are described for the single transducer, axisymmetric and 3D, with the following features: the presence of fabrication induced residual stresses, which determine a non-linear initial deformed configuration of the diaphragm and a substantial fundamental mode frequency shift; the multiple coupling between different physics, namely electro-mechanical coupling for the piezo-electric model, thermo-acoustic-structural interaction and thermo-acoustic-pressure interaction for the waves propagation in the surrounding fluid. The model for the single transducer is enhanced considering the full set of PMUTs belonging to the silicon dye in a 4 × 4 array configuration. The results of the numerical multiphysics models are compared with experimental ones in terms of the initial static pre-deflection, of the diaphragm central point spectrum and of the sound intensity at 3.5 cm on the vertical direction along the axis of the diaphragm.

  13. Experimental Validation of a Closed Brayton Cycle System Transient Simulation

    NASA Technical Reports Server (NTRS)

    Johnson, Paul K.; Hervol, David S.

    2006-01-01

    The Brayton Power Conversion Unit (BPCU) located at NASA Glenn Research Center (GRC) in Cleveland, Ohio was used to validate the results of a computational code known as Closed Cycle System Simulation (CCSS). Conversion system thermal transient behavior was the focus of this validation. The BPCU was operated at various steady state points and then subjected to transient changes involving shaft rotational speed and thermal energy input. These conditions were then duplicated in CCSS. Validation of the CCSS BPCU model provides confidence in developing future Brayton power system performance predictions, and helps to guide high power Brayton technology development.

  14. Experimental validation of a model for diffusion-controlled absorption of organic compounds in the trachea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerde, P.; Muggenburg, B.A.; Thornton-Manning, J.R.

    1995-12-01

    Most chemically induced lung cancer originates in the epithelial cells in the airways. Common conceptions are that chemicals deposited on the airway surface are rapidly absorbed through mucous membranes, limited primarily by the rate of blood perfusion in the mucosa. It is also commonly thought that for chemicals to induce toxicity at the site of entry, they must be either rapidly reactive, readily metabolizable, or especially toxic to the tissues at the site of entry. For highly lipophilic toxicants, there is a third option. Our mathematical model predicts that as lipophilicity increases, chemicals partition more readily into the cellular lipidmore » membranes and diffuse more slowly through the tissues. Therefore, absorption of very lipophilic compounds will be almost entirely limited by the rate of diffusion through the epithelium rather than by perfusion of the capillary bed in the subepithelium. We have reported on a preliminary model for absorption through mucous membranes of any substance with a lipid/aqueous partition coefficient larger than one. The purpose of this work was to experimentally validate the model in Beagle dogs. This validated model on toxicant absorption in the airway mucosa will improve risk assessment of inhaled« less

  15. Metal-backed versus all-polyethylene unicompartmental knee arthroplasty: Proximal tibial strain in an experimentally validated finite element model.

    PubMed

    Scott, C E H; Eaton, M J; Nutton, R W; Wade, F A; Evans, S L; Pankaj, P

    2017-01-01

    Up to 40% of unicompartmental knee arthroplasty (UKA) revisions are performed for unexplained pain which may be caused by elevated proximal tibial bone strain. This study investigates the effect of tibial component metal backing and polyethylene thickness on bone strain in a cemented fixed-bearing medial UKA using a finite element model (FEM) validated experimentally by digital image correlation (DIC) and acoustic emission (AE). A total of ten composite tibias implanted with all-polyethylene (AP) and metal-backed (MB) tibial components were loaded to 2500 N. Cortical strain was measured using DIC and cancellous microdamage using AE. FEMs were created and validated and polyethylene thickness varied from 6 mm to 10 mm. The volume of cancellous bone exposed to < -3000 µε (pathological loading) and < -7000 µε (yield point) minimum principal (compressive) microstrain and > 3000 µε and > 7000 µε maximum principal (tensile) microstrain was computed. Experimental AE data and the FEM volume of cancellous bone with compressive strain < -3000 µε correlated strongly: R = 0.947, R 2 = 0.847, percentage error 12.5% (p < 0.001). DIC and FEM data correlated: R = 0.838, R 2 = 0.702, percentage error 4.5% (p < 0.001). FEM strain patterns included MB lateral edge concentrations; AP concentrations at keel, peg and at the region of load application. Cancellous strains were higher in AP implants at all loads: 2.2- (10 mm) to 3.2-times (6 mm) the volume of cancellous bone compressively strained < -7000 µε. AP tibial components display greater volumes of pathologically overstrained cancellous bone than MB implants of the same geometry. Increasing AP thickness does not overcome these pathological forces and comes at the cost of greater bone resection.Cite this article: C. E. H. Scott, M. J. Eaton, R. W. Nutton, F. A. Wade, S. L. Evans, P. Pankaj. Metal-backed versus all-polyethylene unicompartmental knee arthroplasty: Proximal tibial strain in an experimentally validated finite

  16. Experimental validation of plastic constitutive hardening relationship based upon the direction of the Net Burgers Density Vector

    NASA Astrophysics Data System (ADS)

    Sarac, Abdulhamit; Kysar, Jeffrey W.

    2018-02-01

    We present a new methodology for experimental validation of single crystal plasticity constitutive relationships based upon spatially resolved measurements of the direction of the Net Burgers Density Vector, which we refer to as the β-field. The β-variable contains information about the active slip systems as well as the ratios of the Geometrically Necessary Dislocation (GND) densities on the active slip systems. We demonstrate the methodology by comparing single crystal plasticity finite element simulations of plane strain wedge indentations into face-centered cubic nickel to detailed experimental measurements of the β-field. We employ the classical Peirce-Asaro-Needleman (PAN) hardening model in this study due to the straightforward physical interpretation of its constitutive parameters that include latent hardening ratio, initial hardening modulus and the saturation stress. The saturation stress and the initial hardening modulus have relatively large influence on the β-variable compared to the latent hardening ratio. A change in the initial hardening modulus leads to a shift in the boundaries of plastic slip sectors with the plastically deforming region. As the saturation strength varies, both the magnitude of the β-variable and the boundaries of the plastic slip sectors change. We thus demonstrate that the β-variable is sensitive to changes in the constitutive parameters making the variable suitable for validation purposes. We identify a set of constitutive parameters that are consistent with the β-field obtained from the experiment.

  17. Experimental validation of damping properties and solar pressure effects on flexible, high area-to-mass ratio debris model

    NASA Astrophysics Data System (ADS)

    Channumsin, Sittiporn; Ceriotti, Matteo; Radice, Gianmarco; Watson, Ian

    2017-09-01

    Multilayer insulation (MLI) is a recently-discovered type of debris originating from delamination of aging spacecraft; it is mostly detected near the geosynchronous orbit (GEO). Observation data indicates that these objects are characterised by high reflectivity, high area-to-mass ratio (HAMR), fast rotation, high sensitivity to perturbations (especially solar radiation pressure) and change of area-to-mass ratio (AMR) over time. As a result, traditional models (e.g. cannonball) are unsuitable to represent and predict this debris' orbital evolution. Previous work by the authors effectively modelled the flexible debris by means of multibody dynamics to improve the prediction accuracy. The orbit evolution with the flexible model resulted significantly different from using the rigid model. This paper aims to present a methodology to determine the dynamic properties of thin membranes with the purpose to validate the deformation characteristics of the flexible model. A high-vacuum chamber (10-4 mbar) to significantly decrease air friction, inside which a thin membrane is hinged at one end but free at the other provides the experimental setup. A free motion test is used to determine the damping characteristics and natural frequency of the thin membrane via logarithmic decrement and frequency response. The membrane can swing freely in the chamber and the motion is tracked by a static, optical camera, and a Kalman filter technique is implemented in the tracking algorithm to reduce noise and increase the tracking accuracy of the oscillating motion. Then, the effect of solar radiation pressure on the thin membrane is investigated: a high power spotlight (500-2000 W) is used to illuminate the sample and any displacement of the membrane is measured by means of a high-resolution laser sensor. Analytic methods from the natural frequency response and Finite Element Analysis (FEA) including multibody simulations of both experimental setups are used for the validation of the

  18. Experimental validation and model development for thermal transmittances of porous window screens and horizontal louvred blind systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Robert; Goudey, Howdy; Curcija, D. Charlie

    Virtually every home in the US has some form of shades, blinds, drapes, or other window attachment, but few have been designed for energy savings. In order to provide a common basis of comparison for thermal performance it is important to have validated simulation tools. This study outlines a review and validation of the ISO 15099 centre-of-glass thermal transmittance correlations for naturally ventilated cavities through measurement and detailed simulations. The focus is on the impacts of room-side ventilated cavities, such as those found with solar screens and horizontal louvred blinds. The thermal transmittance of these systems is measured experimentally, simulatedmore » using computational fluid dynamics analysis, and simulated utilizing simplified correlations from ISO 15099. Finally, correlation coefficients are proposed for the ISO 15099 algorithm that reduces the mean error between measured and simulated heat flux for typical solar screens from 16% to 3.5% and from 13% to 1% for horizontal blinds.« less

  19. CFD Simulation and Experimental Validation of Fluid Flow and Particle Transport in a Model of Alveolated Airways

    PubMed Central

    Ma, Baoshun; Ruwet, Vincent; Corieri, Patricia; Theunissen, Raf; Riethmuller, Michel; Darquenne, Chantal

    2009-01-01

    Accurate modeling of air flow and aerosol transport in the alveolated airways is essential for quantitative predictions of pulmonary aerosol deposition. However, experimental validation of such modeling studies has been scarce. The objective of this study is to validate CFD predictions of flow field and particle trajectory with experiments within a scaled-up model of alveolated airways. Steady flow (Re = 0.13) of silicone oil was captured by particle image velocimetry (PIV), and the trajectories of 0.5 mm and 1.2 mm spherical iron beads (representing 0.7 to 14.6 μm aerosol in vivo) were obtained by particle tracking velocimetry (PTV). At twelve selected cross sections, the velocity profiles obtained by CFD matched well with those by PIV (within 1.7% on average). The CFD predicted trajectories also matched well with PTV experiments. These results showed that air flow and aerosol transport in models of human alveolated airways can be simulated by CFD techniques with reasonable accuracy. PMID:20161301

  20. CFD Simulation and Experimental Validation of Fluid Flow and Particle Transport in a Model of Alveolated Airways.

    PubMed

    Ma, Baoshun; Ruwet, Vincent; Corieri, Patricia; Theunissen, Raf; Riethmuller, Michel; Darquenne, Chantal

    2009-05-01

    Accurate modeling of air flow and aerosol transport in the alveolated airways is essential for quantitative predictions of pulmonary aerosol deposition. However, experimental validation of such modeling studies has been scarce. The objective of this study is to validate CFD predictions of flow field and particle trajectory with experiments within a scaled-up model of alveolated airways. Steady flow (Re = 0.13) of silicone oil was captured by particle image velocimetry (PIV), and the trajectories of 0.5 mm and 1.2 mm spherical iron beads (representing 0.7 to 14.6 mum aerosol in vivo) were obtained by particle tracking velocimetry (PTV). At twelve selected cross sections, the velocity profiles obtained by CFD matched well with those by PIV (within 1.7% on average). The CFD predicted trajectories also matched well with PTV experiments. These results showed that air flow and aerosol transport in models of human alveolated airways can be simulated by CFD techniques with reasonable accuracy.

  1. Experimental validation and model development for thermal transmittances of porous window screens and horizontal louvred blind systems

    DOE PAGES

    Hart, Robert; Goudey, Howdy; Curcija, D. Charlie

    2017-05-16

    Virtually every home in the US has some form of shades, blinds, drapes, or other window attachment, but few have been designed for energy savings. In order to provide a common basis of comparison for thermal performance it is important to have validated simulation tools. This study outlines a review and validation of the ISO 15099 centre-of-glass thermal transmittance correlations for naturally ventilated cavities through measurement and detailed simulations. The focus is on the impacts of room-side ventilated cavities, such as those found with solar screens and horizontal louvred blinds. The thermal transmittance of these systems is measured experimentally, simulatedmore » using computational fluid dynamics analysis, and simulated utilizing simplified correlations from ISO 15099. Finally, correlation coefficients are proposed for the ISO 15099 algorithm that reduces the mean error between measured and simulated heat flux for typical solar screens from 16% to 3.5% and from 13% to 1% for horizontal blinds.« less

  2. Experimentally validated computational modeling of organic binder burnout from green ceramic compacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewsuk, K.G.; Cochran, R.J.; Blackwell, B.F.

    The properties and performance of a ceramic component is determined by a combination of the materials from which it was fabricated and how it was processed. Most ceramic components are manufactured by dry pressing a powder/binder system in which the organic binder provides formability and green compact strength. A key step in this manufacturing process is the removal of the binder from the powder compact after pressing. The organic binder is typically removed by a thermal decomposition process in which heating rate, temperature, and time are the key process parameters. Empirical approaches are generally used to design the burnout time-temperaturemore » cycle, often resulting in excessive processing times and energy usage, and higher overall manufacturing costs. Ideally, binder burnout should be completed as quickly as possible without damaging the compact, while using a minimum of energy. Process and computational modeling offer one means to achieve this end. The objective of this study is to develop an experimentally validated computer model that can be used to better understand, control, and optimize binder burnout from green ceramic compacts.« less

  3. MATLAB/Simulink Pulse-Echo Ultrasound System Simulator Based on Experimentally Validated Models.

    PubMed

    Kim, Taehoon; Shin, Sangmin; Lee, Hyongmin; Lee, Hyunsook; Kim, Heewon; Shin, Eunhee; Kim, Suhwan

    2016-02-01

    A flexible clinical ultrasound system must operate with different transducers, which have characteristic impulse responses and widely varying impedances. The impulse response determines the shape of the high-voltage pulse that is transmitted and the specifications of the front-end electronics that receive the echo; the impedance determines the specification of the matching network through which the transducer is connected. System-level optimization of these subsystems requires accurate modeling of pulse-echo (two-way) response, which in turn demands a unified simulation of the ultrasonics and electronics. In this paper, this is realized by combining MATLAB/Simulink models of the high-voltage transmitter, the transmission interface, the acoustic subsystem which includes wave propagation and reflection, the receiving interface, and the front-end receiver. To demonstrate the effectiveness of our simulator, the models are experimentally validated by comparing the simulation results with the measured data from a commercial ultrasound system. This simulator could be used to quickly provide system-level feedback for an optimized tuning of electronic design parameters.

  4. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  5. Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches

    NASA Astrophysics Data System (ADS)

    Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia

    2017-10-01

    With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.

  6. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE PAGES

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.; ...

    2016-07-28

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  7. The Question of Education Science: "Experiment"ism Versus "Experimental"ism

    ERIC Educational Resources Information Center

    Howe, Kenneth R.

    2005-01-01

    The ascendant view in the current debate about education science -- experimentism -- is a reassertion of the randomized experiment as the methodological gold standard. Advocates of this view have ignored, not answered, long-standing criticisms of the randomized experiment: its frequent impracticality, its lack of external validity, its confinement…

  8. Experimental Validation of Pulse Phase Tracking for X-Ray Pulsar Based

    NASA Technical Reports Server (NTRS)

    Anderson, Kevin

    2012-01-01

    Pulsars are a form of variable celestial source that have shown to be usable as aids for autonomous, deep space navigation. Particularly those sources emitting in the X-ray band are ideal for navigation due to smaller detector sizes. In this paper X-ray photons arriving from a pulsar are modeled as a non-homogeneous Poisson process. The method of pulse phase tracking is then investigated as a technique to measure the radial distance traveled by a spacecraft over an observation interval. A maximum-likelihood phase estimator (MLE) is used for the case where the observed frequency signal is constant. For the varying signal frequency case, an algorithm is used in which the observation window is broken up into smaller blocks over which an MLE is used. The outputs of this phase estimation process were then looped through a digital phase-locked loop (DPLL) in order to reduce the errors and produce estimates of the doppler frequency. These phase tracking algorithms were tested both in a computer simulation environment and using the NASA Goddard Space flight Center X-ray Navigation Laboratory Testbed (GXLT). This provided an experimental validation with photons being emitted by a modulated X-ray source and detected by a silicon-drift detector. Models of the Crab pulsar and the pulsar B1821-24 were used in order to generate test scenarios. Three different simulated detector trajectories were used to be tracked by the phase tracking algorithm: a stationary case, one with constant velocity, and one with constant acceleration. All three were performed in one-dimension along the line of sight to the pulsar. The first two had a constant signal frequency and the third had a time varying frequency. All of the constant frequency cases were processed using the MLE, and it was shown that they tracked the initial phase within 0.15% for the simulations and 2.5% in the experiments, based on an average of ten runs. The MLE-DPLL cascade version of the phase tracking algorithm was used in

  9. Design and Experimental Validation for Direct-Drive Fault-Tolerant Permanent-Magnet Vernier Machines

    PubMed Central

    Liu, Guohai; Yang, Junqin; Chen, Ming; Chen, Qian

    2014-01-01

    A fault-tolerant permanent-magnet vernier (FT-PMV) machine is designed for direct-drive applications, incorporating the merits of high torque density and high reliability. Based on the so-called magnetic gearing effect, PMV machines have the ability of high torque density by introducing the flux-modulation poles (FMPs). This paper investigates the fault-tolerant characteristic of PMV machines and provides a design method, which is able to not only meet the fault-tolerant requirements but also keep the ability of high torque density. The operation principle of the proposed machine has been analyzed. The design process and optimization are presented specifically, such as the combination of slots and poles, the winding distribution, and the dimensions of PMs and teeth. By using the time-stepping finite element method (TS-FEM), the machine performances are evaluated. Finally, the FT-PMV machine is manufactured, and the experimental results are presented to validate the theoretical analysis. PMID:25045729

  10. Design and experimental validation for direct-drive fault-tolerant permanent-magnet vernier machines.

    PubMed

    Liu, Guohai; Yang, Junqin; Chen, Ming; Chen, Qian

    2014-01-01

    A fault-tolerant permanent-magnet vernier (FT-PMV) machine is designed for direct-drive applications, incorporating the merits of high torque density and high reliability. Based on the so-called magnetic gearing effect, PMV machines have the ability of high torque density by introducing the flux-modulation poles (FMPs). This paper investigates the fault-tolerant characteristic of PMV machines and provides a design method, which is able to not only meet the fault-tolerant requirements but also keep the ability of high torque density. The operation principle of the proposed machine has been analyzed. The design process and optimization are presented specifically, such as the combination of slots and poles, the winding distribution, and the dimensions of PMs and teeth. By using the time-stepping finite element method (TS-FEM), the machine performances are evaluated. Finally, the FT-PMV machine is manufactured, and the experimental results are presented to validate the theoretical analysis.

  11. CFD simulation and experimental validation of a GM type double inlet pulse tube refrigerator

    NASA Astrophysics Data System (ADS)

    Banjare, Y. P.; Sahoo, R. K.; Sarangi, S. K.

    2010-04-01

    Pulse tube refrigerator has the advantages of long life and low vibration over the conventional cryocoolers, such as GM and stirling coolers because of the absence of moving parts in low temperature. This paper performs a three-dimensional computational fluid dynamic (CFD) simulation of a GM type double inlet pulse tube refrigerator (DIPTR) vertically aligned, operating under a variety of thermal boundary conditions. A commercial computational fluid dynamics (CFD) software package, Fluent 6.1 is used to model the oscillating flow inside a pulse tube refrigerator. The simulation represents fully coupled systems operating in steady-periodic mode. The externally imposed boundary conditions are sinusoidal pressure inlet by user defined function at one end of the tube and constant temperature or heat flux boundaries at the external walls of the cold-end heat exchangers. The experimental method to evaluate the optimum parameters of DIPTR is difficult. On the other hand, developing a computer code for CFD analysis is equally complex. The objectives of the present investigations are to ascertain the suitability of CFD based commercial package, Fluent for study of energy and fluid flow in DIPTR and to validate the CFD simulation results with available experimental data. The general results, such as the cool down behaviours of the system, phase relation between mass flow rate and pressure at cold end, the temperature profile along the wall of the cooler and refrigeration load are presented for different boundary conditions of the system. The results confirm that CFD based Fluent simulations are capable of elucidating complex periodic processes in DIPTR. The results also show that there is an excellent agreement between CFD simulation results and experimental results.

  12. Model development and experimental validation of capnophilic lactic fermentation and hydrogen synthesis by Thermotoga neapolitana.

    PubMed

    Pradhan, Nirakar; Dipasquale, Laura; d'Ippolito, Giuliana; Fontana, Angelo; Panico, Antonio; Pirozzi, Francesco; Lens, Piet N L; Esposito, Giovanni

    2016-08-01

    The aim of the present study was to develop a kinetic model for a recently proposed unique and novel metabolic process called capnophilic (CO2-requiring) lactic fermentation (CLF) pathway in Thermotoga neapolitana. The model was based on Monod kinetics and the mathematical expressions were developed to enable the simulation of biomass growth, substrate consumption and product formation. The calibrated kinetic parameters such as maximum specific uptake rate (k), semi-saturation constant (kS), biomass yield coefficient (Y) and endogenous decay rate (kd) were 1.30 h(-1), 1.42 g/L, 0.1195 and 0.0205 h(-1), respectively. A high correlation (>0.98) was obtained between the experimental data and model predictions for both model validation and cross validation processes. An increase of the lactate production in the range of 40-80% was obtained through CLF pathway compared to the classic dark fermentation model. The proposed kinetic model is the first mechanistically based model for the CLF pathway. This model provides useful information to improve the knowledge about how acetate and CO2 are recycled back by Thermotoga neapolitana to produce lactate without compromising the overall hydrogen yield. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Experimental Validation of a Branched Solution Model for Magnetosonic Ionization Waves in Plasma Accelerators

    NASA Astrophysics Data System (ADS)

    Underwood, Thomas; Loebner, Keith; Cappelli, Mark

    2015-11-01

    Detailed measurements of the thermodynamic and electrodynamic plasma state variables within the plume of a pulsed plasma accelerator are presented. A quadruple Langmuir probe operating in current-saturation mode is used to obtain time resolved measurements of the plasma density, temperature, potential, and velocity along the central axis of the accelerator. This data is used in conjunction with a fast-framing, intensified CCD camera to develop and validate a model predicting the existence of two distinct types of ionization waves corresponding to the upper and lower solution branches of the Hugoniot curve. A deviation of less than 8% is observed between the quasi-steady, one-dimensional theoretical model and the experimentally measured plume velocity. This work is supported by the U.S. Department of Energy Stewardship Science Academic Program in addition to the National Defense Science Engineering Graduate Fellowship.

  14. Thermodynamic modeling and experimental validation of the Fe-Al-Ni-Cr-Mo alloy system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teng, Zhenke; Zhang, F; Miller, Michael K

    2012-01-01

    NiAl-type precipitate-strengthened ferritic steels have been known as potential materials for the steam turbine applications. In this study, thermodynamic descriptions of the B2-NiAl type nano-scaled precipitates and body-centered-cubic (BCC) Fe matrix phase for four alloys based on the Fe-Al-Ni-Cr-Mo system were developed as a function of the alloy composition at the aging temperature. The calculated phase structure, composition, and volume fraction were validated by the experimental investigations using synchrotron X-ray diffraction and atom probe tomography. With the ability to accurately predict the key microstructural features related to the mechanical properties in a given alloy system, the established thermodynamic model inmore » the current study may significantly accelerate the alloy design process of the NiAl-strengthened ferritic steels.« less

  15. Computational model for calculating the dynamical behaviour of generators caused by unbalanced magnetic pull and experimental validation

    NASA Astrophysics Data System (ADS)

    Pennacchi, Paolo

    2008-04-01

    The modelling of the unbalanced magnetic pull (UMP) in generators and the experimental validation of the proposed method are presented in this paper. The UMP is one of the most remarkable effects of electromechanical interactions in rotating machinery. As a consequence of the rotor eccentricity, the imbalance of the electromagnetic forces acting between rotor and stator generates a net radial force. This phenomenon can be avoided by means of a careful assembly and manufacture in small and stiff machines, like electrical motors. On the contrary, the eccentricity of the active part of the rotor with respect to the stator is unavoidable in big generators of power plants, because they operate above their first critical speed and are supported by oil-film bearings. In the first part of the paper, a method aimed to calculate the UMP force is described. This model is more general than those available in literature, which are limited to circular orbits. The model is based on the actual position of the rotor inside the stator, therefore on the actual air-gap distribution, regardless of the orbit type. The closed form of the nonlinear UMP force components is presented. In the second part, the experimental validation of the proposed model is presented. The dynamical behaviour in the time domain of a steam turbo-generator of a power plant is considered and it is shown that the model is able to reproduce the dynamical effects due to the excitation of the magnetic field in the generator.

  16. Polypropylene Production Optimization in Fluidized Bed Catalytic Reactor (FBCR): Statistical Modeling and Pilot Scale Experimental Validation

    PubMed Central

    Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed

    2014-01-01

    Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576

  17. Experimental validation of the intrinsic spatial efficiency method over a wide range of sizes for cylindrical sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Ramŕez, Pablo, E-mail: rapeitor@ug.uchile.cl; Larroquette, Philippe; Camilla, S.

    The intrinsic spatial efficiency method is a new absolute method to determine the efficiency of a gamma spectroscopy system for any extended source. In the original work the method was experimentally demonstrated and validated for homogeneous cylindrical sources containing {sup 137}Cs, whose sizes varied over a small range (29.5 mm radius and 15.0 to 25.9 mm height). In this work we present an extension of the validation over a wide range of sizes. The dimensions of the cylindrical sources vary between 10 to 40 mm height and 8 to 30 mm radius. The cylindrical sources were prepared using the referencemore » material IAEA-372, which had a specific activity of 11320 Bq/kg at july 2006. The obtained results were better for the sources with 29 mm radius showing relative bias lesser than 5% and for the sources with 10 mm height showing relative bias lesser than 6%. In comparison with the obtained results in the work where we present the method, the majority of these results show an excellent agreement.« less

  18. A Methodology for the Derivation of Unloaded Abdominal Aortic Aneurysm Geometry With Experimental Validation

    PubMed Central

    Chandra, Santanu; Gnanaruban, Vimalatharmaiyah; Riveros, Fabian; Rodriguez, Jose F.; Finol, Ender A.

    2016-01-01

    In this work, we present a novel method for the derivation of the unloaded geometry of an abdominal aortic aneurysm (AAA) from a pressurized geometry in turn obtained by 3D reconstruction of computed tomography (CT) images. The approach was experimentally validated with an aneurysm phantom loaded with gauge pressures of 80, 120, and 140 mm Hg. The unloaded phantom geometries estimated from these pressurized states were compared to the actual unloaded phantom geometry, resulting in mean nodal surface distances of up to 3.9% of the maximum aneurysm diameter. An in-silico verification was also performed using a patient-specific AAA mesh, resulting in maximum nodal surface distances of 8 μm after running the algorithm for eight iterations. The methodology was then applied to 12 patient-specific AAA for which their corresponding unloaded geometries were generated in 5–8 iterations. The wall mechanics resulting from finite element analysis of the pressurized (CT image-based) and unloaded geometries were compared to quantify the relative importance of using an unloaded geometry for AAA biomechanics. The pressurized AAA models underestimate peak wall stress (quantified by the first principal stress component) on average by 15% compared to the unloaded AAA models. The validation and application of the method, readily compatible with any finite element solver, underscores the importance of generating the unloaded AAA volume mesh prior to using wall stress as a biomechanical marker for rupture risk assessment. PMID:27538124

  19. Experimental validation of a theoretical model of dual wavelength photoacoustic (PA) excitation in fluorophores

    NASA Astrophysics Data System (ADS)

    Märk, Julia; Theiss, Christoph; Schmitt, Franz-Josef; Laufer, Jan

    2015-03-01

    Fluorophores, such as exogenous dyes and genetically expressed proteins, exhibit radiative relaxation with long excited state lifetimes. This can be exploited for PA detection based on dual wavelength excitation using pump and probe wavelengths that coincide with the absorption and emission spectra, respectively. While the pump pulse raises the fluorophore to a long-lived excited state, simultaneous illumination with the probe pulse reduces the excited state lifetime due to stimulated emission (SE).This leads to a change in thermalized energy, and hence PA signal amplitude, compared to single wavelength illumination. By introducing a time delay between pump and probe pulses, the change in PA amplitude can be modulated. Since the effect is not observed in endogenous chromophores, it provides a contrast mechanism for the detection of fluorophores via PA difference imaging. In this study, a theoretical model of the PA signal generation in fluorophores was developed and experimentally validated. The model is based on a system of coupled rate equations, which describe the spatial and temporal changes in the population of the molecular energy levels of a fluorophore as a function of pump-probe energy and concentration. This allows the prediction of the thermalized energy distribution, and hence the time-resolved PA signal amplitude. The model was validated by comparing its predictions to PA signals measured in solutions of rhodamine 6G, a well-known laser dye, and Atto680, a NIR fluorophore.

  20. Validation of Reference Genes for RT-qPCR Studies of Gene Expression in Preharvest and Postharvest Longan Fruits under Different Experimental Conditions

    PubMed Central

    Wu, Jianyang; Zhang, Hongna; Liu, Liqin; Li, Weicai; Wei, Yongzan; Shi, Shengyou

    2016-01-01

    Reverse transcription quantitative PCR (RT-qPCR) as the accurate and sensitive method is use for gene expression analysis, but the veracity and reliability result depends on whether select appropriate reference gene or not. To date, several reliable reference gene validations have been reported in fruits trees, but none have been done on preharvest and postharvest longan fruits. In this study, 12 candidate reference genes, namely, CYP, RPL, GAPDH, TUA, TUB, Fe-SOD, Mn-SOD, Cu/Zn-SOD, 18SrRNA, Actin, Histone H3, and EF-1a, were selected. Expression stability of these genes in 150 longan samples was evaluated and analyzed using geNorm and NormFinder algorithms. Preharvest samples consisted of seven experimental sets, including different developmental stages, organs, hormone stimuli (NAA, 2,4-D, and ethephon) and abiotic stresses (bagging and girdling with defoliation). Postharvest samples consisted of different temperature treatments (4 and 22°C) and varieties. Our findings indicate that appropriate reference gene(s) should be picked for each experimental condition. Our data further showed that the commonly used reference gene Actin does not exhibit stable expression across experimental conditions in longan. Expression levels of the DlACO gene, which is a key gene involved in regulating fruit abscission under girdling with defoliation treatment, was evaluated to validate our findings. In conclusion, our data provide a useful framework for choice of suitable reference genes across different experimental conditions for RT-qPCR analysis of preharvest and postharvest longan fruits. PMID:27375640

  1. System-Level Experimental Validations for Supersonic Commercial Transport Aircraft Entering Service in the 2018-2020 Time Period

    NASA Technical Reports Server (NTRS)

    Magee, Todd E.; Fugal, Spencer R.; Fink, Lawrence E.; Adamson, Eric E.; Shaw, Stephen G.

    2015-01-01

    This report describes the work conducted under NASA funding for the Boeing N+2 Supersonic Experimental Validation project to experimentally validate the conceptual design of a supersonic airliner feasible for entry into service in the 2018 -to 2020 timeframe (NASA N+2 generation). The primary goal of the project was to develop a low-boom configuration optimized for minimum sonic boom signature (65 to 70 PLdB). This was a very aggressive goal that could be achieved only through integrated multidisciplinary optimization tools validated in relevant ground and, later, flight environments. The project was split into two phases. Phase I of the project covered the detailed aerodynamic design of a low boom airliner as well as the wind tunnel tests to validate that design (ref. 1). This report covers Phase II of the project, which continued the design methodology development of Phase I with a focus on the propulsion integration aspects as well as the testing involved to validate those designs. One of the major airplane configuration features of the Boeing N+2 low boom design was the overwing nacelle. The location of the nacelle allowed for a minimal effect on the boom signature, however, it added a level of difficulty to designing an inlet with acceptable performance in the overwing flow field. Using the Phase I work as the starting point, the goals of the Phase 2 project were to design and verify inlet performance while maintaining a low-boom signature. The Phase II project was successful in meeting all contract objectives. New modular nacelles were built for the larger Performance Model along with a propulsion rig with an electrically-actuated mass flow plug. Two new mounting struts were built for the smaller Boom Model, along with new nacelles. Propulsion integration testing was performed using an instrumented fan face and a mass flow plug, while boom signatures were measured using a wall-mounted pressure rail. A side study of testing in different wind tunnels was

  2. External gear pumps operating with non-Newtonian fluids: Modelling and experimental validation

    NASA Astrophysics Data System (ADS)

    Rituraj, Fnu; Vacca, Andrea

    2018-06-01

    External Gear Pumps are used in various industries to pump non-Newtonian viscoelastic fluids like plastics, paints, inks, etc. For both design and analysis purposes, it is often a matter of interest to understand the features of the displacing action realized by meshing of the gears and the description of the behavior of the leakages for this kind of pumps. However, very limited work can be found in literature about methodologies suitable to model such phenomena. This article describes the technique of modelling external gear pumps that operate with non-Newtonian fluids. In particular, it explains how the displacing action of the unit can be modelled using a lumped parameter approach which involves dividing fluid domain into several control volumes and internal flow connections. This work is built upon the HYGESim simulation tool, conceived by the authors' research team in the last decade, which is for the first time extended for the simulation of non-Newtonian fluids. The article also describes several comparisons between simulation results and experimental data obtained from numerous experiments performed for validation of the presented methodology. Finally, operation of external gear pump with fluids having different viscosity characteristics is discussed.

  3. LES Modeling with Experimental Validation of a Compound Channel having Converging Floodplain

    NASA Astrophysics Data System (ADS)

    Mohanta, Abinash; Patra, K. C.

    2018-04-01

    Computational fluid dynamics (CFD) is often used to predict flow structures in developing areas of a flow field for the determination of velocity field, pressure, shear stresses, effect of turbulence and others. A two phase three-dimensional CFD model along with the large eddy simulation (LES) model is used to solve the turbulence equation. This study aims to validate CFD simulations of free surface flow or open channel flow by using volume of fluid method by comparing the data observed in hydraulics laboratory of the National Institute of Technology, Rourkela. The finite volume method with a dynamic sub grid scale was carried out for a constant aspect ratio and convergence condition. The results show that the secondary flow and centrifugal force influence flow pattern and show good agreement with experimental data. Within this paper over-bank flows have been numerically simulated using LES in order to predict accurate open channel flow behavior. The LES results are shown to accurately predict the flow features, specifically the distribution of secondary circulations both for in-bank channels as well as over-bank channels at varying depth and width ratios in symmetrically converging flood plain compound sections.

  4. Mixing characterisation of full-scale membrane bioreactors: CFD modelling with experimental validation.

    PubMed

    Brannock, M; Wang, Y; Leslie, G

    2010-05-01

    Membrane Bioreactors (MBRs) have been successfully used in aerobic biological wastewater treatment to solve the perennial problem of effective solids-liquid separation. The optimisation of MBRs requires knowledge of the membrane fouling, biokinetics and mixing. However, research has mainly concentrated on the fouling and biokinetics (Ng and Kim, 2007). Current methods of design for a desired flow regime within MBRs are largely based on assumptions (e.g. complete mixing of tanks) and empirical techniques (e.g. specific mixing energy). However, it is difficult to predict how sludge rheology and vessel design in full-scale installations affects hydrodynamics, hence overall performance. Computational Fluid Dynamics (CFD) provides a method for prediction of how vessel features and mixing energy usage affect the hydrodynamics. In this study, a CFD model was developed which accounts for aeration, sludge rheology and geometry (i.e. bioreactor and membrane module). This MBR CFD model was then applied to two full-scale MBRs and was successfully validated against experimental results. The effect of sludge settling and rheology was found to have a minimal impact on the bulk mixing (i.e. the residence time distribution).

  5. Helium release during shale deformation: Experimental validation

    DOE PAGES

    Bauer, Stephen J.; Gardner, W. Payton; Heath, Jason E.

    2016-07-01

    This paper describes initial experimental results of helium tracer release monitoring during deformation of shale. Naturally occurring radiogenic 4He is present in high concentration in most shales. During rock deformation, accumulated helium could be released as fractures are created and new transport pathways are created. We present the results of an experimental study in which confined reservoir shale samples, cored parallel and perpendicular to bedding, which were initially saturated with helium to simulate reservoir conditions, are subjected to triaxial compressive deformation. During the deformation experiment, differential stress, axial, and radial strains are systematically tracked. Release of helium is dynamically measuredmore » using a helium mass spectrometer leak detector. Helium released during deformation is observable at the laboratory scale and the release is tightly coupled to the shale deformation. These first measurements of dynamic helium release from rocks undergoing deformation show that helium provides information on the evolution of microstructure as a function of changes in stress and strain.« less

  6. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  7. Unfolding linac photon spectra and incident electron energies from experimental transmission data, with direct independent validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, E. S. M.; McEwen, M. R.; Rogers, D. W. O.

    2012-11-15

    Purpose: In a recent computational study, an improved physics-based approach was proposed for unfolding linac photon spectra and incident electron energies from transmission data. In this approach, energy differentiation is improved by simultaneously using transmission data for multiple attenuators and detectors, and the unfolding robustness is improved by using a four-parameter functional form to describe the photon spectrum. The purpose of the current study is to validate this approach experimentally, and to demonstrate its application on a typical clinical linac. Methods: The validation makes use of the recent transmission measurements performed on the Vickers research linac of National Research Councilmore » Canada. For this linac, the photon spectra were previously measured using a NaI detector, and the incident electron parameters are independently known. The transmission data are for eight beams in the range 10-30 MV using thick Be, Al and Pb bremsstrahlung targets. To demonstrate the approach on a typical clinical linac, new measurements are performed on an Elekta Precise linac for 6, 10 and 25 MV beams. The different experimental setups are modeled using EGSnrc, with the newly added photonuclear attenuation included. Results: For the validation on the research linac, the 95% confidence bounds of the unfolded spectra fall within the noise of the NaI data. The unfolded spectra agree with the EGSnrc spectra (calculated using independently known electron parameters) with RMS energy fluence deviations of 4.5%. The accuracy of unfolding the incident electron energy is shown to be {approx}3%. A transmission cutoff of only 10% is suitable for accurate unfolding, provided that the other components of the proposed approach are implemented. For the demonstration on a clinical linac, the unfolded incident electron energies and their 68% confidence bounds for the 6, 10 and 25 MV beams are 6.1 {+-} 0.1, 9.3 {+-} 0.1, and 19.3 {+-} 0.2 MeV, respectively. The unfolded

  8. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this

  9. Hypersonic nozzle/afterbody CFD code validation. I - Experimental measurements

    NASA Technical Reports Server (NTRS)

    Spaid, Frank W.; Keener, Earl R.

    1993-01-01

    This study was conducted to obtain a detailed experimental description of the flow field created by the interaction of a single-expansion-ramp-nozzle flow with a hypersonic external stream. Data were obtained from a generic nozzle/afterbody model in the 3.5-Foot Hypersonic Wind Tunnel of the NASA Ames Research Center in a cooperative experimental program involving Ames and the McDonnell Douglas Research Laboratories. This paper presents experimental results consisting primarily of surveys obtained with a five-hole total-pressure/flow-direction probe and a total-temperature probe. These surveys were obtained in the flow field created by the interaction between the underexpanded jet plume and the external flow.

  10. Construction of a stochastic model of track geometry irregularities and validation through experimental measurements of dynamic loading

    NASA Astrophysics Data System (ADS)

    Panunzio, Alfonso M.; Puel, G.; Cottereau, R.; Simon, S.; Quost, X.

    2017-03-01

    This paper describes the construction of a stochastic model of urban railway track geometry irregularities, based on experimental data. The considered irregularities are track gauge, superelevation, horizontal and vertical curvatures. They are modelled as random fields whose statistical properties are extracted from a large set of on-track measurements of the geometry of an urban railway network. About 300-1000 terms are used in the Karhunen-Loève/Polynomial Chaos expansions to represent the random fields with appropriate accuracy. The construction of the random fields is then validated by comparing on-track measurements of the contact forces and numerical dynamics simulations for different operational conditions (train velocity and car load) and horizontal layouts (alignment, curve). The dynamics simulations are performed both with and without randomly generated geometrical irregularities for the track. The power spectrum densities obtained from the dynamics simulations with the model of geometrical irregularities compare extremely well with those obtained from the experimental contact forces. Without irregularities, the spectrum is 10-50 dB too low.

  11. Validation of the generalized model of two-phase thermosyphon loop based on experimental measurements of volumetric flow rate

    NASA Astrophysics Data System (ADS)

    Bieliński, Henryk

    2016-09-01

    The current paper presents the experimental validation of the generalized model of the two-phase thermosyphon loop. The generalized model is based on mass, momentum, and energy balances in the evaporators, rising tube, condensers and the falling tube. The theoretical analysis and the experimental data have been obtained for a new designed variant. The variant refers to a thermosyphon loop with both minichannels and conventional tubes. The thermosyphon loop consists of an evaporator on the lower vertical section and a condenser on the upper vertical section. The one-dimensional homogeneous and separated two-phase flow models were used in calculations. The latest minichannel heat transfer correlations available in literature were applied. A numerical analysis of the volumetric flow rate in the steady-state has been done. The experiment was conducted on a specially designed test apparatus. Ultrapure water was used as a working fluid. The results show that the theoretical predictions are in good agreement with the measured volumetric flow rate at steady-state.

  12. VIRmiRNA: a comprehensive resource for experimentally validated viral miRNAs and their targets.

    PubMed

    Qureshi, Abid; Thakur, Nishant; Monga, Isha; Thakur, Anamika; Kumar, Manoj

    2014-01-01

    Viral microRNAs (miRNAs) regulate gene expression of viral and/or host genes to benefit the virus. Hence, miRNAs play a key role in host-virus interactions and pathogenesis of viral diseases. Lately, miRNAs have also shown potential as important targets for the development of novel antiviral therapeutics. Although several miRNA and their target repositories are available for human and other organisms in literature, but a dedicated resource on viral miRNAs and their targets are lacking. Therefore, we have developed a comprehensive viral miRNA resource harboring information of 9133 entries in three subdatabases. This includes 1308 experimentally validated miRNA sequences with their isomiRs encoded by 44 viruses in viral miRNA ' VIRMIRNA: ' and 7283 of their target genes in ' VIRMIRTAR': . Additionally, there is information of 542 antiviral miRNAs encoded by the host against 24 viruses in antiviral miRNA ' AVIRMIR': . The web interface was developed using Linux-Apache-MySQL-PHP (LAMP) software bundle. User-friendly browse, search, advanced search and useful analysis tools are also provided on the web interface. VIRmiRNA is the first specialized resource of experimentally proven virus-encoded miRNAs and their associated targets. This database would enhance the understanding of viral/host gene regulation and may also prove beneficial in the development of antiviral therapeutics. Database URL: http://crdd.osdd.net/servers/virmirna. © The Author(s) 2014. Published by Oxford University Press.

  13. Experimental validation of alternate integral-formulation method for predicting acoustic radiation based on particle velocity measurements.

    PubMed

    Ni, Zhi; Wu, Sean F

    2010-09-01

    This paper presents experimental validation of an alternate integral-formulation method (AIM) for predicting acoustic radiation from an arbitrary structure based on the particle velocities specified on a hypothetical surface enclosing the target source. Both the normal and tangential components of the particle velocity on this hypothetical surface are measured and taken as the input to AIM codes to predict the acoustic pressures in both exterior and interior regions. The results obtained are compared with the benchmark values measured by microphones at the same locations. To gain some insight into practical applications of AIM, laser Doppler anemometer (LDA) and double hotwire sensor (DHS) are used as measurement devices to collect the particle velocities in the air. Measurement limitations of using LDA and DHS are discussed.

  14. Model validation using CFD-grade experimental database for NGNP Reactor Cavity Cooling Systems with water and air

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manera, Annalisa; Corradini, Michael; Petrov, Victor

    This project has been focused on the experimental and numerical investigations of the water-cooled and air-cooled Reactor Cavity Cooling System (RCCS) designs. At this aim, we have leveraged an existing experimental facility at the University of Wisconsin-Madison (UW), and we have designed and built a separate effect test facility at the University of Michigan. The experimental facility at UW has underwent several upgrades, including the installation of advanced instrumentation (i.e. wire-mesh sensors) built at the University of Michigan. These provides highresolution time-resolved measurements of the void-fraction distribution in the risers of the water-cooled RCCS facility. A phenomenological model has beenmore » developed to assess the water cooled RCCS system stability and determine the root cause behind the oscillatory behavior that occurs under normal two-phase operation. Testing under various perturbations to the water-cooled RCCS facility have resulted in changes in the stability of the integral system. In particular, the effects on stability of inlet orifices, water tank volume have and system pressure been investigated. MELCOR was used as a predictive tool when performing inlet orificing tests and was able to capture the Density Wave Oscillations (DWOs) that occurred upon reaching saturation in the risers. The experimental and numerical results have then been used to provide RCCS design recommendations. The experimental facility built at the University of Michigan was aimed at the investigation of mixing in the upper plenum of the air-cooled RCCS design. The facility has been equipped with state-of-theart high-resolution instrumentation to achieve so-called CFD grade experiments, that can be used for the validation of Computational Fluid Dynanmics (CFD) models, both RANS (Reynold-Averaged) and LES (Large Eddy Simulations). The effect of risers penetration in the upper plenum has been investigated as well.« less

  15. Survey of computer programs for prediction of crash response and of its experimental validation

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.

    1976-01-01

    The author seeks to critically assess the potentialities of the mathematical and hybrid simulators which predict post-impact response of transportation vehicles. A strict rigorous numerical analysis of a complex phenomenon like crash may leave a lot to be desired with regard to the fidelity of mathematical simulation. Hybrid simulations on the other hand which exploit experimentally observed features of deformations appear to hold a lot of promise. MARC, ANSYS, NONSAP, DYCAST, ACTION, WHAM II and KRASH are among some of the simulators examined for their capabilities with regard to prediction of post impact response of vehicles. A review of these simulators reveals that much more by way of an analysis capability may be desirable than what is currently available. NASA's crashworthiness testing program in conjunction with similar programs of various other agencies, besides generating a large data base, will be equally useful in the validation of new mathematical concepts of nonlinear analysis and in the successful extension of other techniques in crashworthiness.

  16. Experimental validation of an 8 element EMAT phased array probe for longitudinal wave generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Bourdais, Florian, E-mail: florian.lebourdais@cea.fr; Marchand, Benoit, E-mail: florian.lebourdais@cea.fr

    2015-03-31

    Sodium cooled Fast Reactors (SFR) use liquid sodium as a coolant. Liquid sodium being opaque, optical techniques cannot be applied to reactor vessel inspection. This makes it necessary to develop alternative ways of assessing the state of the structures immersed in the medium. Ultrasonic pressure waves are well suited for inspection tasks in this environment, especially using pulsed electromagnetic acoustic transducers (EMAT) that generate the ultrasound directly in the liquid sodium. The work carried out at CEA LIST is aimed at developing phased array EMAT probes conditioned for reactor use. The present work focuses on the experimental validation of amore » newly manufactured 8 element probe which was designed for beam forming imaging in a liquid sodium environment. A parametric study is carried out to determine the optimal setup of the magnetic assembly used in this probe. First laboratory tests on an aluminium block show that the probe has the required beam steering capabilities.« less

  17. Experimental Methodology in English Teaching and Learning: Method Features, Validity Issues, and Embedded Experimental Design

    ERIC Educational Resources Information Center

    Lee, Jang Ho

    2012-01-01

    Experimental methods have played a significant role in the growth of English teaching and learning studies. The paper presented here outlines basic features of experimental design, including the manipulation of independent variables, the role and practicality of randomised controlled trials (RCTs) in educational research, and alternative methods…

  18. Validations of calibration-free measurements of electron temperature using double-pass Thomson scattering diagnostics from theoretical and experimental aspects.

    PubMed

    Tojo, H; Yamada, I; Yasuhara, R; Ejiri, A; Hiratsuka, J; Togashi, H; Yatsuka, E; Hatae, T; Funaba, H; Hayashi, H; Takase, Y; Itami, K

    2016-09-01

    This paper evaluates the accuracy of electron temperature measurements and relative transmissivities of double-pass Thomson scattering diagnostics. The electron temperature (T e ) is obtained from the ratio of signals from a double-pass scattering system, then relative transmissivities are calculated from the measured T e and intensity of the signals. How accurate the values are depends on the electron temperature (T e ) and scattering angle (θ), and therefore the accuracy of the values was evaluated experimentally using the Large Helical Device (LHD) and the Tokyo spherical tokamak-2 (TST-2). Analyzing the data from the TST-2 indicates that a high T e and a large scattering angle (θ) yield accurate values. Indeed, the errors for scattering angle θ = 135° are approximately half of those for θ = 115°. The method of determining the T e in a wide T e range spanning over two orders of magnitude (0.01-1.5 keV) was validated using the experimental results of the LHD and TST-2. A simple method to provide relative transmissivities, which include inputs from collection optics, vacuum window, optical fibers, and polychromators, is also presented. The relative errors were less than approximately 10%. Numerical simulations also indicate that the T e measurements are valid under harsh radiation conditions. This method to obtain T e can be considered for the design of Thomson scattering systems where there is high-performance plasma that generates harsh radiation environments.

  19. Experimental validation of coil phase parametrisation on ASDEX Upgrade, and extension to ITER

    NASA Astrophysics Data System (ADS)

    Ryan, D. A.; Liu, Y. Q.; Kirk, A.; Suttrop, W.; Dudson, B.; Dunne, M.; Willensdorfer, M.; the ASDEX Upgrade team; the EUROfusion MST1 team

    2018-06-01

    It has been previously demonstrated in Li et al (2016 Nucl. Fusion 56 126007) that the optimum upper/lower coil phase shift ΔΦopt for alignment of RMP coils for ELM mitigation depends sensitively on q 95, and other equilibrium plasma parameters. Therefore, ΔΦopt is expected to vary widely during the current ramp of ITER plasmas, with negative implications for ELM mitigation during this period. A previously derived and numerically benchmarked parametrisation of the coil phase for optimal ELM mitigation on ASDEX Upgrade (Ryan et al 2017 Plasma Phys. Control. Fusion 59 024005) is validated against experimental measurements of ΔΦopt, made by observing the changes to the ELM frequency as the coil phase is scanned. It is shown that the parametrisation may predict the optimal coil phase to within 32° of the experimental measurement for n = 2 applied perturbations. It is explained that this agreement is sufficient to ensure that the ELM mitigation is not compromised by poor coil alignment. It is also found that the phase which maximises ELM mitigation is shifted from the phase which maximizes density pump-out, in contrast to theoretical expectations that ELM mitigation and density pump out have the same ΔΦ ul dependence. A time lag between the ELM frequency response and density response to the RMP is suggested as the cause. The method for numerically deriving the parametrisation is repeated for the ITER coil set, using the baseline scenario as a reference equilibrium, and the parametrisation coefficients given for future use in a feedback coil alignment system. The relative merits of square or sinusoidal toroidal current waveforms for ELM mitigation are briefly discussed.

  20. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  1. ColE1-Plasmid Production in Escherichia coli: Mathematical Simulation and Experimental Validation.

    PubMed

    Freudenau, Inga; Lutter, Petra; Baier, Ruth; Schleef, Martin; Bednarz, Hanna; Lara, Alvaro R; Niehaus, Karsten

    2015-01-01

    Plasmids have become very important as pharmaceutical gene vectors in the fields of gene therapy and genetic vaccination in the past years. In this study, we present a dynamic model to simulate the ColE1-like plasmid replication control, once for a DH5α-strain carrying a low copy plasmid (DH5α-pSUP 201-3) and once for a DH5α-strain carrying a high copy plasmid (DH5α-pCMV-lacZ) by using ordinary differential equations and the MATLAB software. The model includes the plasmid replication control by two regulatory RNA molecules (RNAI and RNAII) as well as the replication control by uncharged tRNA molecules. To validate the model, experimental data like RNAI- and RNAII concentration, plasmid copy number (PCN), and growth rate for three different time points in the exponential phase were determined. Depending on the sampled time point, the measured RNAI- and RNAII concentrations for DH5α-pSUP 201-3 reside between 6 ± 0.7 and 34 ± 7 RNAI molecules per cell and 0.44 ± 0.1 and 3 ± 0.9 RNAII molecules per cell. The determined PCNs averaged between 46 ± 26 and 48 ± 30 plasmids per cell. The experimentally determined data for DH5α-pCMV-lacZ reside between 345 ± 203 and 1086 ± 298 RNAI molecules per cell and 22 ± 2 and 75 ± 10 RNAII molecules per cell with an averaged PCN of 1514 ± 1301 and 5806 ± 4828 depending on the measured time point. As the model was shown to be consistent with the experimentally determined data, measured at three different time points within the growth of the same strain, we performed predictive simulations concerning the effect of uncharged tRNA molecules on the ColE1-like plasmid replication control. The hypothesis is that these tRNA molecules would have an enhancing effect on the plasmid production. The in silico analysis predicts that uncharged tRNA molecules would indeed increase the plasmid DNA production.

  2. SU-F-J-41: Experimental Validation of a Cascaded Linear System Model for MVCBCT with a Multi-Layer EPID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Y; Rottmann, J; Myronakis, M

    2016-06-15

    Purpose: The purpose of this study was to validate the use of a cascaded linear system model for MV cone-beam CT (CBCT) using a multi-layer (MLI) electronic portal imaging device (EPID) and provide experimental insight into image formation. A validated 3D model provides insight into salient factors affecting reconstructed image quality, allowing potential for optimizing detector design for CBCT applications. Methods: A cascaded linear system model was developed to investigate the potential improvement in reconstructed image quality for MV CBCT using an MLI EPID. Inputs to the three-dimensional (3D) model include projection space MTF and NPS. Experimental validation was performedmore » on a prototype MLI detector installed on the portal imaging arm of a Varian TrueBeam radiotherapy system. CBCT scans of up to 898 projections over 360 degrees were acquired at exposures of 16 and 64 MU. Image volumes were reconstructed using a Feldkamp-type (FDK) filtered backprojection (FBP) algorithm. Flat field images and scans of a Catphan model 604 phantom were acquired. The effect of 2×2 and 4×4 detector binning was also examined. Results: Using projection flat fields as an input, examination of the modeled and measured NPS in the axial plane exhibits good agreement. Binning projection images was shown to improve axial slice SDNR by a factor of approximately 1.4. This improvement is largely driven by a decrease in image noise of roughly 20%. However, this effect is accompanied by a subsequent loss in image resolution. Conclusion: The measured axial NPS shows good agreement with the theoretical calculation using a linear system model. Binning of projection images improves SNR of large objects on the Catphan phantom by decreasing noise. Specific imaging tasks will dictate the implementation image binning to two-dimensional projection images. The project was partially supported by a grant from Varian Medical Systems, Inc. and grant No. R01CA188446-01 from the National Cancer

  3. Variable viscosity and density biofilm simulations using an immersed boundary method, part II: Experimental validation and the heterogeneous rheology-IBM

    NASA Astrophysics Data System (ADS)

    Stotsky, Jay A.; Hammond, Jason F.; Pavlovsky, Leonid; Stewart, Elizabeth J.; Younger, John G.; Solomon, Michael J.; Bortz, David M.

    2016-07-01

    The goal of this work is to develop a numerical simulation that accurately captures the biomechanical response of bacterial biofilms and their associated extracellular matrix (ECM). In this, the second of a two-part effort, the primary focus is on formally presenting the heterogeneous rheology Immersed Boundary Method (hrIBM) and validating our model by comparison to experimental results. With this extension of the Immersed Boundary Method (IBM), we use the techniques originally developed in Part I ([19]) to treat biofilms as viscoelastic fluids possessing variable rheological properties anchored to a set of moving locations (i.e., the bacteria locations). In particular, we incorporate spatially continuous variable viscosity and density fields into our model. Although in [14,15], variable viscosity is used in an IBM context to model discrete viscosity changes across interfaces, to our knowledge this work and Part I are the first to apply the IBM to model a continuously variable viscosity field. We validate our modeling approach from Part I by comparing dynamic moduli and compliance moduli computed from our model to data from mechanical characterization experiments on Staphylococcus epidermidis biofilms. The experimental setup is described in [26] in which biofilms are grown and tested in a parallel plate rheometer. In order to initialize the positions of bacteria in the biofilm, experimentally obtained three dimensional coordinate data was used. One of the major conclusions of this effort is that treating the spring-like connections between bacteria as Maxwell or Zener elements provides good agreement with the mechanical characterization data. We also found that initializing the simulations with different coordinate data sets only led to small changes in the mechanical characterization results. Matlab code used to produce results in this paper will be available at https://github.com/MathBioCU/BiofilmSim.

  4. Experimental validation of the MODTRAN 5.3 sea surface radiance model using MIRAMER campaign measurements.

    PubMed

    Ross, Vincent; Dion, Denis; St-Germain, Daniel

    2012-05-01

    Radiometric images taken in mid-wave and long-wave infrared bands are used as a basis for validating a sea surface bidirectional reflectance distribution function (BRDF) being implemented into MODTRAN 5 (Berk et al. [Proc. SPIE5806, 662 (2005)]). The images were obtained during the MIRAMER campaign that took place in May 2008 in the Mediterranean Sea near Toulon, France. When atmosphere radiances are matched at the horizon to remove possible calibration offsets, the implementation of the BRDF in MODTRAN produces good sea surface radiance agreement, usually within 2% and at worst 4% from off-glint azimuthally averaged measurements. Simulations also compare quite favorably to glint measurements. The observed sea radiance deviations between model and measurements are not systematic, and are well within expected experimental uncertainties. This is largely attributed to proper radiative coupling between the surface and the atmosphere implemented using the DISORT multiple scattering algorithm.

  5. Testing the Validity of Local Flux Laws in an Experimental Eroding Landscape

    NASA Astrophysics Data System (ADS)

    Sweeney, K. E.; Roering, J. J.; Ellis, C.

    2015-12-01

    Linking sediment transport to landscape evolution is fundamental to interpreting climate and tectonic signals from topography and sedimentary deposits. Most geomorphic process laws consist of simple continuum relationships between sediment flux and local topography. However, recent work has shown that nonlocal formulations, whereby sediment flux depends on upslope conditions, are more accurate descriptions of sediment motion, particularly in steep topography. Discriminating between local and nonlocal processes in natural landscapes is complicated by the scarcity of high-resolution topographic data and by the difficulty of measuring sediment flux. To test the validity of local formulations of sediment transport, we use an experimental erosive landscape that combines disturbance-driven, diffusive sediment transport and surface runoff. We conducted our experiments in the eXperimental Landscape Model at St. Anthony Falls Laboratory a 0.5 x 0.5 m test flume filled with crystalline silica (D50 = 30μ) mixed with water to increase cohesion and preclude surface infiltration. Topography is measured with a sheet laser scanner; total sediment flux is tracked with a series of load cells. We simulate uplift (relative baselevel fall) by dropping two parallel weirs at the edges of the experiment. Diffusive sediment transport in our experiments is driven by rainsplash from a constant head drip tank fitted with 625 blunt needles of fixed diameter; sediment is mobilized both through drop impact and the subsequent runoff of the drops. To drive advective transport, we produce surface runoff via a ring of misters that produce droplets that are too small to disturb the sediment surface on impact. Using the results from five experiments that systematically vary the time of drip box rainfall relative to misting rainfall, we calculate local erosion in our experiments by differencing successive time-slices of topography and test whether these patterns are related to local topographic

  6. Experimental Validation of a Fast Forward Model for Guided Wave Tomography of Pipe Elbows.

    PubMed

    Brath, Alex J; Simonetti, Francesco; Nagy, Peter B; Instanes, Geir

    2017-05-01

    Ultrasonic guided wave tomography (GWT) methods for the detection of corrosion and erosion damage in straight pipe sections are now well advanced. However, successful application of GWT to pipe bends has not yet been demonstrated due to the computational burden associated with the complex forward model required to simulate guided wave propagation through the bend. In a previous paper [Brath et al., IEEE Trans. Ultrason., Ferroelectr., Freq. Control, vol. 61, pp. 815-829, 2014], we have shown that the speed of the forward model can be increased by replacing the 3-D pipe bend with a 2-D rectangular domain in which guided wave propagation is formulated based on an artificially inhomogeneous and elliptically anisotropic (INELAN) acoustic model. This paper provides further experimental validation of the INLEAN model by studying the traveltime shifts caused by the introduction of shallow defects on the elbow of a pipe bend. Comparison between experiments and simulations confirms that a defect can be modeled as a phase velocity perturbation to the INLEAN velocity field with accuracy that is within the experimental error of the measurements. In addition, it is found that the sensitivity of traveltime measurements to the presence of damage decreases as the damage position moves from the interior side of the bend (intrados) to the exterior one (extrados). This effect is due to the nonuniform ray coverage obtainable when transmitting the guided wave signals with one ring array of sources on one side of the elbow and receiving with a second array on the other side.

  7. Combined Heat Transfer in High-Porosity High-Temperature Fibrous Insulations: Theory and Experimental Validation

    NASA Technical Reports Server (NTRS)

    Daryabeigi, Kamran; Cunnington, George R.; Miller, Steve D.; Knutson, Jeffry R.

    2010-01-01

    Combined radiation and conduction heat transfer through various high-temperature, high-porosity, unbonded (loose) fibrous insulations was modeled based on first principles. The diffusion approximation was used for modeling the radiation component of heat transfer in the optically thick insulations. The relevant parameters needed for the heat transfer model were derived from experimental data. Semi-empirical formulations were used to model the solid conduction contribution of heat transfer in fibrous insulations with the relevant parameters inferred from thermal conductivity measurements at cryogenic temperatures in a vacuum. The specific extinction coefficient for radiation heat transfer was obtained from high-temperature steady-state thermal measurements with large temperature gradients maintained across the sample thickness in a vacuum. Standard gas conduction modeling was used in the heat transfer formulation. This heat transfer modeling methodology was applied to silica, two types of alumina, and a zirconia-based fibrous insulation, and to a variation of opacified fibrous insulation (OFI). OFI is a class of insulations manufactured by embedding efficient ceramic opacifiers in various unbonded fibrous insulations to significantly attenuate the radiation component of heat transfer. The heat transfer modeling methodology was validated by comparison with more rigorous analytical solutions and with standard thermal conductivity measurements. The validated heat transfer model is applicable to various densities of these high-porosity insulations as long as the fiber properties are the same (index of refraction, size distribution, orientation, and length). Furthermore, the heat transfer data for these insulations can be obtained at any static pressure in any working gas environment without the need to perform tests in various gases at various pressures.

  8. Characterization of Aluminum Honeycomb and Experimentation for Model Development and Validation, Volume I: Discovery and Characterization Experiments for High-Density Aluminum Honeycomb

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Wei-Yang; Korellis, John S.; Lee, Kenneth L.

    2006-08-01

    Honeycomb is a structure that consists of two-dimensional regular arrays of open cells. High-density aluminum honeycomb has been used in weapon assemblies to mitigate shock and protect payload because of its excellent crush properties. In order to use honeycomb efficiently and to certify the payload is protected by the honeycomb under various loading conditions, a validated honeycomb crush model is required and the mechanical properties of the honeycombs need to be fully characterized. Volume I of this report documents an experimental study of the crush behavior of high-density honeycombs. Two sets of honeycombs were included in this investigation: commercial grademore » for initial exploratory experiments, and weapon grade, which satisfied B61 specifications. This investigation also includes developing proper experimental methods for crush characterization, conducting discovery experiments to explore crush behaviors for model improvement, and identifying experimental and material uncertainties.« less

  9. Experimental and numerical validation for the novel configuration of an arthroscopic indentation instrument

    NASA Astrophysics Data System (ADS)

    Korhonen, Rami K.; Saarakkala, Simo; Töyräs, Juha; Laasanen, Mikko S.; Kiviranta, Ilkka; Jurvelin, Jukka S.

    2003-06-01

    Softening of articular cartilage, mainly attributable to deterioration of superficial collagen network and depletion of proteoglycans, is a sign of incipient osteoarthrosis. Early diagnosis of osteoarthrosis is essential to prevent the further destruction of the tissue. During the past decade, a few arthroscopic instruments have been introduced for the measurement of cartilage stiffness; these can be used to provide a sensitive measure of cartilage status. Ease of use, accuracy and reproducibility of the measurements as well as a low risk of damaging cartilage are the main qualities needed in any clinically applicable instrument. In this study, we have modified a commercially available arthroscopic indentation instrument to better fulfil these requirements when measuring cartilage stiffness in joints with thin cartilage. Our novel configuration was validated by experimental testing as well as by finite element (FE) modelling. Experimental and numerical tests indicated that it would be better to use a smaller reference plate and a lower pressing force (3 N) than those used in the original instrument (7-10 N). The reproducibility (CV = 5.0%) of the in situ indentation measurements was improved over that of the original instrument (CV = 7.6%), and the effect of material thickness on the indentation response was smaller than that obtained with the original instrument. The novel configuration showed a significant linear correlation between the indenter force and the reference dynamic modulus of cartilage in unconfined compression, especially in soft tissue (r = 0.893, p < 0.001, n = 16). FE analyses with a transversely isotropic poroelastic model indicated that the instrument was suitable for detecting the degeneration of superficial cartilage. In summary, the instrument presented in this study allows easy and reproducible measurement of cartilage stiffness, also in thin cartilage, and therefore represents a technical improvement for the early diagnosis of

  10. Experimental validation of convection-diffusion discretisation scheme employed for computational modelling of biological mass transport

    PubMed Central

    2010-01-01

    Background The finite volume solver Fluent (Lebanon, NH, USA) is a computational fluid dynamics software employed to analyse biological mass-transport in the vasculature. A principal consideration for computational modelling of blood-side mass-transport is convection-diffusion discretisation scheme selection. Due to numerous discretisation schemes available when developing a mass-transport numerical model, the results obtained should either be validated against benchmark theoretical solutions or experimentally obtained results. Methods An idealised aneurysm model was selected for the experimental and computational mass-transport analysis of species concentration due to its well-defined recirculation region within the aneurysmal sac, allowing species concentration to vary slowly with time. The experimental results were obtained from fluid samples extracted from a glass aneurysm model, using the direct spectrophometric concentration measurement technique. The computational analysis was conducted using the four convection-diffusion discretisation schemes available to the Fluent user, including the First-Order Upwind, the Power Law, the Second-Order Upwind and the Quadratic Upstream Interpolation for Convective Kinetics (QUICK) schemes. The fluid has a diffusivity of 3.125 × 10-10 m2/s in water, resulting in a Peclet number of 2,560,000, indicating strongly convection-dominated flow. Results The discretisation scheme applied to the solution of the convection-diffusion equation, for blood-side mass-transport within the vasculature, has a significant influence on the resultant species concentration field. The First-Order Upwind and the Power Law schemes produce similar results. The Second-Order Upwind and QUICK schemes also correlate well but differ considerably from the concentration contour plots of the First-Order Upwind and Power Law schemes. The computational results were then compared to the experimental findings. An average error of 140% and 116% was demonstrated

  11. Serum paraoxonase type-1 activity in pigs: assay validation and evolution after an induced experimental inflammation.

    PubMed

    Escribano, Damián; Tvarijonaviciute, Asta; Tecles, Fernando; Cerón, José J

    2015-02-15

    Paraoxonase 1 (PON1) is a serum enzyme synthesised and secreted primarily by the liver. It possesses anti-inflammatory properties limiting the production of pro-inflammatory mediators. The objectives of this study were to validate three spectrophotometric assays for the quantification of PON1 activity in pig serum, and to determine if PON1 activity in porcine behaves as a negative acute phase protein (APP), decreasing in inflammatory conditions. An analytical validation using three different substrates - 5-thiobutil butyrolactone (TBBL), phenylacetate (PA) and 4-(p)-nitrophenyl acetate (pNA) - was performed. In addition, inflammation was experimentally induced in five pigs by subcutaneous injection of turpentine oil, while five control pigs were left untreated. The treated pigs showed significant increases in CRP and decreases in albumin, indicating an inflammatory condition. The three substrates used would be suitable for PON1 activity measurements in serum samples, since they offer adequate precision (coefficients of variation<10%), sensitivity (0.01, 0.15, 0.02 U/mL for TBBL, pNA and PA respectively) and accuracy (r=0.99). In addition, PON1 behaves as a negative APP in pigs since a significant decrease (P<0.05) in its activity after 72 h of the induction of the inflammation was observed with all substrates. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Numerical modelling of transdermal delivery from matrix systems: parametric study and experimental validation with silicone matrices.

    PubMed

    Snorradóttir, Bergthóra S; Jónsdóttir, Fjóla; Sigurdsson, Sven Th; Másson, Már

    2014-08-01

    A model is presented for transdermal drug delivery from single-layered silicone matrix systems. The work is based on our previous results that, in particular, extend the well-known Higuchi model. Recently, we have introduced a numerical transient model describing matrix systems where the drug dissolution can be non-instantaneous. Furthermore, our model can describe complex interactions within a multi-layered matrix and the matrix to skin boundary. The power of the modelling approach presented here is further illustrated by allowing the possibility of a donor solution. The model is validated by a comparison with experimental data, as well as validating the parameter values against each other, using various configurations with donor solution, silicone matrix and skin. Our results show that the model is a good approximation to real multi-layered delivery systems. The model offers the ability of comparing drug release for ibuprofen and diclofenac, which cannot be analysed by the Higuchi model because the dissolution in the latter case turns out to be limited. The experiments and numerical model outlined in this study could also be adjusted to more general formulations, which enhances the utility of the numerical model as a design tool for the development of drug-loaded matrices for trans-membrane and transdermal delivery. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  13. Experimental validation of FINDSITEcomb virtual ligand screening results for eight proteins yields novel nanomolar and micromolar binders

    PubMed Central

    2014-01-01

    Background Identification of ligand-protein binding interactions is a critical step in drug discovery. Experimental screening of large chemical libraries, in spite of their specific role and importance in drug discovery, suffer from the disadvantages of being random, time-consuming and expensive. To accelerate the process, traditional structure- or ligand-based VLS approaches are combined with experimental high-throughput screening, HTS. Often a single protein or, at most, a protein family is considered. Large scale VLS benchmarking across diverse protein families is rarely done, and the reported success rate is very low. Here, we demonstrate the experimental HTS validation of a novel VLS approach, FINDSITEcomb, across a diverse set of medically-relevant proteins. Results For eight different proteins belonging to different fold-classes and from diverse organisms, the top 1% of FINDSITEcomb’s VLS predictions were tested, and depending on the protein target, 4%-47% of the predicted ligands were shown to bind with μM or better affinities. In total, 47 small molecule binders were identified. Low nanomolar (nM) binders for dihydrofolate reductase and protein tyrosine phosphatases (PTPs) and micromolar binders for the other proteins were identified. Six novel molecules had cytotoxic activity (<10 μg/ml) against the HCT-116 colon carcinoma cell line and one novel molecule had potent antibacterial activity. Conclusions We show that FINDSITEcomb is a promising new VLS approach that can assist drug discovery. PMID:24936211

  14. Validation of WIND for a Series of Inlet Flows

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Abbott, John M.; Cavicchi, Richard H.

    2002-01-01

    Validation assessments compare WIND CFD simulations to experimental data for a series of inlet flows ranging in Mach number from low subsonic to hypersonic. The validation procedures follow the guidelines of the AIAA. The WIND code performs well in matching the available experimental data. The assessments demonstrate the use of WIND and provide confidence in its use for the analysis of aircraft inlets.

  15. Validations of calibration-free measurements of electron temperature using double-pass Thomson scattering diagnostics from theoretical and experimental aspects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tojo, H., E-mail: tojo.hiroshi@qst.go.jp; Hiratsuka, J.; Yatsuka, E.

    2016-09-15

    This paper evaluates the accuracy of electron temperature measurements and relative transmissivities of double-pass Thomson scattering diagnostics. The electron temperature (T{sub e}) is obtained from the ratio of signals from a double-pass scattering system, then relative transmissivities are calculated from the measured T{sub e} and intensity of the signals. How accurate the values are depends on the electron temperature (T{sub e}) and scattering angle (θ), and therefore the accuracy of the values was evaluated experimentally using the Large Helical Device (LHD) and the Tokyo spherical tokamak-2 (TST-2). Analyzing the data from the TST-2 indicates that a high T{sub e} andmore » a large scattering angle (θ) yield accurate values. Indeed, the errors for scattering angle θ = 135° are approximately half of those for θ = 115°. The method of determining the T{sub e} in a wide T{sub e} range spanning over two orders of magnitude (0.01–1.5 keV) was validated using the experimental results of the LHD and TST-2. A simple method to provide relative transmissivities, which include inputs from collection optics, vacuum window, optical fibers, and polychromators, is also presented. The relative errors were less than approximately 10%. Numerical simulations also indicate that the T{sub e} measurements are valid under harsh radiation conditions. This method to obtain T{sub e} can be considered for the design of Thomson scattering systems where there is high-performance plasma that generates harsh radiation environments.« less

  16. Validation of the actuator disc and actuator line techniques for yawed rotor flows using the New MEXICO experimental data

    NASA Astrophysics Data System (ADS)

    Breton, S.-P.; Shen, W. Z.; Ivanell, S.

    2017-05-01

    Experimental data acquired in the New MEXICO experiment on a yawed 4.5m diameter rotor model turbine are used here to validate the actuator line (AL) and actuator disc (AD) models implemented in the Large Eddy Simulation code EllipSys3D in terms of loading and velocity field. Even without modelling the geometry of the hub and nacelle, the AL and AD models produce similar results that are generally in good agreement with the experimental data under the various configurations considered. As expected, the AL model does better at capturing the induction effects from the individual blade tip vortices, while the AD model can reproduce the averaged features of the flow. The importance of using high quality airfoil data (including 3D corrections) as well as a fine grid resolution is highlighted by the results obtained. Overall, it is found that both models can satisfactorily predict the 3D velocity field and blade loading of the New MEXICO rotor under yawed inflow.

  17. Demonstrating Experimenter "Ineptitude" as a Means of Teaching Internal and External Validity

    ERIC Educational Resources Information Center

    Treadwell, Kimberli R.H.

    2008-01-01

    Internal and external validity are key concepts in understanding the scientific method and fostering critical thinking. This article describes a class demonstration of a "botched" experiment to teach validity to undergraduates. Psychology students (N = 75) completed assessments at the beginning of the semester, prior to and immediately following…

  18. RotCFD Software Validation - Computational and Experimental Data Comparison

    NASA Technical Reports Server (NTRS)

    Fernandez, Ovidio Montalvo

    2014-01-01

    RotCFD is a software intended to ease the design of NextGen rotorcraft. Since RotCFD is a new software still in the development process, the results need to be validated to determine the software's accuracy. The purpose of the present document is to explain one of the approaches to accomplish that goal.

  19. Students' Epistemologies about Experimental Physics: Validating the Colorado Learning Attitudes about Science Survey for Experimental Physics

    ERIC Educational Resources Information Center

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-01-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder…

  20. PBPK modeling for PFOS and PFOA: validation with human experimental data.

    PubMed

    Fàbrega, Francesc; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L; Nadal, Martí

    2014-10-15

    In recent years, because of the potential human toxicity, concern on perfluoroalkyl substances (PFASs) has increased notably with special attention to perfluorooctanoic acid (PFOA) and perfluorooctane sulfonate (PFOS). Unfortunately, there is currently an important knowledge gap on the burdens of these chemicals in most human tissues, as the reported studies have been mainly focused on plasma. In order to overcome these limitations, the use of physiologically-based pharmacokinetic (PBPK) models has been extended. The present study was aimed at testing an existing PBPK model for their predictability of PFOS and PFOA in a new case-study, and also to adapt it to estimate the PFAS content in human tissue compartments. Model validation was conducted by means of PFOA and PFOS concentrations in food and human drinking water from Tarragona County (Catalonia, Spain), and being the predicted results compared with those experimentally found in human tissues (blood, liver, kidney, liver and brain) of subjects from the same area of study. The use of human-derived partition coefficient (Pk) data was proven as more suitable for application to this PBPK model than rat-based Pk values. However, the uncertainty and variability of the data are still too high to get conclusive results. Consequently, further efforts should be carried out to reduce parametric uncertainty of PBPK models. More specifically, a deeper knowledge on the distribution of PFOA and PFOS within the human body should be obtained by enlarging the number of biological monitoring studies on PFASs. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Design and experimental validation of Unilateral Linear Halbach magnet arrays for single-sided magnetic resonance.

    PubMed

    Bashyam, Ashvin; Li, Matthew; Cima, Michael J

    2018-07-01

    Single-sided NMR has the potential for broad utility and has found applications in healthcare, materials analysis, food quality assurance, and the oil and gas industry. These sensors require a remote, strong, uniform magnetic field to perform high sensitivity measurements. We demonstrate a new permanent magnet geometry, the Unilateral Linear Halbach, that combines design principles from "sweet-spot" and linear Halbach magnets to achieve this goal through more efficient use of magnetic flux. We perform sensitivity analysis using numerical simulations to produce a framework for Unilateral Linear Halbach design and assess tradeoffs between design parameters. Additionally, the use of hundreds of small, discrete magnets within the assembly allows for a tunable design, improved robustness to variability in magnetization strength, and increased safety during construction. Experimental validation using a prototype magnet shows close agreement with the simulated magnetic field. The Unilateral Linear Halbach magnet increases the sensitivity, portability, and versatility of single-sided NMR. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Design and experimental validation of Unilateral Linear Halbach magnet arrays for single-sided magnetic resonance

    NASA Astrophysics Data System (ADS)

    Bashyam, Ashvin; Li, Matthew; Cima, Michael J.

    2018-07-01

    Single-sided NMR has the potential for broad utility and has found applications in healthcare, materials analysis, food quality assurance, and the oil and gas industry. These sensors require a remote, strong, uniform magnetic field to perform high sensitivity measurements. We demonstrate a new permanent magnet geometry, the Unilateral Linear Halbach, that combines design principles from "sweet-spot" and linear Halbach magnets to achieve this goal through more efficient use of magnetic flux. We perform sensitivity analysis using numerical simulations to produce a framework for Unilateral Linear Halbach design and assess tradeoffs between design parameters. Additionally, the use of hundreds of small, discrete magnets within the assembly allows for a tunable design, improved robustness to variability in magnetization strength, and increased safety during construction. Experimental validation using a prototype magnet shows close agreement with the simulated magnetic field. The Unilateral Linear Halbach magnet increases the sensitivity, portability, and versatility of single-sided NMR.

  3. Numerical simulation and experimental validation of the dynamics of multiple bubble merger during pool boiling under microgravity conditions.

    PubMed

    Abarajith, H S; Dhir, V K; Warrier, G; Son, G

    2004-11-01

    Numerical simulation and experimental validation of the growth and departure of multiple merging bubbles and associated heat transfer on a horizontal heated surface during pool boiling under variable gravity conditions have been performed. A finite difference scheme is used to solve the equations governing mass, momentum, and energy in the vapor liquid phases. The vapor-liquid interface is captured by a level set method that is modified to include the influence of phase change at the liquid-vapor interface. Water is used as test liquid. The effects of reduced gravity condition and orientation of the bubbles on the bubble diameter, interfacial structure, bubble merger time, and departure time, as well as local heat fluxes, are studied. In the experiments, multiple vapor bubbles are produced on artificial cavities in the 2-10 micrometer diameter range, microfabricated on the polished silicon wafer with given spacing. The wafer was heated electrically from the back with miniature strain gage type heating elements in order to control the nucleation superheat. The experiments conducted in normal Earth gravity and in the low gravity environment of KC-135 aircraft are used to validate the numerical simulations.

  4. Experimental investigation of an RNA sequence space

    NASA Technical Reports Server (NTRS)

    Lee, Youn-Hyung; Dsouza, Lisa; Fox, George E.

    1993-01-01

    Modern rRNAs are the historic consequence of an ongoing evolutionary exploration of a sequence space. These extant sequences belong to a special subset of the sequence space that is comprised only of those primary sequences that can validly perform the biological function(s) required of the particular RNA. If it were possible to readily identify all such valid sequences, stochastic predictions could be made about the relative likelihood of various evolutionary pathways available to an RNA. Herein an experimental system which can assess whether a particular sequence is likely to have validity as a eubacterial 5S rRNA is described. A total of ten naturally occurring, and hence known to be valid, sequences and two point mutants of unknown validity were used to test the usefulness of the approach. Nine of the ten valid sequences tested positive whereas both mutants tested as clearly defective. The tenth valid sequence gave results that would be interpreted as reflecting a borderline status were the answer not known. These results demonstrate that it is possible to experimentally determine which sequences in local regions of the sequence space are potentially valid 5S rRNAs.

  5. An experimental validation of a statistical-based damage detection approach.

    DOT National Transportation Integrated Search

    2011-01-01

    In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to : autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and : predicted beha...

  6. Experimental validation of a true-scale morphing flap for large civil aircraft applications

    NASA Astrophysics Data System (ADS)

    Pecora, R.; Amoroso, F.; Arena, M.; Noviello, M. C.; Rea, F.

    2017-04-01

    systems were duly analyzed and experimentally validated thus proving the overall device compliance with industrial standards and applicable airworthiness requirements.

  7. Internal Validity: A Must in Research Designs

    ERIC Educational Resources Information Center

    Cahit, Kaya

    2015-01-01

    In experimental research, internal validity refers to what extent researchers can conclude that changes in dependent variable (i.e. outcome) are caused by manipulations in independent variable. The causal inference permits researchers to meaningfully interpret research results. This article discusses (a) internal validity threats in social and…

  8. Systematic Validation of Protein Force Fields against Experimental Data

    PubMed Central

    Eastwood, Michael P.; Dror, Ron O.; Shaw, David E.

    2012-01-01

    Molecular dynamics simulations provide a vehicle for capturing the structures, motions, and interactions of biological macromolecules in full atomic detail. The accuracy of such simulations, however, is critically dependent on the force field—the mathematical model used to approximate the atomic-level forces acting on the simulated molecular system. Here we present a systematic and extensive evaluation of eight different protein force fields based on comparisons of experimental data with molecular dynamics simulations that reach a previously inaccessible timescale. First, through extensive comparisons with experimental NMR data, we examined the force fields' abilities to describe the structure and fluctuations of folded proteins. Second, we quantified potential biases towards different secondary structure types by comparing experimental and simulation data for small peptides that preferentially populate either helical or sheet-like structures. Third, we tested the force fields' abilities to fold two small proteins—one α-helical, the other with β-sheet structure. The results suggest that force fields have improved over time, and that the most recent versions, while not perfect, provide an accurate description of many structural and dynamical properties of proteins. PMID:22384157

  9. Thermodynamic Properties of CO{sub 2} Capture Reaction by Solid Sorbents: Theoretical Predictions and Experimental Validations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Yuhua; Luebke, David; Pennline, Henry

    2012-01-01

    It is generally accepted that current technologies for capturing CO{sub 2} are still too energy intensive. Hence, there is a critical need for development of new materials that can capture CO{sub 2} reversibly with acceptable energy costs. Accordingly, solid sorbents have been proposed to be used for CO{sub 2} capture applications through a reversible chemical transformation. By combining thermodynamic database mining with first principles density functional theory and phonon lattice dynamics calculations, a theoretical screening methodology to identify the most promising CO{sub 2} sorbent candidates from the vast array of possible solid materials has been proposed and validated. The calculatedmore » thermodynamic properties of different classes of solid materials versus temperature and pressure changes were further used to evaluate the equilibrium properties for the CO{sub 2} adsorption/desorption cycles. According to the requirements imposed by the pre- and post- combustion technologies and based on our calculated thermodynamic properties for the CO{sub 2} capture reactions by the solids of interest, we were able to screen only those solid materials for which lower capture energy costs are expected at the desired pressure and temperature conditions. These CO{sub 2} sorbent candidates were further considered for experimental validations. In this presentation, we first introduce our screening methodology with validating by solid dataset of alkali and alkaline metal oxides, hydroxides and bicarbonates which thermodynamic properties are available. Then, by studying a series of lithium silicates, we found that by increasing the Li{sub 2}O/SiO{sub 2} ratio in the lithium silicates their corresponding turnover temperatures for CO{sub 2} capture reactions can be increased. Compared to anhydrous K{sub 2}CO{sub 3}, the dehydrated K{sub 2}CO{sub 3}1.5H{sub 2}O can only be applied for post-combustion CO{sub 2} capture technology at temperatures lower than its phase

  10. Magnetohydrodynamic Modeling and Experimental Validation of Convection Inside Electromagnetically Levitated Co-Cu Droplets

    NASA Astrophysics Data System (ADS)

    Lee, Jonghyun; Matson, Douglas M.; Binder, Sven; Kolbe, Matthias; Herlach, Dieter; Hyers, Robert W.

    2014-06-01

    A magnetohydrodynamic model of internal convection of a molten Co-Cu droplet processed by the ground-based electromagnetic levitation (EML) was developed. For the calculation of the electromagnetic field generated by the copper coils, the simplified Maxwell's equations were solved. The calculated Lorentz force per volume was used as a momentum source in the Navier-Stokes equations, which were solved by using a commercial computational fluid dynamics package. The RNG k- ɛ model was adopted for the prediction of turbulent flow. For the validation of the developed model, a Co16Cu84 sample was tested using the EML facility in the German Aerospace Center, Cologne, Germany. The sample was subjected to a full melt cycle, during which the surface of the sample was captured by a high-speed camera. With a sufficient undercooling, the liquid phase separation occurred and the Co-rich liquid phase particles could be observed as they were floating on the surface along streamlines. The convection velocity was estimated by the combination of the displacement of the Co-rich particles and the temporal resolution of the high-speed camera. Both the numerical and experimental results showed an excellent agreement in the convection velocity on the surface.

  11. Theoretical modeling and experimental validation of a torsional piezoelectric vibration energy harvesting system

    NASA Astrophysics Data System (ADS)

    Qian, Feng; Zhou, Wanlu; Kaluvan, Suresh; Zhang, Haifeng; Zuo, Lei

    2018-04-01

    Vibration energy harvesting has been extensively studied in recent years to explore a continuous power source for sensor networks and low-power electronics. Torsional vibration widely exists in mechanical engineering; however, it has not yet been well exploited for energy harvesting. This paper presents a theoretical model and an experimental validation of a torsional vibration energy harvesting system comprised of a shaft and a shear mode piezoelectric transducer. The piezoelectric transducer position on the surface of the shaft is parameterized by two variables that are optimized to obtain the maximum power output. The piezoelectric transducer can work in d 15 mode (pure shear mode), coupled mode of d 31 and d 33, and coupled mode of d 33, d 31 and d 15, respectively, when attached at different angles. Approximate expressions of voltage and power are derived from the theoretical model, which gave predictions in good agreement with analytical solutions. Physical interpretations on the implicit relationship between the power output and the position parameters of the piezoelectric transducer is given based on the derived approximate expression. The optimal position and angle of the piezoelectric transducer is determined, in which case, the transducer works in the coupled mode of d 15, d 31 and d 33.

  12. PSI-Center Validation Studies

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Sutherland, D. A.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2014-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with 3D extended MHD simulations using the NIMROD, HiFi, and PSI-TET codes. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), HBT-EP (Columbia), HIT-SI (U Wash-UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition (BOD) is used to compare experiments with simulations. BOD separates data sets into spatial and temporal structures, giving greater weight to dominant structures. Several BOD metrics are being formulated with the goal of quantitive validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  13. Validating LES for Jet Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Bridges, James; Wernet, Mark P.

    2011-01-01

    Engineers charged with making jet aircraft quieter have long dreamed of being able to see exactly how turbulent eddies produce sound and this dream is now coming true with the advent of large eddy simulation (LES). Two obvious challenges remain: validating the LES codes at the resolution required to see the fluid-acoustic coupling, and the interpretation of the massive datasets that are produced. This paper addresses the former, the use of advanced experimental techniques such as particle image velocimetry (PIV) and Raman and Rayleigh scattering, to validate the computer codes and procedures used to create LES solutions. This paper argues that the issue of accuracy of the experimental measurements be addressed by cross-facility and cross-disciplinary examination of modern datasets along with increased reporting of internal quality checks in PIV analysis. Further, it argues that the appropriate validation metrics for aeroacoustic applications are increasingly complicated statistics that have been shown in aeroacoustic theory to be critical to flow-generated sound, such as two-point space-time velocity correlations. A brief review of data sources available is presented along with examples illustrating cross-facility and internal quality checks required of the data before it should be accepted for validation of LES.

  14. An eleven-year validation of a physically-based distributed dynamic ecohydorological model tRIBS+VEGGIE: Walnut Gulch Experimental Watershed

    NASA Astrophysics Data System (ADS)

    Sivandran, G.; Bisht, G.; Ivanov, V. Y.; Bras, R. L.

    2008-12-01

    A coupled, dynamic vegetation and hydrologic model, tRIBS+VEGGIE, was applied to the semiarid Walnut Gulch Experimental Watershed in Arizona. The physically-based, distributed nature of the coupled model allows for parameterization and simulation of watershed vegetation-water-energy dynamics on timescales varying from hourly to interannual. The model also allows for explicit spatial representation of processes that vary due to complex topography, such as lateral redistribution of moisture and partitioning of radiation with respect to aspect and slope. Model parameterization and forcing was conducted using readily available databases for topography, soil types, and land use cover as well as the data from network of meteorological stations located within the Walnut Gulch watershed. In order to test the performance of the model, three sets of simulations were conducted over an 11 year period from 1997 to 2007. Two simulations focus on heavily instrumented nested watersheds within the Walnut Gulch basin; (i) Kendall watershed, which is dominated by annual grasses; and (ii) Lucky Hills watershed, which is dominated by a mixture of deciduous and evergreen shrubs. The third set of simulations cover the entire Walnut Gulch Watershed. Model validation and performance were evaluated in relation to three broad categories; (i) energy balance components: the network of meteorological stations were used to validate the key energy fluxes; (ii) water balance components: the network of flumes, rain gauges and soil moisture stations installed within the watershed were utilized to validate the manner in which the model partitions moisture; and (iii) vegetation dynamics: remote sensing products from MODIS were used to validate spatial and temporal vegetation dynamics. Model results demonstrate satisfactory spatial and temporal agreement with observed data, giving confidence that key ecohydrological processes can be adequately represented for future applications of tRIBS+VEGGIE in

  15. HIPdb: a database of experimentally validated HIV inhibiting peptides.

    PubMed

    Qureshi, Abid; Thakur, Nishant; Kumar, Manoj

    2013-01-01

    Besides antiretroviral drugs, peptides have also demonstrated potential to inhibit the Human immunodeficiency virus (HIV). For example, T20 has been discovered to effectively block the HIV entry and was approved by the FDA as a novel anti-HIV peptide (AHP). We have collated all experimental information on AHPs at a single platform. HIPdb is a manually curated database of experimentally verified HIV inhibiting peptides targeting various steps or proteins involved in the life cycle of HIV e.g. fusion, integration, reverse transcription etc. This database provides experimental information of 981 peptides. These are of varying length obtained from natural as well as synthetic sources and tested on different cell lines. Important fields included are peptide sequence, length, source, target, cell line, inhibition/IC(50), assay and reference. The database provides user friendly browse, search, sort and filter options. It also contains useful services like BLAST and 'Map' for alignment with user provided sequences. In addition, predicted structure and physicochemical properties of the peptides are also included. HIPdb database is freely available at http://crdd.osdd.net/servers/hipdb. Comprehensive information of this database will be helpful in selecting/designing effective anti-HIV peptides. Thus it may prove a useful resource to researchers for peptide based therapeutics development.

  16. Design and experimental validation of linear and nonlinear vehicle steering control strategies

    NASA Astrophysics Data System (ADS)

    Menhour, Lghani; Lechner, Daniel; Charara, Ali

    2012-06-01

    This paper proposes the design of three control laws dedicated to vehicle steering control, two based on robust linear control strategies and one based on nonlinear control strategies, and presents a comparison between them. The two robust linear control laws (indirect and direct methods) are built around M linear bicycle models, each of these control laws is composed of two M proportional integral derivative (PID) controllers: one M PID controller to control the lateral deviation and the other M PID controller to control the vehicle yaw angle. The indirect control law method is designed using an oscillation method and a nonlinear optimisation subject to H ∞ constraint. The direct control law method is designed using a linear matrix inequality optimisation in order to achieve H ∞ performances. The nonlinear control method used for the correction of the lateral deviation is based on a continuous first-order sliding-mode controller. The different methods are designed using a linear bicycle vehicle model with variant parameters, but the aim is to simulate the nonlinear vehicle behaviour under high dynamic demands with a four-wheel vehicle model. These steering vehicle controls are validated experimentally using the data acquired using a laboratory vehicle, Peugeot 307, developed by National Institute for Transport and Safety Research - Department of Accident Mechanism Analysis Laboratory's (INRETS-MA) and their performance results are compared. Moreover, an unknown input sliding-mode observer is introduced to estimate the road bank angle.

  17. Experimental validation of a sub-surface model of solar power for distributed marine sensor systems

    NASA Astrophysics Data System (ADS)

    Hahn, Gregory G.; Cantin, Heather P.; Shafer, Michael W.

    2016-04-01

    The capabilities of distributed sensor systems such as marine wildlife telemetry tags could be significantly enhanced through the integration of photovoltaic modules. Photovoltaic cells could be used to supplement the primary batteries for wildlife telemetry tags to allow for extended tag deployments, wherein larger amounts of data could be collected and transmitted in near real time. In this article, we present experimental results used to validate and improve key aspects of our original model for sub-surface solar power. We discuss the test methods and results, comparing analytic predictions to experimental results. In a previous work, we introduced a model for sub-surface solar power that used analytic models and empirical data to predict the solar irradiance available for harvest at any depth under the ocean's surface over the course of a year. This model presented underwater photovoltaic transduction as a viable means of supplementing energy for marine wildlife telemetry tags. The additional data provided by improvements in daily energy budgets would enhance the temporal and spatial comprehension of the host's activities and/or environments. Photovoltaic transduction is one method that has not been widely deployed in the sub-surface marine environments despite widespread use on terrestrial and avian species wildlife tag systems. Until now, the use of photovoltaic cells for underwater energy harvesting has generally been disregarded as a viable energy source in this arena. In addition to marine telemetry systems, photovoltaic energy harvesting systems could also serve as a means of energy supply for autonomous underwater vehicles (AUVs), as well as submersible buoys for oceanographic data collection.

  18. Solar power plant performance evaluation: simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Natsheh, E. M.; Albarbar, A.

    2012-05-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P&O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  19. Linear-dichroic infrared spectroscopy—Validation and experimental design of the new orientation technique of solid samples as suspension in nematic liquid crystal

    NASA Astrophysics Data System (ADS)

    Ivanova, B. B.; Simeonov, V. D.; Arnaudov, M. G.; Tsalev, D. L.

    2007-05-01

    A validation of the developed new orientation method of solid samples as suspension in nematic liquid crystal (NLC), applied in linear-dichroic infrared (IR-LD) spectroscopy has been carried out using a model system DL-isoleucine ( DL-isoleu). Accuracy, precision and the influence of the liquid crystal medium on peak positions and integral absorbances of guest molecules have been presented. Optimization of experimental conditions has been performed as well. An experimental design for quantitative evaluation of the impact of four input factors: the number of scans, the rubbing-out of KBr-pellets, the amount of studied compounds included in the liquid crystal medium and the ratios of Lorentzian to Gaussian peak functions in the curve fitting procedure on the spectroscopic signal at five different frequencies, indicating important specifities of the system has been studied.

  20. Thermomechanical simulations and experimental validation for high speed incremental forming

    NASA Astrophysics Data System (ADS)

    Ambrogio, Giuseppina; Gagliardi, Francesco; Filice, Luigino; Romero, Natalia

    2016-10-01

    Incremental sheet forming (ISF) consists in deforming only a small region of the workspace through a punch driven by a NC machine. The drawback of this process is its slowness. In this study, a high speed variant has been investigated from both numerical and experimental points of view. The aim has been the design of a FEM model able to perform the material behavior during the high speed process by defining a thermomechanical model. An experimental campaign has been performed by a CNC lathe with high speed to test process feasibility. The first results have shown how the material presents the same performance than in conventional speed ISF and, in some cases, better material behavior due to the temperature increment. An accurate numerical simulation has been performed to investigate the material behavior during the high speed process confirming substantially experimental evidence.

  1. On-chip gradient generation in 256 microfluidic cell cultures: simulation and experimental validation.

    PubMed

    Somaweera, Himali; Haputhanthri, Shehan O; Ibraguimov, Akif; Pappas, Dimitri

    2015-08-07

    A microfluidic diffusion diluter was used to create a stable concentration gradient for dose response studies. The microfluidic diffusion diluter used in this study consisted of 128 culture chambers on each side of the main fluidic channel. A calibration method was used to find unknown concentrations with 12% error. Flow rate dependent studies showed that changing the flow rates generated different gradient patterns. Mathematical simulations using COMSOL Multi-physics were performed to validate the experimental data. The experimental data obtained for the flow rate studies agreed with the simulation results. Cells could be loaded into culture chambers using vacuum actuation and cultured for long times under low shear stress. Decreasing the size of the culture chambers resulted in faster gradient formation (20 min). Mass transport into the side channels of the microfluidic diffusion diluter used in this study is an important factor in creating the gradient using diffusional mixing as a function of the distance. To demonstrate the device's utility, an H2O2 gradient was generated while culturing Ramos cells. Cell viability was assayed in the 256 culture chambers, each at a discrete H2O2 concentration. As expected, the cell viability for the high concentration side channels increased (by injecting H2O2) whereas the cell viability in the low concentration side channels decreased along the chip due to diffusional mixing as a function of distance. COMSOL simulations were used to identify the effective concentration of H2O2 for cell viability in each side chamber at 45 min. The gradient effects were confirmed using traditional H2O2 culture experiments. Viability of cells in the microfluidic device under gradient conditions showed a linear relationship with the viability of the traditional culture experiment. Development of the microfluidic device used in this study could be used to study hundreds of concentrations of a compound in a single experiment.

  2. In silico analysis and experimental validation of azelastine hydrochloride (N4) targeting sodium taurocholate co-transporting polypeptide (NTCP) in HBV therapy.

    PubMed

    Fu, L-L; Liu, J; Chen, Y; Wang, F-T; Wen, X; Liu, H-Q; Wang, M-Y; Ouyang, L; Huang, J; Bao, J-K; Wei, Y-Q

    2014-08-01

    The aim of this study was to explore sodium taurocholate co-transporting polypeptide (NTCP) exerting its function with hepatitis B virus (HBV) and its targeted candidate compounds, in HBV therapy. Identification of NTCP as a novel HBV target for screening candidate small molecules, was used by phylogenetic analysis, network construction, molecular modelling, molecular docking and molecular dynamics (MD) simulation. In vitro virological examination, q-PCR, western blotting and cytotoxicity studies were used for validating efficacy of the candidate compound. We used the phylogenetic analysis of NTCP and constructed its protein-protein network. Also, we screened compounds from Drugbank and ZINC, among which five were validated for their authentication in HepG 2.2.15 cells. Then, we selected compound N4 (azelastine hydrochloride) as the most potent of them. This showed good inhibitory activity against HBsAg (IC50 = 7.5 μm) and HBeAg (IC50 = 3.7 μm), as well as high SI value (SI = 4.68). Further MD simulation results supported good interaction between compound N4 and NTCP. In silico analysis and experimental validation together demonstrated that compound N4 can target NTCP in HepG2.2.15 cells, which may shed light on exploring it as a potential anti-HBV drug. © 2014 John Wiley & Sons Ltd.

  3. Validation of the thermal challenge problem using Bayesian Belief Networks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFarland, John; Swiler, Laura Painton

    The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less

  4. Dynamic modelling and experimental validation of three wheeled tilting vehicles

    NASA Astrophysics Data System (ADS)

    Amati, Nicola; Festini, Andrea; Pelizza, Luigi; Tonoli, Andrea

    2011-06-01

    The present paper describes the study of the stability in the straight running of a three-wheeled tilting vehicle for urban and sub-urban mobility. The analysis was carried out by developing a multibody model in the Matlab/SimulinkSimMechanics environment. An Adams-Motorcycle model and an equivalent analytical model were developed for the cross-validation and for highlighting the similarities with the lateral dynamics of motorcycles. Field tests were carried out to validate the model and identify some critical parameters, such as the damping on the steering system. The stability analysis demonstrates that the lateral dynamic motions are characterised by vibration modes that are similar to that of a motorcycle. Additionally, it shows that the wobble mode is significantly affected by the castor trail, whereas it is only slightly affected by the dynamics of the front suspension. For the present case study, the frame compliance also has no influence on the weave and wobble.

  5. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John; Schmitz, Eric; Hoff, William

    1991-01-01

    This presentation describes the application of robotic and computer vision systems to validate docking and capture operations for space cargo transfer vehicles. Three applications are discussed: (1) air bearing systems in two dimensions that yield high quality free-flying, flexible, and contact dynamics; (2) validation of docking mechanisms with misalignment and target dynamics; and (3) computer vision technology for target location and real-time tracking. All the testbeds are supported by a network of engineering workstations for dynamic and controls analyses. Dynamic simulation of multibody rigid and elastic systems are performed with the TREETOPS code. MATRIXx/System-Build and PRO-MATLAB/Simulab are the tools for control design and analysis using classical and modern techniques such as H-infinity and LQG/LTR. SANDY is a general design tool to optimize numerically a multivariable robust compensator with a user-defined structure. Mathematica and Macsyma are used to derive symbolically dynamic and kinematic equations.

  6. Experimental validation of a numerical 3-D finite model applied to wind turbines design under vibration constraints: TREVISE platform

    NASA Astrophysics Data System (ADS)

    Sellami, Takwa; Jelassi, Sana; Darcherif, Abdel Moumen; Berriri, Hanen; Mimouni, Med Faouzi

    2018-04-01

    With the advancement of wind turbines towards complex structures, the requirement of trusty structural models has become more apparent. Hence, the vibration characteristics of the wind turbine components, like the blades and the tower, have to be extracted under vibration constraints. Although extracting the modal properties of blades is a simple task, calculating precise modal data for the whole wind turbine coupled to its tower/foundation is still a perplexing task. In this framework, this paper focuses on the investigation of the structural modeling approach of modern commercial micro-turbines. Thus, the structural model a complex designed wind turbine, which is Rutland 504, is established based on both experimental and numerical methods. A three-dimensional (3-D) numerical model of the structure was set up based on the finite volume method (FVM) using the academic finite element analysis software ANSYS. To validate the created model, experimental vibration tests were carried out using the vibration test system of TREVISE platform at ECAM-EPMI. The tests were based on the experimental modal analysis (EMA) technique, which is one of the most efficient techniques for identifying structures parameters. Indeed, the poles and residues of the frequency response functions (FRF), between input and output spectra, were calculated to extract the mode shapes and the natural frequencies of the structure. Based on the obtained modal parameters, the numerical designed model was up-dated.

  7. Vivaldi: visualization and validation of biomacromolecular NMR structures from the PDB.

    PubMed

    Hendrickx, Pieter M S; Gutmanas, Aleksandras; Kleywegt, Gerard J

    2013-04-01

    We describe Vivaldi (VIsualization and VALidation DIsplay; http://pdbe.org/vivaldi), a web-based service for the analysis, visualization, and validation of NMR structures in the Protein Data Bank (PDB). Vivaldi provides access to model coordinates and several types of experimental NMR data using interactive visualization tools, augmented with structural annotations and model-validation information. The service presents information about the modeled NMR ensemble, validation of experimental chemical shifts, residual dipolar couplings, distance and dihedral angle constraints, as well as validation scores based on empirical knowledge and databases. Vivaldi was designed for both expert NMR spectroscopists and casual non-expert users who wish to obtain a better grasp of the information content and quality of NMR structures in the public archive. Copyright © 2013 Wiley Periodicals, Inc.

  8. Experimentally validated multiphysics computational model of focusing and shock wave formation in an electromagnetic lithotripter.

    PubMed

    Fovargue, Daniel E; Mitran, Sorin; Smith, Nathan B; Sankin, Georgy N; Simmons, Walter N; Zhong, Pei

    2013-08-01

    A multiphysics computational model of the focusing of an acoustic pulse and subsequent shock wave formation that occurs during extracorporeal shock wave lithotripsy is presented. In the electromagnetic lithotripter modeled in this work the focusing is achieved via a polystyrene acoustic lens. The transition of the acoustic pulse through the solid lens is modeled by the linear elasticity equations and the subsequent shock wave formation in water is modeled by the Euler equations with a Tait equation of state. Both sets of equations are solved simultaneously in subsets of a single computational domain within the BEARCLAW framework which uses a finite-volume Riemann solver approach. This model is first validated against experimental measurements with a standard (or original) lens design. The model is then used to successfully predict the effects of a lens modification in the form of an annular ring cut. A second model which includes a kidney stone simulant in the domain is also presented. Within the stone the linear elasticity equations incorporate a simple damage model.

  9. Steam-air blown bubbling fluidized bed biomass gasification (BFBBG): Multi-scale models and experimental validation

    DOE PAGES

    Bates, Richard B.; Ghoniem, Ahmed F.; Jablonski, Whitney S.; ...

    2017-02-02

    During fluidized bed biomass gasification, complex gas-solid mixing patterns and numerous chemical and physical phenomena make identification of optimal operating conditions challenging. In this work, a parametric experimental campaign was carried out alongside the development of a coupled reactor network model which successfully integrates the individually validated sub-models to predict steady-state reactor performance metrics and outputs. The experiments utilized an integrated gasification system consisting of an externally-heated, bench-scale, 4-in., 5 kWth, fluidized bed steam/air blown gasifier fed with woody biomass equipped with a molecular beam mass spectrometer to directly measure tar species. The operating temperature (750-850°C) and air/fuel equivalence ratiomore » (ER = 0-0.157) were independently varied to isolate their effects. Elevating temperature is shown to improve the char gasification rate and reduce tar concentrations. In conclusion, air strongly impacts the composition of tar, accelerating the conversion of lighter polycyclic-aromatic hydrocarbons into soot precursors, while also improving the overall carbon conversion.« less

  10. Experimentally validated multiphysics computational model of focusing and shock wave formation in an electromagnetic lithotripter

    PubMed Central

    Fovargue, Daniel E.; Mitran, Sorin; Smith, Nathan B.; Sankin, Georgy N.; Simmons, Walter N.; Zhong, Pei

    2013-01-01

    A multiphysics computational model of the focusing of an acoustic pulse and subsequent shock wave formation that occurs during extracorporeal shock wave lithotripsy is presented. In the electromagnetic lithotripter modeled in this work the focusing is achieved via a polystyrene acoustic lens. The transition of the acoustic pulse through the solid lens is modeled by the linear elasticity equations and the subsequent shock wave formation in water is modeled by the Euler equations with a Tait equation of state. Both sets of equations are solved simultaneously in subsets of a single computational domain within the BEARCLAW framework which uses a finite-volume Riemann solver approach. This model is first validated against experimental measurements with a standard (or original) lens design. The model is then used to successfully predict the effects of a lens modification in the form of an annular ring cut. A second model which includes a kidney stone simulant in the domain is also presented. Within the stone the linear elasticity equations incorporate a simple damage model. PMID:23927200

  11. An experimental study of the validity of the heat-field concept for sonic-boom alleviation

    NASA Technical Reports Server (NTRS)

    Swigart, R. J.

    1974-01-01

    An experimental program was carried out in the NASA-Langley 4 ft x 4 ft supersonic pressure tunnel to investigate the validity of the heat-field concept for sonic boom alleviation. The concept involves heating the flow about a supersonic aircraft in such a manner as to obtain an increase in effective aircraft length and yield an effective aircraft shape that will result in a shock-free pressure signature on the ground. First, a basic body-of-revolution representing an SST configuration with its lift equivalence in volume was tested to provide a baseline pressure signature. Second, a model having a 5/2-power area distribution which, according to theory, should yield a linear pressure rise with no front shock wave was tested. Third, the concept of providing the 5/2-power area distribution by using an off-axis slender fin below the basic body was investigated. Then a substantial portion (approximately 40 percent) of the solid fin was replaced by a heat field generated by passing heated nitrogen through the rear of the fin.

  12. Experimental validation of phase-only pre-compensation over 494  m free-space propagation.

    PubMed

    Brady, Aoife; Berlich, René; Leonhard, Nina; Kopf, Teresa; Böttner, Paul; Eberhardt, Ramona; Reinlein, Claudia

    2017-07-15

    It is anticipated that ground-to-geostationary orbit (GEO) laser communication will benefit from pre-compensation of atmospheric turbulence for laser beam propagation through the atmosphere. Theoretical simulations and laboratory experiments have determined its feasibility; extensive free-space experimental validation has, however, yet to be fulfilled. Therefore, we designed and implemented an adaptive optical (AO)-box which pre-compensates an outgoing laser beam (uplink) using the measurements of an incoming beam (downlink). The setup was designed to approximate the baseline scenario over a horizontal test range of 0.5 km and consisted of a ground terminal with the AO-box and a simplified approximation of a satellite terminal. Our results confirmed that we could focus the uplink beam on the satellite terminal using AO under a point-ahead angle of 28 μrad. Furthermore, we demonstrated a considerable increase in the intensity received at the satellite. These results are further testimony to AO pre-compensation being a viable technique to enhance Earth-to-GEO optical communication.

  13. Principles for valid histopathologic scoring in research

    PubMed Central

    Gibson-Corley, Katherine N.; Olivier, Alicia K.; Meyerholz, David K.

    2013-01-01

    Histopathologic scoring is a tool by which semi-quantitative data can be obtained from tissues. Initially, a thorough understanding of the experimental design, study objectives and methods are required to allow the pathologist to appropriately examine tissues and develop lesion scoring approaches. Many principles go into the development of a scoring system such as tissue examination, lesion identification, scoring definitions and consistency in interpretation. Masking (a.k.a. “blinding”) of the pathologist to experimental groups is often necessary to constrain bias and multiple mechanisms are available. Development of a tissue scoring system requires appreciation of the attributes and limitations of the data (e.g. nominal, ordinal, interval and ratio data) to be evaluated. Incidence, ordinal and rank methods of tissue scoring are demonstrated along with key principles for statistical analyses and reporting. Validation of a scoring system occurs through two principal measures: 1) validation of repeatability and 2) validation of tissue pathobiology. Understanding key principles of tissue scoring can help in the development and/or optimization of scoring systems so as to consistently yield meaningful and valid scoring data. PMID:23558974

  14. Simulation study of amplitude-modulated (AM) harmonic motion imaging (HMI) for stiffness contrast quantification with experimental validation.

    PubMed

    Maleke, Caroline; Luo, Jianwen; Gamarnik, Viktor; Lu, Xin L; Konofagou, Elisa E

    2010-07-01

    The objective of this study is to show that Harmonic Motion Imaging (HMI) can be used as a reliable tumor-mapping technique based on the tumor's distinct stiffness at the early onset of disease. HMI is a radiation-force-based imaging method that generates a localized vibration deep inside the tissue to estimate the relative tissue stiffness based on the resulting displacement amplitude. In this paper, a finite-element model (FEM) study is presented, followed by an experimental validation in tissue-mimicking polyacrylamide gels and excised human breast tumors ex vivo. This study compares the resulting tissue motion in simulations and experiments at four different gel stiffnesses and three distinct spherical inclusion diameters. The elastic moduli of the gels were separately measured using mechanical testing. Identical transducer parameters were used in both the FEM and experimental studies, i.e., a 4.5-MHz single-element focused ultrasound (FUS) and a 7.5-MHz diagnostic (pulse-echo) transducer. In the simulation, an acoustic pressure field was used as the input stimulus to generate a localized vibration inside the target. Radiofrequency (rf) signals were then simulated using a 2D convolution model. A one-dimensional cross-correlation technique was performed on the simulated and experimental rf signals to estimate the axial displacement resulting from the harmonic radiation force. In order to measure the reliability of the displacement profiles in estimating the tissue stiffness distribution, the contrast-transfer efficiency (CTE) was calculated. For tumor mapping ex vivo, a harmonic radiation force was applied using a 2D raster-scan technique. The 2D HMI images of the breast tumor ex vivo could detect a malignant tumor (20 x 10 mm2) surrounded by glandular and fat tissues. The FEM and experimental results from both gels and breast tumors ex vivo demonstrated that HMI was capable of detecting and mapping the tumor or stiff inclusion with various diameters or

  15. Experimental validation of an analytical kinetic model for edge-localized modes in JET-ITER-like wall

    NASA Astrophysics Data System (ADS)

    Guillemaut, C.; Metzger, C.; Moulton, D.; Heinola, K.; O’Mullane, M.; Balboa, I.; Boom, J.; Matthews, G. F.; Silburn, S.; Solano, E. R.; contributors, JET

    2018-06-01

    The design and operation of future fusion devices relying on H-mode plasmas requires reliable modelling of edge-localized modes (ELMs) for precise prediction of divertor target conditions. An extensive experimental validation of simple analytical predictions of the time evolution of target plasma loads during ELMs has been carried out here in more than 70 JET-ITER-like wall H-mode experiments with a wide range of conditions. Comparisons of these analytical predictions with diagnostic measurements of target ion flux density, power density, impact energy and electron temperature during ELMs are presented in this paper and show excellent agreement. The analytical predictions tested here are made with the ‘free-streaming’ kinetic model (FSM) which describes ELMs as a quasi-neutral plasma bunch expanding along the magnetic field lines into the Scrape-Off Layer without collisions. Consequences of the FSM on energy reflection and deposition on divertor targets during ELMs are also discussed.

  16. Wetting boundary condition for the color-gradient lattice Boltzmann method: Validation with analytical and experimental data

    NASA Astrophysics Data System (ADS)

    Akai, Takashi; Bijeljic, Branko; Blunt, Martin J.

    2018-06-01

    In the color gradient lattice Boltzmann model (CG-LBM), a fictitious-density wetting boundary condition has been widely used because of its ease of implementation. However, as we show, this may lead to inaccurate results in some cases. In this paper, a new scheme for the wetting boundary condition is proposed which can handle complicated 3D geometries. The validity of our method for static problems is demonstrated by comparing the simulated results to analytical solutions in 2D and 3D geometries with curved boundaries. Then, capillary rise simulations are performed to study dynamic problems where the three-phase contact line moves. The results are compared to experimental results in the literature (Heshmati and Piri, 2014). If a constant contact angle is assumed, the simulations agree with the analytical solution based on the Lucas-Washburn equation. However, to match the experiments, we need to implement a dynamic contact angle that varies with the flow rate.

  17. Experimental validation of spatial Fourier transform-based multiple sound zone generation with a linear loudspeaker array.

    PubMed

    Okamoto, Takuma; Sakaguchi, Atsushi

    2017-03-01

    Generating acoustically bright and dark zones using loudspeakers is gaining attention as one of the most important acoustic communication techniques for such uses as personal sound systems and multilingual guide services. Although most conventional methods are based on numerical solutions, an analytical approach based on the spatial Fourier transform with a linear loudspeaker array has been proposed, and its effectiveness has been compared with conventional acoustic energy difference maximization and presented by computer simulations. To describe the effectiveness of the proposal in actual environments, this paper investigates the experimental validation of the proposed approach with rectangular and Hann windows and compared it with three conventional methods: simple delay-and-sum beamforming, contrast maximization, and least squares-based pressure matching using an actually implemented linear array of 64 loudspeakers in an anechoic chamber. The results of both the computer simulations and the actual experiments show that the proposed approach with a Hann window more accurately controlled the bright and dark zones than the conventional methods.

  18. Construct Validity: Advances in Theory and Methodology

    PubMed Central

    Strauss, Milton E.; Smith, Gregory T.

    2008-01-01

    Measures of psychological constructs are validated by testing whether they relate to measures of other constructs as specified by theory. Each test of relations between measures reflects on the validity of both the measures and the theory driving the test. Construct validation concerns the simultaneous process of measure and theory validation. In this chapter, we review the recent history of validation efforts in clinical psychological science that has led to this perspective, and we review five recent advances in validation theory and methodology of importance for clinical researchers. These are: the emergence of nonjustificationist philosophy of science; an increasing appreciation for theory and the need for informative tests of construct validity; valid construct representation in experimental psychopathology; the need to avoid representing multidimensional constructs with a single score; and the emergence of effective new statistical tools for the evaluation of convergent and discriminant validity. PMID:19086835

  19. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments.

    PubMed

    Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua

    2018-01-04

    Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    PubMed Central

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  1. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  2. Theoretical design and analysis of multivolume digital assays with wide dynamic range validated experimentally with microfluidic digital PCR.

    PubMed

    Kreutz, Jason E; Munson, Todd; Huynh, Toan; Shen, Feng; Du, Wenbin; Ismagilov, Rustem F

    2011-11-01

    This paper presents a protocol using theoretical methods and free software to design and analyze multivolume digital PCR (MV digital PCR) devices; the theory and software are also applicable to design and analysis of dilution series in digital PCR. MV digital PCR minimizes the total number of wells required for "digital" (single molecule) measurements while maintaining high dynamic range and high resolution. In some examples, multivolume designs with fewer than 200 total wells are predicted to provide dynamic range with 5-fold resolution similar to that of single-volume designs requiring 12,000 wells. Mathematical techniques were utilized and expanded to maximize the information obtained from each experiment and to quantify performance of devices and were experimentally validated using the SlipChip platform. MV digital PCR was demonstrated to perform reliably, and results from wells of different volumes agreed with one another. No artifacts due to different surface-to-volume ratios were observed, and single molecule amplification in volumes ranging from 1 to 125 nL was self-consistent. The device presented here was designed to meet the testing requirements for measuring clinically relevant levels of HIV viral load at the point-of-care (in plasma, <500 molecules/mL to >1,000,000 molecules/mL), and the predicted resolution and dynamic range was experimentally validated using a control sequence of DNA. This approach simplifies digital PCR experiments, saves space, and thus enables multiplexing using separate areas for each sample on one chip, and facilitates the development of new high-performance diagnostic tools for resource-limited applications. The theory and software presented here are general and are applicable to designing and analyzing other digital analytical platforms including digital immunoassays and digital bacterial analysis. It is not limited to SlipChip and could also be useful for the design of systems on platforms including valve-based and droplet

  3. Quantitative comparison of PZT and CMUT probes for photoacoustic imaging: Experimental validation.

    PubMed

    Vallet, Maëva; Varray, François; Boutet, Jérôme; Dinten, Jean-Marc; Caliano, Giosuè; Savoia, Alessandro Stuart; Vray, Didier

    2017-12-01

    Photoacoustic (PA) signals are short ultrasound (US) pulses typically characterized by a single-cycle shape, often referred to as N-shape. The spectral content of such wideband signals ranges from a few hundred kilohertz to several tens of megahertz. Typical reception frequency responses of classical piezoelectric US imaging transducers, based on PZT technology, are not sufficiently broadband to fully preserve the entire information contained in PA signals, which are then filtered, thus limiting PA imaging performance. Capacitive micromachined ultrasonic transducers (CMUT) are rapidly emerging as a valid alternative to conventional PZT transducers in several medical ultrasound imaging applications. As compared to PZT transducers, CMUTs exhibit both higher sensitivity and significantly broader frequency response in reception, making their use attractive in PA imaging applications. This paper explores the advantages of the CMUT larger bandwidth in PA imaging by carrying out an experimental comparative study using various CMUT and PZT probes from different research laboratories and manufacturers. PA acquisitions are performed on a suture wire and on several home-made bimodal phantoms with both PZT and CMUT probes. Three criteria, based on the evaluation of pure receive impulse response, signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) respectively, have been used for a quantitative comparison of imaging results. The measured fractional bandwidths of the CMUT arrays are larger compared to PZT probes. Moreover, both SNR and CNR are enhanced by at least 6 dB with CMUT technology. This work highlights the potential of CMUT technology for PA imaging through qualitative and quantitative parameters.

  4. Development of a new calibration procedure and its experimental validation applied to a human motion capture system.

    PubMed

    Royo Sánchez, Ana Cristina; Aguilar Martín, Juan José; Santolaria Mazo, Jorge

    2014-12-01

    Motion capture systems are often used for checking and analyzing human motion in biomechanical applications. It is important, in this context, that the systems provide the best possible accuracy. Among existing capture systems, optical systems are those with the highest accuracy. In this paper, the development of a new calibration procedure for optical human motion capture systems is presented. The performance and effectiveness of that new calibration procedure are also checked by experimental validation. The new calibration procedure consists of two stages. In the first stage, initial estimators of intrinsic and extrinsic parameters are sought. The camera calibration method used in this stage is the one proposed by Tsai. These parameters are determined from the camera characteristics, the spatial position of the camera, and the center of the capture volume. In the second stage, a simultaneous nonlinear optimization of all parameters is performed to identify the optimal values, which minimize the objective function. The objective function, in this case, minimizes two errors. The first error is the distance error between two markers placed in a wand. The second error is the error of position and orientation of the retroreflective markers of a static calibration object. The real co-ordinates of the two objects are calibrated in a co-ordinate measuring machine (CMM). The OrthoBio system is used to validate the new calibration procedure. Results are 90% lower than those from the previous calibration software and broadly comparable with results from a similarly configured Vicon system.

  5. Guidelines for experimental design protocol and validation procedure for the measurement of heat resistance of microorganisms in milk.

    PubMed

    Condron, Robin; Farrokh, Choreh; Jordan, Kieran; McClure, Peter; Ross, Tom; Cerf, Olivier

    2015-01-02

    Studies on the heat resistance of dairy pathogens are a vital part of assessing the safety of dairy products. However, harmonized methodology for the study of heat resistance of food pathogens is lacking, even though there is a need for such harmonized experimental design protocols and for harmonized validation procedures for heat treatment studies. Such an approach is of particular importance to allow international agreement on appropriate risk management of emerging potential hazards for human and animal health. This paper is working toward establishment of a harmonized protocol for the study of the heat resistance of pathogens, identifying critical issues for establishment of internationally agreed protocols, including a harmonized framework for reporting and interpretation of heat inactivation studies of potentially pathogenic microorganisms. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Experimental Validation of the Piezoelectric Triple Hybrid Actuation System (TriHYBAS)

    NASA Technical Reports Server (NTRS)

    Xu, Tian-Bing; Jiang, Xiaoning; Su, Ji

    2008-01-01

    A piezoelectric triple hybrid actuation system (TriHYBAS) has been developed. In this brief presentation of the validation process the displacement profile of TriHYBAS and findings regarding displacement versus applied voltage are highlighted.

  7. Experimental validation of pulsed column inventory estimators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beyerlein, A.L.; Geldard, J.F.; Weh, R.

    Near-real-time accounting (NRTA) for reprocessing plants relies on the timely measurement of all transfers through the process area and all inventory in the process. It is difficult to measure the inventory of the solvent contractors; therefore, estimation techniques are considered. We have used experimental data obtained at the TEKO facility in Karlsruhe and have applied computer codes developed at Clemson University to analyze this data. For uranium extraction, the computer predictions agree to within 15% of the measured inventories. We believe this study is significant in demonstrating that using theoretical models with a minimum amount of process data may bemore » an acceptable approach to column inventory estimation for NRTA. 15 refs., 7 figs.« less

  8. Prediction of hip joint load and translation using musculoskeletal modelling with force-dependent kinematics and experimental validation.

    PubMed

    Zhang, Xuan; Chen, Zhenxian; Wang, Ling; Yang, Wenjian; Li, Dichen; Jin, Zhongmin

    2015-07-01

    Musculoskeletal lower limb models are widely used to predict the resultant contact force in the hip joint as a non-invasive alternative to instrumented implants. Previous musculoskeletal models based on rigid body assumptions treated the hip joint as an ideal sphere with only three rotational degrees of freedom. An musculoskeletal model that considered force-dependent kinematics with three additional translational degrees of freedom was developed and validated in this study by comparing it with a previous experimental measurement. A 32-mm femoral head against a polyethylene cup was considered in the musculoskeletal model for calculating the contact forces. The changes in the main modelling parameters were found to have little influence on the hip joint forces (relative deviation of peak value < 10 BW%, mean trial deviation < 20 BW%). The centre of the hip joint translation was more sensitive to the changes in the main modelling parameters, especially muscle recruitment type (relative deviation of peak value < 20%, mean trial deviation < 0.02 mm). The predicted hip contact forces showed consistent profiles, compared with the experimental measurements, except in the lateral-medial direction. The ratio-average analysis, based on the Bland-Altman's plots, showed better limits of agreement in climbing stairs (mean limits of agreement: -2.0 to 6.3 in walking, mean limits of agreement: -0.5 to 3.1 in climbing stairs). Better agreement of the predicted hip contact forces was also found during the stance phase. The force-dependent kinematics approach underestimated the maximum hip contact force by a mean value of 6.68 ± 1.75% BW compared with the experimental measurements. The predicted maximum translations of the hip joint centres were 0.125 ± 0.03 mm in level walking and 0.123 ± 0.005 mm in climbing stairs. © IMechE 2015.

  9. A three-stage experimental strategy to evaluate and validate an interplate IC50 format.

    PubMed

    Rodrigues, Daniel J; Lyons, Richard; Laflin, Philip; Pointon, Wayne; Kammonen, Juha

    2007-12-01

    The serial dilution of compounds to establish potency against target enzymes or receptors can at times be a rate-limiting step in project progression. We have investigated the possibility of running 50% inhibitory concentration experiments in an interplate format, with dose ranges constructed across plates. The advantages associated with this format include a faster reformatting time for the compounds while also increasing the number of doses that can be potentially generated. These two factors, in particular, would lend themselves to a higher-throughput and more timely testing of compounds, while also maximizing chances to capture fully developed dose-response curves. The key objective from this work was to establish a strategy to assess the feasibility of an interplate format to ensure that the quality of data generated would be equivalent to historical formats used. A three-stage approach was adopted to assess and validate running an assay in an interplate format, compared to an intraplate format. Although the three-stage strategy was tested with two different assay formats, it would be necessary to investigate the feasibility for other assay types. The recommendation is that the three-stage experimental strategy defined here is used to assess feasibility of other assay formats used.

  10. The Geant4 physics validation repository

    NASA Astrophysics Data System (ADS)

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described.

  11. The Geant4 physics validation repository

    DOE PAGES

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-23

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described

  12. Validity of Highlighting on Text Comprehension

    NASA Astrophysics Data System (ADS)

    So, Joey C. Y.; Chan, Alan H. S.

    2009-10-01

    In this study, 38 university students were tested with a Chinese reading task on an LED display under different task conditions for determining the effects of the highlighting and its validity on comprehension performance on light-emitting diodes (LED) display for Chinese reading. Four levels of validity (0%, 33%, 67% and 100%) and a control condition with no highlighting were tested. Each subject was required to perform the five experimental conditions in which different passages were read and comprehended. The results showed that the condition with 100% validity of highlighting was found to have better comprehension performance than other validity levels and conditions with no highlighting. The comprehension score of the condition without highlighting effect was comparatively lower than those highlighting conditions with distracters, though not significant.

  13. Validating Experimental and Theoretical Langmuir Probe Analyses

    NASA Astrophysics Data System (ADS)

    Pilling, Lawrence Stuart; Carnegie, Dale

    2004-11-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a DC discharge plasma over a wide variety of conditions. This discharge contains a dual temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital motion limited (OML) is approximately the same as the radial motion gradients. An analysis of the gradients from the radial motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature. Only the position of the space charge potential is necessary to determine the applicable theory.

  14. Validating experimental and theoretical Langmuir probe analyses

    NASA Astrophysics Data System (ADS)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  15. Optimal control model predictions of system performance and attention allocation and their experimental validation in a display design study

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Govindaraj, T.

    1980-01-01

    The influence of different types of predictor displays in a longitudinal vertical takeoff and landing (VTOL) hover task is analyzed in a theoretical study. Several cases with differing amounts of predictive and rate information are compared. The optimal control model of the human operator is used to estimate human and system performance in terms of root-mean-square (rms) values and to compute optimized attention allocation. The only part of the model which is varied to predict these data is the observation matrix. Typical cases are selected for a subsequent experimental validation. The rms values as well as eye-movement data are recorded. The results agree favorably with those of the theoretical study in terms of relative differences. Better matching is achieved by revised model input data.

  16. Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview

    NASA Technical Reports Server (NTRS)

    Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard

    1996-01-01

    The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'

  17. Experimental validation of A-mode ultrasound acquisition system for computer assisted orthopaedic surgery

    NASA Astrophysics Data System (ADS)

    De Lorenzo, Danilo; De Momi, Elena; Beretta, Elisa; Cerveri, Pietro; Perona, Franco; Ferrigno, Giancarlo

    2009-02-01

    Computer Assisted Orthopaedic Surgery (CAOS) systems improve the results and the standardization of surgical interventions. Anatomical landmarks and bone surface detection is straightforward to either register the surgical space with the pre-operative imaging space and to compute biomechanical parameters for prosthesis alignment. Surface points acquisition increases the intervention invasiveness and can be influenced by the soft tissue layer interposition (7-15mm localization errors). This study is aimed at evaluating the accuracy of a custom-made A-mode ultrasound (US) system for non invasive detection of anatomical landmarks and surfaces. A-mode solutions eliminate the necessity of US images segmentation, offers real-time signal processing and requires less invasive equipment. The system consists in a single transducer US probe optically tracked, a pulser/receiver and an FPGA-based board, which is responsible for logic control command generation and for real-time signal processing and three custom-made board (signal acquisition, blanking and synchronization). We propose a new calibration method of the US system. The experimental validation was then performed measuring the length of known-shape polymethylmethacrylate boxes filled with pure water and acquiring bone surface points on a bovine bone phantom covered with soft-tissue mimicking materials. Measurement errors were computed through MR and CT images acquisitions of the phantom. Points acquisition on bone surface with the US system demonstrated lower errors (1.2mm) than standard pointer acquisition (4.2mm).

  18. Validation of lower body negative pressure as an experimental model of hemorrhage

    PubMed Central

    Shade, Robert E.; Muniz, Gary W.; Bauer, Cassondra; Goei, Kathleen A.; Pidcoke, Heather F.; Chung, Kevin K.; Cap, Andrew P.; Convertino, Victor A.

    2013-01-01

    Lower body negative pressure (LBNP), a model of hemorrhage (Hem), shifts blood to the legs and elicits central hypovolemia. This study compared responses to LBNP and actual Hem in sedated baboons. Arterial pressure, pulse pressure (PP), central venous pressure (CVP), heart rate, stroke volume (SV), and +dP/dt were measured. Hem steps were 6.25%, 12.5%, 18.75%, and 25% of total estimated blood volume. Shed blood was returned, and 4 wk after Hem, the same animals were subjected to four LBNP levels which elicited equivalent changes in PP and CVP observed during Hem. Blood gases, hematocrit (Hct), hemoglobin (Hb), plasma renin activity (PRA), vasopressin (AVP), epinephrine (EPI), and norepinephrine (NE) were measured at baseline and maximum Hem or LBNP. LBNP levels matched with 6.25%, 12.5%, 18.75%, and 25% hemorrhage were −22 ± 6, −41 ± 7, −54 ± 10, and −71 ± 7 mmHg, respectively (mean ± SD). Hemodynamic responses to Hem and LBNP were similar. SV decreased linearly such that 25% Hem and matching LBNP caused a 50% reduction in SV. Hem caused a decrease in Hct, Hb, and central venous oxygen saturation (ScvO2). In contrast, LBNP increased Hct and Hb, while ScvO2 remained unchanged. Hem caused greater elevations in AVP and NE than LBNP, while PRA, EPI, and other hematologic indexes did not differ between studies. These results indicate that while LBNP does not elicit the same effect on blood cell loss as Hem, LBNP mimics the integrative cardiovascular response to Hem, and validates the use of LBNP as an experimental model of central hypovolemia associated with Hem. PMID:24356525

  19. Fault-tolerant clock synchronization validation methodology. [in computer systems

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.

    1987-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.

  20. Considering RNAi experimental design in parasitic helminths.

    PubMed

    Dalzell, Johnathan J; Warnock, Neil D; McVeigh, Paul; Marks, Nikki J; Mousley, Angela; Atkinson, Louise; Maule, Aaron G

    2012-04-01

    Almost a decade has passed since the first report of RNA interference (RNAi) in a parasitic helminth. Whilst much progress has been made with RNAi informing gene function studies in disparate nematode and flatworm parasites, substantial and seemingly prohibitive difficulties have been encountered in some species, hindering progress. An appraisal of current practices, trends and ideals of RNAi experimental design in parasitic helminths is both timely and necessary for a number of reasons: firstly, the increasing availability of parasitic helminth genome/transcriptome resources means there is a growing need for gene function tools such as RNAi; secondly, fundamental differences and unique challenges exist for parasite species which do not apply to model organisms; thirdly, the inherent variation in experimental design, and reported difficulties with reproducibility undermine confidence. Ideally, RNAi studies of gene function should adopt standardised experimental design to aid reproducibility, interpretation and comparative analyses. Although the huge variations in parasite biology and experimental endpoints make RNAi experimental design standardization difficult or impractical, we must strive to validate RNAi experimentation in helminth parasites. To aid this process we identify multiple approaches to RNAi experimental validation and highlight those which we deem to be critical for gene function studies in helminth parasites.

  1. Experimental Study of Impinging Jets Flow-Fields

    DTIC Science & Technology

    2016-07-27

    1 Grant # N000141410830 Experimental Study of Impinging Jet Flow-Fields Final Report for Period: Jun 15, 2014 – Jun 14, 2016 PI: Dennis K...impinging jet model in the absence of any jet heating. The results of the computations had been compared with the experimental data produced in the...of the validity of the computations, and also of the experimental approach. Figure 12a. Initial single

  2. A PFC3D-based numerical simulation of cutting load for lunar rock simulant and experimental validation

    NASA Astrophysics Data System (ADS)

    Li, Peng; Jiang, Shengyuan; Tang, Dewei; Xu, Bo

    2017-05-01

    For sake of striking a balance between the need of drilling efficiency and the constrains of power budget on the moon, the penetrations per revolution of drill bit are generally limited in the range around 0.1 mm, and besides the geometric angle of the cutting blade need to be well designed. This paper introduces a simulation approach based on PFC3D (particle flow code 3 dimensions) for analyzing the cutting load feature on lunar rock simulant, which is derived from different geometric-angle blades with a small cutting depth. The mean values of the cutting force of five blades in the survey region (four on the boundary points and one on the center point) are selected as the macroscopic responses of model. The method of experimental design which includes Plackett-Burman (PB) design and central composite design (CCD) method is adopted in the matching procedure of microparameters in PFC model. Using the optimization method of enumeration, the optimum set of microparameters is acquired. Then, the experimental validation is implemented by using other twenty-five blades with different geometric angles, and the results from both simulations and laboratory tests give fair agreements. Additionally, the rock breaking process cut by different blades are quantified from simulation analysis. This research provides the theoretical support for the refinement of the rock cutting load prediction and the geometric design of cutting blade on the drill bit.

  3. Software aspects of the Geant4 validation repository

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Wenzel, Hans; Elvira, Daniel; Genser, Krzysztof; Yarba, Julia; Carminati, Federico; Folger, Gunter; Konstantinov, Dmitri; Pokorski, Witold; Ribon, Alberto

    2017-10-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  4. Software Aspects of the Geant4 Validation Repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dotti, Andrea; Wenzel, Hans; Elvira, Daniel

    2016-01-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientic Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  5. Collapse of a Liquid Column: Numerical Simulation and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Cruchaga, Marcela A.; Celentano, Diego J.; Tezduyar, Tayfun E.

    2007-03-01

    This paper is focused on the numerical and experimental analyses of the collapse of a liquid column. The measurements of the interface position in a set of experiments carried out with shampoo and water for two different initial column aspect ratios are presented together with the corresponding numerical predictions. The experimental procedure was found to provide acceptable recurrence in the observation of the interface evolution. Basic models describing some of the relevant physical aspects, e.g. wall friction and turbulence, are included in the simulations. Numerical experiments are conducted to evaluate the influence of the parameters involved in the modeling by comparing the results with the data from the measurements. The numerical predictions reasonably describe the physical trends.

  6. Brazilian Center for the Validation of Alternative Methods (BraCVAM) and the process of validation in Brazil.

    PubMed

    Presgrave, Octavio; Moura, Wlamir; Caldeira, Cristiane; Pereira, Elisabete; Bôas, Maria H Villas; Eskes, Chantra

    2016-03-01

    The need for the creation of a Brazilian centre for the validation of alternative methods was recognised in 2008, and members of academia, industry and existing international validation centres immediately engaged with the idea. In 2012, co-operation between the Oswaldo Cruz Foundation (FIOCRUZ) and the Brazilian Health Surveillance Agency (ANVISA) instigated the establishment of the Brazilian Center for the Validation of Alternative Methods (BraCVAM), which was officially launched in 2013. The Brazilian validation process follows OECD Guidance Document No. 34, where BraCVAM functions as the focal point to identify and/or receive requests from parties interested in submitting tests for validation. BraCVAM then informs the Brazilian National Network on Alternative Methods (RENaMA) of promising assays, which helps with prioritisation and contributes to the validation studies of selected assays. A Validation Management Group supervises the validation study, and the results obtained are peer-reviewed by an ad hoc Scientific Review Committee, organised under the auspices of BraCVAM. Based on the peer-review outcome, BraCVAM will prepare recommendations on the validated test method, which will be sent to the National Council for the Control of Animal Experimentation (CONCEA). CONCEA is in charge of the regulatory adoption of all validated test methods in Brazil, following an open public consultation. 2016 FRAME.

  7. Model based multivariable controller for large scale compression stations. Design and experimental validation on the LHC 18KW cryorefrigerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonne, François; Bonnay, Patrick; Alamir, Mazen

    2014-01-29

    In this paper, a multivariable model-based non-linear controller for Warm Compression Stations (WCS) is proposed. The strategy is to replace all the PID loops controlling the WCS with an optimally designed model-based multivariable loop. This new strategy leads to high stability and fast disturbance rejection such as those induced by a turbine or a compressor stop, a key-aspect in the case of large scale cryogenic refrigeration. The proposed control scheme can be used to have precise control of every pressure in normal operation or to stabilize and control the cryoplant under high variation of thermal loads (such as a pulsedmore » heat load expected to take place in future fusion reactors such as those expected in the cryogenic cooling systems of the International Thermonuclear Experimental Reactor ITER or the Japan Torus-60 Super Advanced fusion experiment JT-60SA). The paper details how to set the WCS model up to synthesize the Linear Quadratic Optimal feedback gain and how to use it. After preliminary tuning at CEA-Grenoble on the 400W@1.8K helium test facility, the controller has been implemented on a Schneider PLC and fully tested first on the CERN's real-time simulator. Then, it was experimentally validated on a real CERN cryoplant. The efficiency of the solution is experimentally assessed using a reasonable operating scenario of start and stop of compressors and cryogenic turbines. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.« less

  8. Model based multivariable controller for large scale compression stations. Design and experimental validation on the LHC 18KW cryorefrigerator

    NASA Astrophysics Data System (ADS)

    Bonne, François; Alamir, Mazen; Bonnay, Patrick; Bradu, Benjamin

    2014-01-01

    In this paper, a multivariable model-based non-linear controller for Warm Compression Stations (WCS) is proposed. The strategy is to replace all the PID loops controlling the WCS with an optimally designed model-based multivariable loop. This new strategy leads to high stability and fast disturbance rejection such as those induced by a turbine or a compressor stop, a key-aspect in the case of large scale cryogenic refrigeration. The proposed control scheme can be used to have precise control of every pressure in normal operation or to stabilize and control the cryoplant under high variation of thermal loads (such as a pulsed heat load expected to take place in future fusion reactors such as those expected in the cryogenic cooling systems of the International Thermonuclear Experimental Reactor ITER or the Japan Torus-60 Super Advanced fusion experiment JT-60SA). The paper details how to set the WCS model up to synthesize the Linear Quadratic Optimal feedback gain and how to use it. After preliminary tuning at CEA-Grenoble on the 400W@1.8K helium test facility, the controller has been implemented on a Schneider PLC and fully tested first on the CERN's real-time simulator. Then, it was experimentally validated on a real CERN cryoplant. The efficiency of the solution is experimentally assessed using a reasonable operating scenario of start and stop of compressors and cryogenic turbines. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.

  9. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  10. Squeaking friction phenomena in ceramic hip endoprosthesis: Modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Ouenzerfi, G.; Massi, F.; Renault, E.; Berthier, Y.

    2015-06-01

    The modern evolution of ceramic bearing surfaces for total hip arthroplasty has allowed longer implant longevity with lower amounts of osteolysis. It has been applied to younger patient expecting improved survivorship compared with traditional bearing surfaces. However, the phenomenon of an audible squeaking produced by implants during daily activities is reported as an annoying complication for patients. Although recent studies have been carried out on this topic, the origin of squeaking and the analysis of factors leading to this phenomenon are not completely identified. Numerical analyses are still not able to reproduce precisely the in vitro and in vivo observations. The lack of understanding on the physics of this issue is still an obstacle to find appropriate solutions to prevent it. In this paper, numerical and experimental approaches to reproduce squeaking are presented. A pre-stressed modal analysis is performed to identify the unstable eigenfrequencies that cause the vibrations and the perceived acoustic emission. The numerical results are validated by experiments on a laboratory test bench and the predicted frequencies are compared to the squeaking frequencies that can be found both in vitro and in vivo. The natural frequencies related to the femoral components are closer to the observed squeaking frequency. Simulations results confirmed that these vibrations are related to the stem dynamic response, which has a strong influence on the squeaking characteristic. In the other hand, the cup and the ceramic components play a main indirect role providing the frictional pair between the head and the liner. The analysis suggests that one of the possible mechanisms at the origin of squeaking is the modal coupling of two modes of vibration of the stem under frictional contact. The numerical model will allow for identifying the dominant factors and parameters affecting squeaking in order to avoid the unstable mode coupling. Squeaking can be reduced clinically by

  11. A recursive Bayesian approach for fatigue damage prognosis: An experimental validation at the reliability component level

    NASA Astrophysics Data System (ADS)

    Gobbato, Maurizio; Kosmatka, John B.; Conte, Joel P.

    2014-04-01

    Fatigue-induced damage is one of the most uncertain and highly unpredictable failure mechanisms for a large variety of mechanical and structural systems subjected to cyclic and random loads during their service life. A health monitoring system capable of (i) monitoring the critical components of these systems through non-destructive evaluation (NDE) techniques, (ii) assessing their structural integrity, (iii) recursively predicting their remaining fatigue life (RFL), and (iv) providing a cost-efficient reliability-based inspection and maintenance plan (RBIM) is therefore ultimately needed. In contribution to these objectives, the first part of the paper provides an overview and extension of a comprehensive reliability-based fatigue damage prognosis methodology — previously developed by the authors — for recursively predicting and updating the RFL of critical structural components and/or sub-components in aerospace structures. In the second part of the paper, a set of experimental fatigue test data, available in the literature, is used to provide a numerical verification and an experimental validation of the proposed framework at the reliability component level (i.e., single damage mechanism evolving at a single damage location). The results obtained from this study demonstrate (i) the importance and the benefits of a nearly continuous NDE monitoring system, (ii) the efficiency of the recursive Bayesian updating scheme, and (iii) the robustness of the proposed framework in recursively updating and improving the RFL estimations. This study also demonstrates that the proposed methodology can lead to either an extent of the RFL (with a consequent economical gain without compromising the minimum safety requirements) or an increase of safety by detecting a premature fault and therefore avoiding a very costly catastrophic failure.

  12. Experimental and theoretical investigations on the validity of the geometrical optics model for calculating the stability of optical traps.

    PubMed

    Schut, T C; Hesselink, G; de Grooth, B G; Greve, J

    1991-01-01

    We have developed a computer program based on the geometrical optics approach proposed by Roosen to calculate the forces on dielectric spheres in focused laser beams. We have explicitly taken into account the polarization of the laser light and thd divergence of the laser beam. The model can be used to evaluate the stability of optical traps in a variety of different optical configurations. Our calculations explain the experimental observation by Ashkin that a stable single-beam optical trap, without the help of the gravitation force, can be obtained with a strongly divergent laser beam. Our calculations also predict a different trap stability in the directions orthogonal and parallel to the polarization direction of the incident light. Different experimental methods were used to test the predictions of the model for the gravity trap. A new method for measuring the radiation force along the beam axis in both the stable and instable regions is presented. Measurements of the radiation force on polystyrene spheres with diameters of 7.5 and 32 microns in a TEM00-mode laser beam showed a good qualitative correlation with the predictions and a slight quantitative difference. The validity of the geometrical approximations involved in the model will be discussed for spheres of different sizes and refractive indices.

  13. Identification and experimental validation of damping ratios of different human body segments through anthropometric vibratory model in standing posture.

    PubMed

    Gupta, T C

    2007-08-01

    A 15 degrees of freedom lumped parameter vibratory model of human body is developed, for vertical mode vibrations, using anthropometric data of the 50th percentile US male. The mass and stiffness of various segments are determined from the elastic modulii of bones and tissues and from the anthropometric data available, assuming the shape of all the segments is ellipsoidal. The damping ratio of each segment is estimated on the basis of the physical structure of the body in a particular posture. Damping constants of various segments are calculated from these damping ratios. The human body is modeled as a linear spring-mass-damper system. The optimal values of the damping ratios of the body segments are estimated, for the 15 degrees of freedom model of the 50th percentile US male, by comparing the response of the model with the experimental response. Formulating a similar vibratory model of the 50th percentile Indian male and comparing the frequency response of the model with the experimental response of the same group of subjects validate the modeling procedure. A range of damping ratios has been considered to develop a vibratory model, which can predict the vertical harmonic response of the human body.

  14. Development of a Conservative Model Validation Approach for Reliable Analysis

    DTIC Science & Technology

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  15. Partition method and experimental validation for impact dynamics of flexible multibody system

    NASA Astrophysics Data System (ADS)

    Wang, J. Y.; Liu, Z. Y.; Hong, J. Z.

    2018-06-01

    The impact problem of a flexible multibody system is a non-smooth, high-transient, and strong-nonlinear dynamic process with variable boundary. How to model the contact/impact process accurately and efficiently is one of the main difficulties in many engineering applications. The numerical approaches being used widely in impact analysis are mainly from two fields: multibody system dynamics (MBS) and computational solid mechanics (CSM). Approaches based on MBS provide a more efficient yet less accurate analysis of the contact/impact problems, while approaches based on CSM are well suited for particularly high accuracy needs, yet require very high computational effort. To bridge the gap between accuracy and efficiency in the dynamic simulation of a flexible multibody system with contacts/impacts, a partition method is presented considering that the contact body is divided into two parts, an impact region and a non-impact region. The impact region is modeled using the finite element method to guarantee the local accuracy, while the non-impact region is modeled using the modal reduction approach to raise the global efficiency. A three-dimensional rod-plate impact experiment is designed and performed to validate the numerical results. The principle for how to partition the contact bodies is proposed: the maximum radius of the impact region can be estimated by an analytical method, and the modal truncation orders of the non-impact region can be estimated by the highest frequency of the signal measured. The simulation results using the presented method are in good agreement with the experimental results. It shows that this method is an effective formulation considering both accuracy and efficiency. Moreover, a more complicated multibody impact problem of a crank slider mechanism is investigated to strengthen this conclusion.

  16. Design and validation of instruments to measure knowledge.

    PubMed

    Elliott, T E; Regal, R R; Elliott, B A; Renier, C M

    2001-01-01

    Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.

  17. Field validation of experimental challenge models for IPN vaccines.

    PubMed

    Ramstad, A; Romstad, A B; Knappskog, D H; Midtlyng, P J

    2007-12-01

    Atlantic salmon S1/2 pre-smolts from the VESO Vikan hatchery were assigned to study groups, i.p. immunized with commercially available, multivalent oil-adjuvanted vaccines with (Norvax Compact 6 - NC-6) or without (Norvax Compact 4 - NC-4) recombinant infectious pancreatic necrosis virus (IPNV) antigen. A control group received saline solution. When ready for sea, the fish were transported to the VESO Vikan experimental laboratory, where two identical tanks were stocked with 75 fish per group before being transferred to 10 degrees C sea water and exposed by bath to first passage IPNV grown in CHSE-214 cells. The third tank containing 40 fish from each group was challenged by the introduction of 116 fish that had received an i.p injection of IPNV-challenge material. The remaining vaccinated fish were transported to the VESO Vikan marine field trial site and placed in two identical pens, each containing approximately 53 000 fish from the NC-6 group and 9000 fish from the NC-4 group. In the experimental bath challenge trial, the cumulative mortality was 75% and 78% in the control groups, and the relative percentage survival (RPS) of the NC-6-immunized fish vs. the reference vaccine groups was 60% and 82%, respectively. In the cohabitation challenge, the control mortality reached 74% and the IPNV-specific vaccine RPS was 72%. In both models, the reference vaccine lacking IPNV antigen gave a moderate but statistically significant non-specific protection. In the field, a natural outbreak of infectious pancreatic necrosis (IPN) occurred after 7 weeks lasting for approximately 3.5 months before problems due to winter ulcers became dominating. During this outbreak, mortality in the NC-4 groups were 33.5% and 31.6%, respectively, whereas mortality in the NC-6 groups were 6.9% and 5.3%, respectively, amounting to 81% IPNV-specific protection. In conclusion, the IPN protection estimates obtained by experimental challenges were consistent between tanks, and were confirmed by

  18. A theoretical model of the application of RF energy to the airway wall and its experimental validation.

    PubMed

    Jarrard, Jerry; Wizeman, Bill; Brown, Robert H; Mitzner, Wayne

    2010-11-27

    Bronchial thermoplasty is a novel technique designed to reduce an airway's ability to contract by reducing the amount of airway smooth muscle through controlled heating of the airway wall. This method has been examined in animal models and as a treatment for asthma in human subjects. At the present time, there has been little research published about how radiofrequency (RF) energy and heat is transferred to the airways of the lung during bronchial thermoplasty procedures. In this manuscript we describe a computational, theoretical model of the delivery of RF energy to the airway wall. An electro-thermal finite-element-analysis model was designed to simulate the delivery of temperature controlled RF energy to airway walls of the in vivo lung. The model includes predictions of heat generation due to RF joule heating and transfer of heat within an airway wall due to thermal conduction. To implement the model, we use known physical characteristics and dimensions of the airway and lung tissues. The model predictions were tested with measurements of temperature, impedance, energy, and power in an experimental canine model. Model predictions of electrode temperature, voltage, and current, along with tissue impedance and delivered energy were compared to experiment measurements and were within ± 5% of experimental averages taken over 157 sample activations.The experimental results show remarkable agreement with the model predictions, and thus validate the use of this model to predict the heat generation and transfer within the airway wall following bronchial thermoplasty. The model also demonstrated the importance of evaporation as a loss term that affected both electrical measurements and heat distribution. The model predictions showed excellent agreement with the empirical results, and thus support using the model to develop the next generation of devices for bronchial thermoplasty. Our results suggest that comparing model results to RF generator electrical measurements

  19. CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction

    NASA Technical Reports Server (NTRS)

    Davis, David O.

    2015-01-01

    Experimental investigations of specific flow phenomena, e.g., Shock Wave Boundary-Layer Interactions (SWBLI), provide great insight to the flow behavior but often lack the necessary details to be useful as CFD validation experiments. Reasons include: 1.Undefined boundary conditions Inconsistent results 2.Undocumented 3D effects (CL only measurements) 3.Lack of uncertainty analysis While there are a number of good subsonic experimental investigations that are sufficiently documented to be considered test cases for CFD and turbulence model validation, the number of supersonic and hypersonic cases is much less. This was highlighted by Settles and Dodsons [1] comprehensive review of available supersonic and hypersonic experimental studies. In all, several hundred studies were considered for their database.Of these, over a hundred were subjected to rigorous acceptance criteria. Based on their criteria, only 19 (12 supersonic, 7 hypersonic) were considered of sufficient quality to be used for validation purposes. Aeschliman and Oberkampf [2] recognized the need to develop a specific methodology for experimental studies intended specifically for validation purposes.

  20. Free-space optical channel characterization and experimental validation in a coastal environment.

    PubMed

    Alheadary, Wael G; Park, Ki-Hong; Alfaraj, Nasir; Guo, Yujian; Stegenburgs, Edgars; Ng, Tien Khee; Ooi, Boon S; Alouini, Mohamed-Slim

    2018-03-19

    Over the years, free-space optical (FSO) communication has attracted considerable research interest owing to its high transmission rates via the unbounded and unlicensed bandwidths. Nevertheless, various weather conditions lead to significant deterioration of the FSO link capabilities. In this context, we report on the modelling of the channel attenuation coefficient (β) for a coastal environment and related ambient, considering the effect of coastal air temperature (T), relative humidity (RH) and dew point (TD) by employing a mobile FSO communication system capable of achieving a transmission rate of 1 Gbps at an outdoor distance of 70 m for optical beam wavelengths of 1310 nm and 1550 nm. For further validation of the proposed models, an indoor measurement over a 1.5 m distance utilizing 1310 nm, 1550 nm, and 1064 nm lasers was also performed. The first model provides a general link between T and β, while the second model provides a relation between β, RH as well as TD. By validating our attenuation coefficient model with actual outdoor and indoor experiments, we obtained a scaling parameter x and decaying parameter c values of 19.94, 40.02, 45.82 and 0.03015, 0.04096, 0.0428 for wavelengths of 1550, 1310, 1064 nm, respectively. The proposed models are well validated over the large variation of temperature and humidity over the FSO link in a coastal region and emulated indoor environment.

  1. Interaction of 1.319 μm laser with skin: an optical-thermal-damage model and experimental validation

    NASA Astrophysics Data System (ADS)

    Jiao, Luguang; Yang, Zaifu; Wang, Jiarui

    2014-09-01

    With the widespread use of high-power laser systems operating within the wavelength region of approximately 1.3 to 1.4 μm, it becomes very necessary to refine the laser safety guidelines setting the exposure limits for the eye and skin. In this paper, an optical-thermal-damage model was developed to simulate laser propagation, energy deposition, heat transfer and thermal damage in the skin for 1.319 μm laser irradiation. Meanwhile, an experiment was also conducted in vitro to measure the tempreture history of a porcine skin specimen irradiated by a 1.319 μm laser. Predictions from the model included light distribution in the skin, temperature response and thermal damge level of the tissue. It was shown that the light distribution region was much larger than that of the incident laser at the wavelength of 1.319 μm, and the maximum value of the fluence rate located on the interior region of the skin, not on the surface. By comparing the calculated temperature curve with the experimentally recorded temperautre data, good agreement was shown betweeen them, which validated the numerical model. The model also indicated that the damage integral changed little when the temperature of skin tissue was lower than about 55 °C, after that, the integral increased rapidly and denatunation of the tissue would occur. Based on this model, we can further explore the damage mechanisms and trends for the skin and eye within the wavelength region of 1.3 μm to 1.4 μm, incorporating with in vivo experimental investigations.

  2. Characterization and experimental validation of a squeeze film damper with MR fluid in a rotor-bearing system

    NASA Astrophysics Data System (ADS)

    Dominguez-Nuñez, L. A.; Silva-Navarro, G.

    2014-04-01

    The general study and applications of Magneto-Rhelogical (MR) dampers have been spread in the lasts years but only some studies have been focusing on the vibration control problems on rotor-bearings systems. Squeeze-Film Dampers (SFD) are now commonly used to passively control the vibration response on rotor-bearing systems because they can provide flexibility, damping and extend the so-called stability thresholds in rotating machinery. More recently, SFD are combined with MR or Electro-Rheological (ER) fluids to introduce a semiactive control mechanism to modify the rotordynamic coefficients and deal with the robust performance of the overall system response for higher operating speeds. There are, however, some theoretical and technological problems that complicate their extensive use, like the relationship between the centering spring flexibility and the rheological behavior of the smart fluid to produce the SFD forces. In this work it is considered a SFD with MR fluid and a set of circular section beams in a squirrel cage arrangement in combination with latex seals as centering springs. The mathematical model analysis includes the controllable viscoelastic properties associated to the MR fluid. The characterization of the SFD is made by the determination of some coefficients associated with a modified Choi-Lee-Park polynomial model. During the analysis is considered a rotor-bearing system modeled using finite element methods. The SFD with MR fluid is connected to an experimental platform to validate and experimentally evaluate the overall system. Finally, to improve the open-loop system performance, a methodology for the use of different control schemes is proposed.

  3. Experimental Validation of L1 Adaptive Control: Rohrs' Counterexample in Flight

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Issac; Kitsios, Ioannis; Cao, Chengyu; Gregory, Irene M.; Valavani, Lena

    2010-01-01

    The paper presents new results on the verification and in-flight validation of an L1 adaptive flight control system, and proposes a general methodology for verification and validation of adaptive flight control algorithms. The proposed framework is based on Rohrs counterexample, a benchmark problem presented in the early 80s to show the limitations of adaptive controllers developed at that time. In this paper, the framework is used to evaluate the performance and robustness characteristics of an L1 adaptive control augmentation loop implemented onboard a small unmanned aerial vehicle. Hardware-in-the-loop simulations and flight test results confirm the ability of the L1 adaptive controller to maintain stability and predictable performance of the closed loop adaptive system in the presence of general (artificially injected) unmodeled dynamics. The results demonstrate the advantages of L1 adaptive control as a verifiable robust adaptive control architecture with the potential of reducing flight control design costs and facilitating the transition of adaptive control into advanced flight control systems.

  4. A method for landing gear modeling and simulation with experimental validation

    NASA Technical Reports Server (NTRS)

    Daniels, James N.

    1996-01-01

    This document presents an approach for modeling and simulating landing gear systems. Specifically, a nonlinear model of an A-6 Intruder Main Gear is developed, simulated, and validated against static and dynamic test data. This model includes nonlinear effects such as a polytropic gas model, velocity squared damping, a geometry governed model for the discharge coefficients, stick-slip friction effects and a nonlinear tire spring and damping model. An Adams-Moulton predictor corrector was used to integrate the equations of motion until a discontinuity caused by a stick-slip friction model was reached, at which point, a Runga-Kutta routine integrated past the discontinuity and returned the problem solution back to the predictor corrector. Run times of this software are around 2 mins. per 1 sec. of simulation under dynamic circumstances. To validate the model, engineers at the Aircraft Landing Dynamics facilities at NASA Langley Research Center installed one A-6 main gear on a drop carriage and used a hydraulic shaker table to provide simulated runway inputs to the gear. Model parameters were tuned to produce excellent agreement for many cases.

  5. Experimental validation of a transformation optics based lens for beam steering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Jianjia; Burokur, Shah Nawaz, E-mail: shah-nawaz.burokur@u-psud.fr; Lustrac, André de

    2015-10-12

    A transformation optics based lens for beam control is experimentally realized and measured at microwave frequencies. Laplace's equation is adopted to construct the mapping between the virtual and physical spaces. The metamaterial-based lens prototype is designed using electric LC resonators. A planar microstrip antenna source is used as transverse electric polarized wave launcher for the lens. Both the far field radiation patterns and the near-field distributions have been measured to experimentally demonstrate the beam steering properties. Measurements agree quantitatively and qualitatively with numerical simulations, and a non-narrow frequency bandwidth operation is observed.

  6. Experimental comparison and validation of hot-ball method with guarded hot plate method on polyurethane foams

    NASA Astrophysics Data System (ADS)

    Hudec, Ján; Glorieux, Christ; Dieška, Peter; Kubičár, Ľudovít

    2016-07-01

    The Hot-ball method is an innovative transient method for measuring thermophysical properties. The principle is based on heating of a small ball, incorporated in measured medium, by constant heating power and simultaneous measuring of the ball's temperature response since the heating was initiated. The shape of the temperature response depends on thermophysical properties of the medium, where the sensor is placed. This method is patented by Institute of Physics, SAS, where the method and sensors based on this method are being developed. At the beginning of the development of sensors for this method we were oriented on monitoring applications, where relative precision is much more important than accuracy. Meanwhile, the quality of sensors was improved good enough to be used for a new application - absolute measuring of thermophysical parameters of low thermally conductive materials. This paper describes experimental verification and validation of measurement by hot-ball method. Thanks to cooperation with Laboratory of Soft Matter and Biophysics of Catholic University of Leuven in Belgium, established Guarded Hot Plate method was used as a reference. Details about measuring setups, description of the experiments and results of the comparison are presented.

  7. Numerical prediction of fiber orientation in injection-molded short-fiber/thermoplastic composite parts with experimental validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thi, Thanh Binh Nguyen; Morioka, Mizuki; Yokoyama, Atsushi

    Numerical prediction of the fiber orientation in the short-glass fiber (GF) reinforced polyamide 6 (PA6) composites with the fiber weight concentration of 30%, 50%, and 70% manufactured by the injection molding process is presented. And the fiber orientation was also directly observed and measured through X-ray computed tomography. During the injection molding process of the short-fiber/thermoplastic composite, the fiber orientation is produced by the flow states and the fiber-fiber interaction. Folgar and Tucker equation is the well known for modeling the fiber orientation in a concentrated suspension. They included into Jeffrey’s equation a diffusive type of term by introducing amore » phenomenological coefficient to account for the fiber-fiber interaction. Our developed model for the fiber-fiber interaction was proposed by modifying the rotary diffusion term of the Folgar-Tucker equation. This model was presented in a conference paper of the 29{sup th} International Conference of the Polymer Processing Society published by AIP conference proceeding. For modeling fiber interaction, the fiber dynamic simulation was introduced in order to obtain a global fiber interaction coefficient, which is sum function of the fiber concentration, aspect ratio, and angular velocity. The fiber orientation is predicted by using the proposed fiber interaction model incorporated into a computer aided engineering simulation package C-Mold. An experimental program has been carried out in which the fiber orientation distribution has been measured in 100 x 100 x 2 mm injection-molded plate and 100 x 80 x 2 mm injection-molded weld by analyzed with a high resolution 3D X-ray computed tomography system XVA-160α, and calculated by X-ray computed tomography imaging. The numerical prediction shows a good agreement with experimental validation. And the complex fiber orientation in the injection-molded weld was investigated.« less

  8. Numerical prediction of fiber orientation in injection-molded short-fiber/thermoplastic composite parts with experimental validation

    NASA Astrophysics Data System (ADS)

    Thi, Thanh Binh Nguyen; Morioka, Mizuki; Yokoyama, Atsushi; Hamanaka, Senji; Yamashita, Katsuhisa; Nonomura, Chisato

    2015-05-01

    Numerical prediction of the fiber orientation in the short-glass fiber (GF) reinforced polyamide 6 (PA6) composites with the fiber weight concentration of 30%, 50%, and 70% manufactured by the injection molding process is presented. And the fiber orientation was also directly observed and measured through X-ray computed tomography. During the injection molding process of the short-fiber/thermoplastic composite, the fiber orientation is produced by the flow states and the fiber-fiber interaction. Folgar and Tucker equation is the well known for modeling the fiber orientation in a concentrated suspension. They included into Jeffrey's equation a diffusive type of term by introducing a phenomenological coefficient to account for the fiber-fiber interaction. Our developed model for the fiber-fiber interaction was proposed by modifying the rotary diffusion term of the Folgar-Tucker equation. This model was presented in a conference paper of the 29th International Conference of the Polymer Processing Society published by AIP conference proceeding. For modeling fiber interaction, the fiber dynamic simulation was introduced in order to obtain a global fiber interaction coefficient, which is sum function of the fiber concentration, aspect ratio, and angular velocity. The fiber orientation is predicted by using the proposed fiber interaction model incorporated into a computer aided engineering simulation package C-Mold. An experimental program has been carried out in which the fiber orientation distribution has been measured in 100 x 100 x 2 mm injection-molded plate and 100 x 80 x 2 mm injection-molded weld by analyzed with a high resolution 3D X-ray computed tomography system XVA-160α, and calculated by X-ray computed tomography imaging. The numerical prediction shows a good agreement with experimental validation. And the complex fiber orientation in the injection-molded weld was investigated.

  9. Fractional differential equations based modeling of microbial survival and growth curves: model development and experimental validation.

    PubMed

    Kaur, A; Takhar, P S; Smith, D M; Mann, J E; Brashears, M M

    2008-10-01

    A fractional differential equations (FDEs)-based theory involving 1- and 2-term equations was developed to predict the nonlinear survival and growth curves of foodborne pathogens. It is interesting to note that the solution of 1-term FDE leads to the Weibull model. Nonlinear regression (Gauss-Newton method) was performed to calculate the parameters of the 1-term and 2-term FDEs. The experimental inactivation data of Salmonella cocktail in ground turkey breast, ground turkey thigh, and pork shoulder; and cocktail of Salmonella, E. coli, and Listeria monocytogenes in ground beef exposed at isothermal cooking conditions of 50 to 66 degrees C were used for validation. To evaluate the performance of 2-term FDE in predicting the growth curves-growth of Salmonella typhimurium, Salmonella Enteritidis, and background flora in ground pork and boneless pork chops; and E. coli O157:H7 in ground beef in the temperature range of 22.2 to 4.4 degrees C were chosen. A program was written in Matlab to predict the model parameters and survival and growth curves. Two-term FDE was more successful in describing the complex shapes of microbial survival and growth curves as compared to the linear and Weibull models. Predicted curves of 2-term FDE had higher magnitudes of R(2) (0.89 to 0.99) and lower magnitudes of root mean square error (0.0182 to 0.5461) for all experimental cases in comparison to the linear and Weibull models. This model was capable of predicting the tails in survival curves, which was not possible using Weibull and linear models. The developed model can be used for other foodborne pathogens in a variety of food products to study the destruction and growth behavior.

  10. Experimental calibration and validation of sewer/surface flow exchange equations in steady and unsteady flow conditions

    NASA Astrophysics Data System (ADS)

    Rubinato, Matteo; Martins, Ricardo; Kesserwani, Georges; Leandro, Jorge; Djordjević, Slobodan; Shucksmith, James

    2017-09-01

    The linkage between sewer pipe flow and floodplain flow is recognised to induce an important source of uncertainty within two-dimensional (2D) urban flood models. This uncertainty is often attributed to the use of empirical hydraulic formulae (the one-dimensional (1D) weir and orifice steady flow equations) to achieve data-connectivity at the linking interface, which require the determination of discharge coefficients. Because of the paucity of high resolution localised data for this type of flows, the current understanding and quantification of a suitable range for those discharge coefficients is somewhat lacking. To fulfil this gap, this work presents the results acquired from an instrumented physical model designed to study the interaction between a pipe network flow and a floodplain flow. The full range of sewer-to-surface and surface-to-sewer flow conditions at the exchange zone are experimentally analysed in both steady and unsteady flow regimes. Steady state measured discharges are first analysed considering the relationship between the energy heads from the sewer flow and the floodplain flow; these results show that existing weir and orifice formulae are valid for describing the flow exchange for the present physical model, and yield new calibrated discharge coefficients for each of the flow conditions. The measured exchange discharges are also integrated (as a source term) within a 2D numerical flood model (a finite volume solver to the 2D Shallow Water Equations (SWE)), which is shown to reproduce the observed coefficients. This calibrated numerical model is then used to simulate a series of unsteady flow tests reproduced within the experimental facility. Results show that the numerical model overestimated the values of mean surcharge flow rate. This suggests the occurrence of additional head losses in unsteady conditions which are not currently accounted for within flood models calibrated in steady flow conditions.

  11. Circulation Control Model Experimental Database for CFD Validation

    NASA Technical Reports Server (NTRS)

    Paschal, Keith B.; Neuhart, Danny H.; Beeler, George B.; Allan, Brian G.

    2012-01-01

    A 2D circulation control wing was tested in the Basic Aerodynamic Research Tunnel at the NASA Langley Research Center. A traditional circulation control wing employs tangential blowing along the span over a trailing-edge Coanda surface for the purpose of lift augmentation. This model has been tested extensively at the Georgia Tech Research Institute for the purpose of performance documentation at various blowing rates. The current study seeks to expand on the previous work by documenting additional flow-field data needed for validation of computational fluid dynamics. Two jet momentum coefficients were tested during this entry: 0.047 and 0.114. Boundary-layer transition was investigated and turbulent boundary layers were established on both the upper and lower surfaces of the model. Chordwise and spanwise pressure measurements were made, and tunnel sidewall pressure footprints were documented. Laser Doppler Velocimetry measurements were made on both the upper and lower surface of the model at two chordwise locations (x/c = 0.8 and 0.9) to document the state of the boundary layers near the spanwise blowing slot.

  12. Theoretical research and experimental validation of quasi-static load spectra on bogie frame structures of high-speed trains

    NASA Astrophysics Data System (ADS)

    Zhu, Ning; Sun, Shou-Guang; Li, Qiang; Zou, Hua

    2014-12-01

    One of the major problems in structural fatigue life analysis is establishing structural load spectra under actual operating conditions. This study conducts theoretical research and experimental validation of quasi-static load spectra on bogie frame structures of high-speed trains. The quasistatic load series that corresponds to quasi-static deformation modes are identified according to the structural form and bearing conditions of high-speed train bogie frames. Moreover, a force-measuring frame is designed and manufactured based on the quasi-static load series. The load decoupling model of the quasi-static load series is then established via calibration tests. Quasi-static load-time histories, together with online tests and decoupling analysis, are obtained for the intermediate range of the Beijing—Shanghai dedicated passenger line. The damage consistency calibration of the quasi-static discrete load spectra is performed according to a damage consistency criterion and a genetic algorithm. The calibrated damage that corresponds with the quasi-static discrete load spectra satisfies the safety requirements of bogie frames.

  13. Experimental validation of a 0-D numerical model for phase change thermal management systems in lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Schweitzer, Ben; Wilke, Stephen; Khateeb, Siddique; Al-Hallaj, Said

    2015-08-01

    A lumped (0-D) numerical model has been developed for simulating the thermal response of a lithium-ion battery pack with a phase-change composite (PCC™) thermal management system. A small 10s4p battery pack utilizing PCC material was constructed and subjected to discharge at various C-rates in order to validate the lumped model. The 18650 size Li-ion cells used in the pack were electrically characterized to determine their heat generation, and various PCC materials were thermally characterized to determine their apparent specific heat as a function of temperature. Additionally, a 2-D FEA thermal model was constructed to help understand the magnitude of spatial temperature variation in the pack, and to understand the limitations of the lumped model. Overall, good agreement is seen between experimentally measured pack temperatures and the 0-D model, and the 2-D FEA model predicts minimal spatial temperature variation for PCC-based packs at C-rates of 1C and below.

  14. Experimental validation of the AVIVET trap, a tool to quantitatively monitor the dynamics of Dermanyssus gallinae populations in laying hens.

    PubMed

    Lammers, G A; Bronneberg, R G G; Vernooij, J C M; Stegeman, J A

    2017-06-01

    Dermanyssus gallinae (D.gallinae) infestation causes economic losses due to impaired health and production of hens and costs of parasite control across the world. Moreover, infestations are associated with reduced welfare of hens and may cause itching in humans. To effectively implement control methods it is crucially important to have high quality information about the D.gallinae populations in poultry houses in space and time. At present no validated tool is available to quantitatively monitor the dynamics of all four stages of D.gallinae (i.e., eggs, larvae, nymphs, and adults) in poultry houses.This article describes the experimental validation of the AVIVET trap, a device to quantitatively monitor dynamics of D.gallinae infestations. We used the device to study D.gallinae in fully equipped cages with two white specific pathogen free Leghorn laying hens experimentally exposed to three different infestation levels of D.gallinae (low to high).The AVIVET trap was successfully able to detect D.gallinae at high (5,000 D.gallinae), medium (2,500 D.gallinae), and low (50 D.gallinae) level of D.gallinae infestation. The linear equation Y = 10∧10∧(0.47 + 1.21X) with Y = log10 (Total number of D.gallinae nymphs and adults) in the cage and X = log10 (Total number of D.gallinae nymphs and adults) in the AVIVET trap explained 93.8% of the variation.The weight of D.gallinae in the AVIVET trap also appears to be a reliable parameter for quantifying D.gallinae infestation in a poultry house. The weight of D.gallinae in the AVIVET trap correlates 99.6% (P < 0.000) to the counted number of all stages of D.gallinae in the trap (i.e., eggs, larvae, nymphs, and adults) indicating that the trap is highly specific.From this experiment it can be concluded that the AVIVET trap is promising as quantitative tool for monitoring D.gallinae dynamics in a poultry house. © 2016 Poultry Science Association Inc.

  15. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    NASA Technical Reports Server (NTRS)

    Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  16. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    NASA Technical Reports Server (NTRS)

    Storey, Jed; Kirk, Daniel (Editor); Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  17. Validation of Magnetic Resonance Thermometry by Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Rydquist, Grant; Owkes, Mark; Verhulst, Claire M.; Benson, Michael J.; Vanpoppel, Bret P.; Burton, Sascha; Eaton, John K.; Elkins, Christopher P.

    2016-11-01

    Magnetic Resonance Thermometry (MRT) is a new experimental technique that can create fully three-dimensional temperature fields in a noninvasive manner. However, validation is still required to determine the accuracy of measured results. One method of examination is to compare data gathered experimentally to data computed with computational fluid dynamics (CFD). In this study, large-eddy simulations have been performed with the NGA computational platform to generate data for a comparison with previously run MRT experiments. The experimental setup consisted of a heated jet inclined at 30° injected into a larger channel. In the simulations, viscosity and density were scaled according to the local temperature to account for differences in buoyant and viscous forces. A mesh-independent study was performed with 5 mil-, 15 mil- and 45 mil-cell meshes. The program Star-CCM + was used to simulate the complete experimental geometry. This was compared to data generated from NGA. Overall, both programs show good agreement with the experimental data gathered with MRT. With this data, the validity of MRT as a diagnostic tool has been shown and the tool can be used to further our understanding of a range of flows with non-trivial temperature distributions.

  18. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    NASA Astrophysics Data System (ADS)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  19. Experimentally validated 3D MD model for AFM-based tip-based nanomanufacturing

    NASA Astrophysics Data System (ADS)

    Promyoo, Rapeepan

    In order to control AFM-based TBN to produce precise nano-geometry efficiently, there is a need to conduct a more focused study of the effects of different parameters, such as feed, speed, and depth of cut on the process performance and outcome. This is achieved by experimentally validating a MD simulation model of nanomachining, and using it to conduct parametric studies to guide AFM-based TBN. A 3D MD model with a larger domain size was developed and used to gain a unique insight into the nanoindentation and nanoscratching processes such as the effect of tip speed (e.g. effect of tip speed on indentation force above 10 nm of indentation depth). The model also supported a more comprehensive parametric study (than other published work) in terms of number of parameters and ranges of values investigated, as well as a more cost effective design of experiments. The model was also used to predict material properties at the nanoscale (e.g. hardness of gold predicted within 6% error). On the other hand, a comprehensive experimental parametric study was conducted to produce a database that is used to select proper machining conditions for guiding the fabrication of nanochannels (e.g. scratch rate = 0.996 Hz, trigger threshold = 1 V, for achieving a nanochannel depth = 50 nm for the case of gold device). Similar trends for the variation of indentation force with depth of cut, pattern of the material pile-up around the indentation mark or scratched groove were found. The parametric studies conducted using both MD model simulations and AFM experiments showed the following: Normal forces for both nanoindentation and nanoscratching increase as the depth of cut increases. The indentation depth increases with tip speed, but the depth of scratch decrease with increasing tip speed. The width and depth of scratched groove also depend on the scratch angle. The recommended scratch angle is at 90°. The surface roughness increases with step over, especially when the step over is larger

  20. Detection of overreported psychopathology with the MMPI-2-RF [corrected] validity scales.

    PubMed

    Sellbom, Martin; Bagby, R Michael

    2010-12-01

    We examined the utility of the validity scales on the recently released Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2 RF; Ben-Porath & Tellegen, 2008) to detect overreported psychopathology. This set of validity scales includes a newly developed scale and revised versions of the original MMPI-2 validity scales. We used an analogue, experimental simulation in which MMPI-2 RF responses (derived from archived MMPI-2 protocols) of undergraduate students instructed to overreport psychopathology (in either a coached or noncoached condition) were compared with those of psychiatric inpatients who completed the MMPI-2 under standardized instructions. The MMPI-2 RF validity scale Infrequent Psychopathology Responses best differentiated the simulation groups from the sample of patients, regardless of experimental condition. No other validity scale added consistent incremental predictive utility to Infrequent Psychopathology Responses in distinguishing the simulation groups from the sample of patients. Classification accuracy statistics confirmed the recommended cut scores in the MMPI-2 RF manual (Ben-Porath & Tellegen, 2008).

  1. 6 DOF articulated-arm robot and mobile platform: Dynamic modelling as Multibody System and its validation via Experimental Modal Analysis.

    NASA Astrophysics Data System (ADS)

    Toledo Fuentes, A.; Kipfmueller, M.; José Prieto, M. A.

    2017-10-01

    Mobile manipulators are becoming a key instrument to increase the flexibility in industrial processes. Some of their requirements include handling of objects with different weights and sizes and their “fast” transportation, without jeopardizing production workers and machines. The compensation of forces affecting the system dynamic is therefore needed to avoid unwanted oscillations and tilting by sudden accelerations and decelerations. One general solution may be the implementation of external positioning elements to active stabilize the system. To accomplish the approach, the dynamic behavior of a robotic arm and a mobile platform was investigated to develop the stabilization mechanism using multibody simulations. The methodology used was divided into two phases for each subsystem: their natural frequencies and modal shapes were obtained using experimental modal analyses. Then, based on these experimental results, multibody simulation models (MBS) were set up and its dynamical parameters adjusted. Their modal shapes together with their obtained natural frequencies allowed a quantitative and qualitative analysis. In summary, the MBS models were successfully validated with the real subsystems, with a maximal percentage error of 15%. These models will serve as the basis for future steps in the design of the external actuators and its control strategy using a co-simulation tool.

  2. Experimental Validation of a Closed Brayton Cycle System Transient Simulation

    NASA Technical Reports Server (NTRS)

    Johnson, Paul K.; Hervol, David S.

    2006-01-01

    The Brayton Power Conversion Unit (BPCU) is a closed cycle system with an inert gas working fluid. It is located in Vacuum Facility 6 at NASA Glenn Research Center. Was used in previous solar dynamic technology efforts (SDGTD). Modified to its present configuration by replacing the solar receiver with an electrical resistance heater. The first closed-Brayton-cycle to be coupled with an ion propulsion system. Used to examine mechanical dynamic characteristics and responses. The focus of this work was the validation of a computer model of the BPCU. Model was built using the Closed Cycle System Simulation (CCSS) design and analysis tool. Test conditions were then duplicated in CCSS. Various steady-state points. Transients involving changes in shaft rotational speed and heat input. Testing to date has shown that the BPCU is able to generate meaningful, repeatable data that can be used for computer model validation. Results generated by CCSS demonstrated that the model sufficiently reproduced the thermal transients exhibited by the BPCU system. CCSS was also used to match BPCU steady-state operating points. Cycle temperatures were within 4.1% of the data (most were within 1%). Cycle pressures were all within 3.2%. Error in alternator power (as much as 13.5%) was attributed to uncertainties in the compressor and turbine maps and alternator and bearing loss models. The acquired understanding of the BPCU behavior gives useful insight for improvements to be made to the CCSS model as well as ideas for future testing and possible system modifications.

  3. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  4. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    NASA Astrophysics Data System (ADS)

    Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.

    2015-09-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  5. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation.

    PubMed

    Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M

    2015-09-07

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo(®) TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus(®) chamber. An EBT3(®) film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  6. Validation of Multitemperature Nozzle Flow Code

    NASA Technical Reports Server (NTRS)

    Park, Chul; Lee, Seung -Ho.

    1994-01-01

    A computer code nozzle in n-temperatures (NOZNT), which calculates one-dimensional flows of partially dissociated and ionized air in an expanding nozzle, is tested against three existing sets of experimental data taken in arcjet wind tunnels. The code accounts for the differences among various temperatures, i.e., translational-rotational temperature, vibrational temperatures of individual molecular species, and electron-electronic temperature, and the effects of impurities. The experimental data considered are (1) the spectroscopic emission data; (2) electron beam data on vibrational temperature; and (3) mass-spectrometric species concentration data. It is shown that the impurities are inconsequential for the arcjet flows, and the NOZNT code is validated by numerically reproducing the experimental data.

  7. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John

    1991-01-01

    Docking concepts include capture, berthing, and docking. The definitions of these terms, consistent with AIAA, are as follows: (1) capture (grasping)--the use of a manipulator to make initial contact and attachment between transfer vehicle and a platform; (2) berthing--positioning of a transfer vehicle or payload into platform restraints using a manipulator; and (3) docking--propulsive mechanical connection between vehicle and platform. The combination of the capture and berthing operations is effectively the same as docking; i.e., capture (grasping) + berthing = docking. These concepts are discussed in terms of Martin Marietta's ability to develop validation methods using robotics testbeds.

  8. Design and experimental validation of a simple controller for a multi-segment magnetic crawler robot

    NASA Astrophysics Data System (ADS)

    Kelley, Leah; Ostovari, Saam; Burmeister, Aaron B.; Talke, Kurt A.; Pezeshkian, Narek; Rahimi, Amin; Hart, Abraham B.; Nguyen, Hoa G.

    2015-05-01

    A novel, multi-segmented magnetic crawler robot has been designed for ship hull inspection. In its simplest version, passive linkages that provide two degrees of relative motion connect front and rear driving modules, so the robot can twist and turn. This permits its navigation over surface discontinuities while maintaining its adhesion to the hull. During operation, the magnetic crawler receives forward and turning velocity commands from either a tele-operator or high-level, autonomous control computer. A low-level, embedded microcomputer handles the commands to the driving motors. This paper presents the development of a simple, low-level, leader-follower controller that permits the rear module to follow the front module. The kinematics and dynamics of the two-module magnetic crawler robot are described. The robot's geometry, kinematic constraints and the user-commanded velocities are used to calculate the desired instantaneous center of rotation and the corresponding central-linkage angle necessary for the back module to follow the front module when turning. The commands to the rear driving motors are determined by applying PID control on the error between the desired and measured linkage angle position. The controller is designed and tested using Matlab Simulink. It is then implemented and tested on an early two-module magnetic crawler prototype robot. Results of the simulations and experimental validation of the controller design are presented.

  9. Modelling Short-Term Maximum Individual Exposure from Airborne Hazardous Releases in Urban Environments. Part ΙI: Validation of a Deterministic Model with Wind Tunnel Experimental Data.

    PubMed

    Efthimiou, George C; Bartzis, John G; Berbekar, Eva; Hertwig, Denise; Harms, Frank; Leitl, Bernd

    2015-06-26

    The capability to predict short-term maximum individual exposure is very important for several applications including, for example, deliberate/accidental release of hazardous substances, odour fluctuations or material flammability level exceedance. Recently, authors have proposed a simple approach relating maximum individual exposure to parameters such as the fluctuation intensity and the concentration integral time scale. In the first part of this study (Part I), the methodology was validated against field measurements, which are governed by the natural variability of atmospheric boundary conditions. In Part II of this study, an in-depth validation of the approach is performed using reference data recorded under truly stationary and well documented flow conditions. For this reason, a boundary-layer wind-tunnel experiment was used. The experimental dataset includes 196 time-resolved concentration measurements which detect the dispersion from a continuous point source within an urban model of semi-idealized complexity. The data analysis allowed the improvement of an important model parameter. The model performed very well in predicting the maximum individual exposure, presenting a factor of two of observations equal to 95%. For large time intervals, an exponential correction term has been introduced in the model based on the experimental observations. The new model is capable of predicting all time intervals giving an overall factor of two of observations equal to 100%.

  10. Validation of High-Fidelity Reactor Physics Models for Support of the KJRR Experimental Campaign in the Advanced Test Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigg, David W.; Nielsen, Joseph W.; Norman, Daren R.

    The Korea Atomic Energy Research Institute is currently in the process of qualifying a Low-Enriched Uranium fuel element design for the new Ki-Jang Research Reactor (KJRR). As part of this effort, a prototype KJRR fuel element was irradiated for several operating cycles in the Northeast Flux Trap of the Advanced Test Reactor (ATR) at the Idaho National Laboratory. The KJRR fuel element contained a very large quantity of fissile material (618g 235U) in comparison with historical ATR experiment standards (<1g 235U), and its presence in the ATR flux trap was expected to create a neutronic configuration that would be wellmore » outside of the approved validation envelope for the reactor physics analysis methods used to support ATR operations. Accordingly it was necessary, prior to high-power irradiation of the KJRR fuel element in the ATR, to conduct an extensive set of new low-power physics measurements with the KJRR fuel element installed in the ATR Critical Facility (ATRC), a companion facility to the ATR that is located in an immediately adjacent building, sharing the same fuel handling and storage canal. The new measurements had the objective of expanding the validation envelope for the computational reactor physics tools used to support ATR operations and safety analysis to include the planned KJRR irradiation in the ATR and similar experiments that are anticipated in the future. The computational and experimental results demonstrated that the neutronic behavior of the KJRR fuel element in the ATRC is well-understood, both in terms of its general effects on core excess reactivity and fission power distributions, its effects on the calibration of the core lobe power measurement system, as well as in terms of its own internal fission rate distribution and total fission power per unit ATRC core power. Taken as a whole, these results have significantly extended the ATR physics validation envelope, thereby enabling an entire new class of irradiation experiments.« less

  11. Experimental annotation of the human genome using microarray technology.

    PubMed

    Shoemaker, D D; Schadt, E E; Armour, C D; He, Y D; Garrett-Engele, P; McDonagh, P D; Loerch, P M; Leonardson, A; Lum, P Y; Cavet, G; Wu, L F; Altschuler, S J; Edwards, S; King, J; Tsang, J S; Schimmack, G; Schelter, J M; Koch, J; Ziman, M; Marton, M J; Li, B; Cundiff, P; Ward, T; Castle, J; Krolewski, M; Meyer, M R; Mao, M; Burchard, J; Kidd, M J; Dai, H; Phillips, J W; Linsley, P S; Stoughton, R; Scherer, S; Boguski, M S

    2001-02-15

    The most important product of the sequencing of a genome is a complete, accurate catalogue of genes and their products, primarily messenger RNA transcripts and their cognate proteins. Such a catalogue cannot be constructed by computational annotation alone; it requires experimental validation on a genome scale. Using 'exon' and 'tiling' arrays fabricated by ink-jet oligonucleotide synthesis, we devised an experimental approach to validate and refine computational gene predictions and define full-length transcripts on the basis of co-regulated expression of their exons. These methods can provide more accurate gene numbers and allow the detection of mRNA splice variants and identification of the tissue- and disease-specific conditions under which genes are expressed. We apply our technique to chromosome 22q under 69 experimental condition pairs, and to the entire human genome under two experimental conditions. We discuss implications for more comprehensive, consistent and reliable genome annotation, more efficient, full-length complementary DNA cloning strategies and application to complex diseases.

  12. Achieving external validity in home advantage research: generalizing crowd noise effects

    PubMed Central

    Myers, Tony D.

    2014-01-01

    Different factors have been postulated to explain the home advantage phenomenon in sport. One plausible explanation investigated has been the influence of a partisan home crowd on sports officials' decisions. Different types of studies have tested the crowd influence hypothesis including purposefully designed experiments. However, while experimental studies investigating crowd influences have high levels of internal validity, they suffer from a lack of external validity; decision-making in a laboratory setting bearing little resemblance to decision-making in live sports settings. This focused review initially considers threats to external validity in applied and theoretical experimental research. Discussing how such threats can be addressed using representative design by focusing on a recently published study that arguably provides the first experimental evidence of the impact of live crowd noise on officials in sport. The findings of this controlled experiment conducted in a real tournament setting offer a level of confirmation of the findings of laboratory studies in the area. Finally directions for future research and the future conduct of crowd noise studies are discussed. PMID:24917839

  13. [Animal experimentation, computer simulation and surgical research].

    PubMed

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  14. FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju

    To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposedmore » based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting

  15. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  16. IFMIF: overview of the validation activities

    NASA Astrophysics Data System (ADS)

    Knaster, J.; Arbeiter, F.; Cara, P.; Favuzza, P.; Furukawa, T.; Groeschel, F.; Heidinger, R.; Ibarra, A.; Matsumoto, H.; Mosnier, A.; Serizawa, H.; Sugimoto, M.; Suzuki, H.; Wakai, E.

    2013-11-01

    The Engineering Validation and Engineering Design Activities (EVEDA) for the International Fusion Materials Irradiation Facility (IFMIF), an international collaboration under the Broader Approach Agreement between Japan Government and EURATOM, aims at allowing a rapid construction phase of IFMIF in due time with an understanding of the cost involved. The three main facilities of IFMIF (1) the Accelerator Facility, (2) the Target Facility and (3) the Test Facility are the subject of validation activities that include the construction of either full scale prototypes or smartly devised scaled down facilities that will allow a straightforward extrapolation to IFMIF needs. By July 2013, the engineering design activities of IFMIF matured with the delivery of an Intermediate IFMIF Engineering Design Report (IIEDR) supported by experimental results. The installation of a Linac of 1.125 MW (125 mA and 9 MeV) of deuterons started in March 2013 in Rokkasho (Japan). The world's largest liquid Li test loop is running in Oarai (Japan) with an ambitious experimental programme for the years ahead. A full scale high flux test module that will house ∼1000 small specimens developed jointly in Europe and Japan for the Fusion programme has been constructed by KIT (Karlsruhe) together with its He gas cooling loop. A full scale medium flux test module to carry out on-line creep measurement has been validated by CRPP (Villigen).

  17. Optimization and experimental validation of stiff porous phononic plates for widest complete bandgap of mixed fundamental guided wave modes

    NASA Astrophysics Data System (ADS)

    Hedayatrasa, Saeid; Kersemans, Mathias; Abhary, Kazem; Uddin, Mohammad; Van Paepegem, Wim

    2018-01-01

    Phononic crystal plates (PhPs) have promising application in manipulation of guided waves for design of low-loss acoustic devices and built-in acoustic metamaterial lenses in plate structures. The prominent feature of phononic crystals is the existence of frequency bandgaps over which the waves are stopped, or are resonated and guided within appropriate defects. Therefore, maximized bandgaps of PhPs are desirable to enhance their phononic controllability. Porous PhPs produced through perforation of a uniform background plate, in which the porous interfaces act as strong reflectors of wave energy, are relatively easy to produce. However, the research in optimization of porous PhPs and experimental validation of achieved topologies has been very limited and particularly focused on bandgaps of flexural (asymmetric) wave modes. In this paper, porous PhPs are optimized through an efficient multiobjective genetic algorithm for widest complete bandgap of mixed fundamental guided wave modes (symmetric and asymmetric) and maximized stiffness. The Pareto front of optimization is analyzed and variation of bandgap efficiency with respect to stiffness is presented for various optimized topologies. Selected optimized topologies from the stiff and compliant regimes of Pareto front are manufactured by water-jetting an aluminum plate and their promising bandgap efficiency is experimentally observed. An optimized Pareto topology is also chosen and manufactured by laser cutting a Plexiglas (PMMA) plate, and its performance in self-collimation and focusing of guided waves is verified as compared to calculated dispersion properties.

  18. Validation Results for LEWICE 2.0. [Supplement

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Rutkowski, Adam

    1999-01-01

    Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.

  19. Multi-actuators vehicle collision avoidance system - Experimental validation

    NASA Astrophysics Data System (ADS)

    Hamid, Umar Zakir Abdul; Zakuan, Fakhrul Razi Ahmad; Akmal Zulkepli, Khairul; Zulfaqar Azmi, Muhammad; Zamzuri, Hairi; Rahman, Mohd Azizi Abdul; Aizzat Zakaria, Muhammad

    2018-04-01

    The Insurance Institute for Highway Safety (IIHS) of the United States of America in their reports has mentioned that a significant amount of the road mishaps would be preventable if more automated active safety applications are adopted into the vehicle. This includes the incorporation of collision avoidance system. The autonomous intervention by the active steering and braking systems in the hazardous scenario can aid the driver in mitigating the collisions. In this work, a real-time platform of a multi-actuators vehicle collision avoidance system is developed. It is a continuous research scheme to develop a fully autonomous vehicle in Malaysia. The vehicle is a modular platform which can be utilized for different research purposes and is denominated as Intelligent Drive Project (iDrive). The vehicle collision avoidance proposed design is validated in a controlled environment, where the coupled longitudinal and lateral motion control system is expected to provide desired braking and steering actuation in the occurrence of a frontal static obstacle. Results indicate the ability of the platform to yield multi-actuators collision avoidance navigation in the hazardous scenario, thus avoiding the obstacle. The findings of this work are beneficial for the development of a more complex and nonlinear real-time collision avoidance work in the future.

  20. Crack Detection in Fibre Reinforced Plastic Structures Using Embedded Fibre Bragg Grating Sensors: Theory, Model Development and Experimental Validation

    PubMed Central

    Pereira, G. F.; Mikkelsen, L. P.; McGugan, M.

    2015-01-01

    In a fibre-reinforced polymer (FRP) structure designed using the emerging damage tolerance and structural health monitoring philosophy, sensors and models that describe crack propagation will enable a structure to operate despite the presence of damage by fully exploiting the material’s mechanical properties. When applying this concept to different structures, sensor systems and damage types, a combination of damage mechanics, monitoring technology, and modelling is required. The primary objective of this article is to demonstrate such a combination. This article is divided in three main topics: the damage mechanism (delamination of FRP), the structural health monitoring technology (fibre Bragg gratings to detect delamination), and the finite element method model of the structure that incorporates these concepts into a final and integrated damage-monitoring concept. A novel method for assessing a crack growth/damage event in fibre-reinforced polymer or structural adhesive-bonded structures using embedded fibre Bragg grating (FBG) sensors is presented by combining conventional measured parameters, such as wavelength shift, with parameters associated with measurement errors, typically ignored by the end-user. Conjointly, a novel model for sensor output prediction (virtual sensor) was developed using this FBG sensor crack monitoring concept and implemented in a finite element method code. The monitoring method was demonstrated and validated using glass fibre double cantilever beam specimens instrumented with an array of FBG sensors embedded in the material and tested using an experimental fracture procedure. The digital image correlation technique was used to validate the model prediction by correlating the specific sensor response caused by the crack with the developed model. PMID:26513653

  1. Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment

    NASA Technical Reports Server (NTRS)

    Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.

    2008-01-01

    Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.

  2. DEM modeling of ball mills with experimental validation: influence of contact parameters on charge motion and power draw

    NASA Astrophysics Data System (ADS)

    Boemer, Dominik; Ponthot, Jean-Philippe

    2017-01-01

    Discrete element method simulations of a 1:5-scale laboratory ball mill are presented in this paper to study the influence of the contact parameters on the charge motion and the power draw. The position density limit is introduced as an efficient mathematical tool to describe and to compare the macroscopic charge motion in different scenarios, i.a. with different values of the contact parameters. While the charge motion and the power draw are relatively insensitive to the stiffness and the damping coefficient of the linear spring-slider-damper contact law, the coefficient of friction has a strong influence since it controls the sliding propensity of the charge. Based on the experimental calibration and validation by charge motion photographs and power draw measurements, the descriptive and predictive capabilities of the position density limit and the discrete element method are demonstrated, i.e. the real position of the charge is precisely delimited by the respective position density limit and the power draw can be predicted with an accuracy of about 5 %.

  3. Experimental Validation of Lightning-Induced Electromagnetic (Indirect) Coupling to Short Monopole Antennas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crull, E W; Brown Jr., C G; Perkins, M P

    2008-07-30

    For short monopoles in this low-power case, it has been shown that a simple circuit model is capable of accurate predictions for the shape and magnitude of the antenna response to lightning-generated electric field coupling effects, provided that the elements of the circuit model have accurate values. Numerical EM simulation can be used to provide more accurate values for the circuit elements than the simple analytical formulas, since the analytical formulas are used outside of their region of validity. However, even with the approximate analytical formulas the simple circuit model produces reasonable results, which would improve if more accurate analyticalmore » models were used. This report discusses the coupling analysis approaches taken to understand the interaction between a time-varying EM field and a short monopole antenna, within the context of lightning safety for nuclear weapons at DOE facilities. It describes the validation of a simple circuit model using laboratory study in order to understand the indirect coupling of energy into a part, and the resulting voltage. Results show that in this low-power case, the circuit model predicts peak voltages within approximately 32% using circuit component values obtained from analytical formulas and about 13% using circuit component values obtained from numerical EM simulation. We note that the analytical formulas are used outside of their region of validity. First, the antenna is insulated and not a bare wire and there are perhaps fringing field effects near the termination of the outer conductor that the formula does not take into account. Also, the effective height formula is for a monopole directly over a ground plane, while in the time-domain measurement setup the monopole is elevated above the ground plane by about 1.5-inch (refer to Figure 5).« less

  4. Development and validation of LC-MS/MS method for the quantification of oxcarbazepine in human plasma using an experimental design.

    PubMed

    Srinubabu, Gedela; Ratnam, Bandaru Veera Venkata; Rao, Allam Appa; Rao, Medicherla Narasimha

    2008-01-01

    A rapid tandem mass spectrometric (MS-MS) method for the quantification of Oxcarbazepine (OXB) in human plasma using imipramine as an internal standard (IS) has been developed and validated. Chromatographic separation was achieved isocratically on a C18 reversed-phase column within 3.0 min, using a mobile phase of acetonitrile-10 mM ammonium formate (90 : 10 v/v) at a flow rate of 0.3 ml/min. Quantitation was achieved using multiple reaction monitoring (MRM) scan at MRM transitions m/z 253>208 and m/z 281>86 for OXB and the IS respectively. Calibration curves were linear over the concentration range of 0.2-16 mug/ml (r>0.999) with a limit of quantification of 0.2 mug/ml. Analytical recoveries of OXB from spiked human plasma were in the range of 74.9 to 76.3%. Plackett-Burman design was applied for screening of chromatographic and mass spectrometric factors; factorial design was applied for optimization of essential factors for the robustness study. A linear model was postulated and a 2(3) full factorial design was employed to estimate the model coefficients for intermediate precision. More specifically, experimental design helps the researcher to verify if changes in factor values produce a statistically significant variation of the observed response. The strategy is most effective if statistical design is used in most or all stages of the screening and optimizing process for future method validation of pharmacokinetic and bioequivalence studies.

  5. Process simulation and experimental validation of Hot Metal Gas Forming with new press hardening steels

    NASA Astrophysics Data System (ADS)

    Paul, A.; Reuther, F.; Neumann, S.; Albert, A.; Landgrebe, D.

    2017-09-01

    One field in the work of the Fraunhofer Institute for Machine Tools and Forming Technology IWU in Chemnitz is industry applied research in Hot Metal Gas Forming, combined with press hardening in one process step. In this paper the results of investigations on new press hardening steels from SSAB AB (Docol®1800 Bor and Docol®2000 Bor) are presented. Hot tensile tests recorded by the project partner (University of West Bohemia, Faculty of Mechanical Engineering) were used to create a material model for thermo-mechanical forming simulations. For this purpose the provided raw data were converted into flow curve approximations of the real stress-real strain-curves for both materials and afterwards integrated in a LS-DYNA simulation model of Hot Metal Gas Forming with all relevant boundary conditions and sub-stages. Preliminary experimental tests were carried out using a tool at room temperature to permit evaluation of the forming behaviour of Docol 1800 Bor and Docol 2000 Bor tubes as well as validation of the simulation model. Using this demonstrator geometry (outer diameter 57 mm, tube length 300 mm, wall thickness 1.5 mm), the intention was to perform a series of tests with different furnace temperatures (from 870 °C to 1035 °C), maximum internal pressures (up to 67 MPa) and pressure build-up rates (up to 40 MPa/s) to evaluate the formability of Docol 1800 Bor and Docol 2000 Bor. Selected demonstrator parts produced in that way were subsequently analysed by wall thickness and hardness measurements. The tests were carried out using the completely modernized Dunkes/AP&T HS3-1500 hydroforming press at the Fraunhofer IWU. In summary, creating a consistent simulation model with all relevant sub-stages was successfully established in LS-DYNA. The computation results show a high correlation with the experimental data regarding the thinning behaviour. The Hot Metal Gas Forming of the demonstrator geometry was successfully established as well. Different hardness values

  6. Experimental economics' inconsistent ban on deception.

    PubMed

    Hersch, Gil

    2015-08-01

    According to what I call the 'argument from public bads', if a researcher deceived subjects in the past, there is a chance that subjects will discount the information that a subsequent researcher provides, thus compromising the validity of the subsequent researcher's experiment. While this argument is taken to justify an existing informal ban on explicit deception in experimental economics, it can also apply to implicit deception, yet implicit deception is not banned and is sometimes used in experimental economics. Thus, experimental economists are being inconsistent when they appeal to the argument from public bads to justify banning explicit deception but not implicit deception. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Development of the Biological Experimental Design Concept Inventory (BEDCI)

    ERIC Educational Resources Information Center

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gulnur

    2014-01-01

    Interest in student conception of experimentation inspired the development of a fully validated 14-question inventory on experimental design in biology (BEDCI) by following established best practices in concept inventory (CI) design. This CI can be used to diagnose specific examples of non-expert-like thinking in students and to evaluate the…

  8. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    PubMed

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  9. MRI-based modeling for radiocarpal joint mechanics: validation criteria and results for four specimen-specific models.

    PubMed

    Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet

    2011-10-01

    The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations

  10. Issues and approach to develop validated analysis tools for hypersonic flows: One perspective

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.

    1993-01-01

    Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.

  11. A Surrogate Approach to the Experimental Optimization of Multielement Airfoils

    NASA Technical Reports Server (NTRS)

    Otto, John C.; Landman, Drew; Patera, Anthony T.

    1996-01-01

    The incorporation of experimental test data into the optimization process is accomplished through the use of Bayesian-validated surrogates. In the surrogate approach, a surrogate for the experiment (e.g., a response surface) serves in the optimization process. The validation step of the framework provides a qualitative assessment of the surrogate quality, and bounds the surrogate-for-experiment error on designs "near" surrogate-predicted optimal designs. The utility of the framework is demonstrated through its application to the experimental selection of the trailing edge ap position to achieve a design lift coefficient for a three-element airfoil.

  12. Validating LES for Jet Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2011-01-01

    Engineers charged with making jet aircraft quieter have long dreamed of being able to see exactly how turbulent eddies produce sound and this dream is now coming true with the advent of large eddy simulation (LES). Two obvious challenges remain: validating the LES codes at the resolution required to see the fluid-acoustic coupling, and the interpretation of the massive datasets that result in having dreams come true. This paper primarily addresses the former, the use of advanced experimental techniques such as particle image velocimetry (PIV) and Raman and Rayleigh scattering, to validate the computer codes and procedures used to create LES solutions. It also addresses the latter problem in discussing what are relevant measures critical for aeroacoustics that should be used in validating LES codes. These new diagnostic techniques deliver measurements and flow statistics of increasing sophistication and capability, but what of their accuracy? And what are the measures to be used in validation? This paper argues that the issue of accuracy be addressed by cross-facility and cross-disciplinary examination of modern datasets along with increased reporting of internal quality checks in PIV analysis. Further, it is argued that the appropriate validation metrics for aeroacoustic applications are increasingly complicated statistics that have been shown in aeroacoustic theory to be critical to flow-generated sound.

  13. Two-Speed Gearbox Dynamic Simulation Predictions and Test Validation

    NASA Technical Reports Server (NTRS)

    Lewicki, David G.; DeSmidt, Hans; Smith, Edward C.; Bauman, Steven W.

    2010-01-01

    Dynamic simulations and experimental validation tests were performed on a two-stage, two-speed gearbox as part of the drive system research activities of the NASA Fundamental Aeronautics Subsonics Rotary Wing Project. The gearbox was driven by two electromagnetic motors and had two electromagnetic, multi-disk clutches to control output speed. A dynamic model of the system was created which included a direct current electric motor with proportional-integral-derivative (PID) speed control, a two-speed gearbox with dual electromagnetically actuated clutches, and an eddy current dynamometer. A six degree-of-freedom model of the gearbox accounted for the system torsional dynamics and included gear, clutch, shaft, and load inertias as well as shaft flexibilities and a dry clutch stick-slip friction model. Experimental validation tests were performed on the gearbox in the NASA Glenn gear noise test facility. Gearbox output speed and torque as well as drive motor speed and current were compared to those from the analytical predictions. The experiments correlate very well with the predictions, thus validating the dynamic simulation methodologies.

  14. Approach and Instrument Placement Validation

    NASA Technical Reports Server (NTRS)

    Ator, Danielle

    2005-01-01

    The Mars Exploration Rovers (MER) from the 2003 flight mission represents the state of the art technology for target approach and instrument placement on Mars. It currently takes 3 sols (Martian days) for the rover to place an instrument on a designated rock target that is about 10 to 20 m away. The objective of this project is to provide an experimentally validated single-sol instrument placement capability to future Mars missions. After completing numerous test runs on the Rocky8 rover under various test conditions, it has been observed that lighting conditions, shadow effects, target features and the initial target distance have an effect on the performance and reliability of the tracking software. Additional software validation testing will be conducted in the months to come.

  15. Validation of Structures in the Protein Data Bank.

    PubMed

    Gore, Swanand; Sanz García, Eduardo; Hendrickx, Pieter M S; Gutmanas, Aleksandras; Westbrook, John D; Yang, Huanwang; Feng, Zukang; Baskaran, Kumaran; Berrisford, John M; Hudson, Brian P; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L; Mading, Steve; Mak, Lora; Mukhopadhyay, Abhik; Oldfield, Thomas J; Patwardhan, Ardan; Peisach, Ezra; Sahni, Gaurav; Sekharan, Monica R; Sen, Sanchayita; Shao, Chenghua; Smart, Oliver S; Ulrich, Eldon L; Yamashita, Reiko; Quesada, Martha; Young, Jasmine Y; Nakamura, Haruki; Markley, John L; Berman, Helen M; Burley, Stephen K; Velankar, Sameer; Kleywegt, Gerard J

    2017-12-05

    The Worldwide PDB recently launched a deposition, biocuration, and validation tool: OneDep. At various stages of OneDep data processing, validation reports for three-dimensional structures of biological macromolecules are produced. These reports are based on recommendations of expert task forces representing crystallography, nuclear magnetic resonance, and cryoelectron microscopy communities. The reports provide useful metrics with which depositors can evaluate the quality of the experimental data, the structural model, and the fit between them. The validation module is also available as a stand-alone web server and as a programmatically accessible web service. A growing number of journals require the official wwPDB validation reports (produced at biocuration) to accompany manuscripts describing macromolecular structures. Upon public release of the structure, the validation report becomes part of the public PDB archive. Geometric quality scores for proteins in the PDB archive have improved over the past decade. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Validation of Lower Body Negative Pressure as an Experimental Model of Hemorrhage

    DTIC Science & Technology

    2013-12-19

    saving intervention (15). Therefore it is important to develop a valid model for understanding the physiology of human hemorrhage especially during the...hemorrhage to investigate the physiological responses to hypovolemia (7). LBNP causes a reduction in pressure sur- rounding the lower extremities. As...from that observed with hemorrhage reflects the physiological mechanisms producing central hypovolemia. During LBNP, intravascular fluid shifts to the

  17. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is amore » follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.« less

  18. PSI-Center Simulations of Validation Platform Experiments

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2013-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with extended MHD simulations. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), FRX-L (Los Alamos National Laboratory), HIT-SI (U Wash - UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), PHD/ELF (UW/MSNW), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). Modifications have been made to the NIMROD, HiFi, and PSI-Tet codes to specifically model these experiments, including mesh generation/refinement, non-local closures, appropriate boundary conditions (external fields, insulating BCs, etc.), and kinetic and neutral particle interactions. The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition is proving to be a powerful method to compare global temporal and spatial structures for validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  19. Further Validation of a CFD Code for Calculating the Performance of Two-Stage Light Gas Guns

    NASA Technical Reports Server (NTRS)

    Bogdanoff, David W.

    2017-01-01

    Earlier validations of a higher-order Godunov code for modeling the performance of two-stage light gas guns are reviewed. These validation comparisons were made between code predictions and experimental data from the NASA Ames 1.5" and 0.28" guns and covered muzzle velocities of 6.5 to 7.2 km/s. In the present report, five more series of code validation comparisons involving experimental data from the Ames 0.22" (1.28" pump tube diameter), 0.28", 0.50", 1.00" and 1.50" guns are presented. The total muzzle velocity range of the validation data presented herein is 3 to 11.3 km/s. The agreement between the experimental data and CFD results is judged to be very good. Muzzle velocities were predicted within 0.35 km/s for 74% of the cases studied with maximum differences being 0.5 km/s and for 4 out of 50 cases, 0.5 - 0.7 km/s.

  20. Experimental aerothermodynamic research of hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Cleary, Joseph W.

    1987-01-01

    The 2-D and 3-D advance computer codes being developed for use in the design of such hypersonic aircraft as the National Aero-Space Plane require comparison of the computational results with a broad spectrum of experimental data to fully assess the validity of the codes. This is particularly true for complex flow fields with control surfaces present and for flows with separation, such as leeside flow. Therefore, the objective is to provide a hypersonic experimental data base required for validation of advanced computational fluid dynamics (CFD) computer codes and for development of more thorough understanding of the flow physics necessary for these codes. This is being done by implementing a comprehensive test program for a generic all-body hypersonic aircraft model in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel over a broad range of test conditions to obtain pertinent surface and flowfield data. Results from the flow visualization portion of the investigation are presented.

  1. [Contribution of animal experimentation to pharmacology].

    PubMed

    Sassard, Jean; Hamon, Michel; Galibert, Francis

    2009-11-01

    Animal experimentation is of considerable importance in pharmacology and cannot yet be avoided when studying complex, highly integrated physiological functions. The use of animals has been drastically reduced in the classical phases of pharmacological research, for example when comparing several compounds belonging to the same pharmacological class. However, animal experiments remain crucial for generating and validating new therapeutic concepts. Three examples of such research, conducted in strict ethical conditions, will be used to illustrate the different ways in which animal experimentation has contributed to human therapeutics.

  2. Detection of tunnel excavation using fiber optic reflectometry: experimental validation

    NASA Astrophysics Data System (ADS)

    Linker, Raphael; Klar, Assaf

    2013-06-01

    Cross-border smuggling tunnels enable unmonitored movement of people and goods, and pose a severe threat to homeland security. In recent years, we have been working on the development of a system based on fiber- optic Brillouin time domain reflectometry (BOTDR) for detecting tunnel excavation. In two previous SPIE publications we have reported the initial development of the system as well as its validation using small-scale experiments. This paper reports, for the first time, results of full-scale experiments and discusses the system performance. The results confirm that distributed measurement of strain profiles in fiber cables buried at shallow depth enable detection of tunnel excavation, and by proper data processing, these measurements enable precise localization of the tunnel, as well as reasonable estimation of its depth.

  3. Validation of a Monte Carlo simulation of the Inveon PET scanner using GATE

    NASA Astrophysics Data System (ADS)

    Lu, Lijun; Zhang, Houjin; Bian, Zhaoying; Ma, Jianhua; Feng, Qiangjin; Chen, Wufan

    2016-08-01

    The purpose of this study is to validate the application of GATE (Geant4 Application for Tomographic Emission) Monte Carlo simulation toolkit in order to model the performance characteristics of Siemens Inveon small animal PET system. The simulation results were validated against experimental/published data in accordance with the NEMA NU-4 2008 protocol for standardized evaluation of spatial resolution, sensitivity, scatter fraction (SF) and noise equivalent counting rate (NECR) of a preclinical PET system. An agreement of less than 18% was obtained between the radial, tangential and axial spatial resolutions of the simulated and experimental results. The simulated peak NECR of mouse-size phantom agreed with the experimental result, while for the rat-size phantom simulated value was higher than experimental result. The simulated and experimental SFs of mouse- and rat- size phantom both reached an agreement of less than 2%. It has been shown the feasibility of our GATE model to accurately simulate, within certain limits, all major performance characteristics of Inveon PET system.

  4. Validation of NASA Thermal Ice Protection Computer Codes. Part 3; The Validation of Antice

    NASA Technical Reports Server (NTRS)

    Al-Khalil, Kamel M.; Horvath, Charles; Miller, Dean R.; Wright, William B.

    2001-01-01

    An experimental program was generated by the Icing Technology Branch at NASA Glenn Research Center to validate two ice protection simulation codes: (1) LEWICE/Thermal for transient electrothermal de-icing and anti-icing simulations, and (2) ANTICE for steady state hot gas and electrothermal anti-icing simulations. An electrothermal ice protection system was designed and constructed integral to a 36 inch chord NACA0012 airfoil. The model was fully instrumented with thermo-couples, RTD'S, and heat flux gages. Tests were conducted at several icing environmental conditions during a two week period at the NASA Glenn Icing Research Tunnel. Experimental results of running-wet and evaporative cases were compared to the ANTICE computer code predictions and are presented in this paper.

  5. Examining students' views about validity of experiments: From introductory to Ph.D. students

    NASA Astrophysics Data System (ADS)

    Hu, Dehui; Zwickl, Benjamin M.

    2018-06-01

    We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.

  6. Acoustic-Structure Interaction in Rocket Engines: Validation Testing

    NASA Technical Reports Server (NTRS)

    Davis, R. Benjamin; Joji, Scott S.; Parks, Russel A.; Brown, Andrew M.

    2009-01-01

    While analyzing a rocket engine component, it is often necessary to account for any effects that adjacent fluids (e.g., liquid fuels or oxidizers) might have on the structural dynamics of the component. To better characterize the fully coupled fluid-structure system responses, an analytical approach that models the system as a coupled expansion of rigid wall acoustic modes and in vacuo structural modes has been proposed. The present work seeks to experimentally validate this approach. To experimentally observe well-coupled system modes, the test article and fluid cavities are designed such that the uncoupled structural frequencies are comparable to the uncoupled acoustic frequencies. The test measures the natural frequencies, mode shapes, and forced response of cylindrical test articles in contact with fluid-filled cylindrical and/or annular cavities. The test article is excited with a stinger and the fluid-loaded response is acquired using a laser-doppler vibrometer. The experimentally determined fluid-loaded natural frequencies are compared directly to the results of the analytical model. Due to the geometric configuration of the test article, the analytical model is found to be valid for natural modes with circumferential wave numbers greater than four. In the case of these modes, the natural frequencies predicted by the analytical model demonstrate excellent agreement with the experimentally determined natural frequencies.

  7. Experimental Validation of Plasma Metasurfaces as Tunable THz Reflectors

    NASA Astrophysics Data System (ADS)

    Colon Quinones, Roberto; Underwood, Thomas; Cappelli, Mark

    2016-10-01

    Measurements are presented which validate the use of plasma metasurfaces (PMs) as potential tunable THz reflectors. The PM considered here is an n x n array of laser produced plasma kernels generated by focusing the fundamental output from a 2 J/p Q-switched Nd:YAG laser through a multi-lens array (MLA) and into a gas of varying pressure. An M Squared Firefly-THz laser is used to generate a collimated pulse of THz light, which is then directed to the PM at varying angles of incidence. The reflected energy is measured using a Gentec-EO SDX-1187 joulemeter probe to characterize the surface impedance or reflectivity. In this presentation, we will compare the measured reflectance to values obtained from theoretical predictions and 3D finite-difference time-domain (FDTD) simulations. Work supported by the Air Force Office of Scientific Research (AFOSR). R. Colon Quinones and T. Underwood acknowledge the support of the Department of Defense (DoD) through the National Defense Science & Engineering Graduate Fellowship (NDSEG) Program.

  8. Simulation of the AC corona phenomenon with experimental validation

    NASA Astrophysics Data System (ADS)

    Villa, Andrea; Barbieri, Luca; Marco, Gondola; Malgesini, Roberto; Leon-Garzon, Andres R.

    2017-11-01

    The corona effect, and in particular the Trichel phenomenon, is an important aspect of plasma physics with many technical applications, such as pollution reduction, surface and medical treatments. This phenomenon is also associated with components used in the power industry where it is, in many cases, the source of electro-magnetic disturbance, noise and production of undesired chemically active species. Despite the power industry to date using mainly alternating current (AC) transmission, most of the studies related to the corona effect have been carried out with direct current (DC) sources. Therefore, there is technical interest in validating numerical codes capable of simulating the AC phenomenon. In this work we describe a set of partial differential equations that are comprehensive enough to reproduce the distinctive features of the corona in an AC regime. The model embeds some selectable chemical databases, comprising tens of chemical species and hundreds of reactions, the thermal dynamics of neutral species and photoionization. A large set of parameters—deduced from experiments and numerical estimations—are compared, to assess the effectiveness of the proposed approach.

  9. Highlights of Transient Plume Impingement Model Validation and Applications

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael

    2011-01-01

    This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

  10. Validation of OVERFLOW for Supersonic Retropropulsion

    NASA Technical Reports Server (NTRS)

    Schauerhamer, Guy

    2012-01-01

    The goal is to softly land high mass vehicles (10s of metric tons) on Mars. Supersonic Retropropulsion (SRP) is a potential method of deceleration. Current method of supersonic parachutes does not scale past 1 metric ton. CFD is of increasing importance since flight and experimental data at these conditions is difficult to obtain. CFD must first be validated at these conditions.

  11. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  12. Computational-experimental approach to drug-target interaction mapping: A case study on kinase inhibitors

    PubMed Central

    Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister

    2017-01-01

    Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach

  13. Si amorphization by focused ion beam milling: Point defect model with dynamic BCA simulation and experimental validation.

    PubMed

    Huang, J; Loeffler, M; Muehle, U; Moeller, W; Mulders, J J L; Kwakman, L F Tz; Van Dorp, W F; Zschech, E

    2018-01-01

    A Ga focused ion beam (FIB) is often used in transmission electron microscopy (TEM) analysis sample preparation. In case of a crystalline Si sample, an amorphous near-surface layer is formed by the FIB process. In order to optimize the FIB recipe by minimizing the amorphization, it is important to predict the amorphous layer thickness from simulation. Molecular Dynamics (MD) simulation has been used to describe the amorphization, however, it is limited by computational power for a realistic FIB process simulation. On the other hand, Binary Collision Approximation (BCA) simulation is able and has been used to simulate ion-solid interaction process at a realistic scale. In this study, a Point Defect Density approach is introduced to a dynamic BCA simulation, considering dynamic ion-solid interactions. We used this method to predict the c-Si amorphization caused by FIB milling on Si. To validate the method, dedicated TEM studies are performed. It shows that the amorphous layer thickness predicted by the numerical simulation is consistent with the experimental data. In summary, the thickness of the near-surface Si amorphization layer caused by FIB milling can be well predicted using the Point Defect Density approach within the dynamic BCA model. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. New Monte Carlo model of cylindrical diffusing fibers illustrates axially heterogeneous fluorescence detection: simulation and experimental validation

    PubMed Central

    Baran, Timothy M.; Foster, Thomas H.

    2011-01-01

    We present a new Monte Carlo model of cylindrical diffusing fibers that is implemented with a graphics processing unit. Unlike previously published models that approximate the diffuser as a linear array of point sources, this model is based on the construction of these fibers. This allows for accurate determination of fluence distributions and modeling of fluorescence generation and collection. We demonstrate that our model generates fluence profiles similar to a linear array of point sources, but reveals axially heterogeneous fluorescence detection. With axially homogeneous excitation fluence, approximately 90% of detected fluorescence is collected by the proximal third of the diffuser for μs'/μa = 8 in the tissue and 70 to 88% is collected in this region for μs'/μa = 80. Increased fluorescence detection by the distal end of the diffuser relative to the center section is also demonstrated. Validation of these results was performed by creating phantoms consisting of layered fluorescent regions. Diffusers were inserted into these layered phantoms and fluorescence spectra were collected. Fits to these spectra show quantitative agreement between simulated fluorescence collection sensitivities and experimental results. These results will be applicable to the use of diffusers as detectors for dosimetry in interstitial photodynamic therapy. PMID:21895311

  15. Modeling, construction and experimental validation of actuated rolling dynamics of the cylindrical Transforming Roving-Rolling Explorer (TRREx)

    NASA Astrophysics Data System (ADS)

    Edwin, L.; Mazzoleni, A.; Gemmer, T.; Ferguson, S.

    2017-03-01

    Planetary surface exploration technology over the past few years has seen significant advancements on multiple fronts. Robotic exploration platforms are becoming more sophisticated and capable of embarking on more challenging missions. More unconventional designs, particularly transforming architectures that have multiple modes of locomotion, are being studied. This work explores the capabilities of one such novel transforming rover called the Transforming Roving-Rolling Explorer (TRREx). Biologically inspired by the armadillo and the golden-wheel spider, the TRREx has two modes of locomotion: it can traverse on six wheels like a conventional rover on benign terrain, but can transform into a sphere when necessary to negotiate steep rugged slopes. The ability to self-propel in the spherical configuration, even in the absence of a negative gradient, increases the TRREx's versatility and its concept value. This paper describes construction and testing of a prototype cylindrical TRREx that demonstrates that "actuated rolling" can be achieved, and also presents a dynamic model of this prototype version of the TRREx that can be used to investigate the feasibility and value of such self-propelled locomotion. Finally, we present results that validate our dynamic model by comparing results from computer simulations made using the dynamic model to experimental results acquired from test runs using the prototype.

  16. Show and tell: disclosure and data sharing in experimental pathology.

    PubMed

    Schofield, Paul N; Ward, Jerrold M; Sundberg, John P

    2016-06-01

    Reproducibility of data from experimental investigations using animal models is increasingly under scrutiny because of the potentially negative impact of poor reproducibility on the translation of basic research. Histopathology is a key tool in biomedical research, in particular for the phenotyping of animal models to provide insights into the pathobiology of diseases. Failure to disclose and share crucial histopathological experimental details compromises the validity of the review process and reliability of the conclusions. We discuss factors that affect the interpretation and validation of histopathology data in publications and the importance of making these data accessible to promote replicability in research. © 2016. Published by The Company of Biologists Ltd.

  17. DoSSiER: Database of scientific simulation and experimental results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenzel, Hans; Yarba, Julia; Genser, Krzystof

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  18. DoSSiER: Database of scientific simulation and experimental results

    DOE PAGES

    Wenzel, Hans; Yarba, Julia; Genser, Krzystof; ...

    2016-08-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  19. Numerical and Experimental Validation of a New Damage Initiation Criterion

    NASA Astrophysics Data System (ADS)

    Sadhinoch, M.; Atzema, E. H.; Perdahcioglu, E. S.; van den Boogaard, A. H.

    2017-09-01

    Most commercial finite element software packages, like Abaqus, have a built-in coupled damage model where a damage evolution needs to be defined in terms of a single fracture energy value for all stress states. The Johnson-Cook criterion has been modified to be Lode parameter dependent and this Modified Johnson-Cook (MJC) criterion is used as a Damage Initiation Surface (DIS) in combination with the built-in Abaqus ductile damage model. An exponential damage evolution law has been used with a single fracture energy value. Ultimately, the simulated force-displacement curves are compared with experiments to validate the MJC criterion. 7 out of 9 fracture experiments were predicted accurately. The limitations and accuracy of the failure predictions of the newly developed damage initiation criterion will be discussed shortly.

  20. Experimental Validation Data for Computational Fluid Dynamics of Forced Convection on a Vertical Flat Plate

    DOE PAGES

    Harris, Jeff R.; Lance, Blake W.; Smith, Barton L.

    2015-08-10

    We present computational fluid dynamics (CFD) validation dataset for turbulent forced convection on a vertical plate. The design of the apparatus is based on recent validation literature and provides a means to simultaneously measure boundary conditions (BCs) and system response quantities (SRQs). Important inflow quantities for Reynolds-Averaged Navier-Stokes (RANS). CFD are also measured. Data are acquired at two heating conditions and cover the range 40,000 < Re x < 300,000, 357 < Re δ2 < 813, and 0.02 < Gr/Re 2 < 0.232.

  1. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations

    PubMed Central

    Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.

    2017-01-01

    A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and

  2. Validation of GC and HPLC systems for residue studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, M.

    1995-12-01

    For residue studies, GC and HPLC system performance must be validated prior to and during use. One excellent measure of system performance is the standard curve and associated chromatograms used to construct that curve. The standard curve is a model of system response to an analyte over a specific time period, and is prima facia evidence of system performance beginning at the auto sampler and proceeding through the injector, column, detector, electronics, data-capture device, and printer/plotter. This tool measures the performance of the entire chromatographic system; its power negates most of the benefits associated with costly and time-consuming validation ofmore » individual system components. Other measures of instrument and method validation will be discussed, including quality control charts and experimental designs for method validation.« less

  3. In silico simulations of experimental protocols for cardiac modeling.

    PubMed

    Carro, Jesus; Rodriguez, Jose Felix; Pueyo, Esther

    2014-01-01

    A mathematical model of the AP involves the sum of different transmembrane ionic currents and the balance of intracellular ionic concentrations. To each ionic current corresponds an equation involving several effects. There are a number of model parameters that must be identified using specific experimental protocols in which the effects are considered as independent. However, when the model complexity grows, the interaction between effects becomes increasingly important. Therefore, model parameters identified considering the different effects as independent might be misleading. In this work, a novel methodology consisting in performing in silico simulations of the experimental protocol and then comparing experimental and simulated outcomes is proposed for parameter model identification and validation. The potential of the methodology is demonstrated by validating voltage-dependent L-type calcium current (ICaL) inactivation in recently proposed human ventricular AP models with different formulations. Our results show large differences between ICaL inactivation as calculated from the model equation and ICaL inactivation from the in silico simulations due to the interaction between effects and/or to the experimental protocol. Our results suggest that, when proposing any new model formulation, consistency between such formulation and the corresponding experimental data that is aimed at being reproduced needs to be first verified considering all involved factors.

  4. Intermediate Scale Experimental Design to Validate a Subsurface Inverse Theory Applicable to Date-sparse Conditions

    NASA Astrophysics Data System (ADS)

    Jiao, J.; Trautz, A.; Zhang, Y.; Illangasekera, T.

    2017-12-01

    Subsurface flow and transport characterization under data-sparse condition is addressed by a new and computationally efficient inverse theory that simultaneously estimates parameters, state variables, and boundary conditions. Uncertainty in static data can be accounted for while parameter structure can be complex due to process uncertainty. The approach has been successfully extended to inverting transient and unsaturated flows as well as contaminant source identification under unknown initial and boundary conditions. In one example, by sampling numerical experiments simulating two-dimensional steady-state flow in which tracer migrates, a sequential inversion scheme first estimates the flow field and permeability structure before the evolution of tracer plume and dispersivities are jointly estimated. Compared to traditional inversion techniques, the theory does not use forward simulations to assess model-data misfits, thus the knowledge of the difficult-to-determine site boundary condition is not required. To test the general applicability of the theory, data generated during high-precision intermediate-scale experiments (i.e., a scale intermediary to the field and column scales) in large synthetic aquifers can be used. The design of such experiments is not trivial as laboratory conditions have to be selected to mimic natural systems in order to provide useful data, thus requiring a variety of sensors and data collection strategies. This paper presents the design of such an experiment in a synthetic, multi-layered aquifer with dimensions of 242.7 x 119.3 x 7.7 cm3. Different experimental scenarios that will generate data to validate the theory are presented.

  5. Revealing the Effects of the Herbal Pair of Euphorbia kansui and Glycyrrhiza on Hepatocellular Carcinoma Ascites with Integrating Network Target Analysis and Experimental Validation

    PubMed Central

    Zhang, Yanqiong; Lin, Ya; Zhao, Haiyu; Guo, Qiuyan; Yan, Chen; Lin, Na

    2016-01-01

    Although the herbal pair of Euphorbia kansui (GS) and Glycyrrhiza (GC) is one of the so-called "eighteen antagonistic medicaments" in Chinese medicinal literature, it is prescribed in a classic Traditional Chinese Medicine (TCM) formula Gansui-Banxia-Tang for cancerous ascites, suggesting that GS and GC may exhibit synergistic or antagonistic effects in different combination designs. Here, we modeled the effects of GS/GC combination with a target interaction network and clarified the associations between the network topologies involving the drug targets and the drug combination effects. Moreover, the "edge-betweenness" values, which is defined as the frequency with which edges are placed on the shortest paths between all pairs of modules in network, were calculated, and the ADRB1-PIK3CG interaction exhibited the greatest edge-betweenness value, suggesting its crucial role in connecting the other edges in the network. Because ADRB1 and PIK3CG were putative targets of GS and GC, respectively, and both had functional interactions with AVPR2 approved as known therapeutic target for ascites, we proposed that the ADRB1-PIK3CG-AVPR2 signal axis might be involved in the effects of the GS-GC combination on ascites. This proposal was further experimentally validated in a H22 hepatocellular carcinoma (HCC) ascites model. Collectively, this systems-level investigation integrated drug target prediction and network analysis to reveal the combination principles of the herbal pair of GS and GC. Experimental validation in an in vivo system provided convincing evidence that different combination designs of GS and GC might result in synergistic or antagonistic effects on HCC ascites that might be partially related to their regulation of the ADRB1-PIK3CG-AVPR2 signal axis. PMID:27143956

  6. Multiplexing T- and B-Cell FLUOROSPOT Assays: Experimental Validation of the Multi-Color ImmunoSpot® Software Based on Center of Mass Distance Algorithm.

    PubMed

    Karulin, Alexey Y; Megyesi, Zoltán; Caspell, Richard; Hanson, Jodi; Lehmann, Paul V

    2018-01-01

    Over the past decade, ELISPOT has become a highly implemented mainstream assay in immunological research, immune monitoring, and vaccine development. Unique single cell resolution along with high throughput potential sets ELISPOT apart from flow cytometry, ELISA, microarray- and bead-based multiplex assays. The necessity to unambiguously identify individual T and B cells that do, or do not co-express certain analytes, including polyfunctional cytokine producing T cells has stimulated the development of multi-color ELISPOT assays. The success of these assays has also been driven by limited sample/cell availability and resource constraints with reagents and labor. There are few commercially available test kits and instruments available at present for multi-color FLUOROSPOT. Beyond commercial descriptions of competing systems, little is known about their accuracy in experimental settings detecting individual cells that secrete multiple analytes vs. random overlays of spots. Here, we present a theoretical and experimental validation study for three and four color T- and B-cell FLUOROSPOT data analysis. The ImmunoSpot ® Fluoro-X™ analysis system we used includes an automatic image acquisition unit that generates individual color images free of spectral overlaps and multi-color spot counting software based on the maximal allowed distance between centers of spots of different colors or Center of Mass Distance (COMD). Using four color B-cell FLUOROSPOT for IgM, IgA, IgG1, IgG3; and three/four color T-cell FLUOROSPOT for IL-2, IFN-γ, TNF-α, and GzB, in serial dilution experiments, we demonstrate the validity and accuracy of Fluoro-X™ multi-color spot counting algorithms. Statistical predictions based on the Poisson spatial distribution, coupled with scrambled image counting, permit objective correction of true multi-color spot counts to exclude randomly overlaid spots.

  7. Experimental Flight Characterization of a Canard-Controlled, Subsonic Missile

    DTIC Science & Technology

    2017-08-01

    ARL-TR-8086 ● AUG 2017 US Army Research Laboratory Experimental Flight Characterization of a Canard- Controlled , Subsonic Missile...Laboratory Experimental Flight Characterization of a Canard- Controlled , Subsonic Missile by Frank Fresconi, Ilmars Celmins, James Maley, and...valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) August 2017 2. REPORT TYPE Technical

  8. Issues and approach to develop validated analysis tools for hypersonic flows: One perspective

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.

    1992-01-01

    Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.

  9. Modeling and Validation of a Three-Stage Solidification Model for Sprays

    NASA Astrophysics Data System (ADS)

    Tanner, Franz X.; Feigl, Kathleen; Windhab, Erich J.

    2010-09-01

    A three-stage freezing model and its validation are presented. In the first stage, the cooling of the droplet down to the freezing temperature is described as a convective heat transfer process in turbulent flow. In the second stage, when the droplet has reached the freezing temperature, the solidification process is initiated via nucleation and crystal growth. The latent heat release is related to the amount of heat convected away from the droplet and the rate of solidification is expressed with a freezing progress variable. After completion of the solidification process, in stage three, the cooling of the solidified droplet (particle) is described again by a convective heat transfer process until the particle approaches the temperature of the gaseous environment. The model has been validated by experimental data of a single cocoa butter droplet suspended in air. The subsequent spray validations have been performed with data obtained from a cocoa butter melt in an experimental spray tower using the open-source computational fluid dynamics code KIVA-3.

  10. Experimental design and reporting standards for improving the internal validity of pre-clinical studies in the field of pain: Consensus of the IMI-Europain consortium.

    PubMed

    Knopp, K L; Stenfors, C; Baastrup, C; Bannon, A W; Calvo, M; Caspani, O; Currie, G; Finnerup, N B; Huang, W; Kennedy, J D; Lefevre, I; Machin, I; Macleod, M; Rees, H; Rice, A S C; Rutten, K; Segerdahl, M; Serra, J; Wodarski, R; Berge, O-G; Treedef, R-D

    2017-12-29

    , and blinding are discussed. In addition, considerations of how stress and normal rodent physiology impact outcome of analgesic drug studies are considered. Flow diagrams are standard requirements in all clinical trials, and flow diagrams for preclinical trials, which describe number of animals included/excluded, and reasons for exclusion are proposed. Creation of a trial registry for pre-clinical studies focused on drug development in order to estimate possible publication bias is discussed. Conclusions More systematic research is needed to analyze how inadequate internal validity and/or experimental bias may impact reproducibility across pre-clinical pain studies. Addressing the potential threats to internal validity and the sources of experimental biases, as well as increasing the transparency in reporting, are likely to improve preclinical research broadly by ensuring relevant progress is made in advancing the knowledge of chronic pain pathophysiology and identifying novel analgesics. Implications We are now disseminating these Europain processes for discussion in the wider pain research community. Any benefit from these guidelines will be dependent on acceptance and disciplined implementation across pre-clinical laboratories, funding agencies and journal editors, but it is anticipated that these guidelines will be a first step towards improving scientific rigor across the field of pre-clinical pain research.

  11. Comparison Between Numerically Simulated and Experimentally Measured Flowfield Quantities Behind a Pulsejet

    NASA Technical Reports Server (NTRS)

    Geng, Tao; Paxson, Daniel E.; Zheng, Fei; Kuznetsov, Andrey V.; Roberts, William L.

    2008-01-01

    Pulsed combustion is receiving renewed interest as a potential route to higher performance in air breathing propulsion systems. Pulsejets offer a simple experimental device with which to study unsteady combustion phenomena and validate simulations. Previous computational fluid dynamic (CFD) simulation work focused primarily on the pulsejet combustion and exhaust processes. This paper describes a new inlet sub-model which simulates the fluidic and mechanical operation of a valved pulsejet head. The governing equations for this sub-model are described. Sub-model validation is provided through comparisons of simulated and experimentally measured reed valve motion, and time averaged inlet mass flow rate. The updated pulsejet simulation, with the inlet sub-model implemented, is validated through comparison with experimentally measured combustion chamber pressure, inlet mass flow rate, operational frequency, and thrust. Additionally, the simulated pulsejet exhaust flowfield, which is dominated by a starting vortex ring, is compared with particle imaging velocimetry (PIV) measurements on the bases of velocity, vorticity, and vortex location. The results show good agreement between simulated and experimental data. The inlet sub-model is shown to be critical for the successful modeling of pulsejet operation. This sub-model correctly predicts both the inlet mass flow rate and its phase relationship with the combustion chamber pressure. As a result, the predicted pulsejet thrust agrees very well with experimental data.

  12. Validated MicroRNA Target Databases: An Evaluation.

    PubMed

    Lee, Yun Ji Diana; Kim, Veronica; Muth, Dillon C; Witwer, Kenneth W

    2015-11-01

    Preclinical Research Positive findings from preclinical and clinical studies involving depletion or supplementation of microRNA (miRNA) engender optimism about miRNA-based therapeutics. However, off-target effects must be considered. Predicting these effects is complicated. Each miRNA may target many gene transcripts, and the rules governing imperfectly complementary miRNA: target interactions are incompletely understood. Several databases provide lists of the relatively small number of experimentally confirmed miRNA: target pairs. Although incomplete, this information might allow assessment of at least some of the off-target effects. We evaluated the performance of four databases of experimentally validated miRNA: target interactions (miRWalk 2.0, miRTarBase, miRecords, and TarBase 7.0) using a list of 50 alphabetically consecutive genes. We examined the provided citations to determine the degree to which each interaction was experimentally supported. To assess stability, we tested at the beginning and end of a five-month period. Results varied widely by database. Two of the databases changed significantly over the course of 5 months. Most reported evidence for miRNA: target interactions were indirect or otherwise weak, and relatively few interactions were supported by more than one publication. Some returned results appear to arise from simplistic text searches that offer no insight into the relationship of the search terms, may not even include the reported gene or miRNA, and may thus, be invalid. We conclude that validation databases provide important information, but not all information in all extant databases is up-to-date or accurate. Nevertheless, the more comprehensive validation databases may provide useful starting points for investigation of off-target effects of proposed small RNA therapies. © 2015 Wiley Periodicals, Inc.

  13. Experimental validation of tonal noise control from subsonic axial fans using flow control obstructions

    NASA Astrophysics Data System (ADS)

    Gérard, Anthony; Berry, Alain; Masson, Patrice; Gervais, Yves

    2009-03-01

    This paper presents the acoustic performance of a novel approach for the passive adaptive control of tonal noise radiated from subsonic fans. Tonal noise originates from non-uniform flow that causes circumferentially varying blade forces and gives rise to a considerably larger radiated dipolar sound at the blade passage frequency (BPF) and its harmonics compared to the tonal noise generated by a uniform flow. The approach presented in this paper uses obstructions in the flow to destructively interfere with the primary tonal noise arising from various flow conditions. The acoustic radiation of the obstructions is first demonstrated experimentally. Indirect on-axis acoustic measurements are used to validate the analytical prediction of the circumferential spectrum of the blade unsteady lift and related indicators generated by the trapezoidal and sinusoidal obstructions presented in Ref. [A. Gérard, A. Berry, P. Masson, Y. Gervais, Modelling of tonal noise control from subsonic axial fans using flow control obstructions, Journal of Sound and Vibration (2008), this issue, doi: 10.1016/j.jsv.2008.09.027.] and also by cylindrical obstructions used in the literature. The directivity and sound power attenuation are then given in free field for the control of the BPF tone generated by rotor/outlet guide vane (OGV) interaction and the control of an amplified BPF tone generated by the rotor/OGV interaction with an added triangular obstruction between two outlet guide vanes to enhance the primary non-uniform flow. Global control was demonstrated in free field, attenuation up to 8.4 dB of the acoustic power at BPF has been measured. Finally, the aerodynamic performances of the automotive fan used in this study are almost not affected by the presence of the control obstruction.

  14. Experimental methods to validate measures of emotional state and readiness for duty in critical operations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weston, Louise Marie

    2007-09-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This reportmore » reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended.« less

  15. A More Rigorous Quasi-Experimental Alternative to the One-Group Pretest-Posttest Design.

    ERIC Educational Resources Information Center

    Johnson, Craig W.

    1986-01-01

    A simple quasi-experimental design is described which may have utility in a variety of applied and laboratory research settings where ordinarily the one-group pretest-posttest pre-experimental design might otherwise be the procedure of choice. The design approaches the internal validity of true experimental designs while optimizing external…

  16. Experimental Validation of the Transverse Shear Behavior of a Nomex Core for Sandwich Panels

    NASA Astrophysics Data System (ADS)

    Farooqi, M. I.; Nasir, M. A.; Ali, H. M.; Ali, Y.

    2017-05-01

    This work deals with determination of the transverse shear moduli of a Nomex® honeycomb core of sandwich panels. Their out-of-plane shear characteristics depend on the transverse shear moduli of the honeycomb core. These moduli were determined experimentally, numerically, and analytically. Numerical simulations were performed by using a unit cell model and three analytical approaches. Analytical calculations showed that two of the approaches provided reasonable predictions for the transverse shear modulus as compared with experimental results. However, the approach based upon the classical lamination theory showed large deviations from experimental data. Numerical simulations also showed a trend similar to that resulting from the analytical models.

  17. Quasi-experimental study designs series-paper 4: uses and value.

    PubMed

    Bärnighausen, Till; Tugwell, Peter; Røttingen, John-Arne; Shemilt, Ian; Rockers, Peter; Geldsetzer, Pascal; Lavis, John; Grimshaw, Jeremy; Daniels, Karen; Brown, Annette; Bor, Jacob; Tanner, Jeffery; Rashidian, Arash; Barreto, Mauricio; Vollmer, Sebastian; Atun, Rifat

    2017-09-01

    Quasi-experimental studies are increasingly used to establish causal relationships in epidemiology and health systems research. Quasi-experimental studies offer important opportunities to increase and improve evidence on causal effects: (1) they can generate causal evidence when randomized controlled trials are impossible; (2) they typically generate causal evidence with a high degree of external validity; (3) they avoid the threats to internal validity that arise when participants in nonblinded experiments change their behavior in response to the experimental assignment to either intervention or control arm (such as compensatory rivalry or resentful demoralization); (4) they are often well suited to generate causal evidence on long-term health outcomes of an intervention, as well as nonhealth outcomes such as economic and social consequences; and (5) they can often generate evidence faster and at lower cost than experiments and other intervention studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Conditions for the Validity of Faraday's Law of Induction and Their Experimental Confirmation

    ERIC Educational Resources Information Center

    Lopez-Ramos, A.; Menendez, J. R.; Pique, C.

    2008-01-01

    This paper, as its main didactic objective, shows the conditions needed for the validity of Faraday's law of induction. Inadequate comprehension of these conditions has given rise to several paradoxes about the issue; some are analysed and solved in this paper in the light of the theoretical deduction of the induction law. Furthermore, an…

  19. Experimental Space Shuttle Orbiter Studies to Acquire Data for Code and Flight Heating Model Validation

    NASA Technical Reports Server (NTRS)

    Wadhams, T. P.; Holden, M. S.; MacLean, M. G.; Campbell, Charles

    2010-01-01

    thin-film resolution in both the span and chord direction in the area of peak heating. Additional objectives of this first study included: obtaining natural or tripped turbulent wing leading edge heating levels, assessing the effectiveness of protuberances and cavities placed at specified locations on the orbiter over a range of Mach numbers and Reynolds numbers to evaluate and compare to existing engineering and computational tools, obtaining cavity floor heating to aid in the verification of cavity heating correlations, acquiring control surface deflection heating data on both the main body flap and elevons, and obtain high speed schlieren videos of the interaction of the orbiter nose bow shock with the wing leading edge. To support these objectives, the stainless steel 1.8% scale orbiter model in addition to the sensors on the wing leading edge was instrumented down the windward centerline, over the wing acreage on the port side, and painted with temperature sensitive paint on the starboard side wing acreage. In all, the stainless steel 1.8% scale Orbiter model was instrumented with over three-hundred highly sensitive thin-film heating sensors, two-hundred of which were located in the wing leading edge shock interaction region. Further experimental studies will also be performed following the successful acquisition of flight data during the Orbiter Entry Boundary Layer Flight Experiment and HYTHIRM on STS-119 at specific data points simulating flight conditions and geometries. Additional instrumentation and a protuberance matching the layout present during the STS-119 boundary layer transition flight experiment were added with testing performed at Mach number and Reynolds number conditions simulating conditions experienced in flight. In addition to the experimental studies, CUBRC also performed a large amount of CFD analysis to confirm and validate not only the tunnel freestream conditions, but also 3D flows over the orbiter acreage, wing leading edge, and

  20. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  1. Behavior under the Microscope: Increasing the Resolution of Our Experimental Procedures

    ERIC Educational Resources Information Center

    Palmer, David C.

    2010-01-01

    Behavior analysis has exploited conceptual tools whose experimental validity has been amply demonstrated, but their relevance to large-scale and fine-grained behavioral phenomena remains uncertain, because the experimental analysis of these domains faces formidable obstacles of measurement and control. In this essay I suggest that, at least at the…

  2. Development and validation of a general approach to predict and quantify the synergism of anti-cancer drugs using experimental design and artificial neural networks.

    PubMed

    Pivetta, Tiziana; Isaia, Francesco; Trudu, Federica; Pani, Alessandra; Manca, Matteo; Perra, Daniela; Amato, Filippo; Havel, Josef

    2013-10-15

    The combination of two or more drugs using multidrug mixtures is a trend in the treatment of cancer. The goal is to search for a synergistic effect and thereby reduce the required dose and inhibit the development of resistance. An advanced model-free approach for data exploration and analysis, based on artificial neural networks (ANN) and experimental design is proposed to predict and quantify the synergism of drugs. The proposed method non-linearly correlates the concentrations of drugs with the cytotoxicity of the mixture, providing the possibility of choosing the optimal drug combination that gives the maximum synergism. The use of ANN allows for the prediction of the cytotoxicity of each combination of drugs in the chosen concentration interval. The method was validated by preparing and experimentally testing the combinations with the predicted highest synergistic effect. In all cases, the data predicted by the network were experimentally confirmed. The method was applied to several binary mixtures of cisplatin and [Cu(1,10-orthophenanthroline)2(H2O)](ClO4)2, Cu(1,10-orthophenanthroline)(H2O)2(ClO4)2 or [Cu(1,10-orthophenanthroline)2(imidazolidine-2-thione)](ClO4)2. The cytotoxicity of the two drugs, alone and in combination, was determined against human acute T-lymphoblastic leukemia cells (CCRF-CEM). For all systems, a synergistic effect was found for selected combinations. © 2013 Elsevier B.V. All rights reserved.

  3. Theoretical research and experimental validation of elastic dynamic load spectra on bogie frame of high-speed train

    NASA Astrophysics Data System (ADS)

    Zhu, Ning; Sun, Shouguang; Li, Qiang; Zou, Hua

    2016-05-01

    When a train runs at high speeds, the external exciting frequencies approach the natural frequencies of bogie critical components, thereby inducing strong elastic vibrations. The present international reliability test evaluation standard and design criteria of bogie frames are all based on the quasi-static deformation hypothesis. Structural fatigue damage generated by structural elastic vibrations has not yet been included. In this paper, theoretical research and experimental validation are done on elastic dynamic load spectra on bogie frame of high-speed train. The construction of the load series that correspond to elastic dynamic deformation modes is studied. The simplified form of the load series is obtained. A theory of simplified dynamic load-time histories is then deduced. Measured data from the Beijing-Shanghai Dedicated Passenger Line are introduced to derive the simplified dynamic load-time histories. The simplified dynamic discrete load spectra of bogie frame are established. Based on the damage consistency criterion and a genetic algorithm, damage consistency calibration of the simplified dynamic load spectra is finally performed. The computed result proves that the simplified load series is reasonable. The calibrated damage that corresponds to the elastic dynamic discrete load spectra can cover the actual damage at the operating conditions. The calibrated damage satisfies the safety requirement of damage consistency criterion for bogie frame. This research is helpful for investigating the standardized load spectra of bogie frame of high-speed train.

  4. Monitoring tooth profile faults in epicyclic gearboxes using synchronously averaged motor currents: Mathematical modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Ottewill, J. R.; Ruszczyk, A.; Broda, D.

    2017-02-01

    Time-varying transmission paths and inaccessibility can increase the difficulty in both acquiring and processing vibration signals for the purpose of monitoring epicyclic gearboxes. Recent work has shown that the synchronous signal averaging approach may be applied to measured motor currents in order to diagnose tooth faults in parallel shaft gearboxes. In this paper we further develop the approach, so that it may also be applied to monitor tooth faults in epicyclic gearboxes. A low-degree-of-freedom model of an epicyclic gearbox which incorporates the possibility of simulating tooth faults, as well as any subsequent tooth contact loss due to these faults, is introduced. By combining this model with a simple space-phasor model of an induction motor it is possible to show that, in theory, tooth faults in epicyclic gearboxes may be identified from motor currents. Applying the synchronous averaging approach to experimentally recorded motor currents and angular displacements recorded from a shaft mounted encoder, validate this finding. Comparison between experiments and theory highlight the influence of operating conditions, backlash and shaft couplings on the transient response excited in the currents by the tooth fault. The results obtained suggest that the method may be a viable alternative or complement to more traditional methods for monitoring gearboxes. However, general observations also indicate that further investigations into the sensitivity and robustness of the method would be beneficial.

  5. Experimental validation of Critical Temperature-Pressure theory of scuffing

    NASA Astrophysics Data System (ADS)

    Lee, Si C.; Chen, Huanliang

    1995-07-01

    A series of experiments was conducted for validating a newly developed theory of scuffing. The Critical temperature-Pressure (CTP) theory is based on the physisorption behavior of lubricants and is capable of predicting the onset of scuffing failures over a wide range of operating conditions, including the contacts operating in the boundary lubrication and in the partial elastohydrodynamic lubrication (EHL) regimes. According to the CTP theory, failures occur when the contact temperature exceeds a certain critical value which is a function of the lubricant pressure generated by the hydrodynamic action of the EHL contact. A special device capable of simulating the ambient conditions of the partial EHL conjunctions (of contact temperature, pressure, and the lubricant pressure) was constructed. A ball-on-flat type wear tester was put inside a pressure vessel, completely immersed in a highly pressurized bath of mineral oil. The temperature on the flat specimen was gradually increased while the ball was slowly traversed. At a certain critical temmperature, the friction force abruptly jumped indicating the incipiency of the lubrication breakdown. This experiment was repeated for several levels of hydrostatic pressure and the corresponding critical temperatures were obtained. The test results showed an excellent correlation with the newly developed CTP theory.

  6. Construct and Concurrent Validity of a Prototype Questionnaire to Survey Public Attitudes toward Stuttering

    ERIC Educational Resources Information Center

    St. Louis, Kenneth O.; Reichel, Isabella K.; Yaruss, J. Scott; Lubker, Bobbie Boyd

    2009-01-01

    Purpose: Construct validity and concurrent validity were investigated in a prototype survey instrument, the "Public Opinion Survey of Human Attributes-Experimental Edition" (POSHA-E). The POSHA-E was designed to measure public attitudes toward stuttering within the context of eight other attributes, or "anchors," assumed to range from negative…

  7. Numerical Modeling and Experimental Validation by Calorimetric Detection of Energetic Materials Using Thermal Bimorph Microcantilever Array: A Case Study on Sensing Vapors of Volatile Organic Compounds (VOCs)

    PubMed Central

    Kang, Seok-Won; Fragala, Joe; Banerjee, Debjyoti

    2015-01-01

    Bi-layer (Au-Si3N4) microcantilevers fabricated in an array were used to detect vapors of energetic materials such as explosives under ambient conditions. The changes in the bending response of each thermal bimorph (i.e., microcantilever) with changes in actuation currents were experimentally monitored by measuring the angle of the reflected ray from a laser source used to illuminate the gold nanocoating on the surface of silicon nitride microcantilevers in the absence and presence of a designated combustible species. Experiments were performed to determine the signature response of this nano-calorimeter platform for each explosive material considered for this study. Numerical modeling was performed to predict the bending response of the microcantilevers for various explosive materials, species concentrations, and actuation currents. The experimental validation of the numerical predictions demonstrated that in the presence of different explosive or combustible materials, the microcantilevers exhibited unique trends in their bending responses with increasing values of the actuation current. PMID:26334276

  8. Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.

    PubMed

    Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan

    2013-01-01

    In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.

  9. Standoff determination of the particle size and concentration of small optical depth clouds based on double scattering measurements: concept and experimental validation with bioaerosols.

    PubMed

    Roy, Gilles; Roy, Nathalie

    2008-03-20

    A multiple-field-of-view (MFOV) lidar is used to characterize size and optical depth of low concentration of bioaerosol clouds. The concept relies on the measurement of the forward scattered light by using the background aerosols at various distances at the back of a subvisible cloud. It also relies on the subtraction of the background aerosol forward scattering contribution and on the partial attenuation of the first-order backscattering. The validity of the concept developed to retrieve the effective diameter and the optical depth of low concentration bioaerosol clouds with good precision is demonstrated using simulation results and experimental MFOV lidar measurements. Calculations are also done to show that the method presented can be extended to small optical depth cloud retrieval.

  10. Turbofan Engine Post-Instability Behavior - Computer Simulations, Test Validation, and Application of Simulations,

    DTIC Science & Technology

    COMPRESSORS, *AIR FLOW, TURBOFAN ENGINES , TRANSIENTS, SURGES, STABILITY, COMPUTERIZED SIMULATION, EXPERIMENTAL DATA, VALIDATION, DIGITAL SIMULATION, INLET GUIDE VANES , ROTATION, STALLING, RECOVERY, HYSTERESIS

  11. Identification and validation of loss of function variants in clinical contexts.

    PubMed

    Lescai, Francesco; Marasco, Elena; Bacchelli, Chiara; Stanier, Philip; Mantovani, Vilma; Beales, Philip

    2014-01-01

    The choice of an appropriate variant calling pipeline for exome sequencing data is becoming increasingly more important in translational medicine projects and clinical contexts. Within GOSgene, which facilitates genetic analysis as part of a joint effort of the University College London and the Great Ormond Street Hospital, we aimed to optimize a variant calling pipeline suitable for our clinical context. We implemented the GATK/Queue framework and evaluated the performance of its two callers: the classical UnifiedGenotyper and the new variant discovery tool HaplotypeCaller. We performed an experimental validation of the loss-of-function (LoF) variants called by the two methods using Sequenom technology. UnifiedGenotyper showed a total validation rate of 97.6% for LoF single-nucleotide polymorphisms (SNPs) and 92.0% for insertions or deletions (INDELs), whereas HaplotypeCaller was 91.7% for SNPs and 55.9% for INDELs. We confirm that GATK/Queue is a reliable pipeline in translational medicine and clinical context. We conclude that in our working environment, UnifiedGenotyper is the caller of choice, being an accurate method, with a high validation rate of error-prone calls like LoF variants. We finally highlight the importance of experimental validation, especially for INDELs, as part of a standard pipeline in clinical environments.

  12. A vortex model for forces and moments on low-aspect-ratio wings in side-slip with experimental validation

    PubMed Central

    DeVoria, Adam C.

    2017-01-01

    This paper studies low-aspect-ratio () rectangular wings at high incidence and in side-slip. The main objective is to incorporate the effects of high angle of attack and side-slip into a simplified vortex model for the forces and moments. Experiments are also performed and are used to validate assumptions made in the model. The model asymptotes to the potential flow result of classical aerodynamics for an infinite aspect ratio. The → 0 limit of a rectangular wing is considered with slender body theory, where the side-edge vortices merge into a vortex doublet. Hence, the velocity fields transition from being dominated by a spanwise vorticity monopole ( ≫ 1) to a streamwise vorticity dipole ( ∼ 1). We theoretically derive a spanwise loading distribution that is parabolic instead of elliptic, and this physically represents the additional circulation around the wing that is associated with reattached flow. This is a fundamental feature of wings with a broad-facing leading edge. The experimental measurements of the spanwise circulation closely approximate a parabolic distribution. The vortex model yields very agreeable comparison with direct measurement of the lift and drag, and the roll moment prediction is acceptable for ≤ 1 prior to the roll stall angle and up to side-slip angles of 20°. PMID:28293139

  13. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.

    PubMed

    Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao

    2017-06-30

    Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.

  14. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do

    PubMed Central

    2017-01-01

    Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113

  15. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  16. Ionic polymer-metal composite torsional sensor: physics-based modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Aidi Sharif, Montassar; Lei, Hong; Khalid Al-Rubaiai, Mohammed; Tan, Xiaobo

    2018-07-01

    Ionic polymer-metal composites (IPMCs) have intrinsic sensing and actuation properties. Typical IPMC sensors are in the shape of beams and only respond to stimuli acting along beam-bending directions. Rod or tube-shaped IPMCs have been explored as omnidirectional bending actuators or sensors. In this paper, physics-based modeling is studied for a tubular IPMC sensor under pure torsional stimulus. The Poisson–Nernst–Planck model is used to describe the fundamental physics within the IPMC, where it is hypothesized that the anion concentration is coupled to the sum of shear strains induced by the torsional stimulus. Finite element simulation is conducted to solve for the torsional sensing response, where some of the key parameters are identified based on experimental measurements using an artificial neural network. Additional experimental results suggest that the proposed model is able to capture the torsional sensing dynamics for different amplitudes and rates of the torsional stimulus.

  17. Anesthetics and analgesics in experimental traumatic brain injury: Selection based on experimental objectives

    PubMed Central

    Rowe, Rachel K.; Harrison, Jordan L.; Thomas, Theresa C.; Pauly, James R.; Adelson, P. David; Lifshitz, Jonathan

    2013-01-01

    The use of animal modeling in traumatic brain injury (TBI) research is justified by the lack of sufficiently comprehensive in vitro and computer modeling that incorporates all components of the neurovascular unit. Valid animal modeling of TBI requires accurate replication of both the mechanical forces and secondary injury conditions observed in human patients. Regulatory requirements for animal modeling emphasize the administration of appropriate anesthetics and analgesics unless withholding these drugs is scientifically justified. The objective of this review is to present scientific justification for standardizing the use of anesthetics and analgesics, within a study, when modeling TBI in order to preserve study validity. Evidence for the interference of anesthetics and analgesics in the natural course of brain injury calls for consistent consideration of pain management regimens when conducting TBI research. Anesthetics administered at the time of or shortly after induction of brain injury can alter cognitive, motor, and histological outcomes following TBI. A consistent anesthesia protocol based on experimental objectives within each individual study is imperative when conducting TBI studies to control for the confounding effects of anesthesia on outcome parameters. Experimental studies that replicate the clinical condition are essential to gain further understanding and evaluate possible treatments for TBI. However, with animal models of TBI it is essential that investigators assure a uniform drug delivery protocol that minimizes confounding variables, while minimizing pain and suffering. PMID:23877609

  18. Finite Element Model Development and Validation for Aircraft Fuselage Structures

    NASA Technical Reports Server (NTRS)

    Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.

    2000-01-01

    The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results. The increased frequency range results in a corresponding increase in the number of modes, modal density and spatial resolution requirements. In this study, conventional modal tests using accelerometers are complemented with Scanning Laser Doppler Velocimetry and Electro-Optic Holography measurements to further resolve the spatial response characteristics. Whenever possible, component and subassembly modal tests are used to validate the finite element models at lower levels of assembly. Normal mode predictions for different finite element representations of components and assemblies are compared with experimental results to assess the most accurate techniques for modeling aircraft fuselage type structures.

  19. Importance of the pharmacological profile of the bound ligand in enrichment on nuclear receptors: toward the use of experimentally validated decoy ligands.

    PubMed

    Lagarde, Nathalie; Zagury, Jean-François; Montes, Matthieu

    2014-10-27

    The evaluation of virtual ligand screening methods is of major importance to ensure their reliability. Taking into account the agonist/antagonist pharmacological profile should improve the quality of the benchmarking data sets since ligand binding can induce conformational changes in the nuclear receptor structure and such changes may vary according to the agonist/antagonist ligand profile. We indeed found that splitting the agonist and antagonist ligands into two separate data sets for a given nuclear receptor target significantly enhances the quality of the evaluation. The pharmacological profile of the ligand bound in the binding site of the target structure was also found to be an additional critical parameter. We also illustrate that active compound data sets for a given pharmacological activity can be used as a set of experimentally validated decoy ligands for another pharmacological activity to ensure a reliable and challenging evaluation of virtual screening methods.

  20. Refined method for predicting electrochemical windows of ionic liquids and experimental validation studies.

    PubMed

    Zhang, Yong; Shi, Chaojun; Brennecke, Joan F; Maginn, Edward J

    2014-06-12

    A combined classical molecular dynamics (MD) and ab initio MD (AIMD) method was developed for the calculation of electrochemical windows (ECWs) of ionic liquids. In the method, the liquid phase of ionic liquid is explicitly sampled using classical MD. The electrochemical window, estimated by the energy difference between the highest occupied molecular orbital (HOMO) and lowest unoccupied molecular orbital (LUMO), is calculated at the density functional theory (DFT) level based on snapshots obtained from classical MD trajectories. The snapshots were relaxed using AIMD and quenched to their local energy minima, which assures that the HOMO/LUMO calculations are based on stable configurations on the same potential energy surface. The new procedure was applied to a group of ionic liquids for which the ECWs were also experimentally measured in a self-consistent manner. It was found that the predicted ECWs not only agree with the experimental trend very well but also the values are quantitatively accurate. The proposed method provides an efficient way to compare ECWs of ionic liquids in the same context, which has been difficult in experiments or simulation due to the fact that ECW values sensitively depend on experimental setup and conditions.

  1. Hyper-X: Flight Validation of Hypersonic Airbreathing Technology

    NASA Technical Reports Server (NTRS)

    Rausch, Vincent L.; McClinton, Charles R.; Crawford, J. Larry

    1997-01-01

    This paper provides an overview of NASA's focused hypersonic technology program, i.e. the Hyper-X program. This program is designed to move hypersonic, air breathing vehicle technology from the laboratory environment to the flight environment, the last stage preceding prototype development. This paper presents some history leading to the flight test program, research objectives, approach, schedule and status. Substantial experimental data base and concept validation have been completed. The program is concentrating on Mach 7 vehicle development, verification and validation in preparation for wind tunnel testing in 1998 and flight testing in 1999. It is also concentrating on finalization of the Mach 5 and 10 vehicle designs. Detailed evaluation of the Mach 7 vehicle at the flight conditions is nearing completion, and will provide a data base for validation of design methods once flight test data are available.

  2. Gaseous Sulfate Solubility in Glass: Experimental Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bliss, Mary

    2013-11-30

    Sulfate solubility in glass is a key parameter in many commercial glasses and nuclear waste glasses. This report summarizes key publications specific to sulfate solubility experimental methods and the underlying physical chemistry calculations. The published methods and experimental data are used to verify the calculations in this report and are expanded to a range of current technical interest. The calculations and experimental methods described in this report will guide several experiments on sulfate solubility and saturation for the Hanford Waste Treatment Plant Enhanced Waste Glass Models effort. There are several tables of sulfate gas equilibrium values at high temperature tomore » guide experimental gas mixing and to achieve desired SO3 levels. This report also describes the necessary equipment and best practices to perform sulfate saturation experiments for molten glasses. Results and findings will be published when experimental work is finished and this report is validated from the data obtained.« less

  3. Experimental aeroelasticity history, status and future in brief

    NASA Technical Reports Server (NTRS)

    Ricketts, Rodney H.

    1990-01-01

    NASA conducts wind tunnel experiments to determine and understand the aeroelastic characteristics of new and advanced flight vehicles, including fixed-wing, rotary-wing and space-launch configurations. Review and assessments are made of the state-of-the-art in experimental aeroelasticity regarding available facilities, measurement techniques, and other means and devices useful in testing. In addition, some past experimental programs are described which assisted in the development of new technology, validated new analysis codes, or provided needed information for clearing flight envelopes of unwanted aeroelastic response. Finally, needs and requirements for advances and improvements in testing capabilities for future experimental research and development programs are described.

  4. Quantification of mitral regurgitation by automated cardiac output measurement: experimental and clinical validation

    NASA Technical Reports Server (NTRS)

    Sun, J. P.; Yang, X. S.; Qin, J. X.; Greenberg, N. L.; Zhou, J.; Vazquez, C. J.; Griffin, B. P.; Stewart, W. J.; Thomas, J. D.

    1998-01-01

    OBJECTIVES: To develop and validate an automated noninvasive method to quantify mitral regurgitation. BACKGROUND: Automated cardiac output measurement (ACM), which integrates digital color Doppler velocities in space and in time, has been validated for the left ventricular (LV) outflow tract but has not been tested for the LV inflow tract or to assess mitral regurgitation (MR). METHODS: First, to validate ACM against a gold standard (ultrasonic flow meter), 8 dogs were studied at 40 different stages of cardiac output (CO). Second, to compare ACM to the LV outflow (ACMa) and inflow (ACMm) tracts, 50 normal volunteers without MR or aortic regurgitation (44+/-5 years, 31 male) were studied. Third, to compare ACM with the standard pulsed Doppler-two-dimensional echocardiographic (PD-2D) method for quantification of MR, 51 patients (61+/-14 years, 30 male) with MR were studied. RESULTS: In the canine studies, CO by ACM (1.32+/-0.3 liter/min, y) and flow meter (1.35+/-0.3 liter/min, x) showed good correlation (r=0.95, y=0.89x+0.11) and agreement (deltaCO(y-x)=0.03+/-0.08 [mean+/-SD] liter/min). In the normal subjects, CO measured by ACMm agreed with CO by ACMa (r=0.90, p < 0.0001, deltaCO=-0.09+/-0.42 liter/min), PD (r=0.87, p < 0.0001, deltaCO=0.12+/-0.49 liter/min) and 2D (r=0.84, p < 0.0001, deltaCO=-0.16+/-0.48 liter/min). In the patients, mitral regurgitant volume (MRV) by ACMm-ACMa agreed with PD-2D (r= 0.88, y=0.88x+6.6, p < 0.0001, deltaMRV=2.68+/-9.7 ml). CONCLUSIONS: We determined that ACM is a feasible new method for quantifying LV outflow and inflow volume to measure MRV and that ACM automatically performs calculations that are equivalent to more time-consuming Doppler and 2D measurements. Additionally, ACM should improve MR quantification in routine clinical practice.

  5. Development plan for the External Hazards Experimental Group. Light Water Reactor Sustainability Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin Leigh; Smith, Curtis Lee; Burns, Douglas Edward

    This report describes the development plan for a new multi-partner External Hazards Experimental Group (EHEG) coordinated by Idaho National Laboratory (INL) within the Risk-Informed Safety Margin Characterization (RISMC) technical pathway of the Light Water Reactor Sustainability Program. Currently, there is limited data available for development and validation of the tools and methods being developed in the RISMC Toolkit. The EHEG is being developed to obtain high-quality, small- and large-scale experimental data validation of RISMC tools and methods in a timely and cost-effective way. The group of universities and national laboratories that will eventually form the EHEG (which is ultimately expectedmore » to include both the initial participants and other universities and national laboratories that have been identified) have the expertise and experimental capabilities needed to both obtain and compile existing data archives and perform additional seismic and flooding experiments. The data developed by EHEG will be stored in databases for use within RISMC. These databases will be used to validate the advanced external hazard tools and methods.« less

  6. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  7. Validation of the Monte Carlo simulator GATE for indium-111 imaging.

    PubMed

    Assié, K; Gardin, I; Véra, P; Buvat, I

    2005-07-07

    Monte Carlo simulations are useful for optimizing and assessing single photon emission computed tomography (SPECT) protocols, especially when aiming at measuring quantitative parameters from SPECT images. Before Monte Carlo simulated data can be trusted, the simulation model must be validated. The purpose of this work was to validate the use of GATE, a new Monte Carlo simulation platform based on GEANT4, for modelling indium-111 SPECT data, the quantification of which is of foremost importance for dosimetric studies. To that end, acquisitions of (111)In line sources in air and in water and of a cylindrical phantom were performed, together with the corresponding simulations. The simulation model included Monte Carlo modelling of the camera collimator and of a back-compartment accounting for photomultiplier tubes and associated electronics. Energy spectra, spatial resolution, sensitivity values, images and count profiles obtained for experimental and simulated data were compared. An excellent agreement was found between experimental and simulated energy spectra. For source-to-collimator distances varying from 0 to 20 cm, simulated and experimental spatial resolution differed by less than 2% in air, while the simulated sensitivity values were within 4% of the experimental values. The simulation of the cylindrical phantom closely reproduced the experimental data. These results suggest that GATE enables accurate simulation of (111)In SPECT acquisitions.

  8. Turbine-99 unsteady simulations - Validation

    NASA Astrophysics Data System (ADS)

    Cervantes, M. J.; Andersson, U.; Lövgren, H. M.

    2010-08-01

    The Turbine-99 test case, a Kaplan draft tube model, aimed to determine the state of the art within draft tube simulation. Three workshops were organized on the matter in 1999, 2001 and 2005 where the geometry and experimental data were provided as boundary conditions to the participants. Since the last workshop, computational power and flow modelling have been developed and the available data completed with unsteady pressure measurements and phase resolved velocity measurements in the cone. Such new set of data together with the corresponding phase resolved velocity boundary conditions offer new possibilities to validate unsteady numerical simulations in Kaplan draft tube. The present work presents simulation of the Turbine-99 test case with time dependent angular resolved inlet velocity boundary conditions. Different grids and time steps are investigated. The results are compared to experimental time dependent pressure and velocity measurements.

  9. Animal Experimentation: Issues for the 1980s.

    ERIC Educational Resources Information Center

    Zola, Judith C.; And Others

    1984-01-01

    Examines the extent to which issues related to animal experimentation are in conflict and proposes choices that might least comprise them. These issues include animal well-being, human well-being, self-interest of science, scientific validity and responsibility, progress in biomedical and behavioral science, and the future quality of medical care.…

  10. The Ca(2+)-EDTA chelation as standard reaction to validate Isothermal Titration Calorimeter measurements (ITC).

    PubMed

    Ràfols, Clara; Bosch, Elisabeth; Barbas, Rafael; Prohens, Rafel

    2016-07-01

    A study about the suitability of the chelation reaction of Ca(2+)with ethylenediaminetetraacetic acid (EDTA) as a validation standard for Isothermal Titration Calorimeter measurements has been performed exploring the common experimental variables (buffer, pH, ionic strength and temperature). Results obtained in a variety of experimental conditions have been amended according to the side reactions involved in the main process and to the experimental ionic strength and, finally, validated by contrast with the potentiometric reference values. It is demonstrated that the chelation reaction performed in acetate buffer 0.1M and 25°C shows accurate and precise results and it is robust enough to be adopted as a standard calibration process. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    NASA Astrophysics Data System (ADS)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  12. Does Linguistic Analysis Confirm the Validity of Facilitated Communication?

    ERIC Educational Resources Information Center

    Saloviita, Timo

    2018-01-01

    Facilitated communication (FC) has been interpreted as an ideomotor phenomenon, in which one person physically supports another person's hand and unconsciously affects the content of the writing. Despite the strong experimental evidence against the authenticity of FC output, several studies claim to support its validity based on idiosyncrasies…

  13. Assessment of leaf carotenoids content with a new carotenoid index: Development and validation on experimental and model data

    NASA Astrophysics Data System (ADS)

    Zhou, Xianfeng; Huang, Wenjiang; Kong, Weiping; Ye, Huichun; Dong, Yingying; Casa, Raffaele

    2017-05-01

    Leaf carotenoids content (LCar) is an important indicator of plant physiological status. Accurate estimation of LCar provides valuable insight into early detection of stress in vegetation. With spectroscopy techniques, a semi-empirical approach based on spectral indices was extensively used for carotenoids content estimation. However, established spectral indices for carotenoids that generally rely on limited measured data, might lack predictive accuracy for carotenoids estimation in various species and at different growth stages. In this study, we propose a new carotenoid index (CARI) for LCar assessment based on a large synthetic dataset simulated from the leaf radiative transfer model PROSPECT-5, and evaluate its capability with both simulated data from PROSPECT-5 and 4SAIL and extensive experimental datasets: the ANGERS dataset and experimental data acquired in field experiments in China in 2004. Results show that CARI was the index most linearly correlated with carotenoids content at the leaf level using a synthetic dataset (R2 = 0.943, RMSE = 1.196 μg/cm2), compared with published spectral indices. Cross-validation results with CARI using ANGERS data achieved quite an accurate estimation (R2 = 0.545, RMSE = 3.413 μg/cm2), though the RBRI performed as the best index (R2 = 0.727, RMSE = 2.640 μg/cm2). CARI also showed good accuracy (R2 = 0.639, RMSE = 1.520 μg/cm2) for LCar assessment with leaf level field survey data, though PRI performed better (R2 = 0.710, RMSE = 1.369 μg/cm2). Whereas RBRI, PRI and other assessed spectral indices showed a good performance for a given dataset, overall their estimation accuracy was not consistent across all datasets used in this study. Conversely CARI was more robust showing good results in all datasets. Further assessment of LCar with simulated and measured canopy reflectance data indicated that CARI might not be very sensitive to LCar changes at low leaf area index (LAI) value, and in these conditions soil moisture

  14. Ab Initio Structural Modeling of and Experimental Validation for Chlamydia trachomatis Protein CT296 Reveal Structural Similarity to Fe(II) 2-Oxoglutarate-Dependent Enzymes▿

    PubMed Central

    Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott

    2011-01-01

    Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-Å Cα root mean square deviation [RMSD]) the high-resolution (1.8-Å) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur. PMID:21965559

  15. Medical experimentation in the elderly.

    PubMed

    Bernstein, J E; Nelson, F K

    1975-07-01

    Participation in human experimental research constitutes a major problem for the geriatric subject. Because there is a high incidence of noncontagious disease in the elderly, they are the group most useful for the study of new therapeutic agents or procedures. However, normal aging processes, often coupled with disease of the central nervous system, render elderly persons less able to comprehend the nature and risks of such studies. These factors permit easy exploitation of geriatric subjects in medical experimentation, with possible exposure to a significant risk of serious drug reactions and unnecessary hospitalization. Recent federal regulations have given "special protections" to children, prisoners, and the mentally infirm in experimental research, to guard against abuse of their human rights. A basic requirement is that informed consent be carefully obtained and documented. Such "special protections" should now be extended to geriatric subjects so that there will be no further exploitation in the course of valid clinical research.

  16. An Experimental Study of Characteristic Combustion-Driven Flow for CFD Validation

    NASA Technical Reports Server (NTRS)

    Santoro, Robert J.

    1997-01-01

    A series of uni-element rocket injector studies were completed to provide benchmark quality data needed to validate computational fluid dynamic models. A shear coaxial injector geometry was selected as the primary injector for study using gaseous hydrogen/oxygen and gaseous hydrogen/liquid oxygen propellants. Emphasis was placed on the use of nonintrusive diagnostic techniques to characterize the flowfields inside an optically-accessible rocket chamber. Measurements of the velocity and species fields were obtained using laser velocimetry and Raman spectroscopy, respectively. Qualitative flame shape information was also obtained using laser-induced fluorescence excited from OH radicals and laser light scattering studies of aluminum oxide particle seeded combusting flows. The gaseous hydrogen/liquid oxygen propellant studies for the shear coaxial injector focused on breakup mechanisms associated with the liquid oxygen jet under subcritical pressure conditions. Laser sheet illumination techniques were used to visualize the core region of the jet and a Phase Doppler Particle Analyzer was utilized for drop velocity, size and size distribution characterization. The results of these studies indicated that the shear coaxial geometry configuration was a relatively poor injector in terms of mixing. The oxygen core was observed to extend well downstream of the injector and a significant fraction of the mixing occurred in the near nozzle region where measurements were not possible to obtain. Detailed velocity and species measurements were obtained to allow CFD model validation and this set of benchmark data represents the most comprehensive data set available to date. As an extension of the investigation, a series of gas/gas injector studies were conducted in support of the X-33 Reusable Launch Vehicle program. A Gas/Gas Injector Technology team was formed consisting of the Marshall Space Flight Center, the NASA Lewis Research Center, Rocketdyne and Penn State. Injector

  17. An Experimental Study of Characteristic Combustion-Driven Flow for CFD Validation

    NASA Technical Reports Server (NTRS)

    Santoro, Robert J.

    1997-01-01

    A series of uni-element rocket injector studies were completed to provide benchmark quality data needed to validate computational fluid dynamic models. A shear coaxial injector geometry was selected as the primary injector for study using gaseous hydrogen/oxygen and gaseous hydrogen/liquid oxygen propellants. Emphasis was placed on the use of non-intrusive diagnostic techniques to characterize the flowfields inside an optically-accessible rocket chamber. Measurements of the velocity and species fields were obtained using laser velocimetry and Raman spectroscopy, respectively Qualitative flame shape information was also obtained using laser-induced fluorescence excited from OH radicals and laser light scattering studies of aluminum oxide particle seeded combusting flows. The gaseous hydrogen/liquid oxygen propellant studies for the shear coaxial injector focused on breakup mechanisms associated with the liquid oxygen jet under sub-critical pressure conditions. Laser sheet illumination techniques were used to visualize the core region of the jet and a Phase Doppler Particle Analyzer was utilized for drop velocity, size and size distribution characterization. The results of these studies indicated that the shear coaxial geometry configuration was a relatively poor injector in terms of mixing. The oxygen core was observed to extend well downstream of the injector and a significant fraction of the mixing occurred in the near nozzle region where measurements were not possible to obtain Detailed velocity and species measurements were obtained to allow CFD model validation and this set of benchmark data represents the most comprehensive data set available to date As an extension of the investigation, a series of gas/gas injector studies were conducted in support of the X-33 Reusable Launch Vehicle program. A Gas/Gas Injector Technology team was formed consisting of the Marshall Space Flight Center, the NASA Lewis Research Center, Rocketdyne and Penn State. Injector

  18. Improvements to a five-phase ABS algorithm for experimental validation

    NASA Astrophysics Data System (ADS)

    Gerard, Mathieu; Pasillas-Lépine, William; de Vries, Edwin; Verhaegen, Michel

    2012-10-01

    The anti-lock braking system (ABS) is the most important active safety system for passenger cars. Unfortunately, the literature is not really precise about its description, stability and performance. This research improves a five-phase hybrid ABS control algorithm based on wheel deceleration [W. Pasillas-Lépine, Hybrid modeling and limit cycle analysis for a class of five-phase anti-lock brake algorithms, Veh. Syst. Dyn. 44 (2006), pp. 173-188] and validates it on a tyre-in-the-loop laboratory facility. Five relevant effects are modelled so that the simulation matches the reality: oscillations in measurements, wheel acceleration reconstruction, brake pressure dynamics, brake efficiency changes and tyre relaxation. The time delays in measurement and actuation have been identified as the main difficulty for the initial algorithm to work in practice. Three methods are proposed in order to deal with these delays. It is verified that the ABS limit cycles encircle the optimal braking point, without assuming any tyre parameter being a priori known. The ABS algorithm is compared with the commercial algorithm developed by Bosch.

  19. Experimental studies of characteristic combustion-driven flows for CFD validation

    NASA Technical Reports Server (NTRS)

    Santoro, R. J.; Moser, M.; Anderson, W.; Pal, S.; Ryan, H.; Merkle, C. L.

    1992-01-01

    A series of rocket-related studies intended to develop a suitable data base for validation of Computational Fluid Dynamics (CFD) models of characteristic combustion-driven flows was undertaken at the Propulsion Engineering Research Center at Penn State. Included are studies of coaxial and impinging jet injectors as well as chamber wall heat transfer effects. The objective of these studies is to provide fundamental understanding and benchmark quality data for phenomena important to rocket combustion under well-characterized conditions. Diagnostic techniques utilized in these studies emphasize determinations of velocity, temperature, spray and droplet characteristics, and combustion zone distribution. Since laser diagnostic approaches are favored, the development of an optically accessible rocket chamber has been a high priority in the initial phase of the project. During the design phase for this chamber, the advice and input of the CFD modeling community were actively sought through presentations and written surveys. Based on this procedure, a suitable uni-element rocket chamber was fabricated and is presently under preliminary testing. Results of these tests, as well as the survey findings leading to the chamber design, were presented.

  20. Development and Initial Validation of the Pain Resilience Scale.

    PubMed

    Slepian, P Maxwell; Ankawi, Brett; Himawan, Lina K; France, Christopher R

    2016-04-01

    Over the past decade, the role of positive psychology in pain experience has gained increasing attention. One such positive factor, identified as resilience, has been defined as the ability to maintain positive emotional and physical functioning despite physical or psychological adversity. Although cross-situational measures of resilience have been shown to be related to pain, it was hypothesized that a pain-specific resilience measure would serve as a stronger predictor of acute pain experience. To test this hypothesis, we conducted a series of studies to develop and validate the Pain Resilience Scale. Study 1 described exploratory and confirmatory factor analyses that support a scale with 2 distinct factors, Cognitive/Affective Positivity and Behavioral Perseverance. Study 2 showed test-retest reliability and construct validity of this new scale, including moderate positive relationships with measures of positive psychological functioning and small to moderate negative relationships with vulnerability measures such as pain catastrophizing. Finally, consistent with our initial hypothesis, study 3 showed that the Pain Resilience Scale is more strongly related to ischemic pain responses than existing measures of general resilience. Together, these studies support the predictive utility of this new pain-specific measure of resilience in the context of acute experimental pain. The Pain Resilience Scale represents a novel measure of Cognitive/Affective Positivity and Behavioral Perseverance during exposure to noxious stimuli. Construct validity is supported by expected relationships with existing pain-coping measures, and predictive validity is shown by individual differences in response to acute experimental pain. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.

  1. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no

  2. The validation of a generalized Hooke's law for coronary arteries.

    PubMed

    Wang, Chong; Zhang, Wei; Kassab, Ghassan S

    2008-01-01

    The exponential form of constitutive model is widely used in biomechanical studies of blood vessels. There are two main issues, however, with this model: 1) the curve fits of experimental data are not always satisfactory, and 2) the material parameters may be oversensitive. A new type of strain measure in a generalized Hooke's law for blood vessels was recently proposed by our group to address these issues. The new model has one nonlinear parameter and six linear parameters. In this study, the stress-strain equation is validated by fitting the model to experimental data of porcine coronary arteries. Material constants of left anterior descending artery and right coronary artery for the Hooke's law were computed with a separable nonlinear least-squares method with an excellent goodness of fit. A parameter sensitivity analysis shows that the stability of material constants is improved compared with the exponential model and a biphasic model. A boundary value problem was solved to demonstrate that the model prediction can match the measured arterial deformation under experimental loading conditions. The validated constitutive relation will serve as a basis for the solution of various boundary value problems of cardiovascular biomechanics.

  3. Ab initio random structure searching of organic molecular solids: assessment and validation against experimental data.

    PubMed

    Zilka, Miri; Dudenko, Dmytro V; Hughes, Colan E; Williams, P Andrew; Sturniolo, Simone; Franks, W Trent; Pickard, Chris J; Yates, Jonathan R; Harris, Kenneth D M; Brown, Steven P

    2017-10-04

    This paper explores the capability of using the DFT-D ab initio random structure searching (AIRSS) method to generate crystal structures of organic molecular materials, focusing on a system (m-aminobenzoic acid; m-ABA) that is known from experimental studies to exhibit abundant polymorphism. Within the structural constraints selected for the AIRSS calculations (specifically, centrosymmetric structures with Z = 4 for zwitterionic m-ABA molecules), the method is shown to successfully generate the two known polymorphs of m-ABA (form III and form IV) that have these structural features. We highlight various issues that are encountered in comparing crystal structures generated by AIRSS to experimental powder X-ray diffraction (XRD) data and solid-state magic-angle spinning (MAS) NMR data, demonstrating successful fitting for some of the lowest energy structures from the AIRSS calculations against experimental low-temperature powder XRD data for known polymorphs of m-ABA, and showing that comparison of computed and experimental solid-state NMR parameters allows different hydrogen-bonding motifs to be discriminated.

  4. Can jurors recognize missing control groups, confounds, and experimenter bias in psychological science?

    PubMed

    McAuliff, Bradley D; Kovera, Margaret Bull; Nunez, Gabriel

    2009-06-01

    This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed.

  5. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    NASA Astrophysics Data System (ADS)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J.

    2015-09-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features. In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP. At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60 mm macro lens, equipped with extension tubes (20 and 32 mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process. The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques.

  6. Identification of nonlinear modes using phase-locked-loop experimental continuation and normal form

    NASA Astrophysics Data System (ADS)

    Denis, V.; Jossic, M.; Giraud-Audine, C.; Chomette, B.; Renault, A.; Thomas, O.

    2018-06-01

    In this article, we address the model identification of nonlinear vibratory systems, with a specific focus on systems modeled with distributed nonlinearities, such as geometrically nonlinear mechanical structures. The proposed strategy theoretically relies on the concept of nonlinear modes of the underlying conservative unforced system and the use of normal forms. Within this framework, it is shown that without internal resonance, a valid reduced order model for a nonlinear mode is a single Duffing oscillator. We then propose an efficient experimental strategy to measure the backbone curve of a particular nonlinear mode and we use it to identify the free parameters of the reduced order model. The experimental part relies on a Phase-Locked Loop (PLL) and enables a robust and automatic measurement of backbone curves as well as forced responses. It is theoretically and experimentally shown that the PLL is able to stabilize the unstable part of Duffing-like frequency responses, thus enabling its robust experimental measurement. Finally, the whole procedure is tested on three experimental systems: a circular plate, a chinese gong and a piezoelectric cantilever beam. It enable to validate the procedure by comparison to available theoretical models as well as to other experimental identification methods.

  7. Model-Free Primitive-Based Iterative Learning Control Approach to Trajectory Tracking of MIMO Systems With Experimental Validation.

    PubMed

    Radac, Mircea-Bogdan; Precup, Radu-Emil; Petriu, Emil M

    2015-11-01

    This paper proposes a novel model-free trajectory tracking of multiple-input multiple-output (MIMO) systems by the combination of iterative learning control (ILC) and primitives. The optimal trajectory tracking solution is obtained in terms of previously learned solutions to simple tasks called primitives. The library of primitives that are stored in memory consists of pairs of reference input/controlled output signals. The reference input primitives are optimized in a model-free ILC framework without using knowledge of the controlled process. The guaranteed convergence of the learning scheme is built upon a model-free virtual reference feedback tuning design of the feedback decoupling controller. Each new complex trajectory to be tracked is decomposed into the output primitives regarded as basis functions. The optimal reference input for the control system to track the desired trajectory is next recomposed from the reference input primitives. This is advantageous because the optimal reference input is computed straightforward without the need to learn from repeated executions of the tracking task. In addition, the optimization problem specific to trajectory tracking of square MIMO systems is decomposed in a set of optimization problems assigned to each separate single-input single-output control channel that ensures a convenient model-free decoupling. The new model-free primitive-based ILC approach is capable of planning, reasoning, and learning. A case study dealing with the model-free control tuning for a nonlinear aerodynamic system is included to validate the new approach. The experimental results are given.

  8. Research Directions for Cyber Experimentation: Workshop Discussion Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeWaard, Elizabeth; Deccio, Casey; Fritz, David Jakob

    Sandia National Laboratories hosted a workshop on August 11, 2017 entitled "Research Directions for Cyber Experimentation," which focused on identifying and addressing research gaps within the field of cyber experimentation , particularly emulation testbeds . This report mainly documents the discussion toward the end of the workshop, which included research gaps such as developing a sustainable research infrastructure, exp anding cyber experimentation, and making the field more accessible to subject matter experts who may not have a background in computer science . Other gaps include methodologies for rigorous experimentation, validation, and uncertainty quantification, which , if addressed, also have themore » potential to bridge the gap between cyber experimentation and cyber engineering. Workshop attendees presented various ways to overcome these research gaps, however the main conclusion for overcoming these gaps is better commun ication through increased workshops, conferences, email lists, and slack chann els, among other opportunities.« less

  9. Validation of drift and diffusion coefficients from experimental data

    NASA Astrophysics Data System (ADS)

    Riera, R.; Anteneodo, C.

    2010-04-01

    Many fluctuation phenomena, in physics and other fields, can be modeled by Fokker-Planck or stochastic differential equations whose coefficients, associated with drift and diffusion components, may be estimated directly from the observed time series. Its correct characterization is crucial to determine the system quantifiers. However, due to the finite sampling rates of real data, the empirical estimates may significantly differ from their true functional forms. In the literature, low-order corrections, or even no corrections, have been applied to the finite-time estimates. A frequent outcome consists of linear drift and quadratic diffusion coefficients. For this case, exact corrections have been recently found, from Itô-Taylor expansions. Nevertheless, model validation constitutes a necessary step before determining and applying the appropriate corrections. Here, we exploit the consequences of the exact theoretical results obtained for the linear-quadratic model. In particular, we discuss whether the observed finite-time estimates are actually a manifestation of that model. The relevance of this analysis is put into evidence by its application to two contrasting real data examples in which finite-time linear drift and quadratic diffusion coefficients are observed. In one case the linear-quadratic model is readily rejected while in the other, although the model constitutes a very good approximation, low-order corrections are inappropriate. These examples give warning signs about the proper interpretation of finite-time analysis even in more general diffusion processes.

  10. Transcranial Assessment and Visualization of Acoustic Cavitation: Modeling and Experimental Validation

    PubMed Central

    Clement, Gregory T.; McDannold, Nathan

    2015-01-01

    The interaction of ultrasonically-controlled microbubble oscillations (acoustic cavitation) with tissues and biological media has been shown to induce a wide range of bioeffects that may have significant impact to therapy and diagnosis of central nervous system diseases and disorders. However, the inherently non-linear microbubble oscillations combined with the micrometer and microsecond scales involved in these interactions and the limited methods to assess and visualize them transcranially hinder both their optimal use and translation to the clinics. To overcome these challenges, we present a noninvasive and clinically relevant framework that combines numerical simulations with multimodality imaging to assess and visualize the microbubble oscillations transcranially. In the present work, acoustic cavitation was studied with an integrated US and MR imaging guided clinical FUS system in non-human primates. This multimodality imaging system allowed us to concurrently induce and visualize acoustic cavitation transcranially. A high-resolution brain CT-scan that allowed us to determine the head acoustic properties (density, speed of sound, and absorption) was also co-registered to the US and MR images. The derived acoustic properties and the location of the targets that were determined by the 3D-CT scans and the post sonication MRI respectively were then used as inputs to two-and three-dimensional Finite Difference Time Domain (2D, 3D-FDTD) simulations that matched the experimental conditions and geometry. At the experimentally-determined target locations, synthetic point sources with pressure amplitude traces derived by either a Gaussian function or the output of a microbubble dynamics model were numerically excited and propagated through the skull towards a virtual US imaging array. Then, using passive acoustic mapping that was refined to incorporate variable speed of sound, we assessed the losses and aberrations induced by the skull as a function of the acoustic

  11. Upgrade of the gas flow control system of the resistive current leads of the LHC inner triplet magnets: Simulation and experimental validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perin, A.; Casas-Cubillos, J.; Pezzetti, M.

    2014-01-29

    The 600 A and 120 A circuits of the inner triplet magnets of the Large Hadron Collider are powered by resistive gas cooled current leads. The current solution for controlling the gas flow of these leads has shown severe operability limitations. In order to allow a more precise and more reliable control of the cooling gas flow, new flowmeters will be installed during the first long shutdown of the LHC. Because of the high level of radiation in the area next to the current leads, the flowmeters will be installed in shielded areas located up to 50 m away frommore » the current leads. The control valves being located next to the current leads, this configuration leads to long piping between the valves and the flowmeters. In order to determine its dynamic behaviour, the proposed system was simulated with a numerical model and validated with experimental measurements performed on a dedicated test bench.« less

  12. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  13. Design, development, testing and validation of a Photonics Virtual Laboratory for the study of LEDs

    NASA Astrophysics Data System (ADS)

    Naranjo, Francisco L.; Martínez, Guadalupe; Pérez, Ángel L.; Pardo, Pedro J.

    2014-07-01

    This work presents the design, development, testing and validation of a Photonic Virtual Laboratory, highlighting the study of LEDs. The study was conducted from a conceptual, experimental and didactic standpoint, using e-learning and m-learning platforms. Specifically, teaching tools that help ensure that our students perform significant learning have been developed. It has been brought together the scientific aspect, such as the study of LEDs, with techniques of generation and transfer of knowledge through the selection, hierarchization and structuring of information using concept maps. For the validation of the didactic materials developed, it has been used procedures with various assessment tools for the collection and processing of data, applied in the context of an experimental design. Additionally, it was performed a statistical analysis to determine the validity of the materials developed. The assessment has been designed to validate the contributions of the new materials developed over the traditional method of teaching, and to quantify the learning achieved by students, in order to draw conclusions that serve as a reference for its application in the teaching and learning processes, and comprehensively validate the work carried out.

  14. A combined computational-experimental analyses of selected metabolic enzymes in Pseudomonas species.

    PubMed

    Perumal, Deepak; Lim, Chu Sing; Chow, Vincent T K; Sakharkar, Kishore R; Sakharkar, Meena K

    2008-09-10

    Comparative genomic analysis has revolutionized our ability to predict the metabolic subsystems that occur in newly sequenced genomes, and to explore the functional roles of the set of genes within each subsystem. These computational predictions can considerably reduce the volume of experimental studies required to assess basic metabolic properties of multiple bacterial species. However, experimental validations are still required to resolve the apparent inconsistencies in the predictions by multiple resources. Here, we present combined computational-experimental analyses on eight completely sequenced Pseudomonas species. Comparative pathway analyses reveal that several pathways within the Pseudomonas species show high plasticity and versatility. Potential bypasses in 11 metabolic pathways were identified. We further confirmed the presence of the enzyme O-acetyl homoserine (thiol) lyase (EC: 2.5.1.49) in P. syringae pv. tomato that revealed inconsistent annotations in KEGG and in the recently published SYSTOMONAS database. These analyses connect and integrate systematic data generation, computational data interpretation, and experimental validation and represent a synergistic and powerful means for conducting biological research.

  15. Selecting and Improving Quasi-Experimental Designs in Effectiveness and Implementation Research.

    PubMed

    Handley, Margaret A; Lyles, Courtney R; McCulloch, Charles; Cattamanchi, Adithya

    2018-04-01

    Interventional researchers face many design challenges when assessing intervention implementation in real-world settings. Intervention implementation requires holding fast on internal validity needs while incorporating external validity considerations (such as uptake by diverse subpopulations, acceptability, cost, and sustainability). Quasi-experimental designs (QEDs) are increasingly employed to achieve a balance between internal and external validity. Although these designs are often referred to and summarized in terms of logistical benefits, there is still uncertainty about (a) selecting from among various QEDs and (b) developing strategies to strengthen the internal and external validity of QEDs. We focus here on commonly used QEDs (prepost designs with nonequivalent control groups, interrupted time series, and stepped-wedge designs) and discuss several variants that maximize internal and external validity at the design, execution and implementation, and analysis stages.

  16. Design, fabrication and experimental validation of a novel dry-contact sensor for measuring electroencephalography signals without skin preparation.

    PubMed

    Liao, Lun-De; Wang, I-Jan; Chen, Sheng-Fu; Chang, Jyh-Yeong; Lin, Chin-Teng

    2011-01-01

    In the present study, novel dry-contact sensors for measuring electro-encephalography (EEG) signals without any skin preparation are designed, fabricated by an injection molding manufacturing process and experimentally validated. Conventional wet electrodes are commonly used to measure EEG signals; they provide excellent EEG signals subject to proper skin preparation and conductive gel application. However, a series of skin preparation procedures for applying the wet electrodes is always required and usually creates trouble for users. To overcome these drawbacks, novel dry-contact EEG sensors were proposed for potential operation in the presence or absence of hair and without any skin preparation or conductive gel usage. The dry EEG sensors were designed to contact the scalp surface with 17 spring contact probes. Each probe was designed to include a probe head, plunger, spring, and barrel. The 17 probes were inserted into a flexible substrate using a one-time forming process via an established injection molding procedure. With these 17 spring contact probes, the flexible substrate allows for high geometric conformity between the sensor and the irregular scalp surface to maintain low skin-sensor interface impedance. Additionally, the flexible substrate also initiates a sensor buffer effect, eliminating pain when force is applied. The proposed dry EEG sensor was reliable in measuring EEG signals without any skin preparation or conductive gel usage, as compared with the conventional wet electrodes.

  17. Design, Fabrication and Experimental Validation of a Novel Dry-Contact Sensor for Measuring Electroencephalography Signals without Skin Preparation

    PubMed Central

    Liao, Lun-De; Wang, I-Jan; Chen, Sheng-Fu; Chang, Jyh-Yeong; Lin, Chin-Teng

    2011-01-01

    In the present study, novel dry-contact sensors for measuring electro-encephalography (EEG) signals without any skin preparation are designed, fabricated by an injection molding manufacturing process and experimentally validated. Conventional wet electrodes are commonly used to measure EEG signals; they provide excellent EEG signals subject to proper skin preparation and conductive gel application. However, a series of skin preparation procedures for applying the wet electrodes is always required and usually creates trouble for users. To overcome these drawbacks, novel dry-contact EEG sensors were proposed for potential operation in the presence or absence of hair and without any skin preparation or conductive gel usage. The dry EEG sensors were designed to contact the scalp surface with 17 spring contact probes. Each probe was designed to include a probe head, plunger, spring, and barrel. The 17 probes were inserted into a flexible substrate using a one-time forming process via an established injection molding procedure. With these 17 spring contact probes, the flexible substrate allows for high geometric conformity between the sensor and the irregular scalp surface to maintain low skin-sensor interface impedance. Additionally, the flexible substrate also initiates a sensor buffer effect, eliminating pain when force is applied. The proposed dry EEG sensor was reliable in measuring EEG signals without any skin preparation or conductive gel usage, as compared with the conventional wet electrodes. PMID:22163929

  18. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation

  19. Development of the Biological Experimental Design Concept Inventory (BEDCI)

    PubMed Central

    Deane, Thomas; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2014-01-01

    Interest in student conception of experimentation inspired the development of a fully validated 14-question inventory on experimental design in biology (BEDCI) by following established best practices in concept inventory (CI) design. This CI can be used to diagnose specific examples of non–expert-like thinking in students and to evaluate the success of teaching strategies that target conceptual changes. We used BEDCI to diagnose non–expert-like student thinking in experimental design at the pre- and posttest stage in five courses (total n = 580 students) at a large research university in western Canada. Calculated difficulty and discrimination metrics indicated that BEDCI questions are able to effectively capture learning changes at the undergraduate level. A high correlation (r = 0.84) between responses by students in similar courses and at the same stage of their academic career, also suggests that the test is reliable. Students showed significant positive learning changes by the posttest stage, but some non–expert-like responses were widespread and persistent. BEDCI is a reliable and valid diagnostic tool that can be used in a variety of life sciences disciplines. PMID:25185236

  20. Reliability and Validity of Rubrics for Assessment through Writing

    ERIC Educational Resources Information Center

    Rezaei, Ali Reza; Lovorn, Michael

    2010-01-01

    This experimental project investigated the reliability and validity of rubrics in assessment of students' written responses to a social science "writing prompt". The participants were asked to grade one of the two samples of writing assuming it was written by a graduate student. In fact both samples were prepared by the authors. The…