ERIC Educational Resources Information Center
Rossi, Robert Joseph
Methods drawn from four logical theories associated with studies of inductive processes are applied to the assessment and evaluation of experimental episode construct validity. It is shown that this application provides for estimates of episode informativeness with respect to the person examined in terms of the construct and to the construct…
Validation of WIND for a Series of Inlet Flows
NASA Technical Reports Server (NTRS)
Slater, John W.; Abbott, John M.; Cavicchi, Richard H.
2002-01-01
Validation assessments compare WIND CFD simulations to experimental data for a series of inlet flows ranging in Mach number from low subsonic to hypersonic. The validation procedures follow the guidelines of the AIAA. The WIND code performs well in matching the available experimental data. The assessments demonstrate the use of WIND and provide confidence in its use for the analysis of aircraft inlets.
Validation of the Soil Moisture Active Passive mission using USDA-ARS experimental watersheds
USDA-ARS?s Scientific Manuscript database
The calibration and validation program of the Soil Moisture Active Passive mission (SMAP) relies upon an international cooperative of in situ networks to provide ground truth references across a variety of landscapes. The USDA Agricultural Research Service operates several experimental watersheds wh...
Experimental validation of calculated atomic charges in ionic liquids
NASA Astrophysics Data System (ADS)
Fogarty, Richard M.; Matthews, Richard P.; Ashworth, Claire R.; Brandt-Talbot, Agnieszka; Palgrave, Robert G.; Bourne, Richard A.; Vander Hoogerstraete, Tom; Hunt, Patricia A.; Lovelock, Kevin R. J.
2018-05-01
A combination of X-ray photoelectron spectroscopy and near edge X-ray absorption fine structure spectroscopy has been used to provide an experimental measure of nitrogen atomic charges in nine ionic liquids (ILs). These experimental results are used to validate charges calculated with three computational methods: charges from electrostatic potentials using a grid-based method (ChelpG), natural bond orbital population analysis, and the atoms in molecules approach. By combining these results with those from a previous study on sulfur, we find that ChelpG charges provide the best description of the charge distribution in ILs. However, we find that ChelpG charges can lead to significant conformational dependence and therefore advise that small differences in ChelpG charges (<0.3 e) should be interpreted with care. We use these validated charges to provide physical insight into nitrogen atomic charges for the ILs probed.
The Fresnel Zone Light Field Spectral Imager
2017-03-23
Marciniak Member AFIT-ENP-MS-17-M-095 Abstract This thesis provides a computational model and the first experimental demonstration of a Fresnel zone...Fresnel propagation. It was validated experimentally and provides excellent demonstration of system capabilities. The experimentally demonstrated system...in the measured light fields, they did not degrade the system’s performance. Experimental demonstration also showed the capability to resolve between
Aerodynamic Database Development for Mars Smart Lander Vehicle Configurations
NASA Technical Reports Server (NTRS)
Bobskill, Glenn J.; Parikh, Paresh C.; Prabhu, Ramadas K.; Tyler, Erik D.
2002-01-01
An aerodynamic database has been generated for the Mars Smart Lander Shelf-All configuration using computational fluid dynamics (CFD) simulations. Three different CFD codes, USM3D and FELISA, based on unstructured grid technology and LAURA, an established and validated structured CFD code, were used. As part of this database development, the results for the Mars continuum were validated with experimental data and comparisons made where applicable. The validation of USM3D and LAURA with the Unitary experimental data, the use of intermediate LAURA check analyses, as well as the validation of FELISA with the Mach 6 CF(sub 4) experimental data provided a higher confidence in the ability for CFD to provide aerodynamic data in order to determine the static trim characteristics for longitudinal stability. The analyses of the noncontinuum regime showed the existence of multiple trim angles of attack that can be unstable or stable trim points. This information is needed to design guidance controller throughout the trajectory.
Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.
2017-01-01
A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices. PMID:28594889
Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R
2017-01-01
A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices.
CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction
NASA Technical Reports Server (NTRS)
Davis, David O.
2015-01-01
Experimental investigations of specific flow phenomena, e.g., Shock Wave Boundary-Layer Interactions (SWBLI), provide great insight to the flow behavior but often lack the necessary details to be useful as CFD validation experiments. Reasons include: 1.Undefined boundary conditions Inconsistent results 2.Undocumented 3D effects (CL only measurements) 3.Lack of uncertainty analysis While there are a number of good subsonic experimental investigations that are sufficiently documented to be considered test cases for CFD and turbulence model validation, the number of supersonic and hypersonic cases is much less. This was highlighted by Settles and Dodsons [1] comprehensive review of available supersonic and hypersonic experimental studies. In all, several hundred studies were considered for their database.Of these, over a hundred were subjected to rigorous acceptance criteria. Based on their criteria, only 19 (12 supersonic, 7 hypersonic) were considered of sufficient quality to be used for validation purposes. Aeschliman and Oberkampf [2] recognized the need to develop a specific methodology for experimental studies intended specifically for validation purposes.
Alvarez, M Lucrecia
2014-01-01
Different target prediction algorithms have been developed to provide a list of candidate target genes for a given animal microRNAs (miRNAs). However, these computational approaches provide both false-positive and false-negative predictions. Therefore, the target genes of a specific miRNA identified in silico should be experimentally validated. In this chapter, we describe a step-by-step protocol for the experimental validation of a direct miRNA target using a faster Dual Firefly-Renilla Luciferase Reporter Assay. We describe how to construct reporter plasmids using the simple, fast, and highly efficient cold fusion cloning technology, which does not require ligase, phosphatase, or restriction enzymes. In addition, we provide a protocol for co-transfection of reporter plasmids with either miRNA mimics or miRNA inhibitors in human embryonic kidney 293 (HEK293) cells, as well as a description on how to measure Firefly and Renilla luciferase activity using the Dual-Glo Luciferase Assay kit. As an example of the use of this technology, we will validate glucose-6-phosphate dehydrogenase (G6PD) as a direct target of miR-1207-5p.
Investigation of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams
NASA Technical Reports Server (NTRS)
Davis, Brian A.
2005-01-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical model. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. Excellent agreement is achieved between the predicted and measured results, thereby quantitatively validating the numerical tool.
Fatigue Failure of Space Shuttle Main Engine Turbine Blades
NASA Technical Reports Server (NTRS)
Swanson, Gregrory R.; Arakere, Nagaraj K.
2000-01-01
Experimental validation of finite element modeling of single crystal turbine blades is presented. Experimental results from uniaxial high cycle fatigue (HCF) test specimens and full scale Space Shuttle Main Engine test firings with the High Pressure Fuel Turbopump Alternate Turbopump (HPFTP/AT) provide the data used for the validation. The conclusions show the significant contribution of the crystal orientation within the blade on the resulting life of the component, that the analysis can predict this variation, and that experimental testing demonstrates it.
Numerical modeling and experimental validation of thermoplastic composites induction welding
NASA Astrophysics Data System (ADS)
Palmieri, Barbara; Nele, Luigi; Galise, Francesco
2018-05-01
In this work, a numerical simulation and experimental test of the induction welding of continuous fibre-reinforced thermoplastic composites (CFRTPCs) was provided. The thermoplastic Polyamide 66 (PA66) with carbon fiber fabric was used. Using a dedicated software (JMag Designer), the influence of the fundamental process parameters such as temperature, current and holding time was investigated. In order to validate the results of the simulations, and therefore the numerical model used, experimental tests were carried out, and the temperature values measured during the tests were compared with the aid of an optical pyrometer, with those provided by the numerical simulation. The mechanical properties of the welded joints were evaluated by single lap shear tests.
NASA Technical Reports Server (NTRS)
Freeman, Delman C., Jr.; Reubush, Daivd E.; McClinton, Charles R.; Rausch, Vincent L.; Crawford, J. Larry
1997-01-01
This paper provides an overview of NASA's Hyper-X Program; a focused hypersonic technology effort designed to move hypersonic, airbreathing vehicle technology from the laboratory environment to the flight environment. This paper presents an overview of the flight test program, research objectives, approach, schedule and status. Substantial experimental database and concept validation have been completed. The program is currently concentrating on the first, Mach 7, vehicle development, verification and validation in preparation for wind-tunnel testing in 1998 and flight testing in 1999. Parallel to this effort the Mach 5 and 10 vehicle designs are being finalized. Detailed analytical and experimental evaluation of the Mach 7 vehicle at the flight conditions is nearing completion, and will provide a database for validation of design methods once flight test data are available.
[Animal experimentation, computer simulation and surgical research].
Carpentier, Alain
2009-11-01
We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.
Vivaldi: visualization and validation of biomacromolecular NMR structures from the PDB.
Hendrickx, Pieter M S; Gutmanas, Aleksandras; Kleywegt, Gerard J
2013-04-01
We describe Vivaldi (VIsualization and VALidation DIsplay; http://pdbe.org/vivaldi), a web-based service for the analysis, visualization, and validation of NMR structures in the Protein Data Bank (PDB). Vivaldi provides access to model coordinates and several types of experimental NMR data using interactive visualization tools, augmented with structural annotations and model-validation information. The service presents information about the modeled NMR ensemble, validation of experimental chemical shifts, residual dipolar couplings, distance and dihedral angle constraints, as well as validation scores based on empirical knowledge and databases. Vivaldi was designed for both expert NMR spectroscopists and casual non-expert users who wish to obtain a better grasp of the information content and quality of NMR structures in the public archive. Copyright © 2013 Wiley Periodicals, Inc.
Zimmerman, C.E.
2005-01-01
Analysis of otolith strontium (Sr) or strontium-to-calcium (Sr:Ca) ratios provides a powerful tool to reconstruct the chronology of migration among salinity environments for diadromous salmonids. Although use of this method has been validated by examination of known individuals and translocation experiments, it has never been validated under controlled experimental conditions. In this study, incorporation of otolith Sr was tested across a range of salinities and resulting levels of ambient Sr and Ca concentrations in juvenile chinook salmon (Oncorhynchus tshawytscha), coho salmon (Oncorhynchus kisutch), sockeye salmon (Oncorhynchus nerka), rainbow trout (Oncorhynchus rnykiss), and Arctic char (Salvelinus alpinus). Experimental water was mixed, using stream water and seawater as end members, to create experimental salinities of 0.1, 6.3, 12.7, 18.6, 25.5, and 33.0 psu. Otolith Sr and Sr:Ca ratios were significantly related to salinity for all species (r2 range: 0.80-0.91) but provide only enough predictive resolution to discriminate among fresh water, brackish water, and saltwater residency. These results validate the use of otolith Sr:Ca ratios to broadly discriminate salinity histories encountered by salmonids but highlight the need for further research concerning the influence of osmoregulation and physiological changes associated with smoking on otolith microchemistry.
Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code
NASA Astrophysics Data System (ADS)
Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.
2015-12-01
WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).
Experimental validation of predicted cancer genes using FRET
NASA Astrophysics Data System (ADS)
Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.
2018-07-01
Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.
Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A
2016-01-01
The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.
Holgado-Tello, Fco. P.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A.
2016-01-01
The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity. PMID:27378991
NASA Astrophysics Data System (ADS)
Dörr, Dominik; Joppich, Tobias; Schirmaier, Fabian; Mosthaf, Tobias; Kärger, Luise; Henning, Frank
2016-10-01
Thermoforming of continuously fiber reinforced thermoplastics (CFRTP) is ideally suited to thin walled and complex shaped products. By means of forming simulation, an initial validation of the producibility of a specific geometry, an optimization of the forming process and the prediction of fiber-reorientation due to forming is possible. Nevertheless, applied methods need to be validated. Therefor a method is presented, which enables the calculation of error measures for the mismatch between simulation results and experimental tests, based on measurements with a conventional coordinate measuring device. As a quantitative measure, describing the curvature is provided, the presented method is also suitable for numerical or experimental sensitivity studies on wrinkling behavior. The applied methods for forming simulation, implemented in Abaqus explicit, are presented and applied to a generic geometry. The same geometry is tested experimentally and simulation and test results are compared by the proposed validation method.
Boundary Layer Transition Experiments in Support of the Hypersonics Program
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Chen, Fang-Jenq; Wilder, Michael C.; Reda, Daniel C.
2007-01-01
Two experimental boundary layer transition studies in support of fundamental hypersonics research are reviewed. The two studies are the HyBoLT flight experiment and a new ballistic range effort. Details are provided of the objectives and approach associated with each experimental program. The establishment of experimental databases from ground and flight are to provide better understanding of high-speed flows and data to validate and guide the development of simulation tools.
ERIC Educational Resources Information Center
Rizvi, Shireen L.; Nock, Matthew K.
2008-01-01
Single-case experimental designs (SCEDs) provide a time- and cost-effective alternative to randomized clinical trials and offer significant advantages in terms of internal and external validity. A brief history and primer on SCEDs is provided, specifically for use in suicide intervention research. Various SCED methodologies, such as AB, ABAB,…
FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju
To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposedmore » based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting hybrid validation data package (composed of experimental and dummy data) will provide a clear and complete instance delineating the structure of the desired validation data and enabling effective communication among the modeler, the experimentalist, and the knowledgebase developer. With a good common understanding of the desired data structure by the three parties of subject matter experts, further existing data hunting will be effectively conducted, new experimental data generation will be realistically pursued, knowledgebase schema will be practically designed; and code validation will be confidently planned.« less
A Surrogate Approach to the Experimental Optimization of Multielement Airfoils
NASA Technical Reports Server (NTRS)
Otto, John C.; Landman, Drew; Patera, Anthony T.
1996-01-01
The incorporation of experimental test data into the optimization process is accomplished through the use of Bayesian-validated surrogates. In the surrogate approach, a surrogate for the experiment (e.g., a response surface) serves in the optimization process. The validation step of the framework provides a qualitative assessment of the surrogate quality, and bounds the surrogate-for-experiment error on designs "near" surrogate-predicted optimal designs. The utility of the framework is demonstrated through its application to the experimental selection of the trailing edge ap position to achieve a design lift coefficient for a three-element airfoil.
Experimental validation of a new heterogeneous mechanical test design
NASA Astrophysics Data System (ADS)
Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.
2018-05-01
Standard material parameters identification strategies generally use an extensive number of classical tests for collecting the required experimental data. However, a great effort has been made recently by the scientific and industrial communities to support this experimental database on heterogeneous tests. These tests can provide richer information on the material behavior allowing the identification of a more complete set of material parameters. This is a result of the recent development of full-field measurements techniques, like digital image correlation (DIC), that can capture the heterogeneous deformation fields on the specimen surface during the test. Recently, new specimen geometries were designed to enhance the richness of the strain field and capture supplementary strain states. The butterfly specimen is an example of these new geometries, designed through a numerical optimization procedure where an indicator capable of evaluating the heterogeneity and the richness of strain information. However, no experimental validation was yet performed. The aim of this work is to experimentally validate the heterogeneous butterfly mechanical test in the parameter identification framework. For this aim, DIC technique and a Finite Element Model Up-date inverse strategy are used together for the parameter identification of a DC04 steel, as well as the calculation of the indicator. The experimental tests are carried out in a universal testing machine with the ARAMIS measuring system to provide the strain states on the specimen surface. The identification strategy is accomplished with the data obtained from the experimental tests and the results are compared to a reference numerical solution.
A new simple local muscle recovery model and its theoretical and experimental validation.
Ma, Liang; Zhang, Wei; Wu, Su; Zhang, Zhanwu
2015-01-01
This study was conducted to provide theoretical and experimental validation of a local muscle recovery model. Muscle recovery has been modeled in different empirical and theoretical approaches to determine work-rest allowance for musculoskeletal disorder (MSD) prevention. However, time-related parameters and individual attributes have not been sufficiently considered in conventional approaches. A new muscle recovery model was proposed by integrating time-related task parameters and individual attributes. Theoretically, this muscle recovery model was compared to other theoretical models mathematically. Experimentally, a total of 20 subjects participated in the experimental validation. Hand grip force recovery and shoulder joint strength recovery were measured after a fatiguing operation. The recovery profile was fitted by using the recovery model, and individual recovery rates were calculated as well after fitting. Good fitting values (r(2) > .8) were found for all the subjects. Significant differences in recovery rates were found among different muscle groups (p < .05). The theoretical muscle recovery model was primarily validated by characterization of the recovery process after fatiguing operation. The determined recovery rate may be useful to represent individual recovery attribute.
NASA Astrophysics Data System (ADS)
Christiansen, Rasmus E.; Sigmund, Ole
2016-09-01
This Letter reports on the experimental validation of a two-dimensional acoustic hyperbolic metamaterial slab optimized to exhibit negative refractive behavior. The slab was designed using a topology optimization based systematic design method allowing for tailoring the refractive behavior. The experimental results confirm the predicted refractive capability as well as the predicted transmission at an interface. The study simultaneously provides an estimate of the attenuation inside the slab stemming from the boundary layer effects—insight which can be utilized in the further design of the metamaterial slabs. The capability of tailoring the refractive behavior opens possibilities for different applications. For instance, a slab exhibiting zero refraction across a wide angular range is capable of funneling acoustic energy through it, while a material exhibiting the negative refractive behavior across a wide angular range provides lensing and collimating capabilities.
Show and tell: disclosure and data sharing in experimental pathology.
Schofield, Paul N; Ward, Jerrold M; Sundberg, John P
2016-06-01
Reproducibility of data from experimental investigations using animal models is increasingly under scrutiny because of the potentially negative impact of poor reproducibility on the translation of basic research. Histopathology is a key tool in biomedical research, in particular for the phenotyping of animal models to provide insights into the pathobiology of diseases. Failure to disclose and share crucial histopathological experimental details compromises the validity of the review process and reliability of the conclusions. We discuss factors that affect the interpretation and validation of histopathology data in publications and the importance of making these data accessible to promote replicability in research. © 2016. Published by The Company of Biologists Ltd.
Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment
NASA Technical Reports Server (NTRS)
Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.
2008-01-01
Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.
Numerical Modelling of Femur Fracture and Experimental Validation Using Bone Simulant.
Marco, Miguel; Giner, Eugenio; Larraínzar-Garijo, Ricardo; Caeiro, José Ramón; Miguélez, María Henar
2017-10-01
Bone fracture pattern prediction is still a challenge and an active field of research. The main goal of this article is to present a combined methodology (experimental and numerical) for femur fracture onset analysis. Experimental work includes the characterization of the mechanical properties and fracture testing on a bone simulant. The numerical work focuses on the development of a model whose material properties are provided by the characterization tests. The fracture location and the early stages of the crack propagation are modelled using the extended finite element method and the model is validated by fracture tests developed in the experimental work. It is shown that the accuracy of the numerical results strongly depends on a proper bone behaviour characterization.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.
1989-01-01
The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.
Experimental Validation: Subscale Aircraft Ground Facilities and Integrated Test Capability
NASA Technical Reports Server (NTRS)
Bailey, Roger M.; Hostetler, Robert W., Jr.; Barnes, Kevin N.; Belcastro, Celeste M.; Belcastro, Christine M.
2005-01-01
Experimental testing is an important aspect of validating complex integrated safety critical aircraft technologies. The Airborne Subscale Transport Aircraft Research (AirSTAR) Testbed is being developed at NASA Langley to validate technologies under conditions that cannot be flight validated with full-scale vehicles. The AirSTAR capability comprises a series of flying sub-scale models, associated ground-support equipment, and a base research station at NASA Langley. The subscale model capability utilizes a generic 5.5% scaled transport class vehicle known as the Generic Transport Model (GTM). The AirSTAR Ground Facilities encompass the hardware and software infrastructure necessary to provide comprehensive support services for the GTM testbed. The ground facilities support remote piloting of the GTM aircraft, and include all subsystems required for data/video telemetry, experimental flight control algorithm implementation and evaluation, GTM simulation, data recording/archiving, and audio communications. The ground facilities include a self-contained, motorized vehicle serving as a mobile research command/operations center, capable of deployment to remote sites when conducting GTM flight experiments. The ground facilities also include a laboratory based at NASA LaRC providing near identical capabilities as the mobile command/operations center, as well as the capability to receive data/video/audio from, and send data/audio to the mobile command/operations center during GTM flight experiments.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1993-01-01
Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1992-01-01
Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.
Crowdsourcing for Cognitive Science – The Utility of Smartphones
Brown, Harriet R.; Zeidman, Peter; Smittenaar, Peter; Adams, Rick A.; McNab, Fiona; Rutledge, Robb B.; Dolan, Raymond J.
2014-01-01
By 2015, there will be an estimated two billion smartphone users worldwide. This technology presents exciting opportunities for cognitive science as a medium for rapid, large-scale experimentation and data collection. At present, cost and logistics limit most study populations to small samples, restricting the experimental questions that can be addressed. In this study we investigated whether the mass collection of experimental data using smartphone technology is valid, given the variability of data collection outside of a laboratory setting. We presented four classic experimental paradigms as short games, available as a free app and over the first month 20,800 users submitted data. We found that the large sample size vastly outweighed the noise inherent in collecting data outside a controlled laboratory setting, and show that for all four games canonical results were reproduced. For the first time, we provide experimental validation for the use of smartphones for data collection in cognitive science, which can lead to the collection of richer data sets and a significant cost reduction as well as provide an opportunity for efficient phenotypic screening of large populations. PMID:25025865
Crowdsourcing for cognitive science--the utility of smartphones.
Brown, Harriet R; Zeidman, Peter; Smittenaar, Peter; Adams, Rick A; McNab, Fiona; Rutledge, Robb B; Dolan, Raymond J
2014-01-01
By 2015, there will be an estimated two billion smartphone users worldwide. This technology presents exciting opportunities for cognitive science as a medium for rapid, large-scale experimentation and data collection. At present, cost and logistics limit most study populations to small samples, restricting the experimental questions that can be addressed. In this study we investigated whether the mass collection of experimental data using smartphone technology is valid, given the variability of data collection outside of a laboratory setting. We presented four classic experimental paradigms as short games, available as a free app and over the first month 20,800 users submitted data. We found that the large sample size vastly outweighed the noise inherent in collecting data outside a controlled laboratory setting, and show that for all four games canonical results were reproduced. For the first time, we provide experimental validation for the use of smartphones for data collection in cognitive science, which can lead to the collection of richer data sets and a significant cost reduction as well as provide an opportunity for efficient phenotypic screening of large populations.
Lance, Blake W.; Smith, Barton L.
2016-06-23
Transient convection has been investigated experimentally for the purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. A specialized facility for validation benchmark experiments called the Rotatable Buoyancy Tunnel was used to acquire thermal and velocity measurements of flow over a smooth, vertical heated plate. The initial condition was forced convection downward with subsequent transition to mixed convection, ending with natural convection upward after a flow reversal. Data acquisition through the transient was repeated for ensemble-averaged results. With simple flow geometry, validation data were acquired at the benchmark level. All boundary conditions (BCs) were measured and their uncertainties quantified.more » Temperature profiles on all four walls and the inlet were measured, as well as as-built test section geometry. Inlet velocity profiles and turbulence levels were quantified using Particle Image Velocimetry. System Response Quantities (SRQs) were measured for comparison with CFD outputs and include velocity profiles, wall heat flux, and wall shear stress. Extra effort was invested in documenting and preserving the validation data. Details about the experimental facility, instrumentation, experimental procedure, materials, BCs, and SRQs are made available through this paper. As a result, the latter two are available for download and the other details are included in this work.« less
Bayesian cross-entropy methodology for optimal design of validation experiments
NASA Astrophysics Data System (ADS)
Jiang, X.; Mahadevan, S.
2006-07-01
An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.
Das, Sankha Subhra; Saha, Pritam
2018-01-01
Abstract MicroRNAs (miRNAs) are well-known as key regulators of diverse biological pathways. A series of experimental evidences have shown that abnormal miRNA expression profiles are responsible for various pathophysiological conditions by modulating genes in disease associated pathways. In spite of the rapid increase in research data confirming such associations, scientists still do not have access to a consolidated database offering these miRNA-pathway association details for critical diseases. We have developed miRwayDB, a database providing comprehensive information of experimentally validated miRNA-pathway associations in various pathophysiological conditions utilizing data collected from published literature. To the best of our knowledge, it is the first database that provides information about experimentally validated miRNA mediated pathway dysregulation as seen specifically in critical human diseases and hence indicative of a cause-and-effect relationship in most cases. The current version of miRwayDB collects an exhaustive list of miRNA-pathway association entries for 76 critical disease conditions by reviewing 663 published articles. Each database entry contains complete information on the name of the pathophysiological condition, associated miRNA(s), experimental sample type(s), regulation pattern (up/down) of miRNA, pathway association(s), targeted member of dysregulated pathway(s) and a brief description. In addition, miRwayDB provides miRNA, gene and pathway score to evaluate the role of a miRNA regulated pathways in various pathophysiological conditions. The database can also be used for other biomedical approaches such as validation of computational analysis, integrated analysis and prediction of computational model. It also offers a submission page to submit novel data from recently published studies. We believe that miRwayDB will be a useful tool for miRNA research community. Database URL: http://www.mirway.iitkgp.ac.in PMID:29688364
Myers, Tony; Balmer, Nigel
2012-01-01
Numerous factors have been proposed to explain the home advantage in sport. Several authors have suggested that a partisan home crowd enhances home advantage and that this is at least in part a consequence of their influence on officiating. However, while experimental studies examining this phenomenon have high levels of internal validity (since only the "crowd noise" intervention is allowed to vary), they suffer from a lack of external validity, with decision-making in a laboratory setting typically bearing little resemblance to decision-making in live sports settings. Conversely, observational and quasi-experimental studies with high levels of external validity suffer from low levels of internal validity as countless factors besides crowd noise vary. The present study provides a unique opportunity to address these criticisms, by conducting a controlled experiment on the impact of crowd noise on officiating in a live tournament setting. Seventeen qualified judges officiated on thirty Thai boxing bouts in a live international tournament setting featuring "home" and "away" boxers. In each bout, judges were randomized into a "noise" (live sound) or "no crowd noise" (noise-canceling headphones and white noise) condition, resulting in 59 judgments in the "no crowd noise" and 61 in the "crowd noise" condition. The results provide the first experimental evidence of the impact of live crowd noise on officials in sport. A cross-classified statistical model indicated that crowd noise had a statistically significant impact, equating to just over half a point per bout (in the context of five round bouts with the "10-point must" scoring system shared with professional boxing). The practical significance of the findings, their implications for officiating and for the future conduct of crowd noise studies are discussed.
Myers, Tony; Balmer, Nigel
2012-01-01
Numerous factors have been proposed to explain the home advantage in sport. Several authors have suggested that a partisan home crowd enhances home advantage and that this is at least in part a consequence of their influence on officiating. However, while experimental studies examining this phenomenon have high levels of internal validity (since only the “crowd noise” intervention is allowed to vary), they suffer from a lack of external validity, with decision-making in a laboratory setting typically bearing little resemblance to decision-making in live sports settings. Conversely, observational and quasi-experimental studies with high levels of external validity suffer from low levels of internal validity as countless factors besides crowd noise vary. The present study provides a unique opportunity to address these criticisms, by conducting a controlled experiment on the impact of crowd noise on officiating in a live tournament setting. Seventeen qualified judges officiated on thirty Thai boxing bouts in a live international tournament setting featuring “home” and “away” boxers. In each bout, judges were randomized into a “noise” (live sound) or “no crowd noise” (noise-canceling headphones and white noise) condition, resulting in 59 judgments in the “no crowd noise” and 61 in the “crowd noise” condition. The results provide the first experimental evidence of the impact of live crowd noise on officials in sport. A cross-classified statistical model indicated that crowd noise had a statistically significant impact, equating to just over half a point per bout (in the context of five round bouts with the “10-point must” scoring system shared with professional boxing). The practical significance of the findings, their implications for officiating and for the future conduct of crowd noise studies are discussed. PMID:23049520
78 FR 37228 - Cooperative Agreement To Support the Western Center for Food Safety
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-20
... Davis main campus and experimental stations provide invaluable access to one of the leading food... sites for experimental trials is instrumental to FDA receiving the most current scientifically validated... facilitate industry compliance with preventive control standards. Information gleaned from this research has...
Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames
NASA Astrophysics Data System (ADS)
Heye, Colin; Raman, Venkat
2012-11-01
A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.
NASA Astrophysics Data System (ADS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2005-05-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
NASA Technical Reports Server (NTRS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2005-01-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
Models of protein–ligand crystal structures: trust, but verify
Deller, Marc C.
2015-01-01
X-ray crystallography provides the most accurate models of protein–ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein–ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein–ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein–ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein–ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein–ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein–ligand models for their computational and biological studies, and we provide an overview of how this can be achieved. PMID:25665575
Models of protein-ligand crystal structures: trust, but verify.
Deller, Marc C; Rupp, Bernhard
2015-09-01
X-ray crystallography provides the most accurate models of protein-ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein-ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein-ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein-ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein-ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein-ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein-ligand models for their computational and biological studies, and we provide an overview of how this can be achieved.
NASA Astrophysics Data System (ADS)
Rizzo, Axel; Vaglio-Gaudard, Claire; Martin, Julie-Fiona; Noguère, Gilles; Eschbach, Romain
2017-09-01
DARWIN2.3 is the reference package used for fuel cycle applications in France. It solves the Boltzmann and Bateman equations in a coupling way, with the European JEFF-3.1.1 nuclear data library, to compute the fuel cycle values of interest. It includes both deterministic transport codes APOLLO2 (for light water reactors) and ERANOS2 (for fast reactors), and the DARWIN/PEPIN2 depletion code, each of them being developed by CEA/DEN with the support of its industrial partners. The DARWIN2.3 package has been experimentally validated for pressurized and boiling water reactors, as well as for sodium fast reactors; this experimental validation relies on the analysis of post-irradiation experiments (PIE). The DARWIN2.3 experimental validation work points out some isotopes for which the depleted concentration calculation can be improved. Some other nuclides have no available experimental validation, and their concentration calculation uncertainty is provided by the propagation of a priori nuclear data uncertainties. This paper describes the work plan of studies initiated this year to improve the accuracy of the DARWIN2.3 depleted material balance calculation concerning some nuclides of interest for the fuel cycle.
Analytical and experimental validation of the Oblique Detonation Wave Engine concept
NASA Technical Reports Server (NTRS)
Adelman, Henry G.; Cambier, Jean-Luc; Menees, Gene P.; Balboni, John A.
1988-01-01
The Oblique Detonation Wave Engine (ODWE) for hypersonic flight has been analytically studied by NASA using the CFD codes which fully couple finite rate chemistry with fluid dynamics. Fuel injector designs investigated included wall and strut injectors, and the in-stream strut injectors were chosen to provide good mixing with minimal stagnation pressure losses. Plans for experimentally validating the ODWE concept in an arc-jet hypersonic wind tunnel are discussed. Measurements of the flow field properties behind the oblique wave will be compared to analytical predictions.
Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes
NASA Astrophysics Data System (ADS)
Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.
2018-04-01
To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.
Perspectives on the simulation of protein–surface interactions using empirical force field methods
Latour, Robert A.
2014-01-01
Protein–surface interactions are of fundamental importance for a broad range of applications in the fields of biomaterials and biotechnology. Present experimental methods are limited in their ability to provide a comprehensive depiction of these interactions at the atomistic level. In contrast, empirical force field based simulation methods inherently provide the ability to predict and visualize protein–surface interactions with full atomistic detail. These methods, however, must be carefully developed, validated, and properly applied before confidence can be placed in results from the simulations. In this perspectives paper, I provide an overview of the critical aspects that I consider being of greatest importance for the development of these methods, with a focus on the research that my combined experimental and molecular simulation groups have conducted over the past decade to address these issues. These critical issues include the tuning of interfacial force field parameters to accurately represent the thermodynamics of interfacial behavior, adequate sampling of these types of complex molecular systems to generate results that can be comparable with experimental data, and the generation of experimental data that can be used for simulation results evaluation and validation. PMID:25028242
Validated MicroRNA Target Databases: An Evaluation.
Lee, Yun Ji Diana; Kim, Veronica; Muth, Dillon C; Witwer, Kenneth W
2015-11-01
Preclinical Research Positive findings from preclinical and clinical studies involving depletion or supplementation of microRNA (miRNA) engender optimism about miRNA-based therapeutics. However, off-target effects must be considered. Predicting these effects is complicated. Each miRNA may target many gene transcripts, and the rules governing imperfectly complementary miRNA: target interactions are incompletely understood. Several databases provide lists of the relatively small number of experimentally confirmed miRNA: target pairs. Although incomplete, this information might allow assessment of at least some of the off-target effects. We evaluated the performance of four databases of experimentally validated miRNA: target interactions (miRWalk 2.0, miRTarBase, miRecords, and TarBase 7.0) using a list of 50 alphabetically consecutive genes. We examined the provided citations to determine the degree to which each interaction was experimentally supported. To assess stability, we tested at the beginning and end of a five-month period. Results varied widely by database. Two of the databases changed significantly over the course of 5 months. Most reported evidence for miRNA: target interactions were indirect or otherwise weak, and relatively few interactions were supported by more than one publication. Some returned results appear to arise from simplistic text searches that offer no insight into the relationship of the search terms, may not even include the reported gene or miRNA, and may thus, be invalid. We conclude that validation databases provide important information, but not all information in all extant databases is up-to-date or accurate. Nevertheless, the more comprehensive validation databases may provide useful starting points for investigation of off-target effects of proposed small RNA therapies. © 2015 Wiley Periodicals, Inc.
Supersonic Combustion Research at NASA
NASA Technical Reports Server (NTRS)
Drummond, J. P.; Danehy, Paul M.; Gaffney, Richard L., Jr.; Tedder, Sarah A.; Cutler, Andrew D.; Bivolaru, Daniel
2007-01-01
This paper discusses the progress of work to model high-speed supersonic reacting flow. The purpose of the work is to improve the state of the art of CFD capabilities for predicting the flow in high-speed propulsion systems, particularly combustor flowpaths. The program has several components including the development of advanced algorithms and models for simulating engine flowpaths as well as a fundamental experimental and diagnostic development effort to support the formulation and validation of the mathematical models. The paper will provide details of current work on experiments that will provide data for the modeling efforts along with the associated nonintrusive diagnostics used to collect the data from the experimental flowfield. Simulation of a recent experiment to partially validate the accuracy of a combustion code is also described.
NASA Astrophysics Data System (ADS)
Huang, Ying; Bevans, W. J.; Xiao, Hai; Zhou, Zhi; Chen, Genda
2012-04-01
During or after an earthquake event, building system often experiences large strains due to shaking effects as observed during recent earthquakes, causing permanent inelastic deformation. In addition to the inelastic deformation induced by the earthquake effect, the post-earthquake fires associated with short fuse of electrical systems and leakage of gas devices can further strain the already damaged structures during the earthquakes, potentially leading to a progressive collapse of buildings. Under these harsh environments, measurements on the involved building by various sensors could only provide limited structural health information. Finite element model analysis, on the other hand, if validated by predesigned experiments, can provide detail structural behavior information of the entire structures. In this paper, a temperature dependent nonlinear 3-D finite element model (FEM) of a one-story steel frame is set up by ABAQUS based on the cited material property of steel from EN 1993-1.2 and AISC manuals. The FEM is validated by testing the modeled steel frame in simulated post-earthquake environments. Comparisons between the FEM analysis and the experimental results show that the FEM predicts the structural behavior of the steel frame in post-earthquake fire conditions reasonably. With experimental validations, the FEM analysis of critical structures could be continuously predicted for structures in these harsh environments for a better assistant to fire fighters in their rescue efforts and save fire victims.
Experimental validation of the DARWIN2.3 package for fuel cycle applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
San-Felice, L.; Eschbach, R.; Bourdot, P.
2012-07-01
The DARWIN package, developed by the CEA and its French partners (AREVA and EDF) provides the required parameters for fuel cycle applications: fuel inventory, decay heat, activity, neutron, {gamma}, {alpha}, {beta} sources and spectrum, radiotoxicity. This paper presents the DARWIN2.3 experimental validation for fuel inventory and decay heat calculations on Pressurized Water Reactor (PWR). In order to validate this code system for spent fuel inventory a large program has been undertaken, based on spent fuel chemical assays. This paper deals with the experimental validation of DARWIN2.3 for the Pressurized Water Reactor (PWR) Uranium Oxide (UOX) and Mixed Oxide (MOX) fuelmore » inventory calculation, focused on the isotopes involved in Burn-Up Credit (BUC) applications and decay heat computations. The calculation - experiment (C/E-1) discrepancies are calculated with the latest European evaluation file JEFF-3.1.1 associated with the SHEM energy mesh. An overview of the tendencies is obtained on a complete range of burn-up from 10 to 85 GWd/t (10 to 60 GWcVt for MOX fuel). The experimental validation of the DARWIN2.3 package for decay heat calculation is performed using calorimetric measurements carried out at the Swedish Interim Spent Fuel Storage Facility for Pressurized Water Reactor (PWR) assemblies, covering a large burn-up (20 to 50 GWd/t) and cooling time range (10 to 30 years). (authors)« less
Achieving external validity in home advantage research: generalizing crowd noise effects
Myers, Tony D.
2014-01-01
Different factors have been postulated to explain the home advantage phenomenon in sport. One plausible explanation investigated has been the influence of a partisan home crowd on sports officials' decisions. Different types of studies have tested the crowd influence hypothesis including purposefully designed experiments. However, while experimental studies investigating crowd influences have high levels of internal validity, they suffer from a lack of external validity; decision-making in a laboratory setting bearing little resemblance to decision-making in live sports settings. This focused review initially considers threats to external validity in applied and theoretical experimental research. Discussing how such threats can be addressed using representative design by focusing on a recently published study that arguably provides the first experimental evidence of the impact of live crowd noise on officials in sport. The findings of this controlled experiment conducted in a real tournament setting offer a level of confirmation of the findings of laboratory studies in the area. Finally directions for future research and the future conduct of crowd noise studies are discussed. PMID:24917839
ERIC Educational Resources Information Center
Schubert, T. F., Jr.; Kim, E. M.
2009-01-01
The use of Miller's Theorem in the determination of the high-frequency cutoff frequency of transistor amplifiers was recently challenged by a paper published in this TRANSACTIONS. Unfortunately, that paper provided no simulation or experimental results to bring credence to the challenge or to validate the alternate method of determination…
NASA Astrophysics Data System (ADS)
Mashayekhi, Somayeh; Miles, Paul; Hussaini, M. Yousuff; Oates, William S.
2018-02-01
In this paper, fractional and non-fractional viscoelastic models for elastomeric materials are derived and analyzed in comparison to experimental results. The viscoelastic models are derived by expanding thermodynamic balance equations for both fractal and non-fractal media. The order of the fractional time derivative is shown to strongly affect the accuracy of the viscoelastic constitutive predictions. Model validation uses experimental data describing viscoelasticity of the dielectric elastomer Very High Bond (VHB) 4910. Since these materials are known for their broad applications in smart structures, it is important to characterize and accurately predict their behavior across a large range of time scales. Whereas integer order viscoelastic models can yield reasonable agreement with data, the model parameters often lack robustness in prediction at different deformation rates. Alternatively, fractional order models of viscoelasticity provide an alternative framework to more accurately quantify complex rate-dependent behavior. Prior research that has considered fractional order viscoelasticity lacks experimental validation and contains limited links between viscoelastic theory and fractional order derivatives. To address these issues, we use fractional order operators to experimentally validate fractional and non-fractional viscoelastic models in elastomeric solids using Bayesian uncertainty quantification. The fractional order model is found to be advantageous as predictions are significantly more accurate than integer order viscoelastic models for deformation rates spanning four orders of magnitude.
3-D and quasi-2-D discrete element modeling of grain commingling in a bucket elevator boot system
USDA-ARS?s Scientific Manuscript database
Unwanted grain commingling impedes new quality-based grain handling systems and has proven to be an expensive and time consuming issue to study experimentally. Experimentally validated models may reduce the time and expense of studying grain commingling while providing additional insight into detail...
MIDURA (Minefield Detection Using Reconnaissance Assets) 1982-1983 Experimental Test Plan.
1982-04-01
3.2.4.2 Subjection Validation at the Salem ONG 27 3.2.4.3 Objective Validity at Fort Huachuca 28 4. TEST FLIGHTS AT ARRAYS IIa, lib, Ilia AND IIIb...subjective validation at the Salem ONG; (3) objective validation at Fort Huachuca. 3.2.4.1 Subjective Image Interpretation at ERIM The initial phase...The ERIM II’s will provide for each image estimate of PD’ Pc and PFA on a 0.00 to 1.00 scale. P is defined as the subjective probability estimate that
NASA Technical Reports Server (NTRS)
Geng, Tao; Paxson, Daniel E.; Zheng, Fei; Kuznetsov, Andrey V.; Roberts, William L.
2008-01-01
Pulsed combustion is receiving renewed interest as a potential route to higher performance in air breathing propulsion systems. Pulsejets offer a simple experimental device with which to study unsteady combustion phenomena and validate simulations. Previous computational fluid dynamic (CFD) simulation work focused primarily on the pulsejet combustion and exhaust processes. This paper describes a new inlet sub-model which simulates the fluidic and mechanical operation of a valved pulsejet head. The governing equations for this sub-model are described. Sub-model validation is provided through comparisons of simulated and experimentally measured reed valve motion, and time averaged inlet mass flow rate. The updated pulsejet simulation, with the inlet sub-model implemented, is validated through comparison with experimentally measured combustion chamber pressure, inlet mass flow rate, operational frequency, and thrust. Additionally, the simulated pulsejet exhaust flowfield, which is dominated by a starting vortex ring, is compared with particle imaging velocimetry (PIV) measurements on the bases of velocity, vorticity, and vortex location. The results show good agreement between simulated and experimental data. The inlet sub-model is shown to be critical for the successful modeling of pulsejet operation. This sub-model correctly predicts both the inlet mass flow rate and its phase relationship with the combustion chamber pressure. As a result, the predicted pulsejet thrust agrees very well with experimental data.
Modeling Combustion in Supersonic Flows
NASA Technical Reports Server (NTRS)
Drummond, J. Philip; Danehy, Paul M.; Bivolaru, Daniel; Gaffney, Richard L.; Tedder, Sarah A.; Cutler, Andrew D.
2007-01-01
This paper discusses the progress of work to model high-speed supersonic reacting flow. The purpose of the work is to improve the state of the art of CFD capabilities for predicting the flow in high-speed propulsion systems, particularly combustor flow-paths. The program has several components including the development of advanced algorithms and models for simulating engine flowpaths as well as a fundamental experimental and diagnostic development effort to support the formulation and validation of the mathematical models. The paper will provide details of current work on experiments that will provide data for the modeling efforts along with with the associated nonintrusive diagnostics used to collect the data from the experimental flowfield. Simulation of a recent experiment to partially validate the accuracy of a combustion code is also described.
1981-01-01
per-rev, ring weighting factor, etc.) and with compression system design . A detailed description of the SAE methodology is provided in Ref. 1...offers insights into the practical application of experimental aeromechanical procedures and establishes the process of valid design assessment, avoiding...considerations given to the total engine system. Design Verification in the Experimental Laboratory Certain key parameters are influencing the design of modern
The Use of Virtual Reality in the Study of People's Responses to Violent Incidents.
Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel
2009-01-01
This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call 'plausibility' - including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.
The Use of Virtual Reality in the Study of People's Responses to Violent Incidents
Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel
2009-01-01
This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call ‘plausibility’ – including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents. PMID:20076762
Examining students' views about validity of experiments: From introductory to Ph.D. students
NASA Astrophysics Data System (ADS)
Hu, Dehui; Zwickl, Benjamin M.
2018-06-01
We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.
Design and validation of instruments to measure knowledge.
Elliott, T E; Regal, R R; Elliott, B A; Renier, C M
2001-01-01
Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.
Buschmann, Dominik; Haberberger, Anna; Kirchner, Benedikt; Spornraft, Melanie; Riedmaier, Irmgard; Schelling, Gustav; Pfaffl, Michael W.
2016-01-01
Small RNA-Seq has emerged as a powerful tool in transcriptomics, gene expression profiling and biomarker discovery. Sequencing cell-free nucleic acids, particularly microRNA (miRNA), from liquid biopsies additionally provides exciting possibilities for molecular diagnostics, and might help establish disease-specific biomarker signatures. The complexity of the small RNA-Seq workflow, however, bears challenges and biases that researchers need to be aware of in order to generate high-quality data. Rigorous standardization and extensive validation are required to guarantee reliability, reproducibility and comparability of research findings. Hypotheses based on flawed experimental conditions can be inconsistent and even misleading. Comparable to the well-established MIQE guidelines for qPCR experiments, this work aims at establishing guidelines for experimental design and pre-analytical sample processing, standardization of library preparation and sequencing reactions, as well as facilitating data analysis. We highlight bottlenecks in small RNA-Seq experiments, point out the importance of stringent quality control and validation, and provide a primer for differential expression analysis and biomarker discovery. Following our recommendations will encourage better sequencing practice, increase experimental transparency and lead to more reproducible small RNA-Seq results. This will ultimately enhance the validity of biomarker signatures, and allow reliable and robust clinical predictions. PMID:27317696
Vlachos, Ioannis S; Paraskevopoulou, Maria D; Karagkouni, Dimitra; Georgakilas, Georgios; Vergoulis, Thanasis; Kanellos, Ilias; Anastasopoulos, Ioannis-Laertis; Maniou, Sofia; Karathanou, Konstantina; Kalfakakou, Despina; Fevgas, Athanasios; Dalamagas, Theodore; Hatzigeorgiou, Artemis G
2015-01-01
microRNAs (miRNAs) are short non-coding RNA species, which act as potent gene expression regulators. Accurate identification of miRNA targets is crucial to understanding their function. Currently, hundreds of thousands of miRNA:gene interactions have been experimentally identified. However, this wealth of information is fragmented and hidden in thousands of manuscripts and raw next-generation sequencing data sets. DIANA-TarBase was initially released in 2006 and it was the first database aiming to catalog published experimentally validated miRNA:gene interactions. DIANA-TarBase v7.0 (http://www.microrna.gr/tarbase) aims to provide for the first time hundreds of thousands of high-quality manually curated experimentally validated miRNA:gene interactions, enhanced with detailed meta-data. DIANA-TarBase v7.0 enables users to easily identify positive or negative experimental results, the utilized experimental methodology, experimental conditions including cell/tissue type and treatment. The new interface provides also advanced information ranging from the binding site location, as identified experimentally as well as in silico, to the primer sequences used for cloning experiments. More than half a million miRNA:gene interactions have been curated from published experiments on 356 different cell types from 24 species, corresponding to 9- to 250-fold more entries than any other relevant database. DIANA-TarBase v7.0 is freely available. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Experimental Validation of a Closed Brayton Cycle System Transient Simulation
NASA Technical Reports Server (NTRS)
Johnson, Paul K.; Hervol, David S.
2006-01-01
The Brayton Power Conversion Unit (BPCU) located at NASA Glenn Research Center (GRC) in Cleveland, Ohio was used to validate the results of a computational code known as Closed Cycle System Simulation (CCSS). Conversion system thermal transient behavior was the focus of this validation. The BPCU was operated at various steady state points and then subjected to transient changes involving shaft rotational speed and thermal energy input. These conditions were then duplicated in CCSS. Validation of the CCSS BPCU model provides confidence in developing future Brayton power system performance predictions, and helps to guide high power Brayton technology development.
NASA Technical Reports Server (NTRS)
Stefanescu, D. M.; Catalina, A. V.; Juretzko, Frank R.; Sen, Subhayu; Curreri, P. A.
2003-01-01
The objective of the work on Particle Engulfment and Pushing by Solidifying Interfaces (PEP) include: 1) to obtain fundamental understanding of the physics of particle pushing and engulfment, 2) to develop mathematical models to describe the phenomenon, and 3) to perform critical experiments in the microgravity environment of space to provide benchmark data for model validation. Successful completion of this project will yield vital information relevant to a diverse area of terrestrial applications. With PEP being a long term research effort, this report will focus on advances in the theoretical treatment of the solid/liquid interface interaction with an approaching particle, experimental validation of some aspects of the developed models, and the experimental design aspects of future experiments to be performed on board the International Space Station.
Experimental annotation of the human genome using microarray technology.
Shoemaker, D D; Schadt, E E; Armour, C D; He, Y D; Garrett-Engele, P; McDonagh, P D; Loerch, P M; Leonardson, A; Lum, P Y; Cavet, G; Wu, L F; Altschuler, S J; Edwards, S; King, J; Tsang, J S; Schimmack, G; Schelter, J M; Koch, J; Ziman, M; Marton, M J; Li, B; Cundiff, P; Ward, T; Castle, J; Krolewski, M; Meyer, M R; Mao, M; Burchard, J; Kidd, M J; Dai, H; Phillips, J W; Linsley, P S; Stoughton, R; Scherer, S; Boguski, M S
2001-02-15
The most important product of the sequencing of a genome is a complete, accurate catalogue of genes and their products, primarily messenger RNA transcripts and their cognate proteins. Such a catalogue cannot be constructed by computational annotation alone; it requires experimental validation on a genome scale. Using 'exon' and 'tiling' arrays fabricated by ink-jet oligonucleotide synthesis, we devised an experimental approach to validate and refine computational gene predictions and define full-length transcripts on the basis of co-regulated expression of their exons. These methods can provide more accurate gene numbers and allow the detection of mRNA splice variants and identification of the tissue- and disease-specific conditions under which genes are expressed. We apply our technique to chromosome 22q under 69 experimental condition pairs, and to the entire human genome under two experimental conditions. We discuss implications for more comprehensive, consistent and reliable genome annotation, more efficient, full-length complementary DNA cloning strategies and application to complex diseases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, C., E-mail: hansec@uw.edu; Columbia University, New York, New York 10027; Victor, B.
We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numericalmore » validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.« less
Zhuang, Jinda; Ju, Y Sungtaek
2015-09-22
The deformation and rupture of axisymmetric liquid bridges being stretched between two fully wetted coaxial disks are studied experimentally and theoretically. We numerically solve the time-dependent Navier-Stokes equations while tracking the deformation of the liquid-air interface using the arbitrary Lagrangian-Eulerian (ALE) moving mesh method to fully account for the effects of inertia and viscous forces on bridge dynamics. The effects of the stretching velocity, liquid properties, and liquid volume on the dynamics of liquid bridges are systematically investigated to provide direct experimental validation of our numerical model for stretching velocities as high as 3 m/s. The Ohnesorge number (Oh) of liquid bridges is a primary factor governing the dynamics of liquid bridge rupture, especially the dependence of the rupture distance on the stretching velocity. The rupture distance generally increases with the stretching velocity, far in excess of the static stability limit. For bridges with low Ohnesorge numbers, however, the rupture distance stay nearly constant or decreases with the stretching velocity within certain velocity windows due to the relative rupture position switching and the thread shape change. Our work provides an experimentally validated modeling approach and experimental data to help establish foundation for systematic further studies and applications of liquid bridges.
Phenomenological study of decoherence in solid-state spin qubits due to nuclear spin diffusion
NASA Astrophysics Data System (ADS)
Biercuk, Michael J.; Bluhm, Hendrik
2011-06-01
We present a study of the prospects for coherence preservation in solid-state spin qubits using dynamical decoupling protocols. Recent experiments have provided the first demonstrations of multipulse dynamical decoupling sequences in this qubit system, but quantitative analyses of potential coherence improvements have been hampered by a lack of concrete knowledge of the relevant noise processes. We present calculations of qubit coherence under the application of arbitrary dynamical decoupling pulse sequences based on an experimentally validated semiclassical model. This phenomenological approach bundles the details of underlying noise processes into a single experimentally relevant noise power spectral density. Our results show that the dominant features of experimental measurements in a two-electron singlet-triplet spin qubit can be replicated using a 1/ω2 noise power spectrum associated with nuclear spin flips in the host material. Beginning with this validation, we address the effects of nuclear programming, high-frequency nuclear spin dynamics, and other high-frequency classical noise sources, with conjectures supported by physical arguments and microscopic calculations where relevant. Our results provide expected performance bounds and identify diagnostic metrics that can be measured experimentally in order to better elucidate the underlying nuclear spin dynamics.
PCA as a practical indicator of OPLS-DA model reliability.
Worley, Bradley; Powers, Robert
Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pritychenko, B.
The precision of double-beta ββ-decay experimental half lives and their uncertainties is reanalyzed. The method of Benford's distributions has been applied to nuclear reaction, structure and decay data sets. First-digit distribution trend for ββ-decay T 2v 1/2 is consistent with large nuclear reaction and structure data sets and provides validation of experimental half-lives. A complementary analysis of the decay uncertainties indicates deficiencies due to small size of statistical samples, and incomplete collection of experimental information. Further experimental and theoretical efforts would lead toward more precise values of-decay half-lives and nuclear matrix elements.
NASA Technical Reports Server (NTRS)
Magee, Todd E.; Wilcox, Peter A.; Fugal, Spencer R.; Acheson, Kurt E.; Adamson, Eric E.; Bidwell, Alicia L.; Shaw, Stephen G.
2013-01-01
This report describes the work conducted by The Boeing Company under American Recovery and Reinvestment Act (ARRA) and NASA funding to experimentally validate the conceptual design of a supersonic airliner feasible for entry into service in the 2018 to 2020 timeframe (NASA N+2 generation). The report discusses the design, analysis and development of a low-boom concept that meets aggressive sonic boom and performance goals for a cruise Mach number of 1.8. The design is achieved through integrated multidisciplinary optimization tools. The report also describes the detailed design and fabrication of both sonic boom and performance wind tunnel models of the low-boom concept. Additionally, a description of the detailed validation wind tunnel testing that was performed with the wind tunnel models is provided along with validation comparisons with pretest Computational Fluid Dynamics (CFD). Finally, the report describes the evaluation of existing NASA sonic boom pressure rail measurement instrumentation and a detailed description of new sonic boom measurement instrumentation that was constructed for the validation wind tunnel testing.
Testing and validating environmental models
Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.
1996-01-01
Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series against data time series, and plotting predicted versus observed values) have little diagnostic power. We propose that it may be more useful to statistically extract the relationships of primary interest from the time series, and test the model directly against them.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
Code of Federal Regulations, 2012 CFR
2012-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
Code of Federal Regulations, 2013 CFR
2013-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
Forensic Uncertainty Quantification of Explosive Dispersal of Particles
NASA Astrophysics Data System (ADS)
Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho
2017-06-01
In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.
Improving the seismic small-scale modelling by comparison with numerical methods
NASA Astrophysics Data System (ADS)
Pageot, Damien; Leparoux, Donatienne; Le Feuvre, Mathieu; Durand, Olivier; Côte, Philippe; Capdeville, Yann
2017-10-01
The potential of experimental seismic modelling at reduced scale provides an intermediate step between numerical tests and geophysical campaigns on field sites. Recent technologies such as laser interferometers offer the opportunity to get data without any coupling effects. This kind of device is used in the Mesures Ultrasonores Sans Contact (MUSC) measurement bench for which an automated support system makes possible to generate multisource and multireceivers seismic data at laboratory scale. Experimental seismic modelling would become a great tool providing a value-added stage in the imaging process validation if (1) the experimental measurement chain is perfectly mastered, and thus if the experimental data are perfectly reproducible with a numerical tool, as well as if (2) the effective source is reproducible along the measurement setup. These aspects for a quantitative validation concerning devices with piezoelectrical sources and a laser interferometer have not been yet quantitatively studied in published studies. Thus, as a new stage for the experimental modelling approach, these two key issues are tackled in the proposed paper in order to precisely define the quality of the experimental small-scale data provided by the bench MUSC, which are available in the scientific community. These two steps of quantitative validation are dealt apart any imaging techniques in order to offer the opportunity to geophysicists who want to use such data (delivered as free data) of precisely knowing their quality before testing any imaging technique. First, in order to overcome the 2-D-3-D correction usually done in seismic processing when comparing 2-D numerical data with 3-D experimental measurement, we quantitatively refined the comparison between numerical and experimental data by generating accurate experimental line sources, avoiding the necessity of geometrical spreading correction for 3-D point-source data. The comparison with 2-D and 3-D numerical modelling is based on the Spectral Element Method. The approach shows the relevance of building a line source by sampling several source points, except the boundaries effects on later arrival times. Indeed, the experimental results highlight the amplitude feature and the delay equal to π/4 provided by a line source in the same manner than numerical data. In opposite, the 2-D corrections applied on 3-D data showed discrepancies which are higher on experimental data than on numerical ones due to the source wavelet shape and interferences between different arrivals. The experimental results from the approach proposed here show that discrepancies are avoided, especially for the reflected echoes. Concerning the second point aiming to assess the experimental reproducibility of the source, correlation coefficients of recording from a repeated source impact on a homogeneous model are calculated. The quality of the results, that is, higher than 0.98, allow to calculate a mean source wavelet by inversion of a mean data set. Results obtained on a more realistic model simulating clays on limestones, confirmed the reproducibility of the source impact.
Hyper-X: Flight Validation of Hypersonic Airbreathing Technology
NASA Technical Reports Server (NTRS)
Rausch, Vincent L.; McClinton, Charles R.; Crawford, J. Larry
1997-01-01
This paper provides an overview of NASA's focused hypersonic technology program, i.e. the Hyper-X program. This program is designed to move hypersonic, air breathing vehicle technology from the laboratory environment to the flight environment, the last stage preceding prototype development. This paper presents some history leading to the flight test program, research objectives, approach, schedule and status. Substantial experimental data base and concept validation have been completed. The program is concentrating on Mach 7 vehicle development, verification and validation in preparation for wind tunnel testing in 1998 and flight testing in 1999. It is also concentrating on finalization of the Mach 5 and 10 vehicle designs. Detailed evaluation of the Mach 7 vehicle at the flight conditions is nearing completion, and will provide a data base for validation of design methods once flight test data are available.
Harris, Jeff R.; Lance, Blake W.; Smith, Barton L.
2015-08-10
We present computational fluid dynamics (CFD) validation dataset for turbulent forced convection on a vertical plate. The design of the apparatus is based on recent validation literature and provides a means to simultaneously measure boundary conditions (BCs) and system response quantities (SRQs). Important inflow quantities for Reynolds-Averaged Navier-Stokes (RANS). CFD are also measured. Data are acquired at two heating conditions and cover the range 40,000 < Re x < 300,000, 357 < Re δ2 < 813, and 0.02 < Gr/Re 2 < 0.232.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonior, Jason D; Hu, Zhen; Guo, Terry N.
This letter presents an experimental demonstration of software-defined-radio-based wireless tomography using computer-hosted radio devices called Universal Software Radio Peripheral (USRP). This experimental brief follows our vision and previous theoretical study of wireless tomography that combines wireless communication and RF tomography to provide a novel approach to remote sensing. Automatic data acquisition is performed inside an RF anechoic chamber. Semidefinite relaxation is used for phase retrieval, and the Born iterative method is utilized for imaging the target. Experimental results are presented, validating our vision of wireless tomography.
The flaws and human harms of animal experimentation.
Akhtar, Aysha
2015-10-01
Nonhuman animal ("animal") experimentation is typically defended by arguments that it is reliable, that animals provide sufficiently good models of human biology and diseases to yield relevant information, and that, consequently, its use provides major human health benefits. I demonstrate that a growing body of scientific literature critically assessing the validity of animal experimentation generally (and animal modeling specifically) raises important concerns about its reliability and predictive value for human outcomes and for understanding human physiology. The unreliability of animal experimentation across a wide range of areas undermines scientific arguments in favor of the practice. Additionally, I show how animal experimentation often significantly harms humans through misleading safety studies, potential abandonment of effective therapeutics, and direction of resources away from more effective testing methods. The resulting evidence suggests that the collective harms and costs to humans from animal experimentation outweigh potential benefits and that resources would be better invested in developing human-based testing methods.
NASA Astrophysics Data System (ADS)
Stekovic, Svjetlana; Nissen, Erin; Bhowmick, Mithun; Stewart, Donald S.; Dlott, Dana D.
2017-06-01
The objective of this work is to numerically analyze shock behavior as it propagates through compressed, unreactive and reactive liquid, such as liquid water and liquid nitromethane. Parameters, such as pressure and density, are analyzed using the Mie-Gruneisen EOS and each multi-material system is modeled using the ALE3D software. The motivation for this study is based on provided high-resolution, optical interferometer (PDV) and optical pyrometer measurements. In the experimental set-up, a liquid is placed between an Al 1100 plate and Pyrex BK-7 glass. A laser-driven Al 1100 flyer impacts the plate, causing the liquid to be highly compressed. The numerical model investigates the influence of the high pressure, shock-compressed behavior in each liquid, the energy transfer, and the wave impedance at the interface of each material in contact. The numerical results using ALE3D will be validated by experimental data. This work aims to provide further understanding of shock-compressed behavior and how the shock influences phase transition in each liquid.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Christopher S.; Bernstein, Hans C.; Weisenhorn, Pamela
Metabolic network modeling of microbial communities provides an in-depth understanding of community-wide metabolic and regulatory processes. Compared to single organism analyses, community metabolic network modeling is more complex because it needs to account for interspecies interactions. To date, most approaches focus on reconstruction of high-quality individual networks so that, when combined, they can predict community behaviors as a result of interspecies interactions. However, this conventional method becomes ineffective for communities whose members are not well characterized and cannot be experimentally interrogated in isolation. Here, we tested a new approach that uses community-level data as a critical input for the networkmore » reconstruction process. This method focuses on directly predicting interspecies metabolic interactions in a community, when axenic information is insufficient. We validated our method through the case study of a bacterial photoautotroph-heterotroph consortium that was used to provide data needed for a community-level metabolic network reconstruction. Resulting simulations provided experimentally validated predictions of how a photoautotrophic cyanobacterium supports the growth of an obligate heterotrophic species by providing organic carbon and nitrogen sources.« less
ERIC Educational Resources Information Center
Barnette, J. Jackson; Wallis, Anne Baber
2005-01-01
We rely a great deal on the schematic descriptions that represent experimental and quasi-experimental design arrangements, as well as the discussions of threats to validity associated with these, provided by Campbell and his associates: Stanley, Cook, and Shadish. Some of these designs include descriptions of treatments removed, removed and then…
AeroValve Experimental Test Data Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noakes, Mark W.
This report documents the collection of experimental test data and presents performance characteristics for the AeroValve brand prototype pneumatic bidirectional solenoid valves tested at the Oak Ridge National Laboratory (ORNL) in July/August 2014 as part of a validation of AeroValve energy efficiency claims. The test stand and control programs were provided by AeroValve. All raw data and processing are included in the report attachments.
Chen, Lei; Zhong, Hai-ying; Kuang, Jian-fei; Li, Jian-guo; Lu, Wang-jin; Chen, Jian-ye
2011-08-01
Reverse transcription quantitative real-time PCR (RT-qPCR) is a sensitive technique for quantifying gene expression, but its success depends on the stability of the reference gene(s) used for data normalization. Only a few studies on validation of reference genes have been conducted in fruit trees and none in banana yet. In the present work, 20 candidate reference genes were selected, and their expression stability in 144 banana samples were evaluated and analyzed using two algorithms, geNorm and NormFinder. The samples consisted of eight sample sets collected under different experimental conditions, including various tissues, developmental stages, postharvest ripening, stresses (chilling, high temperature, and pathogen), and hormone treatments. Our results showed that different suitable reference gene(s) or combination of reference genes for normalization should be selected depending on the experimental conditions. The RPS2 and UBQ2 genes were validated as the most suitable reference genes across all tested samples. More importantly, our data further showed that the widely used reference genes, ACT and GAPDH, were not the most suitable reference genes in many banana sample sets. In addition, the expression of MaEBF1, a gene of interest that plays an important role in regulating fruit ripening, under different experimental conditions was used to further confirm the validated reference genes. Taken together, our results provide guidelines for reference gene(s) selection under different experimental conditions and a foundation for more accurate and widespread use of RT-qPCR in banana.
Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis
2015-01-01
Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.
NASA Astrophysics Data System (ADS)
Johnston, Michael A.; Farrell, Damien; Nielsen, Jens Erik
2012-04-01
The exchange of information between experimentalists and theoreticians is crucial to improving the predictive ability of theoretical methods and hence our understanding of the related biology. However many barriers exist which prevent the flow of information between the two disciplines. Enabling effective collaboration requires that experimentalists can easily apply computational tools to their data, share their data with theoreticians, and that both the experimental data and computational results are accessible to the wider community. We present a prototype collaborative environment for developing and validating predictive tools for protein biophysical characteristics. The environment is built on two central components; a new python-based integration module which allows theoreticians to provide and manage remote access to their programs; and PEATDB, a program for storing and sharing experimental data from protein biophysical characterisation studies. We demonstrate our approach by integrating PEATSA, a web-based service for predicting changes in protein biophysical characteristics, into PEATDB. Furthermore, we illustrate how the resulting environment aids method development using the Potapov dataset of experimentally measured ΔΔGfold values, previously employed to validate and train protein stability prediction algorithms.
McFadden, Michael J; Iqbal, Muzammil; Dillon, Thomas; Nair, Rohit; Gu, Tian; Prather, Dennis W; Haney, Michael W
2006-09-01
The use of optical interconnects for communication between points on a microchip is motivated by system-level interconnect modeling showing the saturation of metal wire capacity at the global layer. Free-space optical solutions are analyzed for intrachip communication at the global layer. A multiscale solution comprising microlenses, etched compound slope microprisms, and a curved mirror is shown to outperform a single-scale alternative. Microprisms are designed and fabricated and inserted into an optical setup apparatus to experimentally validate the concept. The multiscale free-space system is shown to have the potential to provide the bandwidth density and configuration flexibility required for global communication in future generations of microchips.
Acoustically Driven Fluid and Particle Motion in Confined and Leaky Systems
NASA Astrophysics Data System (ADS)
Barnkob, Rune; Nama, Nitesh; Ren, Liqiang; Huang, Tony Jun; Costanzo, Francesco; Kähler, Christian J.
2018-01-01
The acoustic motion of fluids and particles in confined and acoustically leaky systems is receiving increasing attention for its use in medicine and biotechnology. A number of contradicting physical and numerical models currently exist, but their validity is uncertain due to the unavailability of hard-to-access experimental data for validation. We provide experimental benchmarking data by measuring 3D particle trajectories and demonstrate that the particle trajectories can be described numerically without any fitting parameter by a reduced-fluid model with leaky impedance-wall conditions. The results reveal the hitherto unknown existence of a pseudo-standing wave that drives the acoustic streaming as well as the acoustic radiation force on suspended particles.
Haley, David W
2011-09-01
The current study examined whether the psychological stress of the still-face (SF) task (i.e. stress resulting from a parent's unresponsiveness) is a valid laboratory stress paradigm for evaluating infant cortisol reactivity. Given that factors external to the experimental paradigm, such as arriving at a new place, may cause an elevation in cortisol secretion; we tested the hypothesis that infants would show a cortisol response to the SF task but not to a normal FF task (control). Saliva was collected for cortisol measurement from 6-month-old infants (n = 31) randomly assigned to either a repeated SF task or to a continuous FF task. Parent-infant dyads were videotaped. Salivary cortisol concentration was measured at baseline, 20, and 30 min after the start of the procedure. Infant salivary cortisol concentrations showed a significant increase over time for the SF task but not for the FF task. The results provide new evidence that the repeated SF task provides a psychological challenge that is due to the SF condition rather than to some non-task related factor; these results provide internal validity for the paradigm. The study offers new insight into the role of parent-infant interactions in the activation of the infant stress response system.
Orchestrating TRANSP Simulations for Interpretative and Predictive Tokamak Modeling with OMFIT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grierson, B. A.; Yuan, X.; Gorelenkova, M.
TRANSP simulations are being used in the OMFIT work- flow manager to enable a machine independent means of experimental analysis, postdictive validation, and predictive time dependent simulations on the DIII-D, NSTX, JET and C-MOD tokamaks. The procedures for preparing the input data from plasma profile diagnostics and equilibrium reconstruction, as well as processing of the time-dependent heating and current drive sources and assumptions about the neutral recycling, vary across machines, but are streamlined by using a common workflow manager. Settings for TRANSP simulation fidelity are incorporated into the OMFIT framework, contrasting between-shot analysis, power balance, and fast-particle simulations. A previouslymore » established series of data consistency metrics are computed such as comparison of experimental vs. calculated neutron rate, equilibrium stored energy vs. total stored energy from profile and fast-ion pressure, and experimental vs. computed surface loop voltage. Discrepancies between data consistency metrics can indicate errors in input quantities such as electron density profile or Zeff, or indicate anomalous fast-particle transport. Measures to assess the sensitivity of the verification metrics to input quantities are provided by OMFIT, including scans of the input profiles and standardized post-processing visualizations. For predictive simulations, TRANSP uses GLF23 or TGLF to predict core plasma profiles, with user defined boundary conditions in the outer region of the plasma. ITPA validation metrics are provided in post-processing to assess the transport model validity. By using OMFIT to orchestrate the steps for experimental data preparation, selection of operating mode, submission, post-processing and visualization, we have streamlined and standardized the usage of TRANSP.« less
Orchestrating TRANSP Simulations for Interpretative and Predictive Tokamak Modeling with OMFIT
Grierson, B. A.; Yuan, X.; Gorelenkova, M.; ...
2018-02-21
TRANSP simulations are being used in the OMFIT work- flow manager to enable a machine independent means of experimental analysis, postdictive validation, and predictive time dependent simulations on the DIII-D, NSTX, JET and C-MOD tokamaks. The procedures for preparing the input data from plasma profile diagnostics and equilibrium reconstruction, as well as processing of the time-dependent heating and current drive sources and assumptions about the neutral recycling, vary across machines, but are streamlined by using a common workflow manager. Settings for TRANSP simulation fidelity are incorporated into the OMFIT framework, contrasting between-shot analysis, power balance, and fast-particle simulations. A previouslymore » established series of data consistency metrics are computed such as comparison of experimental vs. calculated neutron rate, equilibrium stored energy vs. total stored energy from profile and fast-ion pressure, and experimental vs. computed surface loop voltage. Discrepancies between data consistency metrics can indicate errors in input quantities such as electron density profile or Zeff, or indicate anomalous fast-particle transport. Measures to assess the sensitivity of the verification metrics to input quantities are provided by OMFIT, including scans of the input profiles and standardized post-processing visualizations. For predictive simulations, TRANSP uses GLF23 or TGLF to predict core plasma profiles, with user defined boundary conditions in the outer region of the plasma. ITPA validation metrics are provided in post-processing to assess the transport model validity. By using OMFIT to orchestrate the steps for experimental data preparation, selection of operating mode, submission, post-processing and visualization, we have streamlined and standardized the usage of TRANSP.« less
NSRD-10: Leak Path Factor Guidance Using MELCOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less
Fast scattering simulation tool for multi-energy x-ray imaging
NASA Astrophysics Data System (ADS)
Sossin, A.; Tabary, J.; Rebuffel, V.; Létang, J. M.; Freud, N.; Verger, L.
2015-12-01
A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.
miRTarBase update 2018: a resource for experimentally validated microRNA-target interactions.
Chou, Chih-Hung; Shrestha, Sirjana; Yang, Chi-Dung; Chang, Nai-Wen; Lin, Yu-Ling; Liao, Kuang-Wen; Huang, Wei-Chi; Sun, Ting-Hsuan; Tu, Siang-Jyun; Lee, Wei-Hsiang; Chiew, Men-Yee; Tai, Chun-San; Wei, Ting-Yen; Tsai, Tzi-Ren; Huang, Hsin-Tzu; Wang, Chung-Yu; Wu, Hsin-Yi; Ho, Shu-Yi; Chen, Pin-Rong; Chuang, Cheng-Hsun; Hsieh, Pei-Jung; Wu, Yi-Shin; Chen, Wen-Liang; Li, Meng-Ju; Wu, Yu-Chun; Huang, Xin-Yi; Ng, Fung Ling; Buddhakosai, Waradee; Huang, Pei-Chun; Lan, Kuan-Chun; Huang, Chia-Yen; Weng, Shun-Long; Cheng, Yeong-Nan; Liang, Chao; Hsu, Wen-Lian; Huang, Hsien-Da
2018-01-04
MicroRNAs (miRNAs) are small non-coding RNAs of ∼ 22 nucleotides that are involved in negative regulation of mRNA at the post-transcriptional level. Previously, we developed miRTarBase which provides information about experimentally validated miRNA-target interactions (MTIs). Here, we describe an updated database containing 422 517 curated MTIs from 4076 miRNAs and 23 054 target genes collected from over 8500 articles. The number of MTIs curated by strong evidence has increased ∼1.4-fold since the last update in 2016. In this updated version, target sites validated by reporter assay that are available in the literature can be downloaded. The target site sequence can extract new features for analysis via a machine learning approach which can help to evaluate the performance of miRNA-target prediction tools. Furthermore, different ways of browsing enhance user browsing specific MTIs. With these improvements, miRTarBase serves as more comprehensively annotated, experimentally validated miRNA-target interactions databases in the field of miRNA related research. miRTarBase is available at http://miRTarBase.mbc.nctu.edu.tw/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation
NASA Technical Reports Server (NTRS)
Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.
2009-01-01
A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.
Effects of human running cadence and experimental validation of the bouncing ball model
NASA Astrophysics Data System (ADS)
Bencsik, László; Zelei, Ambrus
2017-05-01
The biomechanical analysis of human running is a complex problem, because of the large number of parameters and degrees of freedom. However, simplified models can be constructed, which are usually characterized by some fundamental parameters, like step length, foot strike pattern and cadence. The bouncing ball model of human running is analysed theoretically and experimentally in this work. It is a minimally complex dynamic model when the aim is to estimate the energy cost of running and the tendency of ground-foot impact intensity as a function of cadence. The model shows that cadence has a direct effect on energy efficiency of running and ground-foot impact intensity. Furthermore, it shows that higher cadence implies lower risk of injury and better energy efficiency. An experimental data collection of 121 amateur runners is presented. The experimental results validate the model and provides information about the walk-to-run transition speed and the typical development of cadence and grounded phase ratio in different running speed ranges.
Chen, S C; You, S H; Liu, C Y; Chio, C P; Liao, C M
2012-09-01
The aim of this work was to use experimental infection data of human influenza to assess a simple viral dynamics model in epithelial cells and better understand the underlying complex factors governing the infection process. The developed study model expands on previous reports of a target cell-limited model with delayed virus production. Data from 10 published experimental infection studies of human influenza was used to validate the model. Our results elucidate, mechanistically, the associations between epithelial cells, human immune responses, and viral titres and were supported by the experimental infection data. We report that the maximum total number of free virions following infection is 10(3)-fold higher than the initial introduced titre. Our results indicated that the infection rates of unprotected epithelial cells probably play an important role in affecting viral dynamics. By simulating an advanced model of viral dynamics and applying it to experimental infection data of human influenza, we obtained important estimates of the infection rate. This work provides epidemiologically meaningful results, meriting further efforts to understand the causes and consequences of influenza A infection.
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.
1994-01-01
This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.
Comparing fluid mechanics models with experimental data.
Spedding, G R
2003-01-01
The art of modelling the physical world lies in the appropriate simplification and abstraction of the complete problem. In fluid mechanics, the Navier-Stokes equations provide a model that is valid under most circumstances germane to animal locomotion, but the complexity of solutions provides strong incentive for the development of further, more simplified practical models. When the flow organizes itself so that all shearing motions are collected into localized patches, then various mathematical vortex models have been very successful in predicting and furthering the physical understanding of many flows, particularly in aerodynamics. Experimental models have the significant added convenience that the fluid mechanics can be generated by a real fluid, not a model, provided the appropriate dimensionless groups have similar values. Then, analogous problems can be encountered in making intelligible but independent descriptions of the experimental results. Finally, model predictions and experimental results may be compared if, and only if, numerical estimates of the likely variations in the tested quantities are provided. Examples from recent experimental measurements of wakes behind a fixed wing and behind a bird in free flight are used to illustrate these principles. PMID:14561348
Alonso-López, Diego; Gutiérrez, Miguel A.; Lopes, Katia P.; Prieto, Carlos; Santamaría, Rodrigo; De Las Rivas, Javier
2016-01-01
APID (Agile Protein Interactomes DataServer) is an interactive web server that provides unified generation and delivery of protein interactomes mapped to their respective proteomes. This resource is a new, fully redesigned server that includes a comprehensive collection of protein interactomes for more than 400 organisms (25 of which include more than 500 interactions) produced by the integration of only experimentally validated protein–protein physical interactions. For each protein–protein interaction (PPI) the server includes currently reported information about its experimental validation to allow selection and filtering at different quality levels. As a whole, it provides easy access to the interactomes from specific species and includes a global uniform compendium of 90,379 distinct proteins and 678,441 singular interactions. APID integrates and unifies PPIs from major primary databases of molecular interactions, from other specific repositories and also from experimentally resolved 3D structures of protein complexes where more than two proteins were identified. For this purpose, a collection of 8,388 structures were analyzed to identify specific PPIs. APID also includes a new graph tool (based on Cytoscape.js) for visualization and interactive analyses of PPI networks. The server does not require registration and it is freely available for use at http://apid.dep.usal.es. PMID:27131791
NASA Technical Reports Server (NTRS)
Garg, Vijay K.
2001-01-01
The turbine gas path is a very complex flow field. This is due to a variety of flow and heat transfer phenomena encountered in turbine passages. This manuscript provides an overview of the current work in this field at the NASA Glenn Research Center. Also, based on the author's preference, more emphasis is on the computational work. There is much more experimental work in progress at GRC than that reported here. While much has been achieved, more needs to be done in terms of validating the predictions against experimental data. More experimental data, especially on film cooled and rough turbine blades, are required for code validation. Also, the combined film cooling and internal cooling flow computation for a real blade is yet to be performed. While most computational work to date has assumed steady state conditions, the flow is clearly unsteady due to the presence of wakes. All this points to a long road ahead. However, we are well on course.
Experimental aerothermodynamic research of hypersonic aircraft
NASA Technical Reports Server (NTRS)
Cleary, Joseph W.
1987-01-01
The 2-D and 3-D advance computer codes being developed for use in the design of such hypersonic aircraft as the National Aero-Space Plane require comparison of the computational results with a broad spectrum of experimental data to fully assess the validity of the codes. This is particularly true for complex flow fields with control surfaces present and for flows with separation, such as leeside flow. Therefore, the objective is to provide a hypersonic experimental data base required for validation of advanced computational fluid dynamics (CFD) computer codes and for development of more thorough understanding of the flow physics necessary for these codes. This is being done by implementing a comprehensive test program for a generic all-body hypersonic aircraft model in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel over a broad range of test conditions to obtain pertinent surface and flowfield data. Results from the flow visualization portion of the investigation are presented.
Physical Justification for Negative Remanent Magnetization in Homogeneous Nanoparticles
Gu, Shuo; He, Weidong; Zhang, Ming; Zhuang, Taisen; Jin, Yi; ElBidweihy, Hatem; Mao, Yiwu; Dickerson, James H.; Wagner, Michael J.; Torre, Edward Della; Bennett, Lawrence H.
2014-01-01
The phenomenon of negative remanent magnetization (NRM) has been observed experimentally in a number of heterogeneous magnetic systems and has been considered anomalous. The existence of NRM in homogenous magnetic materials is still in debate, mainly due to the lack of compelling support from experimental data and a convincing theoretical explanation for its thermodynamic validation. Here we resolve the long-existing controversy by presenting experimental evidence and physical justification that NRM is real in a prototype homogeneous ferromagnetic nanoparticle, an europium sulfide nanoparticle. We provide novel insights into major and minor hysteresis behavior that illuminate the true nature of the observed inverted hysteresis and validate its thermodynamic permissibility and, for the first time, present counterintuitive magnetic aftereffect behavior that is consistent with the mechanism of magnetization reversal, possessing unique capability to identify NRM. The origin and conditions of NRM are explained quantitatively via a wasp-waist model, in combination of energy calculations. PMID:25183061
Experimental validation of bracing recommendations for long-span concrete girders : final report.
DOT National Transportation Integrated Search
2012-12-01
During bridge construction, flexible support conditions provided by steel-reinforced neoprene bearing pads supporting precast, prestressed concrete girders may allow the girders to become unstable, rolling about an axis parallel to the span of the gi...
Computer aided manual validation of mass spectrometry-based proteomic data.
Curran, Timothy G; Bryson, Bryan D; Reigelhaupt, Michael; Johnson, Hannah; White, Forest M
2013-06-15
Advances in mass spectrometry-based proteomic technologies have increased the speed of analysis and the depth provided by a single analysis. Computational tools to evaluate the accuracy of peptide identifications from these high-throughput analyses have not kept pace with technological advances; currently the most common quality evaluation methods are based on statistical analysis of the likelihood of false positive identifications in large-scale data sets. While helpful, these calculations do not consider the accuracy of each identification, thus creating a precarious situation for biologists relying on the data to inform experimental design. Manual validation is the gold standard approach to confirm accuracy of database identifications, but is extremely time-intensive. To palliate the increasing time required to manually validate large proteomic datasets, we provide computer aided manual validation software (CAMV) to expedite the process. Relevant spectra are collected, catalogued, and pre-labeled, allowing users to efficiently judge the quality of each identification and summarize applicable quantitative information. CAMV significantly reduces the burden associated with manual validation and will hopefully encourage broader adoption of manual validation in mass spectrometry-based proteomics. Copyright © 2013 Elsevier Inc. All rights reserved.
Shock compression response of cold-rolled Ni/Al multilayer composites
Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.
2017-01-06
Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. Finally, these simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.
Establishment of a VISAR Measurement System for Material Model Validation in DSTO
2013-02-01
advancements published in the works by L.M. Baker, E.R. Hollenbach and W.F. Hemsing [1-3] and results in the user-friendly interface and configuration of the...VISAR system [4] used in the current work . VISAR tests are among the mandatory instrumentation techniques when validating material models and...The present work reports on preliminary tests using the recently commissioned DSTO VISAR system, providing an assessment of the experimental set-up
NASA Astrophysics Data System (ADS)
Leclaire, N.; Cochet, B.; Le Dauphin, F. X.; Haeck, W.; Jacquet, O.
2014-06-01
The present paper aims at providing experimental validation for the use of the MORET 5 code for advanced concepts of reactor involving thorium and heavy water. It therefore constitutes an opportunity to test and improve the thermal-scattering data of heavy water and also to test the recent implementation of probability tables in the MORET 5 code.
Estimating Flow-Through Balance Momentum Tares with CFD
NASA Technical Reports Server (NTRS)
Melton, John E.; James, Kevin D.; Long, Kurtis R.; Flamm, Jeffrey D.
2016-01-01
This paper describes the process used for estimating flow-through balance momentum tares. The interaction of jet engine exhausts on the BOEINGERA Hybrid Wing Body (HWB) was simulated in the NFAC 40x80 wind tunnel at NASA Ames using a pair of turbine powered simulators (TPS). High-pressure air was passed through a flow-through balance and manifold before being delivered to the TPS units. The force and moment tares that result from the internal shear and pressure distribution were estimated using CFD. Validation of the CFD simulations for these complex internal flows is a challenge, given limited experimental data due to the complications of the internal geometry. Two CFD validation efforts are documented, and comparisons with experimental data from the final model installation are provided.
NASA Astrophysics Data System (ADS)
Banica, M. C.; Chun, J.; Scheuermann, T.; Weigand, B.; Wolfersdorf, J. v.
2009-01-01
Scramjet powered vehicles can decrease costs for access to space but substantial obstacles still exist in their realization. For example, experiments in the relevant Mach number regime are difficult to perform and flight testing is expensive. Therefore, numerical methods are often employed for system layout but they require validation against experimental data. Here, we validate the commercial code CFD++ against experimental results for hydrogen combustion in the supersonic combustion facility of the Institute of Aerospace Thermodynamics (ITLR) at the Universität Stuttgart. Fuel is injected through a lobed a strut injector, which provides rapid mixing. Our numerical data shows reasonable agreement with experiments. We further investigate effects of varying equivalence ratios on several important performance parameters.
NASA Technical Reports Server (NTRS)
Sellers, William L., III; Dwoyer, Douglas L.
1992-01-01
The design of a hypersonic aircraft poses unique challenges to the engineering community. Problems with duplicating flight conditions in ground based facilities have made performance predictions risky. Computational fluid dynamics (CFD) has been proposed as an additional means of providing design data. At the present time, CFD codes are being validated based on sparse experimental data and then used to predict performance at flight conditions with generally unknown levels of uncertainty. This paper will discuss the facility and measurement techniques that are required to support CFD development for the design of hypersonic aircraft. Illustrations are given of recent success in combining experimental and direct numerical simulation in CFD model development and validation for hypersonic perfect gas flows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freed, Melanie; Miller, Stuart; Tang, Katherine
Purpose: MANTIS is a Monte Carlo code developed for the detailed simulation of columnar CsI scintillator screens in x-ray imaging systems. Validation of this code is needed to provide a reliable and valuable tool for system optimization and accurate reconstructions for a variety of x-ray applications. Whereas previous validation efforts have focused on matching of summary statistics, in this work the authors examine the complete point response function (PRF) of the detector system in addition to relative light output values. Methods: Relative light output values and high-resolution PRFs have been experimentally measured with a custom setup. A corresponding set ofmore » simulated light output values and PRFs have also been produced, where detailed knowledge of the experimental setup and CsI:Tl screen structures are accounted for in the simulations. Four different screens were investigated with different thicknesses, column tilt angles, and substrate types. A quantitative comparison between the experimental and simulated PRFs was performed for four different incidence angles (0 deg., 15 deg., 30 deg., and 45 deg.) and two different x-ray spectra (40 and 70 kVp). The figure of merit (FOM) used measures the normalized differences between the simulated and experimental data averaged over a region of interest. Results: Experimental relative light output values ranged from 1.456 to 1.650 and were in approximate agreement for aluminum substrates, but poor agreement for graphite substrates. The FOMs for all screen types, incidence angles, and energies ranged from 0.1929 to 0.4775. To put these FOMs in context, the same FOM was computed for 2D symmetric Gaussians fit to the same experimental data. These FOMs ranged from 0.2068 to 0.8029. Our analysis demonstrates that MANTIS reproduces experimental PRFs with higher accuracy than a symmetric 2D Gaussian fit to the experimental data in the majority of cases. Examination of the spatial distribution of differences between the PRFs shows that the main reason for errors between MANTIS and the experimental data is that MANTIS-generated PRFs are sharper than the experimental PRFs. Conclusions: The experimental validation of MANTIS performed in this study demonstrates that MANTIS is able to reliably predict experimental PRFs, especially for thinner screens, and can reproduce the highly asymmetric shape seen in the experimental data. As a result, optimizations and reconstructions carried out using MANTIS should yield results indicative of actual detector performance. Better characterization of screen properties is necessary to reconcile the simulated light output values with experimental data.« less
2014-08-22
higher frequencies due to weaves with smaller unit cells. A second predicts the dielectric properties of unidirectional composite fabrics and laminates ...effective dielectric properties of composite laminates within the X- band (8-12 GHz). The circuit analog method becomes less accurate as the...architectures and to multilayered laminates . In this project, experimental validation from 4-50 GHz is provided for single layers of dry structural grade
Identification of widespread adenosine nucleotide binding in Mycobacterium tuberculosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ansong, Charles; Ortega, Corrie; Payne, Samuel H.
The annotation of protein function is almost completely performed by in silico approaches. However, computational prediction of protein function is frequently incomplete and error prone. In Mycobacterium tuberculosis (Mtb), ~25% of all genes have no predicted function and are annotated as hypothetical proteins. This lack of functional information severely limits our understanding of Mtb pathogenicity. Current tools for experimental functional annotation are limited and often do not scale to entire protein families. Here, we report a generally applicable chemical biology platform to functionally annotate bacterial proteins by combining activity-based protein profiling (ABPP) and quantitative LC-MS-based proteomics. As an example ofmore » this approach for high-throughput protein functional validation and discovery, we experimentally annotate the families of ATP-binding proteins in Mtb. Our data experimentally validate prior in silico predictions of >250 ATPases and adenosine nucleotide-binding proteins, and reveal 73 hypothetical proteins as novel ATP-binding proteins. We identify adenosine cofactor interactions with many hypothetical proteins containing a diversity of unrelated sequences, providing a new and expanded view of adenosine nucleotide binding in Mtb. Furthermore, many of these hypothetical proteins are both unique to Mycobacteria and essential for infection, suggesting specialized functions in mycobacterial physiology and pathogenicity. Thus, we provide a generally applicable approach for high throughput protein function discovery and validation, and highlight several ways in which application of activity-based proteomics data can improve the quality of functional annotations to facilitate novel biological insights.« less
Overview of HIT-SI3 experiment: Simulations, Diagnostics, and Summary of Current Results
NASA Astrophysics Data System (ADS)
Penna, James; Jarboe, Thomas; Nelson, Brian; Hossack, Aaron; Sutherland, Derek; Morgan, Kyle; Hansen, Chris; Benedett, Thomas; Everson, Chris; Victor, Brian
2016-10-01
The Helicity Injected Torus - Steady Inductive 3(HIT-SI3)experiment forms and maintains spheromaks via Steady Inductive Helicity Injection (SIHI). Three injector units allow for continuous injection of helicity into a copper flux conserver in order to sustain a spheromak. Firing of the injectors with a phase difference allows finite rotation of the plasma to provide a stabilizing effect. Simulations in the MHD code NIMROD and the fluid-model code PSI-TET provide validation and a basis for interpretation of the observed experimental data. Thompson Scattering (TS) and Far Infrared (FIR) Interferometer systems allow temperature and line-averaged density measurements to be taken. An Ion Doppler Spectroscopy (IDS) system allows measurement of the plasma rotation and velocity. HIT-SI3 data has been used for validation of IDCD predictions, in particular the projected impedance of helicity injectors according to the theory. The experimental impedances have been calculated here for the first time for different HIT-SI3 regimes. Such experimental evidence will contribute to the design of future experiments employing IDCD as a current-drive mechanism. Work supported by the D.O.E., Office of Science, Office of Fusion Science.
NASA Technical Reports Server (NTRS)
Hessenius, K. A.; Goorjian, P. M.
1981-01-01
A high frequency extension of the unsteady, transonic code LTRAN2 was created and is evaluated by comparisons with experimental results. The experimental test case is a NACA 64A010 airfoil in pitching motion at a Mach number of 0.8 over a range of reduced frequencies. Comparisons indicate that the modified code is an improvement of the original LTRAN2 and provides closer agreement with experimental lift and moment coefficients. A discussion of the code modifications, which involve the addition of high frequency terms of the boundary conditions of the numerical algorithm, is included.
Experimental aeroelasticity history, status and future in brief
NASA Technical Reports Server (NTRS)
Ricketts, Rodney H.
1990-01-01
NASA conducts wind tunnel experiments to determine and understand the aeroelastic characteristics of new and advanced flight vehicles, including fixed-wing, rotary-wing and space-launch configurations. Review and assessments are made of the state-of-the-art in experimental aeroelasticity regarding available facilities, measurement techniques, and other means and devices useful in testing. In addition, some past experimental programs are described which assisted in the development of new technology, validated new analysis codes, or provided needed information for clearing flight envelopes of unwanted aeroelastic response. Finally, needs and requirements for advances and improvements in testing capabilities for future experimental research and development programs are described.
Experimental economics' inconsistent ban on deception.
Hersch, Gil
2015-08-01
According to what I call the 'argument from public bads', if a researcher deceived subjects in the past, there is a chance that subjects will discount the information that a subsequent researcher provides, thus compromising the validity of the subsequent researcher's experiment. While this argument is taken to justify an existing informal ban on explicit deception in experimental economics, it can also apply to implicit deception, yet implicit deception is not banned and is sometimes used in experimental economics. Thus, experimental economists are being inconsistent when they appeal to the argument from public bads to justify banning explicit deception but not implicit deception. Copyright © 2015 Elsevier Ltd. All rights reserved.
Universal Effectiveness of Inducing Magnetic Moments in Graphene by Amino-Type sp3-Defects
Wu, Liting; Gao, Shengqing; Li, Ming; Wen, Jianfeng; Li, Xinyu; Liu, Fuchi
2018-01-01
Inducing magnetic moments in graphene is very important for its potential application in spintronics. Introducing sp3-defects on the graphene basal plane is deemed as the most promising approach to produce magnetic graphene. However, its universal validity has not been very well verified experimentally. By functionalization of approximately pure amino groups on graphene basal plane, a spin-generalization efficiency of ~1 μB/100 NH2 was obtained for the first time, thus providing substantial evidence for the validity of inducing magnetic moments by sp3-defects. As well, amino groups provide another potential sp3-type candidate to prepare magnetic graphene. PMID:29673185
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
In order to better understand the dynamic processes of a real game system, we need an appropriate dynamics model, so to evaluate the validity of a model is not a trivial task. Here, we demonstrate an approach, considering the dynamical macroscope patterns of angular momentum and speed as the measurement variables, to evaluate the validity of various dynamics models. Using the data in real time Rock-Paper-Scissors (RPS) games experiments, we obtain the experimental dynamic patterns, and then derive the related theoretical dynamic patterns from a series of typical dynamics models respectively. By testing the goodness-of-fit between the experimental and theoretical patterns, the validity of the models can be evaluated. One of the results in our study case is that, among all the nonparametric models tested, the best-known Replicator dynamics model performs almost worst, while the Projection dynamics model performs best. Besides providing new empirical macroscope patterns of social dynamics, we demonstrate that the approach can be an effective and rigorous tool to test game dynamics models. Fundamental Research Funds for the Central Universities (SSEYI2014Z) and the National Natural Science Foundation of China (Grants No. 61503062).
Chen, Weixin; Chen, Jianye; Lu, Wangjin; Chen, Lei; Fu, Danwen
2012-01-01
Real-time reverse transcription PCR (RT-qPCR) is a preferred method for rapid and accurate quantification of gene expression studies. Appropriate application of RT-qPCR requires accurate normalization though the use of reference genes. As no single reference gene is universally suitable for all experiments, thus reference gene(s) validation under different experimental conditions is crucial for RT-qPCR analysis. To date, only a few studies on reference genes have been done in other plants but none in papaya. In the present work, we selected 21 candidate reference genes, and evaluated their expression stability in 246 papaya fruit samples using three algorithms, geNorm, NormFinder and RefFinder. The samples consisted of 13 sets collected under different experimental conditions, including various tissues, different storage temperatures, different cultivars, developmental stages, postharvest ripening, modified atmosphere packaging, 1-methylcyclopropene (1-MCP) treatment, hot water treatment, biotic stress and hormone treatment. Our results demonstrated that expression stability varied greatly between reference genes and that different suitable reference gene(s) or combination of reference genes for normalization should be validated according to the experimental conditions. In general, the internal reference genes EIF (Eukaryotic initiation factor 4A), TBP1 (TATA binding protein 1) and TBP2 (TATA binding protein 2) genes had a good performance under most experimental conditions, whereas the most widely present used reference genes, ACTIN (Actin 2), 18S rRNA (18S ribosomal RNA) and GAPDH (Glyceraldehyde-3-phosphate dehydrogenase) were not suitable in many experimental conditions. In addition, two commonly used programs, geNorm and Normfinder, were proved sufficient for the validation. This work provides the first systematic analysis for the selection of superior reference genes for accurate transcript normalization in papaya under different experimental conditions. PMID:22952972
Development and validation of a 10-year-old child ligamentous cervical spine finite element model.
Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H
2013-12-01
Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.
Experimental validation of structural optimization methods
NASA Technical Reports Server (NTRS)
Adelman, Howard M.
1992-01-01
The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.
NASA Technical Reports Server (NTRS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2008-01-01
An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasten, C. P., E-mail: ckasten@alum.mit.edu; White, A. E.; Irby, J. H.
2014-04-15
Accurately predicting the turbulent transport properties of magnetically confined plasmas is a major challenge of fusion energy research. Validation of transport models is typically done by applying so-called “synthetic diagnostics” to the output of nonlinear gyrokinetic simulations, and the results are compared to experimental data. As part of the validation process, comparing two independent turbulence measurements to each other provides the opportunity to test the synthetic diagnostics themselves; a step which is rarely possible due to limited availability of redundant fluctuation measurements on magnetic confinement experiments. At Alcator C-Mod, phase-contrast imaging (PCI) is a commonly used turbulence diagnostic. PCI measuresmore » line-integrated electron density fluctuations with high sensitivity and wavenumber resolution (1.6 cm{sup −1}≲|k{sub R}|≲11 cm{sup −1}). A new fast two-color interferometry (FTCI) diagnostic on the Alcator C-Mod tokamak measures long-wavelength (|k{sub R}|≲3.0 cm{sup −1}) line-integrated electron density fluctuations. Measurements of coherent and broadband fluctuations made by PCI and FTCI are compared here for the first time. Good quantitative agreement is found between the two measurements. This provides experimental validation of the low-wavenumber region of the PCI calibration, and also helps validate the low-wavenumber portions of the synthetic PCI diagnostic that has been used in gyrokinetic model validation work in the past. We discuss possibilities to upgrade FTCI, so that a similar comparison could be done at higher wavenumbers in the future.« less
1998-12-01
Soft Sphere Molecular Model for Inverse-Power-Law or Lennard Jones Potentials , Physics of Fluids A, Vol. 3, No. 10, pp. 2459-2465. 42. Legge, H...information; — Providing assistance to member nations for the purpose of increasing their scientific and technical potential ; — Rendering scientific and...nal, 34:756-763, 1996. [22] W. Jones and B. Launder. The Prediction of Laminarization with a Two-Equation Model of Turbulence. Int. Journal of Heat
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo
Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.
Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua
2018-01-04
Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu
2018-01-01
Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416
Kliegl, Reinhold; Wei, Ping; Dambacher, Michael; Yan, Ming; Zhou, Xiaolin
2011-01-01
Linear mixed models (LMMs) provide a still underused methodological perspective on combining experimental and individual-differences research. Here we illustrate this approach with two-rectangle cueing in visual attention (Egly et al., 1994). We replicated previous experimental cue-validity effects relating to a spatial shift of attention within an object (spatial effect), to attention switch between objects (object effect), and to the attraction of attention toward the display centroid (attraction effect), also taking into account the design-inherent imbalance of valid and other trials. We simultaneously estimated variance/covariance components of subject-related random effects for these spatial, object, and attraction effects in addition to their mean reaction times (RTs). The spatial effect showed a strong positive correlation with mean RT and a strong negative correlation with the attraction effect. The analysis of individual differences suggests that slow subjects engage attention more strongly at the cued location than fast subjects. We compare this joint LMM analysis of experimental effects and associated subject-related variances and correlations with two frequently used alternative statistical procedures. PMID:21833292
MCMEG: Simulations of both PDD and TPR for 6 MV LINAC photon beam using different MC codes
NASA Astrophysics Data System (ADS)
Fonseca, T. C. F.; Mendes, B. M.; Lacerda, M. A. S.; Silva, L. A. C.; Paixão, L.; Bastos, F. M.; Ramirez, J. V.; Junior, J. P. R.
2017-11-01
The Monte Carlo Modelling Expert Group (MCMEG) is an expert network specializing in Monte Carlo radiation transport and the modelling and simulation applied to the radiation protection and dosimetry research field. For the first inter-comparison task the group launched an exercise to model and simulate a 6 MV LINAC photon beam using the Monte Carlo codes available within their laboratories and validate their simulated results by comparing them with experimental measurements carried out in the National Cancer Institute (INCA) in Rio de Janeiro, Brazil. The experimental measurements were performed using an ionization chamber with calibration traceable to a Secondary Standard Dosimetry Laboratory (SSDL). The detector was immersed in a water phantom at different depths and was irradiated with a radiation field size of 10×10 cm2. This exposure setup was used to determine the dosimetric parameters Percentage Depth Dose (PDD) and Tissue Phantom Ratio (TPR). The validation process compares the MC calculated results to the experimental measured PDD20,10 and TPR20,10. Simulations were performed reproducing the experimental TPR20,10 quality index which provides a satisfactory description of both the PDD curve and the transverse profiles at the two depths measured. This paper reports in detail the modelling process using MCNPx, MCNP6, EGSnrc and Penelope Monte Carlo codes, the source and tally descriptions, the validation processes and the results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Robert; Goudey, Howdy; Curcija, D. Charlie
Virtually every home in the US has some form of shades, blinds, drapes, or other window attachment, but few have been designed for energy savings. In order to provide a common basis of comparison for thermal performance it is important to have validated simulation tools. This study outlines a review and validation of the ISO 15099 centre-of-glass thermal transmittance correlations for naturally ventilated cavities through measurement and detailed simulations. The focus is on the impacts of room-side ventilated cavities, such as those found with solar screens and horizontal louvred blinds. The thermal transmittance of these systems is measured experimentally, simulatedmore » using computational fluid dynamics analysis, and simulated utilizing simplified correlations from ISO 15099. Finally, correlation coefficients are proposed for the ISO 15099 algorithm that reduces the mean error between measured and simulated heat flux for typical solar screens from 16% to 3.5% and from 13% to 1% for horizontal blinds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leichner, P.K.
This report summarizes research in beta-particle dosimetry, quantitative single-photon emission computed tomography (SPECT), the clinical implementation of these two areas of research in radioimmunotherapy (RIT), and postgraduate training provided since the inception of this grant on July 15, 1989. To improve beta-particle dosimetry, a point source function was developed that is valid for a wide range of beta emitters. Analytical solutions for beta-particle dose rates within out outside slabs of finite thickness were validated in experimental tumors and are now being used in clinical RIT. Quantitative SPECT based on the circular harmonic transform (CHT) algorithm was validated in phantom, experimental,more » and clinical studies. This has led to improved macrodosimetry in clinical RIT. In dosimetry at the multi-cellular level studies were made of the HepG2 human hepatoblastoma grown subcutaneously in nude mice. Histologic sections and autoradiographs were prepared to quantitate activity distributions of radiolabeled antibodies. Absorbed-dose calculations are being carried out for {sup 131}I and {sup 90}Y beta particles for these antibody distributions.« less
Hart, Robert; Goudey, Howdy; Curcija, D. Charlie
2017-05-16
Virtually every home in the US has some form of shades, blinds, drapes, or other window attachment, but few have been designed for energy savings. In order to provide a common basis of comparison for thermal performance it is important to have validated simulation tools. This study outlines a review and validation of the ISO 15099 centre-of-glass thermal transmittance correlations for naturally ventilated cavities through measurement and detailed simulations. The focus is on the impacts of room-side ventilated cavities, such as those found with solar screens and horizontal louvred blinds. The thermal transmittance of these systems is measured experimentally, simulatedmore » using computational fluid dynamics analysis, and simulated utilizing simplified correlations from ISO 15099. Finally, correlation coefficients are proposed for the ISO 15099 algorithm that reduces the mean error between measured and simulated heat flux for typical solar screens from 16% to 3.5% and from 13% to 1% for horizontal blinds.« less
Validation of Structures in the Protein Data Bank.
Gore, Swanand; Sanz García, Eduardo; Hendrickx, Pieter M S; Gutmanas, Aleksandras; Westbrook, John D; Yang, Huanwang; Feng, Zukang; Baskaran, Kumaran; Berrisford, John M; Hudson, Brian P; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L; Mading, Steve; Mak, Lora; Mukhopadhyay, Abhik; Oldfield, Thomas J; Patwardhan, Ardan; Peisach, Ezra; Sahni, Gaurav; Sekharan, Monica R; Sen, Sanchayita; Shao, Chenghua; Smart, Oliver S; Ulrich, Eldon L; Yamashita, Reiko; Quesada, Martha; Young, Jasmine Y; Nakamura, Haruki; Markley, John L; Berman, Helen M; Burley, Stephen K; Velankar, Sameer; Kleywegt, Gerard J
2017-12-05
The Worldwide PDB recently launched a deposition, biocuration, and validation tool: OneDep. At various stages of OneDep data processing, validation reports for three-dimensional structures of biological macromolecules are produced. These reports are based on recommendations of expert task forces representing crystallography, nuclear magnetic resonance, and cryoelectron microscopy communities. The reports provide useful metrics with which depositors can evaluate the quality of the experimental data, the structural model, and the fit between them. The validation module is also available as a stand-alone web server and as a programmatically accessible web service. A growing number of journals require the official wwPDB validation reports (produced at biocuration) to accompany manuscripts describing macromolecular structures. Upon public release of the structure, the validation report becomes part of the public PDB archive. Geometric quality scores for proteins in the PDB archive have improved over the past decade. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Motivation Interventions in Education: A Meta-Analytic Review
ERIC Educational Resources Information Center
Lazowski, Rory A.; Hulleman, Chris S.
2016-01-01
This meta-analysis provides an extensive and organized summary of intervention studies in education that are grounded in motivation theory. We identified 74 published and unpublished papers that experimentally manipulated an independent variable and measured an authentic educational outcome within an ecologically valid educational context. Our…
Shock compression response of cold-rolled Ni/Al multilayer composites
NASA Astrophysics Data System (ADS)
Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.
2017-01-01
Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations [Specht et al., J. Appl. Phys. 111, 073527 (2012)]. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. These simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oßwald, Patrick; Köhler, Markus
A new high-temperature flow reactor experiment utilizing the powerful molecular beam mass spectrometry (MBMS) technique for detailed observation of gas phase kinetics in reacting flows is presented. The reactor design provides a consequent extension of the experimental portfolio of validation experiments for combustion reaction kinetics. Temperatures up to 1800 K are applicable by three individually controlled temperature zones with this atmospheric pressure flow reactor. Detailed speciation data are obtained using the sensitive MBMS technique, providing in situ access to almost all chemical species involved in the combustion process, including highly reactive species such as radicals. Strategies for quantifying the experimentalmore » data are presented alongside a careful analysis of the characterization of the experimental boundary conditions to enable precise numeric reproduction of the experimental results. The general capabilities of this new analytical tool for the investigation of reacting flows are demonstrated for a selected range of conditions, fuels, and applications. A detailed dataset for the well-known gaseous fuels, methane and ethylene, is provided and used to verify the experimental approach. Furthermore, application for liquid fuels and fuel components important for technical combustors like gas turbines and engines is demonstrated. Besides the detailed investigation of novel fuels and fuel components, the wide range of operation conditions gives access to extended combustion topics, such as super rich conditions at high temperature important for gasification processes, or the peroxy chemistry governing the low temperature oxidation regime. These demonstrations are accompanied by a first kinetic modeling approach, examining the opportunities for model validation purposes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricks, Allen; Blanchat, Thomas K.; Jernigan, Dann A.
2006-06-01
It is necessary to improve understanding and develop validation data of the heat flux incident to an object located within the fire plume for the validation of SIERRA/ FUEGO/SYRINX fire and SIERRA/CALORE. One key aspect of the validation data sets is the determination of the relative contribution of the radiative and convective heat fluxes. To meet this objective, a cylindrical calorimeter with sufficient instrumentation to measure total and radiative heat flux had been designed and fabricated. This calorimeter will be tested both in the controlled radiative environment of the Penlight facility and in a fire environment in the FLAME/Radiant Heatmore » (FRH) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisons between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. A significant question of interest to modeling heat flux incident to an object in or near a fire is the contribution of the radiation and convection modes of heat transfer. The series of experiments documented in this test plan is designed to provide data on the radiation partitioning, defined as the fraction of the total heat flux that is due to radiation.« less
Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R
2015-11-01
The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.
Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin
Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R
2016-01-01
The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a “complete mystical experience” that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. PMID:26442957
Funding for the 2ND IAEA technical meeting on fusion data processing, validation and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenwald, Martin
The International Atomic Energy Agency (IAEA) will organize the second Technical Meeting on Fusion Da Processing, Validation and Analysis from 30 May to 02 June, 2017, in Cambridge, MA USA. The meeting w be hosted by the MIT Plasma Science and Fusion Center (PSFC). The objective of the meeting is to provide a platform where a set of topics relevant to fusion data processing, validation and analysis are discussed with the view of extrapolation needs to next step fusion devices such as ITER. The validation and analysis of experimental data obtained from diagnostics used to characterize fusion plasmas are crucialmore » for a knowledge based understanding of the physical processes governing the dynamics of these plasmas. The meeting will aim at fostering, in particular, discussions of research and development results that set out or underline trends observed in the current major fusion confinement devices. General information on the IAEA, including its mission and organization, can be found at the IAEA websit Uncertainty quantification (UQ) Model selection, validation, and verification (V&V) Probability theory and statistical analysis Inverse problems & equilibrium reconstru ction Integrated data analysis Real time data analysis Machine learning Signal/image proc essing & pattern recognition Experimental design and synthetic diagnostics Data management« less
Development and validation of a low-cost mobile robotics testbed
NASA Astrophysics Data System (ADS)
Johnson, Michael; Hayes, Martin J.
2012-03-01
This paper considers the design, construction and validation of a low-cost experimental robotic testbed, which allows for the localisation and tracking of multiple robotic agents in real time. The testbed system is suitable for research and education in a range of different mobile robotic applications, for validating theoretical as well as practical research work in the field of digital control, mobile robotics, graphical programming and video tracking systems. It provides a reconfigurable floor space for mobile robotic agents to operate within, while tracking the position of multiple agents in real-time using the overhead vision system. The overall system provides a highly cost-effective solution to the topical problem of providing students with practical robotics experience within severe budget constraints. Several problems encountered in the design and development of the mobile robotic testbed and associated tracking system, such as radial lens distortion and the selection of robot identifier templates are clearly addressed. The testbed performance is quantified and several experiments involving LEGO Mindstorm NXT and Merlin System MiaBot robots are discussed.
NASA Astrophysics Data System (ADS)
Vedartham, Padmaja B.
Snap-through buckling provides an intricate force-displacement relationship for study. With the possibility for multiple limit points and pitchfork bifurcations and large regions of instability, experimental validation of numerical analysis can become difficult. This requires stabilization of unstable static equilibria, for which limited prior research exists. For all but the simplest cases, more than one actuator is needed, increasing the complexity of the experiment to the point of intractability without a control system. In this thesis, the necessary conditions for stabilization of a buckled beam with pinned boundaries under transverse loading were determined. By combining various nonlinear solution methods, a control system was created that could stabilize any branch of the force-displacement response. Experimental traversal of an unstable branch are presented along with other unstable static equilibrium configurations. The control system had numerical limitations, losing convergence near singular points. The groundwork for experimental stabilization was validated and demonstrated.
Fluidic Vectoring of a Planar Incompressible Jet Flow
NASA Astrophysics Data System (ADS)
Mendez, Miguel Alfonso; Scelzo, Maria Teresa; Enache, Adriana; Buchlin, Jean-Marie
2018-06-01
This paper presents an experimental, a numerical and a theoretical analysis of the performances of a fluidic vectoring device for controlling the direction of a turbulent, bi-dimensional and low Mach number (incompressible) jet flow. The investigated design is the co-flow secondary injection with Coanda surface, which allows for vectoring angles up to 25° with no need of moving mechanical parts. A simple empirical model of the vectoring process is presented and validated via experimental and numerical data. The experiments consist of flow visualization and image processing for the automatic detection of the jet centerline; the numerical simulations are carried out solving the Unsteady Reynolds Average Navier- Stokes (URANS) closed with the k - ω SST turbulence model, using the PisoFoam solver from OpenFOAM. The experimental validation on three different geometrical configurations has shown that the model is capable of providing a fast and reliable evaluation of the device performance as a function of the operating conditions.
Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate
NASA Technical Reports Server (NTRS)
Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.
2013-01-01
There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable
An experimental and modeling study of isothermal charge/discharge behavior of commercial Ni-MH cells
NASA Astrophysics Data System (ADS)
Pan, Y. H.; Srinivasan, V.; Wang, C. Y.
In this study, a previously developed nickel-metal hydride (Ni-MH) battery model is applied in conjunction with experimental characterization. Important geometric parameters, including the active surface area and micro-diffusion length for both electrodes, are measured and incorporated in the model. The kinetic parameters of the oxygen evolution reaction are also characterized using constant potential experiments. Two separate equilibrium equations for the Ni electrode, one for charge and the other for discharge, are determined to provide a better description of the electrode hysteresis effect, and their use results in better agreement of simulation results with experimental data on both charge and discharge. The Ni electrode kinetic parameters are re-calibrated for the battery studied. The Ni-MH cell model coupled with the updated electrochemical properties is then used to simulate a wide range of experimental discharge and charge curves with satisfactory agreement. The experimentally validated model is used to predict and compare various charge algorithms so as to provide guidelines for application-specific optimization.
Experimental Validation of the Transverse Shear Behavior of a Nomex Core for Sandwich Panels
NASA Astrophysics Data System (ADS)
Farooqi, M. I.; Nasir, M. A.; Ali, H. M.; Ali, Y.
2017-05-01
This work deals with determination of the transverse shear moduli of a Nomex® honeycomb core of sandwich panels. Their out-of-plane shear characteristics depend on the transverse shear moduli of the honeycomb core. These moduli were determined experimentally, numerically, and analytically. Numerical simulations were performed by using a unit cell model and three analytical approaches. Analytical calculations showed that two of the approaches provided reasonable predictions for the transverse shear modulus as compared with experimental results. However, the approach based upon the classical lamination theory showed large deviations from experimental data. Numerical simulations also showed a trend similar to that resulting from the analytical models.
Experimental study of isolas in nonlinear systems featuring modal interactions
Noël, Jean-Philippe; Virgin, Lawrence N.; Kerschen, Gaëtan
2018-01-01
The objective of the present paper is to provide experimental evidence of isolated resonances in the frequency response of nonlinear mechanical systems. More specifically, this work explores the presence of isolas, which are periodic solutions detached from the main frequency response, in the case of a nonlinear set-up consisting of two masses sliding on a horizontal guide. A careful experimental investigation of isolas is carried out using responses to swept-sine and stepped-sine excitations. The experimental findings are validated with advanced numerical simulations combining nonlinear modal analysis and bifurcation monitoring. In particular, the interactions between two nonlinear normal modes are shown to be responsible for the creation of the isolas. PMID:29584758
Shanks, Ryan A; Robertson, Chuck L; Haygood, Christian S; Herdliksa, Anna M; Herdliska, Heather R; Lloyd, Steven A
2017-01-01
Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model's ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads.
Measuring landscape esthetics: the scenic beauty estimation method
Terry C. Daniel; Ron S. Boster
1976-01-01
The Scenic Beauty Estimation Method (SBE) provides quantitative measures of esthetic preferences for alternative wildland management systems. Extensive experimentation and testing with user, interest, and professional groups validated the method. SBE shows promise as an efficient and objective means for assessing the scenic beauty of public forests and wildlands, and...
A laboratory-scale experimental program was designed to standardize each of four black carbon measurement methods, provide appropriate quality assurance/control procedures for these techniques, and compare measurements made by these methods to a NIST traceable standard (filter gr...
Management Strategies for Promoting Teacher Collective Learning
ERIC Educational Resources Information Center
Cheng, Eric C. K.
2011-01-01
This paper aims to validate a theoretical model for developing teacher collective learning by using a quasi-experimental design, and explores the management strategies that would provide a school administrator practical steps to effectively promote collective learning in the school organization. Twenty aided secondary schools in Hong Kong were…
Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.
Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B
2018-01-01
The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.
NASA Astrophysics Data System (ADS)
Stastnik, S.
2016-06-01
Development of materials for vertical outer building structures tends to application of hollow clay blocks filled with some appropriate insulation material. Ceramic fittings provide high thermal resistance, but the walls built from them suffer from condensation of air humidity in winter season frequently. The paper presents the computational simulation and experimental laboratory validation of moisture behaviour of such masonry with insulation prepared from waste fibres under the Central European climatic conditions.
Fault detection, isolation, and diagnosis of self-validating multifunctional sensors.
Yang, Jing-Li; Chen, Yin-Sheng; Zhang, Li-Li; Sun, Zhen
2016-06-01
A novel fault detection, isolation, and diagnosis (FDID) strategy for self-validating multifunctional sensors is presented in this paper. The sparse non-negative matrix factorization-based method can effectively detect faults by using the squared prediction error (SPE) statistic, and the variables contribution plots based on SPE statistic can help to locate and isolate the faulty sensitive units. The complete ensemble empirical mode decomposition is employed to decompose the fault signals to a series of intrinsic mode functions (IMFs) and a residual. The sample entropy (SampEn)-weighted energy values of each IMFs and the residual are estimated to represent the characteristics of the fault signals. Multi-class support vector machine is introduced to identify the fault mode with the purpose of diagnosing status of the faulty sensitive units. The performance of the proposed strategy is compared with other fault detection strategies such as principal component analysis, independent component analysis, and fault diagnosis strategies such as empirical mode decomposition coupled with support vector machine. The proposed strategy is fully evaluated in a real self-validating multifunctional sensors experimental system, and the experimental results demonstrate that the proposed strategy provides an excellent solution to the FDID research topic of self-validating multifunctional sensors.
Automated identification of reference genes based on RNA-seq data.
Carmona, Rosario; Arroyo, Macarena; Jiménez-Quesada, María José; Seoane, Pedro; Zafra, Adoración; Larrosa, Rafael; Alché, Juan de Dios; Claros, M Gonzalo
2017-08-18
Gene expression analyses demand appropriate reference genes (RGs) for normalization, in order to obtain reliable assessments. Ideally, RG expression levels should remain constant in all cells, tissues or experimental conditions under study. Housekeeping genes traditionally fulfilled this requirement, but they have been reported to be less invariant than expected; therefore, RGs should be tested and validated for every particular situation. Microarray data have been used to propose new RGs, but only a limited set of model species and conditions are available; on the contrary, RNA-seq experiments are more and more frequent and constitute a new source of candidate RGs. An automated workflow based on mapped NGS reads has been constructed to obtain highly and invariantly expressed RGs based on a normalized expression in reads per mapped million and the coefficient of variation. This workflow has been tested with Roche/454 reads from reproductive tissues of olive tree (Olea europaea L.), as well as with Illumina paired-end reads from two different accessions of Arabidopsis thaliana and three different human cancers (prostate, small-cell cancer lung and lung adenocarcinoma). Candidate RGs have been proposed for each species and many of them have been previously reported as RGs in literature. Experimental validation of significant RGs in olive tree is provided to support the algorithm. Regardless sequencing technology, number of replicates, and library sizes, when RNA-seq experiments are designed and performed, the same datasets can be analyzed with our workflow to extract suitable RGs for subsequent PCR validation. Moreover, different subset of experimental conditions can provide different suitable RGs.
Wu, Jianyang; Zhang, Hongna; Liu, Liqin; Li, Weicai; Wei, Yongzan; Shi, Shengyou
2016-01-01
Reverse transcription quantitative PCR (RT-qPCR) as the accurate and sensitive method is use for gene expression analysis, but the veracity and reliability result depends on whether select appropriate reference gene or not. To date, several reliable reference gene validations have been reported in fruits trees, but none have been done on preharvest and postharvest longan fruits. In this study, 12 candidate reference genes, namely, CYP, RPL, GAPDH, TUA, TUB, Fe-SOD, Mn-SOD, Cu/Zn-SOD, 18SrRNA, Actin, Histone H3, and EF-1a, were selected. Expression stability of these genes in 150 longan samples was evaluated and analyzed using geNorm and NormFinder algorithms. Preharvest samples consisted of seven experimental sets, including different developmental stages, organs, hormone stimuli (NAA, 2,4-D, and ethephon) and abiotic stresses (bagging and girdling with defoliation). Postharvest samples consisted of different temperature treatments (4 and 22°C) and varieties. Our findings indicate that appropriate reference gene(s) should be picked for each experimental condition. Our data further showed that the commonly used reference gene Actin does not exhibit stable expression across experimental conditions in longan. Expression levels of the DlACO gene, which is a key gene involved in regulating fruit abscission under girdling with defoliation treatment, was evaluated to validate our findings. In conclusion, our data provide a useful framework for choice of suitable reference genes across different experimental conditions for RT-qPCR analysis of preharvest and postharvest longan fruits. PMID:27375640
Reactivity loss validation of high burn-up PWR fuels with pile-oscillation experiments in MINERVE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leconte, P.; Vaglio-Gaudard, C.; Eschbach, R.
2012-07-01
The ALIX experimental program relies on the experimental validation of the spent fuel inventory, by chemical analysis of samples irradiated in a PWR between 5 and 7 cycles, and also on the experimental validation of the spent fuel reactivity loss with bum-up, obtained by pile-oscillation measurements in the MINERVE reactor. These latter experiments provide an overall validation of both the fuel inventory and of the nuclear data responsible for the reactivity loss. This program offers also unique experimental data for fuels with a burn-up reaching 85 GWd/t, as spent fuels in French PWRs never exceeds 70 GWd/t up to now.more » The analysis of these experiments is done in two steps with the APOLLO2/SHEM-MOC/CEA2005v4 package. In the first one, the fuel inventory of each sample is obtained by assembly calculations. The calculation route consists in the self-shielding of cross sections on the 281 energy group SHEM mesh, followed by the flux calculation by the Method Of Characteristics in a 2D-exact heterogeneous geometry of the assembly, and finally a depletion calculation by an iterative resolution of the Bateman equations. In the second step, the fuel inventory is used in the analysis of pile-oscillation experiments in which the reactivity of the ALIX spent fuel samples is compared to the reactivity of fresh fuel samples. The comparison between Experiment and Calculation shows satisfactory results with the JEFF3.1.1 library which predicts the reactivity loss within 2% for burn-up of {approx}75 GWd/t and within 4% for burn-up of {approx}85 GWd/t. (authors)« less
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2005-01-01
Researchers from NASA Glenn Research Center s Combustion Branch and the Ohio Aerospace Institute (OAI) have developed a transferable calibration standard for an optical technique called spontaneous Raman scattering (SRS) in high-pressure flames. SRS is perhaps the only technique that provides spatially and temporally resolved, simultaneous multiscalar measurements in turbulent flames. Such measurements are critical for the validation of numerical models of combustion. This study has been a combined experimental and theoretical effort to develop a spectral calibration database for multiscalar diagnostics using SRS in high-pressure flames. However, in the past such measurements have used a one-of-a-kind experimental setup and a setup-dependent calibration procedure to empirically account for spectral interferences, or crosstalk, among the major species of interest. Such calibration procedures, being non-transferable, are prohibitively expensive to duplicate. A goal of this effort is to provide an SRS calibration database using transferable standards that can be implemented widely by other researchers for both atmospheric-pressure and high-pressure (less than 30 atm) SRS studies. A secondary goal of this effort is to provide quantitative multiscalar diagnostics in high pressure environments to validate computational combustion codes.
Observations on CFD Verification and Validation from the AIAA Drag Prediction Workshops
NASA Technical Reports Server (NTRS)
Morrison, Joseph H.; Kleb, Bil; Vassberg, John C.
2014-01-01
The authors provide observations from the AIAA Drag Prediction Workshops that have spanned over a decade and from a recent validation experiment at NASA Langley. These workshops provide an assessment of the predictive capability of forces and moments, focused on drag, for transonic transports. It is very difficult to manage the consistency of results in a workshop setting to perform verification and validation at the scientific level, but it may be sufficient to assess it at the level of practice. Observations thus far: 1) due to simplifications in the workshop test cases, wind tunnel data are not necessarily the “correct” results that CFD should match, 2) an average of core CFD data are not necessarily a better estimate of the true solution as it is merely an average of other solutions and has many coupled sources of variation, 3) outlier solutions should be investigated and understood, and 4) the DPW series does not have the systematic build up and definition on both the computational and experimental side that is required for detailed verification and validation. Several observations regarding the importance of the grid, effects of physical modeling, benefits of open forums, and guidance for validation experiments are discussed. The increased variation in results when predicting regions of flow separation and increased variation due to interaction effects, e.g., fuselage and horizontal tail, point out the need for validation data sets for these important flow phenomena. Experiences with a recent validation experiment at NASA Langley are included to provide guidance on validation experiments.
Ocean Observations with EOS/MODIS: Algorithm Development and Post Launch Studies
NASA Technical Reports Server (NTRS)
Gordon, Howard R.; Conboy, Barbara (Technical Monitor)
1999-01-01
This separation has been logical thus far; however, as launch of AM-1 approaches, it must be recognized that many of these activities will shift emphasis from algorithm development to validation. For example, the second, third, and fifth bullets will become almost totally validation-focussed activities in the post-launch era, providing the core of our experimental validation effort. Work under the first bullet will continue into the post-launch time frame, driven in part by algorithm deficiencies revealed as a result of validation activities. Prior to the start of the 1999 fiscal year (FY99) we were requested to prepare a brief plan for our FY99 activities. This plan is included as Appendix 1. The present report describes the progress made on our planned activities.
Cloud computing and validation of expandable in silico livers.
Ropella, Glen E P; Hunt, C Anthony
2010-12-03
In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware.
A Comprehensive Validation Methodology for Sparse Experimental Data
NASA Technical Reports Server (NTRS)
Norman, Ryan B.; Blattnig, Steve R.
2010-01-01
A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.
NASA Technical Reports Server (NTRS)
Cognata, Thomas J.; Leimkuehler, Thomas O.; Sheth, Rubik B.; Le,Hung
2012-01-01
The Fusible Heat Sink is a novel vehicle heat rejection technology which combines a flow through radiator with a phase change material. The combined technologies create a multi-function device able to shield crew members against Solar Particle Events (SPE), reduce radiator extent by permitting sizing to the average vehicle heat load rather than to the peak vehicle heat load, and to substantially absorb heat load excursions from the average while constantly maintaining thermal control system setpoints. This multi-function technology provides great flexibility for mission planning, making it possible to operate a vehicle in hot or cold environments and under high or low heat load conditions for extended periods of time. This paper describes the model development and experimental validation of the Fusible Heat Sink technology. The model developed was intended to meet the radiation and heat rejection requirements of a nominal MMSEV mission. Development parameters and results, including sizing and model performance will be discussed. From this flight-sized model, a scaled test-article design was modeled, designed, and fabricated for experimental validation of the technology at Johnson Space Center thermal vacuum chamber facilities. Testing showed performance comparable to the model at nominal loads and the capability to maintain heat loads substantially greater than nominal for extended periods of time.
NASA Technical Reports Server (NTRS)
Cognata, Thomas J.; Leimkuehler, Thomas; Sheth, Rubik; Le, Hung
2013-01-01
The Fusible Heat Sink is a novel vehicle heat rejection technology which combines a flow through radiator with a phase change material. The combined technologies create a multi-function device able to shield crew members against Solar Particle Events (SPE), reduce radiator extent by permitting sizing to the average vehicle heat load rather than to the peak vehicle heat load, and to substantially absorb heat load excursions from the average while constantly maintaining thermal control system setpoints. This multi-function technology provides great flexibility for mission planning, making it possible to operate a vehicle in hot or cold environments and under high or low heat load conditions for extended periods of time. This paper describes the modeling and experimental validation of the Fusible Heat Sink technology. The model developed was intended to meet the radiation and heat rejection requirements of a nominal MMSEV mission. Development parameters and results, including sizing and model performance will be discussed. From this flight-sized model, a scaled test-article design was modeled, designed, and fabricated for experimental validation of the technology at Johnson Space Center thermal vacuum chamber facilities. Testing showed performance comparable to the model at nominal loads and the capability to maintain heat loads substantially greater than nominal for extended periods of time.
2012-08-01
U0=15m/s, Lv =350m Cloud Wind and Clear Sky Gust Simulation Using Dryden PSD* Harvested Energy from Normal Vibration (Red) to...energy control law based on limited energy constraints 4) Experimentally validated simultaneous energy harvesting and vibration control Summary...Experimental Characterization and Validation of Simultaneous Gust Alleviation and Energy Harvesting for Multifunctional Wing Spars AFOSR
Supersonic Coaxial Jet Experiment for CFD Code Validation
NASA Technical Reports Server (NTRS)
Cutler, A. D.; Carty, A. A.; Doerner, S. E.; Diskin, G. S.; Drummond, J. P.
1999-01-01
A supersonic coaxial jet facility has been designed to provide experimental data suitable for the validation of CFD codes used to analyze high-speed propulsion flows. The center jet is of a light gas and the coflow jet is of air, and the mixing layer between them is compressible. Various methods have been employed in characterizing the jet flow field, including schlieren visualization, pitot, total temperature and gas sampling probe surveying, and RELIEF velocimetry. A Navier-Stokes code has been used to calculate the nozzle flow field and the results compared to the experiment.
NASA Astrophysics Data System (ADS)
Silvernail, Nathan L.
This research was carried out in collaboration with the United Launch Alliance (ULA), to advance an innovative Centaur-based on-orbit propellant storage and transfer system that takes advantage of rotational settling to simplify Fluid Management (FM), specifically enabling settled fluid transfer between two tanks and settled pressure control. This research consists of two specific objectives: (1) technique and process validation and (2) computational model development. In order to raise the Technology Readiness Level (TRL) of this technology, the corresponding FM techniques and processes must be validated in a series of experimental tests, including: laboratory/ground testing, microgravity flight testing, suborbital flight testing, and orbital testing. Researchers from Embry-Riddle Aeronautical University (ERAU) have joined with the Massachusetts Institute of Technology (MIT) Synchronized Position Hold Engage and Reorient Experimental Satellites (SPHERES) team to develop a prototype FM system for operations aboard the International Space Station (ISS). Testing of the integrated system in a representative environment will raise the FM system to TRL 6. The tests will demonstrate the FM system and provide unique data pertaining to the vehicle's rotational dynamics while undergoing fluid transfer operations. These data sets provide insight into the behavior and physical tendencies of the on-orbit refueling system. Furthermore, they provide a baseline for comparison against the data produced by various computational models; thus verifying the accuracy of the models output and validating the modeling approach. Once these preliminary models have been validated, the parameters defined by them will provide the basis of development for accurate simulations of full scale, on-orbit systems. The completion of this project and the models being developed will accelerate the commercialization of on-orbit propellant storage and transfer technologies as well as all in-space technologies that utilize or will utilize similar FM techniques and processes.
Quantitative validation of carbon-fiber laminate low velocity impact simulations
English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.
2015-09-26
Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less
Validation of Heat Transfer Thermal Decomposition and Container Pressurization of Polyurethane Foam.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Sarah Nicole; Dodd, Amanda B.; Larsen, Marvin E.
Polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. In fire environments, gas pressure from thermal decomposition of polymers can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of PMDI-based polyurethane foam is presented to assess the validity of the computational model. Both experimental measurement uncertainty and model prediction uncertainty are examined and compared. Both the mean value method and Latin hypercube sampling approach are used to propagate the uncertainty through the model. In addition to comparing computational and experimental results, the importance of each input parameter on the simulation resultmore » is also investigated. These results show that further development in the physics model of the foam and appropriate associated material testing are necessary to improve model accuracy.« less
Turbine-99 unsteady simulations - Validation
NASA Astrophysics Data System (ADS)
Cervantes, M. J.; Andersson, U.; Lövgren, H. M.
2010-08-01
The Turbine-99 test case, a Kaplan draft tube model, aimed to determine the state of the art within draft tube simulation. Three workshops were organized on the matter in 1999, 2001 and 2005 where the geometry and experimental data were provided as boundary conditions to the participants. Since the last workshop, computational power and flow modelling have been developed and the available data completed with unsteady pressure measurements and phase resolved velocity measurements in the cone. Such new set of data together with the corresponding phase resolved velocity boundary conditions offer new possibilities to validate unsteady numerical simulations in Kaplan draft tube. The present work presents simulation of the Turbine-99 test case with time dependent angular resolved inlet velocity boundary conditions. Different grids and time steps are investigated. The results are compared to experimental time dependent pressure and velocity measurements.
LEWICE 2.2 Capabilities and Thermal Validation
NASA Technical Reports Server (NTRS)
Wright, William B.
2002-01-01
A computational model of bleed air anti-icing and electrothermal de-icing have been added to the LEWICE 2.0 software by integrating the capabilities of two previous programs, ANTICE and LEWICE/ Thermal. This combined model has been released as LEWICE version 2.2. Several advancements have also been added to the previous capabilities of each module. This report will present the capabilities of the software package and provide results for both bleed air and electrothermal cases. A comprehensive validation effort has also been performed to compare the predictions to an existing electrothermal database. A quantitative comparison shows that for deicing cases, the average difference is 9.4 F (26%) compared to 3 F for the experimental data while for evaporative cases the average difference is 2 F (32%) compared to an experimental error of 4 F.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vlcek, Lukas; Chialvo, Ariel; Simonson, J Michael
2013-01-01
Molecular models and experimental estimates based on the cluster pair approximation (CPA) provide inconsistent predictions of absolute single-ion hydration properties. To understand the origin of this discrepancy we used molecular simulations to study the transition between hydration of alkali metal and halide ions in small aqueous clusters and bulk water. The results demonstrate that the assumptions underlying the CPA are not generally valid as a result of a significant shift in the ion hydration free energies (~15 kJ/mol) and enthalpies (~47 kJ/mol) in the intermediate range of cluster sizes. When this effect is accounted for, the systematic differences between modelsmore » and experimental predictions disappear, and the value of absolute proton hydration enthalpy based on the CPA gets in closer agreement with other estimates.« less
Experimental Equipment Validation for Methane (CH4) and Carbon Dioxide (CO2) Hydrates
NASA Astrophysics Data System (ADS)
Saad Khan, Muhammad; Yaqub, Sana; Manner, Naathiya; Ani Karthwathi, Nur; Qasim, Ali; Mellon, Nurhayati Binti; Lal, Bhajan
2018-04-01
Clathrate hydrates are eminent structures regard as a threat to the gas and oil industry in light of their irritating propensity to subsea pipelines. For natural gas transmission and processing, the formation of gas hydrate is one of the main flow assurance delinquent has led researchers toward conducting fresh and meticulous studies on various aspects of gas hydrates. This paper highlighted the thermodynamic analysis on pure CH4 and CO2 gas hydrates on the custom fabricated equipment (Sapphire cell hydrate reactor) for experimental validation. CO2 gas hydrate formed at lower pressure (41 bar) as compared to CH4 gas hydrate (70 bar) while comparison of thermodynamic properties between CH4 and CO2 also presented in this study. This preliminary study could provide pathways for the quest of potent hydrate inhibitors.
Neutron capture on short-lived nuclei via the surrogate (d,pγ) reaction
NASA Astrophysics Data System (ADS)
Cizewski, Jolie A.; Ratkiewicz, Andrew
2018-05-01
Rapid r-process nucleosynthesis is responsible for the creation of about half of the elements heavier than iron. Neutron capture on shortlived nuclei in cold processes or during freeze out from hot processes can have a significant impact on the final observed r-process abundances. We are validating the (d,pγ) reaction as a surrogate for neutron capture with measurements on 95Mo targets and a focus on discrete transitions. The experimental results have been analyzed within the Hauser-Feshbach approach with non-elastic breakup of the deuteron providing a neutron to be captured. Preliminary results support the (d,pγ) reaction as a valid surrogate for neutron capture. We are poised to measure the (d,pγ) reaction in inverse kinematics with unstable beams following the development of the experimental techniques.
Validation of Laser-Induced Fluorescent Photogrammetric Targets on Membrane Structures
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Dorrington, Adrian A.; Shortis, Mark R.; Hendricks, Aron R.
2004-01-01
The need for static and dynamic characterization of a new generation of inflatable space structures requires the advancement of classical metrology techniques. A new photogrammetric-based method for non-contact ranging and surface profiling has been developed at NASA Langley Research Center (LaRC) to support modal analyses and structural validation of this class of space structures. This full field measurement method, known as Laser-Induced Fluorescence (LIF) photogrammetry, has previously yielded promising experimental results. However, data indicating the achievable measurement precision had not been published. This paper provides experimental results that indicate the LIF-photogrammetry measurement precision for three different target types used on a reflective membrane structure. The target types were: (1) non-contact targets generated using LIF, (2) surface attached retro-reflective targets, and (3) surface attached diffuse targets. Results from both static and dynamic investigations are included.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
2017-09-01
VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY
Matrix Dominated Failure of Fiber-Reinforced Composite Laminates Under Static and Dynamic Loading
NASA Astrophysics Data System (ADS)
Schaefer, Joseph Daniel
Hierarchical material systems provide the unique opportunity to connect material knowledge to solving specific design challenges. Representing the quickest growing class of hierarchical materials in use, fiber-reinforced polymer composites (FRPCs) offer superior strength and stiffness-to-weight ratios, damage tolerance, and decreasing production costs compared to metals and alloys. However, the implementation of FRPCs has historically been fraught with inadequate knowledge of the material failure behavior due to incomplete verification of recent computational constitutive models and improper (or non-existent) experimental validation, which has severely slowed creation and development. Noted by the recent Materials Genome Initiative and the Worldwide Failure Exercise, current state of the art qualification programs endure a 20 year gap between material conceptualization and implementation due to the lack of effective partnership between computational coding (simulation) and experimental characterization. Qualification processes are primarily experiment driven; the anisotropic nature of composites predisposes matrix-dominant properties to be sensitive to strain rate, which necessitates extensive testing. To decrease the qualification time, a framework that practically combines theoretical prediction of material failure with limited experimental validation is required. In this work, the Northwestern Failure Theory (NU Theory) for composite lamina is presented as the theoretical basis from which the failure of unidirectional and multidirectional composite laminates is investigated. From an initial experimental characterization of basic lamina properties, the NU Theory is employed to predict the matrix-dependent failure of composites under any state of biaxial stress from quasi-static to 1000 s-1 strain rates. It was found that the number of experiments required to characterize the strain-rate-dependent failure of a new composite material was reduced by an order of magnitude, and the resulting strain-rate-dependence was applicable for a large class of materials. The presented framework provides engineers with the capability to quickly identify fiber and matrix combinations for a given application and determine the failure behavior over the range of practical loadings cases. The failure-mode-based NU Theory may be especially useful when partnered with computational approaches (which often employ micromechanics to determine constituent and constitutive response) to provide accurate validation of the matrix-dominated failure modes experienced by laminates during progressive failure.
Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.
2014-01-01
Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051
2014-01-01
In omic research, such as genome wide association studies, researchers seek to repeat their results in other datasets to reduce false positive findings and thus provide evidence for the existence of true associations. Unfortunately this standard validation approach cannot completely eliminate false positive conclusions, and it can also mask many true associations that might otherwise advance our understanding of pathology. These issues beg the question: How can we increase the amount of knowledge gained from high throughput genetic data? To address this challenge, we present an approach that complements standard statistical validation methods by drawing attention to both potential false negative and false positive conclusions, as well as providing broad information for directing future research. The Diverse Convergent Evidence approach (DiCE) we propose integrates information from multiple sources (omics, informatics, and laboratory experiments) to estimate the strength of the available corroborating evidence supporting a given association. This process is designed to yield an evidence metric that has utility when etiologic heterogeneity, variable risk factor frequencies, and a variety of observational data imperfections might lead to false conclusions. We provide proof of principle examples in which DiCE identified strong evidence for associations that have established biological importance, when standard validation methods alone did not provide support. If used as an adjunct to standard validation methods this approach can leverage multiple distinct data types to improve genetic risk factor discovery/validation, promote effective science communication, and guide future research directions. PMID:25071867
Complex Water Impact Visitor Information Validation and Qualification Sciences Experimental Complex Our the problem space. The Validation and Qualification Sciences Experimental Complex (VQSEC) at Sandia
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
Biomarkers of exposure to new and emerging tobacco delivery products.
Schick, Suzaynn F; Blount, Benjamin C; Jacob, Peyton; Saliba, Najat A; Bernert, John T; El Hellani, Ahmad; Jatlow, Peter; Pappas, R Steven; Wang, Lanqing; Foulds, Jonathan; Ghosh, Arunava; Hecht, Stephen S; Gomez, John C; Martin, Jessica R; Mesaros, Clementina; Srivastava, Sanjay; St Helen, Gideon; Tarran, Robert; Lorkiewicz, Pawel K; Blair, Ian A; Kimmel, Heather L; Doerschuk, Claire M; Benowitz, Neal L; Bhatnagar, Aruni
2017-09-01
Accurate and reliable measurements of exposure to tobacco products are essential for identifying and confirming patterns of tobacco product use and for assessing their potential biological effects in both human populations and experimental systems. Due to the introduction of new tobacco-derived products and the development of novel ways to modify and use conventional tobacco products, precise and specific assessments of exposure to tobacco are now more important than ever. Biomarkers that were developed and validated to measure exposure to cigarettes are being evaluated to assess their use for measuring exposure to these new products. Here, we review current methods for measuring exposure to new and emerging tobacco products, such as electronic cigarettes, little cigars, water pipes, and cigarillos. Rigorously validated biomarkers specific to these new products have not yet been identified. Here, we discuss the strengths and limitations of current approaches, including whether they provide reliable exposure estimates for new and emerging products. We provide specific guidance for choosing practical and economical biomarkers for different study designs and experimental conditions. Our goal is to help both new and experienced investigators measure exposure to tobacco products accurately and avoid common experimental errors. With the identification of the capacity gaps in biomarker research on new and emerging tobacco products, we hope to provide researchers, policymakers, and funding agencies with a clear action plan for conducting and promoting research on the patterns of use and health effects of these products.
Experimental Design and Some Threats to Experimental Validity: A Primer
ERIC Educational Resources Information Center
Skidmore, Susan
2008-01-01
Experimental designs are distinguished as the best method to respond to questions involving causality. The purpose of the present paper is to explicate the logic of experimental design and why it is so vital to questions that demand causal conclusions. In addition, types of internal and external validity threats are discussed. To emphasize the…
Role of metabolism and viruses in aflatoxin-induced liver cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groopman, John D.; Kensler, Thomas W.
The use of biomarkers in molecular epidemiology studies for identifying stages in the progression of development of the health effects of environmental agents has the potential for providing important information for critical regulatory, clinical and public health problems. Investigations of aflatoxins probably represent one of the most extensive data sets in the field and this work may serve as a template for future studies of other environmental agents. The aflatoxins are naturally occurring mycotoxins found on foods such as corn, peanuts, various other nuts and cottonseed and they have been demonstrated to be carcinogenic in many experimental models. As amore » result of nearly 30 years of study, experimental data and epidemiological studies in human populations, aflatoxin B{sub 1} was classified as carcinogenic to humans by the International Agency for Research on Cancer. The long-term goal of the research described herein is the application of biomarkers to the development of preventative interventions for use in human populations at high-risk for cancer. Several of the aflatoxin-specific biomarkers have been validated in epidemiological studies and are now being used as intermediate biomarkers in prevention studies. The development of these aflatoxin biomarkers has been based upon the knowledge of the biochemistry and toxicology of aflatoxins gleaned from both experimental and human studies. These biomarkers have subsequently been utilized in experimental models to provide data on the modulation of these markers under different situations of disease risk. This systematic approach provides encouragement for preventive interventions and should serve as a template for the development, validation and application of other chemical-specific biomarkers to cancer or other chronic diseases.« less
ERIC Educational Resources Information Center
Hamby, Tyler; Taylor, Wyn
2016-01-01
This study examined the predictors and psychometric outcomes of survey satisficing, wherein respondents provide quick, "good enough" answers (satisficing) rather than carefully considered answers (optimizing). We administered surveys to university students and respondents--half of whom held college degrees--from a for-pay survey website,…
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2011 CFR
2011-04-01
... experimental units. When the effect of such variables is accounted for by an appropriate design, and when... and well-controlled study should provide sufficient details of study design, conduct, and analysis to... the new animal drug used in the study. (4) The study uses a design that permits a valid comparison...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2012 CFR
2012-04-01
... experimental units. When the effect of such variables is accounted for by an appropriate design, and when... and well-controlled study should provide sufficient details of study design, conduct, and analysis to... the new animal drug used in the study. (4) The study uses a design that permits a valid comparison...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2014 CFR
2014-04-01
... experimental units. When the effect of such variables is accounted for by an appropriate design, and when... and well-controlled study should provide sufficient details of study design, conduct, and analysis to... the new animal drug used in the study. (4) The study uses a design that permits a valid comparison...
ERIC Educational Resources Information Center
Howard, George S.; And Others
1979-01-01
True experimental designs are thought to provide internally valid results. In this investigation of the evaluations of five interventions, a source of internal invalidity is identified when self-report measures are used. An alternative approach is presented and implications of the findings for evaluation research are discussed. (JKS)
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2013 CFR
2013-04-01
... experimental units. When the effect of such variables is accounted for by an appropriate design, and when... and well-controlled study should provide sufficient details of study design, conduct, and analysis to... the new animal drug used in the study. (4) The study uses a design that permits a valid comparison...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2010 CFR
2010-04-01
... experimental units. When the effect of such variables is accounted for by an appropriate design, and when... and well-controlled study should provide sufficient details of study design, conduct, and analysis to... the new animal drug used in the study. (4) The study uses a design that permits a valid comparison...
A summary and evaluation of semi-empirical methods for the prediction of helicopter rotor noise
NASA Technical Reports Server (NTRS)
Pegg, R. J.
1979-01-01
Existing prediction techniques are compiled and described. The descriptions include input and output parameter lists, required equations and graphs, and the range of validity for each part of the prediction procedures. Examples are provided illustrating the analysis procedure and the degree of agreement with experimental results.
As part of our efforts to develop a public platform to provide access to predictive models we have attempted to disentangle the influence of the quality versus quantity of data available to develop and validate QSAR models. Using a thorough manual review of the data underlying t...
Integral nuclear data validation using experimental spent nuclear fuel compositions
Gauld, Ian C.; Williams, Mark L.; Michel-Sendis, Franco; ...
2017-07-19
Measurements of the isotopic contents of spent nuclear fuel provide experimental data that are a prerequisite for validating computer codes and nuclear data for many spent fuel applications. Under the auspices of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) and guidance of the Expert Group on Assay Data of Spent Nuclear Fuel of the NEA Working Party on Nuclear Criticality Safety, a new database of expanded spent fuel isotopic compositions has been compiled. The database, Spent Fuel Compositions (SFCOMPO) 2.0, includes measured data for more than 750 fuel samples acquired from 44 different reactors andmore » representing eight different reactor technologies. Measurements for more than 90 isotopes are included. This new database provides data essential for establishing the reliability of code systems for inventory predictions, but it also has broader potential application to nuclear data evaluation. Furthermore, the database, together with adjoint based sensitivity and uncertainty tools for transmutation systems developed to quantify the importance of nuclear data on nuclide concentrations, are described.« less
Alimenti, Federico; Bonafoni, Stefania; Roselli, Luca
2017-01-01
Controlled measurements by a low-cost single-pixel microwave radiometer operating at 12.65 GHz were carried out to assess the detection and counting capability for targets warmer than the surroundings. The adopted reference test targets were pre-warmed water and oil; and a hand, both naked and wearing a glove. The results showed the reliability of microwave radiometry for counting operations under controlled conditions, and its effectiveness at detecting even warm targets masked by unheated dielectric layers. An electromagnetic model describing the scenario sensed by the radiometer antenna is proposed, and comparison with the experimental observations shows a good agreement. The measurements prove that reliable counting is enabled by an antenna temperature increment, for each target sample added, of around 1 K. Starting from this value, an analysis of the antenna filling factor was performed to provide an instrument useful for evaluating real applicability in many practical situations. This study also allows the direct people counting problem to be addressed, providing preliminary operational indications, reference numbers and experimental validation. PMID:28613264
Integral nuclear data validation using experimental spent nuclear fuel compositions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauld, Ian C.; Williams, Mark L.; Michel-Sendis, Franco
Measurements of the isotopic contents of spent nuclear fuel provide experimental data that are a prerequisite for validating computer codes and nuclear data for many spent fuel applications. Under the auspices of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) and guidance of the Expert Group on Assay Data of Spent Nuclear Fuel of the NEA Working Party on Nuclear Criticality Safety, a new database of expanded spent fuel isotopic compositions has been compiled. The database, Spent Fuel Compositions (SFCOMPO) 2.0, includes measured data for more than 750 fuel samples acquired from 44 different reactors andmore » representing eight different reactor technologies. Measurements for more than 90 isotopes are included. This new database provides data essential for establishing the reliability of code systems for inventory predictions, but it also has broader potential application to nuclear data evaluation. Furthermore, the database, together with adjoint based sensitivity and uncertainty tools for transmutation systems developed to quantify the importance of nuclear data on nuclide concentrations, are described.« less
An experimental study of the validity of the heat-field concept for sonic-boom alleviation
NASA Technical Reports Server (NTRS)
Swigart, R. J.
1974-01-01
An experimental program was carried out in the NASA-Langley 4 ft x 4 ft supersonic pressure tunnel to investigate the validity of the heat-field concept for sonic boom alleviation. The concept involves heating the flow about a supersonic aircraft in such a manner as to obtain an increase in effective aircraft length and yield an effective aircraft shape that will result in a shock-free pressure signature on the ground. First, a basic body-of-revolution representing an SST configuration with its lift equivalence in volume was tested to provide a baseline pressure signature. Second, a model having a 5/2-power area distribution which, according to theory, should yield a linear pressure rise with no front shock wave was tested. Third, the concept of providing the 5/2-power area distribution by using an off-axis slender fin below the basic body was investigated. Then a substantial portion (approximately 40 percent) of the solid fin was replaced by a heat field generated by passing heated nitrogen through the rear of the fin.
The Effects of Magnetic Nozzle Configurations on Plasma Thrusters
NASA Technical Reports Server (NTRS)
Turchi, P. J.
1997-01-01
Over the course of eight years, the Ohio State University has performed research in support of electric propulsion development efforts at the NASA Lewis Research Center, Cleveland, OH. This research has been largely devoted to plasma propulsion systems including MagnetoPlasmaDynamic (MPD) thrusters with externally-applied, solenoidal magnetic fields, hollow cathodes, and Pulsed Plasma Microthrusters (PPT's). Both experimental and theoretical work has been performed, as documented in four master's theses, two doctoral dissertations, and numerous technical papers. The present document is the final report for the grant period 5 December 1987 to 31 December 1995, and summarizes all activities. Detailed discussions of each area of activity are provided in appendices: Appendix 1 - Experimental studies of magnetic nozzle effects on plasma thrusters; Appendix 2 - Numerical modeling of applied-field MPD thrusters; Appendix 3 - Theoretical and experimental studies of hollow cathodes; and Appendix 4 -Theoretical, numerical and experimental studies of pulsed plasma thrusters. Especially notable results include the efficacy of using a solenoidal magnetic field downstream of a plasma thruster to collimate the exhaust flow, the development of a new understanding of applied-field MPD thrusters (based on experimentally-validated results from state-of-the art, numerical simulation) leading to predictions of improved performance, an experimentally-validated, first-principles model for orificed, hollow-cathode behavior, and the first time-dependent, two-dimensional calculations of ablation-fed, pulsed plasma thrusters.
Validating the BISON fuel performance code to integral LWR experiments
Williamson, R. L.; Gamble, K. A.; Perez, D. M.; ...
2016-03-24
BISON is a modern finite element-based nuclear fuel performance code that has been under development at the Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to datemore » for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Our results demonstrate that 1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, 2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and 3) comparison of rod diameter results indicates a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. In the initial rod diameter comparisons they were unsatisfactory and have lead to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.« less
Code of Federal Regulations, 2011 CFR
2011-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...
Code of Federal Regulations, 2010 CFR
2010-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...
Methodological convergence of program evaluation designs.
Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2014-01-01
Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
Validation Database Based Thermal Analysis of an Advanced RPS Concept
NASA Technical Reports Server (NTRS)
Balint, Tibor S.; Emis, Nickolas D.
2006-01-01
Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
Continued Development and Validation of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2015-11-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.
Studying Sexual Aggression: A Review of the Evolution and Validity of Laboratory Paradigms
Davis, Kelly Cue; George, William H.; Nagayama Hall, Gordon C.; Parrott, Dominic J.; Tharp, Andra Teten; Stappenbeck, Cynthia A.
2018-01-01
Objective Researchers have endeavored for decades to develop and implement experimental assessments of sexual aggression and its precursors to capitalize on the many scientific advantages offered by laboratory experiments, such as rigorous control of key variables and identification of causal relationships. The purpose of this review is to provide an overview of and commentary on the evolution of these laboratory-based methods. Conclusions To date, two primary types of sexual aggression laboratory studies have been developed: those that involve behavioral analogues of sexual aggression and those that assess postulated precursors to sexually aggressive behavior. Although the study of sexual aggression in the laboratory is fraught with methodological challenges, validity concerns, and ethical considerations, advances in the field have resulted in greater methodological rigor, more precise dependent measures, and improved experimental validity, reliability, and realism. Because highly effective sexual aggression prevention strategies remain elusive, continued laboratory-based investigation of sexual aggression coupled with translation of critical findings to the development and modification of sexual aggression prevention programs remains an important task for the field. PMID:29675289
EMDataBank unified data resource for 3DEM.
Lawson, Catherine L; Patwardhan, Ardan; Baker, Matthew L; Hryc, Corey; Garcia, Eduardo Sanz; Hudson, Brian P; Lagerstedt, Ingvar; Ludtke, Steven J; Pintilie, Grigore; Sala, Raul; Westbrook, John D; Berman, Helen M; Kleywegt, Gerard J; Chiu, Wah
2016-01-04
Three-dimensional Electron Microscopy (3DEM) has become a key experimental method in structural biology for a broad spectrum of biological specimens from molecules to cells. The EMDataBank project provides a unified portal for deposition, retrieval and analysis of 3DEM density maps, atomic models and associated metadata (emdatabank.org). We provide here an overview of the rapidly growing 3DEM structural data archives, which include maps in EM Data Bank and map-derived models in the Protein Data Bank. In addition, we describe progress and approaches toward development of validation protocols and methods, working with the scientific community, in order to create a validation pipeline for 3DEM data. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model
2010-03-01
EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End
Shanks, Ryan A.; Robertson, Chuck L.; Haygood, Christian S.; Herdliksa, Anna M.; Herdliska, Heather R.; Lloyd, Steven A.
2017-01-01
Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model’s ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads. PMID:28904647
Herbort, Maike C.; Iseev, Jenny; Stolz, Christopher; Roeser, Benedict; Großkopf, Nora; Wüstenberg, Torsten; Hellweg, Rainer; Walter, Henrik; Dziobek, Isabel; Schott, Björn H.
2016-01-01
We present the ToMenovela, a stimulus set that has been developed to provide a set of normatively rated socio-emotional stimuli showing varying amount of characters in emotionally laden interactions for experimental investigations of (i) cognitive and (ii) affective Theory of Mind (ToM), (iii) emotional reactivity, and (iv) complex emotion judgment with respect to Ekman’s basic emotions (happiness, anger, disgust, fear, sadness, surprise, Ekman and Friesen, 1975). Stimuli were generated with focus on ecological validity and consist of 190 scenes depicting daily-life situations. Two or more of eight main characters with distinct biographies and personalities are depicted on each scene picture. To obtain an initial evaluation of the stimulus set and to pave the way for future studies in clinical populations, normative data on each stimulus of the set was obtained from a sample of 61 neurologically and psychiatrically healthy participants (31 female, 30 male; mean age 26.74 ± 5.84), including a visual analog scale rating of Ekman’s basic emotions (happiness, anger, disgust, fear, sadness, surprise) and free-text descriptions of the content of each scene. The ToMenovela is being developed to provide standardized material of social scenes that are available to researchers in the study of social cognition. It should facilitate experimental control while keeping ecological validity high. PMID:27994562
Experimental evaluation of the certification-trail method
NASA Technical Reports Server (NTRS)
Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.; Itoh, Mamoru; Smith, Warren W.; Kay, Jonathan S.
1993-01-01
Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. A comprehensive attempt to assess experimentally the performance and overall value of the method is reported. The method is applied to algorithms for the following problems: huffman tree, shortest path, minimum spanning tree, sorting, and convex hull. Our results reveal many cases in which an approach using certification-trails allows for significantly faster overall program execution time than a basic time redundancy-approach. Algorithms for the answer-validation problem for abstract data types were also examined. This kind of problem provides a basis for applying the certification-trail method to wide classes of algorithms. Answer-validation solutions for two types of priority queues were implemented and analyzed. In both cases, the algorithm which performs answer-validation is substantially faster than the original algorithm for computing the answer. Next, a probabilistic model and analysis which enables comparison between the certification-trail method and the time-redundancy approach were presented. The analysis reveals some substantial and sometimes surprising advantages for ther certification-trail method. Finally, the work our group performed on the design and implementation of fault injection testbeds for experimental analysis of the certification trail technique is discussed. This work employs two distinct methodologies, software fault injection (modification of instruction, data, and stack segments of programs on a Sun Sparcstation ELC and on an IBM 386 PC) and hardware fault injection (control, address, and data lines of a Motorola MC68000-based target system pulsed at logical zero/one values). Our results indicate the viability of the certification trail technique. It is also believed that the tools developed provide a solid base for additional exploration.
Schütte, Judith; Wang, Huange; Antoniou, Stella; Jarratt, Andrew; Wilson, Nicola K; Riepsaame, Joey; Calero-Nieto, Fernando J; Moignard, Victoria; Basilico, Silvia; Kinston, Sarah J; Hannah, Rebecca L; Chan, Mun Chiang; Nürnberg, Sylvia T; Ouwehand, Willem H; Bonzanni, Nicola; de Bruijn, Marella FTR; Göttgens, Berthold
2016-01-01
Transcription factor (TF) networks determine cell-type identity by establishing and maintaining lineage-specific expression profiles, yet reconstruction of mammalian regulatory network models has been hampered by a lack of comprehensive functional validation of regulatory interactions. Here, we report comprehensive ChIP-Seq, transgenic and reporter gene experimental data that have allowed us to construct an experimentally validated regulatory network model for haematopoietic stem/progenitor cells (HSPCs). Model simulation coupled with subsequent experimental validation using single cell expression profiling revealed potential mechanisms for cell state stabilisation, and also how a leukaemogenic TF fusion protein perturbs key HSPC regulators. The approach presented here should help to improve our understanding of both normal physiological and disease processes. DOI: http://dx.doi.org/10.7554/eLife.11469.001 PMID:26901438
[The ethical aspects of physiological experiment].
Al'bertin, S V
2014-01-01
A modern classification of invasive procedures developed according to International Bioethical Principles has been presented. The experimental data convincingly demonstrate that using of noninvasive approaches and techniques give a good opportunity to reduce a number of animals recruited in experiment as well as to keep the normal (not distressful) physiological functions of animals. The data presented stress that development of noninvasive techniques is closely related both to scientific and social aspects of our life, allowing the scientists to provide high validity of experimental data obtained as well as to keep themselves as a human beings.
Experimental investigation of hypersonic aerodynamics
NASA Technical Reports Server (NTRS)
Heinemann, K.; Intrieri, Peter F.
1987-01-01
An extensive series of ballistic range tests are currently being conducted at the Ames Research Center. These tests are intended to investigate the hypersonic aerodynamic characteristics of two basic configurations, which are: the blunt-cone Galileo probe which is scheduled to be launched in late 1989 and will enter the atmosphere of Jupiter in 1994, and a generic slender cone configuration to provide experimental aerodynamic data including good flow-field definition which computational aerodynamicists could use to validate their computer codes. Some of the results obtained thus far are presented and work for the near future is discussed.
Aerothermal Testing for Project Orion Crew Exploration Vehicle
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Horvath, Thomas J.; Lillard, Randolph P.; Kirk, Benjamin S.; Fischer-Cassady, Amy
2009-01-01
The Project Orion Crew Exploration Vehicle aerothermodynamic experimentation strategy, as it relates to flight database development, is reviewed. Experimental data has been obtained to both validate the computational predictions utilized as part of the database and support the development of engineering models for issues not adequately addressed with computations. An outline is provided of the working groups formed to address the key deficiencies in data and knowledge for blunt reentry vehicles. The facilities utilized to address these deficiencies are reviewed, along with some of the important results obtained thus far. For smooth wall comparisons of computational convective heating predictions against experimental data from several facilities, confidence was gained with the use of algebraic turbulence model solutions as part of the database. For cavities and protuberances, experimental data is being used for screening various designs, plus providing support to the development of engineering models. With the reaction-control system testing, experimental data were acquired on the surface in combination with off-body flow visualization of the jet plumes and interactions. These results are being compared against predictions for improved understanding of aftbody thermal environments and uncertainties.
NASA Astrophysics Data System (ADS)
Cosh, M. H.; Jackson, T. J.; Colliander, A.; Bindlish, R.; McKee, L.; Goodrich, D. C.; Prueger, J. H.; Hornbuckle, B. K.; Coopersmith, E. J.; Holifield Collins, C.; Smith, J.
2016-12-01
With the launch of the Soil Moisture Active Passive Mission (SMAP) in 2015, a new era of soil moisture monitoring was begun. Soil moisture is available on a near daily basis at a 36 km resolution for the globe. But this dataset is only as valuable if its products are accurate and reliable. Therefore, in order to demonstrate the accuracy of the soil moisture product, NASA enacted an extensive calibration and validation program with many in situ soil moisture networks contributing data across a variety of landscape regimes. However, not all questions can be answered by these networks. As a result, two intensive field experiments were executed to provide more detailed reference points for calibration and validation. Multi-week field campaigns were conducted in Arizona and Iowa at the USDA Agricultural Research Service Walnut Gulch and South Fork Experimental Watersheds, respectively. Aircraft observations were made to provide a high resolution data product. Soil moisture, soil roughness and vegetation data were collected at high resolution to provide a downscaled dataset to compare against aircraft and satellite estimates.
Validating Inertial Confinement Fusion (ICF) predictive capability using perturbed capsules
NASA Astrophysics Data System (ADS)
Schmitt, Mark; Magelssen, Glenn; Tregillis, Ian; Hsu, Scott; Bradley, Paul; Dodd, Evan; Cobble, James; Flippo, Kirk; Offerman, Dustin; Obrey, Kimberly; Wang, Yi-Ming; Watt, Robert; Wilke, Mark; Wysocki, Frederick; Batha, Steven
2009-11-01
Achieving ignition on NIF is a monumental step on the path toward utilizing fusion as a controlled energy source. Obtaining robust ignition requires accurate ICF models to predict the degradation of ignition caused by heterogeneities in capsule construction and irradiation. LANL has embarked on a project to induce controlled defects in capsules to validate our ability to predict their effects on fusion burn. These efforts include the validation of feature-driven hydrodynamics and mix in a convergent geometry. This capability is needed to determine the performance of capsules imploded under less-than-optimum conditions on future IFE facilities. LANL's recently initiated Defect Implosion Experiments (DIME) conducted at Rochester's Omega facility are providing input for these efforts. Recent simulation and experimental results will be shown.
Approach and Instrument Placement Validation
NASA Technical Reports Server (NTRS)
Ator, Danielle
2005-01-01
The Mars Exploration Rovers (MER) from the 2003 flight mission represents the state of the art technology for target approach and instrument placement on Mars. It currently takes 3 sols (Martian days) for the rover to place an instrument on a designated rock target that is about 10 to 20 m away. The objective of this project is to provide an experimentally validated single-sol instrument placement capability to future Mars missions. After completing numerous test runs on the Rocky8 rover under various test conditions, it has been observed that lighting conditions, shadow effects, target features and the initial target distance have an effect on the performance and reliability of the tracking software. Additional software validation testing will be conducted in the months to come.
Martins, Raquel R; McCracken, Andrew W; Simons, Mirre J P; Henriques, Catarina M; Rera, Michael
2018-02-05
The Smurf Assay (SA) was initially developed in the model organism Drosophila melanogaster where a dramatic increase of intestinal permeability has been shown to occur during aging (Rera et al. , 2011). We have since validated the protocol in multiple other model organisms (Dambroise et al. , 2016) and have utilized the assay to further our understanding of aging (Tricoire and Rera, 2015; Rera et al. , 2018). The SA has now also been used by other labs to assess intestinal barrier permeability (Clark et al. , 2015; Katzenberger et al. , 2015; Barekat et al. , 2016; Chakrabarti et al. , 2016; Gelino et al. , 2016). The SA in itself is simple; however, numerous small details can have a considerable impact on its experimental validity and subsequent interpretation. Here, we provide a detailed update on the SA technique and explain how to catch a Smurf while avoiding the most common experimental fallacies.
Finite Element Vibration Modeling and Experimental Validation for an Aircraft Engine Casing
NASA Astrophysics Data System (ADS)
Rabbitt, Christopher
This thesis presents a procedure for the development and validation of a theoretical vibration model, applies this procedure to a pair of aircraft engine casings, and compares select parameters from experimental testing of those casings to those from a theoretical model using the Modal Assurance Criterion (MAC) and linear regression coefficients. A novel method of determining the optimal MAC between axisymmetric results is developed and employed. It is concluded that the dynamic finite element models developed as part of this research are fully capable of modelling the modal parameters within the frequency range of interest. Confidence intervals calculated in this research for correlation coefficients provide important information regarding the reliability of predictions, and it is recommended that these intervals be calculated for all comparable coefficients. The procedure outlined for aligning mode shapes around an axis of symmetry proved useful, and the results are promising for the development of further optimization techniques.
Space Station UCS antenna pattern computation and measurement. [UHF Communication Subsystem
NASA Technical Reports Server (NTRS)
Hwu, Shian U.; Lu, Ba P.; Johnson, Larry A.; Fournet, Jon S.; Panneton, Robert J.; Ngo, John D.; Eggers, Donald S.; Arndt, G. D.
1993-01-01
The purpose of this paper is to analyze the interference to the Space Station Ultrahigh Frequency (UHF) Communication Subsystem (UCS) antenna radiation pattern due to its environment - Space Station. A hybrid Computational Electromagnetics (CEM) technique was applied in this study. The antenna was modeled using the Method of Moments (MOM) and the radiation patterns were computed using the Uniform Geometrical Theory of Diffraction (GTD) in which the effects of the reflected and diffracted fields from surfaces, edges, and vertices of the Space Station structures were included. In order to validate the CEM techniques, and to provide confidence in the computer-generated results, a comparison with experimental measurements was made for a 1/15 scale Space Station mockup. Based on the results accomplished, good agreement on experimental and computed results was obtained. The computed results using the CEM techniques for the Space Station UCS antenna pattern predictions have been validated.
Optimal coordination and control of posture and movements.
Johansson, Rolf; Fransson, Per-Anders; Magnusson, Måns
2009-01-01
This paper presents a theoretical model of stability and coordination of posture and locomotion, together with algorithms for continuous-time quadratic optimization of motion control. Explicit solutions to the Hamilton-Jacobi equation for optimal control of rigid-body motion are obtained by solving an algebraic matrix equation. The stability is investigated with Lyapunov function theory and it is shown that global asymptotic stability holds. It is also shown how optimal control and adaptive control may act in concert in the case of unknown or uncertain system parameters. The solution describes motion strategies of minimum effort and variance. The proposed optimal control is formulated to be suitable as a posture and movement model for experimental validation and verification. The combination of adaptive and optimal control makes this algorithm a candidate for coordination and control of functional neuromuscular stimulation as well as of prostheses. Validation examples with experimental data are provided.
NASA Astrophysics Data System (ADS)
Lee, Chang-Chun; Shih, Yan-Shin; Wu, Chih-Sheng; Tsai, Chia-Hao; Yeh, Shu-Tang; Peng, Yi-Hao; Chen, Kuang-Jung
2012-07-01
This work analyses the overall stress/strain characteristic of flexible encapsulations with organic light-emitting diode (OLED) devices. A robust methodology composed of a mechanical model of multi-thin film under bending loads and related stress simulations based on nonlinear finite element analysis (FEA) is proposed, and validated to be more reliable compared with related experimental data. With various geometrical combinations of cover plate, stacked thin films and plastic substrate, the position of the neutral axis (NA) plate, which is regarded as a key design parameter to minimize stress impact for the concerned OLED devices, is acquired using the present methodology. The results point out that both the thickness and mechanical properties of the cover plate help in determining the NA location. In addition, several concave and convex radii are applied to examine the reliable mechanical tolerance and to provide an insight into the estimated reliability of foldable OLED encapsulations.
A standardized set of 3-D objects for virtual reality research and applications.
Peeters, David
2018-06-01
The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.
Performance optimization and validation of ADM1 simulations under anaerobic thermophilic conditions.
Atallah, Nabil M; El-Fadel, Mutasem; Ghanimeh, Sophia; Saikaly, Pascal; Abou-Najm, Majdi
2014-12-01
In this study, two experimental sets of data each involving two thermophilic anaerobic digesters treating food waste, were simulated using the Anaerobic Digestion Model No. 1 (ADM1). A sensitivity analysis was conducted, using both data sets of one digester, for parameter optimization based on five measured performance indicators: methane generation, pH, acetate, total COD, ammonia, and an equally weighted combination of the five indicators. The simulation results revealed that while optimization with respect to methane alone, a commonly adopted approach, succeeded in simulating methane experimental results, it predicted other intermediary outputs less accurately. On the other hand, the multi-objective optimization has the advantage of providing better results than methane optimization despite not capturing the intermediary output. The results from the parameter optimization were validated upon their independent application on the data sets of the second digester. Copyright © 2014 Elsevier Ltd. All rights reserved.
Liu, Guohai; Yang, Junqin; Chen, Ming; Chen, Qian
2014-01-01
A fault-tolerant permanent-magnet vernier (FT-PMV) machine is designed for direct-drive applications, incorporating the merits of high torque density and high reliability. Based on the so-called magnetic gearing effect, PMV machines have the ability of high torque density by introducing the flux-modulation poles (FMPs). This paper investigates the fault-tolerant characteristic of PMV machines and provides a design method, which is able to not only meet the fault-tolerant requirements but also keep the ability of high torque density. The operation principle of the proposed machine has been analyzed. The design process and optimization are presented specifically, such as the combination of slots and poles, the winding distribution, and the dimensions of PMs and teeth. By using the time-stepping finite element method (TS-FEM), the machine performances are evaluated. Finally, the FT-PMV machine is manufactured, and the experimental results are presented to validate the theoretical analysis.
Monte Carlo simulation of Ray-Scan 64 PET system and performance evaluation using GATE toolkit
NASA Astrophysics Data System (ADS)
Li, Suying; Zhang, Qiushi; Vuletic, Ivan; Xie, Zhaoheng; Yang, Kun; Ren, Qiushi
2017-02-01
In this study, we aimed to develop a GATE model for the simulation of Ray-Scan 64 PET scanner and model its performance characteristics. A detailed implementation of system geometry and physical process were included in the simulation model. Then we modeled the performance characteristics of Ray-Scan 64 PET system for the first time, based on National Electrical Manufacturers Association (NEMA) NU-2 2007 protocols and validated the model against experimental measurement, including spatial resolution, sensitivity, counting rates and noise equivalent count rate (NECR). Moreover, an accurate dead time module was investigated to simulate the counting rate performance. Overall results showed reasonable agreement between simulation and experimental data. The validation results showed the reliability and feasibility of the GATE model to evaluate major performance of Ray-Scan 64 PET system. It provided a useful tool for a wide range of research applications.
Investigating Mechanisms of Chronic Kidney Disease in Mouse Models
Eddy, Allison A.; Okamura, Daryl M.; Yamaguchi, Ikuyo; López-Guisa, Jesús M.
2011-01-01
Animal models of chronic kidney disease (CKD) are important experimental tools that are used to investigate novel mechanistic pathways and to validate potential new therapeutic interventions prior to pre-clinical testing in humans. Over the past several years, mouse CKD models have been extensively used for these purposes. Despite significant limitations, the model of unilateral ureteral obstruction (UUO) has essentially become the high throughput in vivo model, as it recapitulates the fundamental pathogenetic mechanisms that typify all forms of CKD in a relatively short time span. In addition, several alternative mouse models are available that can be used to validate new mechanistic paradigms and/or novel therapies. Several models are reviewed – both genetic and experimentally induced – that provide investigators with an opportunity to include renal functional study end-points together with quantitative measures of fibrosis severity, something that is not possible with the UUO model. PMID:21695449
Hypersonic Magneto-Fluid-Dynamic Compression in Cylindrical Inlet
NASA Technical Reports Server (NTRS)
Shang, Joseph S.; Chang, Chau-Lyan
2007-01-01
Hypersonic magneto-fluid-dynamic interaction has been successfully performed as a virtual leading-edge strake and a virtual cowl of a cylindrical inlet. In a side-by-side experimental and computational study, the magnitude of the induced compression was found to be depended on configuration and electrode placement. To better understand the interacting phenomenon the present investigation is focused on a direct current discharge at the leading edge of a cylindrical inlet for which validating experimental data is available. The present computational result is obtained by solving the magneto-fluid-dynamics equations at the low magnetic Reynolds number limit and using a nonequilibrium weakly ionized gas model based on the drift-diffusion theory. The numerical simulation provides a detailed description of the intriguing physics. After validation with experimental measurements, the computed results further quantify the effectiveness of a magnet-fluid-dynamic compression for a hypersonic cylindrical inlet. At a minuscule power input to a direct current surface discharge of 8.14 watts per square centimeter of electrode area produces an additional compression of 6.7 percent for a constant cross-section cylindrical inlet.
Reader, Arran T; Holmes, Nicholas P
2016-01-01
Social interaction is an essential part of the human experience, and much work has been done to study it. However, several common approaches to examining social interactions in psychological research may inadvertently either unnaturally constrain the observed behaviour by causing it to deviate from naturalistic performance, or introduce unwanted sources of variance. In particular, these sources are the differences between naturalistic and experimental behaviour that occur from changes in visual fidelity (quality of the observed stimuli), gaze (whether it is controlled for in the stimuli), and social potential (potential for the stimuli to provide actual interaction). We expand on these possible sources of extraneous variance and why they may be important. We review the ways in which experimenters have developed novel designs to remove these sources of extraneous variance. New experimental designs using a 'two-person' approach are argued to be one of the most effective ways to develop more ecologically valid measures of social interaction, and we suggest that future work on social interaction should use these designs wherever possible.
Small-scale experimental study of vaporization flux of liquid nitrogen released on water.
Gopalaswami, Nirupama; Olewski, Tomasz; Véchot, Luc N; Mannan, M Sam
2015-10-30
A small-scale experimental study was conducted using liquid nitrogen to investigate the convective heat transfer behavior of cryogenic liquids released on water. The experiment was performed by spilling five different amounts of liquid nitrogen at different release rates and initial water temperatures. The vaporization mass fluxes of liquid nitrogen were determined directly from the mass loss measured during the experiment. A variation of initial vaporization fluxes and a subsequent shift in heat transfer mechanism were observed with changes in initial water temperature. The initial vaporization fluxes were directly dependent on the liquid nitrogen spill rate. The heat flux from water to liquid nitrogen determined from experimental data was validated with two theoretical correlations for convective boiling. It was also observed from validation with correlations that liquid nitrogen was found to be predominantly in the film boiling regime. The substantial results provide a suitable procedure for predicting the heat flux from water to cryogenic liquids that is required for source term modeling. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Suzen, Y. B.; Huang, P. G.; Ashpis, D. E.; Volino, R. J.; Corke, T. C.; Thomas, F. O.; Huang, J.; Lake, J. P.; King, P. I.
2007-01-01
A transport equation for the intermittency factor is employed to predict the transitional flows in low-pressure turbines. The intermittent behavior of the transitional flows is taken into account and incorporated into computations by modifying the eddy viscosity, mu(sub p) with the intermittency factor, gamma. Turbulent quantities are predicted using Menter's two-equation turbulence model (SST). The intermittency factor is obtained from a transport equation model which can produce both the experimentally observed streamwise variation of intermittency and a realistic profile in the cross stream direction. The model had been previously validated against low-pressure turbine experiments with success. In this paper, the model is applied to predictions of three sets of recent low-pressure turbine experiments on the Pack B blade to further validate its predicting capabilities under various flow conditions. Comparisons of computational results with experimental data are provided. Overall, good agreement between the experimental data and computational results is obtained. The new model has been shown to have the capability of accurately predicting transitional flows under a wide range of low-pressure turbine conditions.
A simplified approach to characterizing a kilovoltage source spectrum for accurate dose computation.
Poirier, Yannick; Kouznetsov, Alexei; Tambasco, Mauro
2012-06-01
To investigate and validate the clinical feasibility of using half-value layer (HVL) and peak tube potential (kVp) for characterizing a kilovoltage (kV) source spectrum for the purpose of computing kV x-ray dose accrued from imaging procedures. To use this approach to characterize a Varian® On-Board Imager® (OBI) source and perform experimental validation of a novel in-house hybrid dose computation algorithm for kV x-rays. We characterized the spectrum of an imaging kV x-ray source using the HVL and the kVp as the sole beam quality identifiers using third-party freeware Spektr to generate the spectra. We studied the sensitivity of our dose computation algorithm to uncertainties in the beam's HVL and kVp by systematically varying these spectral parameters. To validate our approach experimentally, we characterized the spectrum of a Varian® OBI system by measuring the HVL using a Farmer-type Capintec ion chamber (0.06 cc) in air and compared dose calculations using our computationally validated in-house kV dose calculation code to measured percent depth-dose and transverse dose profiles for 80, 100, and 125 kVp open beams in a homogeneous phantom and a heterogeneous phantom comprising tissue, lung, and bone equivalent materials. The sensitivity analysis of the beam quality parameters (i.e., HVL, kVp, and field size) on dose computation accuracy shows that typical measurement uncertainties in the HVL and kVp (±0.2 mm Al and ±2 kVp, respectively) source characterization parameters lead to dose computation errors of less than 2%. Furthermore, for an open beam with no added filtration, HVL variations affect dose computation accuracy by less than 1% for a 125 kVp beam when field size is varied from 5 × 5 cm(2) to 40 × 40 cm(2). The central axis depth dose calculations and experimental measurements for the 80, 100, and 125 kVp energies agreed within 2% for the homogeneous and heterogeneous block phantoms, and agreement for the transverse dose profiles was within 6%. The HVL and kVp are sufficient for characterizing a kV x-ray source spectrum for accurate dose computation. As these parameters can be easily and accurately measured, they provide for a clinically feasible approach to characterizing a kV energy spectrum to be used for patient specific x-ray dose computations. Furthermore, these results provide experimental validation of our novel hybrid dose computation algorithm. © 2012 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mattsson, Ann E.
Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less
Statistical Methodologies to Integrate Experimental and Computational Research
NASA Technical Reports Server (NTRS)
Parker, P. A.; Johnson, R. T.; Montgomery, D. C.
2008-01-01
Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.
Photogrammetric Technique for Center of Gravity Determination
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Johnson, Thomas H.; Shemwell, Dave; Shreves, Christopher M.
2012-01-01
A new measurement technique for determination of the center of gravity (CG) for large scale objects has been demonstrated. The experimental method was conducted as part of an LS-DYNA model validation program for the Max Launch Abort System (MLAS) crew module. The test was conducted on the full scale crew module concept at NASA Langley Research Center. Multi-camera photogrammetry was used to measure the test article in several asymmetric configurations. The objective of these measurements was to provide validation of the CG as computed from the original mechanical design. The methodology, measurement technique, and measurement results are presented.
Calibrated Blade-Element/Momentum Theory Aerodynamic Model of the MARIN Stock Wind Turbine: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goupee, A.; Kimball, R.; de Ridder, E. J.
2015-04-02
In this paper, a calibrated blade-element/momentum theory aerodynamic model of the MARIN stock wind turbine is developed and documented. The model is created using open-source software and calibrated to closely emulate experimental data obtained by the DeepCwind Consortium using a genetic algorithm optimization routine. The provided model will be useful for those interested in validating interested in validating floating wind turbine numerical simulators that rely on experiments utilizing the MARIN stock wind turbine—for example, the International Energy Agency Wind Task 30’s Offshore Code Comparison Collaboration Continued, with Correlation project.
Linear relations in microbial reaction systems: a general overview of their origin, form, and use.
Noorman, H J; Heijnen, J J; Ch A M Luyben, K
1991-09-01
In microbial reaction systems, there are a number of linear relations among net conversion rates. These can be very useful in the analysis of experimental data. This article provides a general approach for the formation and application of the linear relations. Two type of system descriptions, one considering the biomass as a black box and the other based on metabolic pathways, are encountered. These are defined in a linear vector and matrix algebra framework. A correct a priori description can be obtained by three useful tests: the independency, consistency, and observability tests. The independency are different. The black box approach provides only conservations relations. They are derived from element, electrical charge, energy, and Gibbs energy balances. The metabolic approach provides, in addition to the conservation relations, metabolic and reaction relations. These result from component, energy, and Gibbs energy balances. Thus it is more attractive to use the metabolic description than the black box approach. A number of different types of linear relations given in the literature are reviewed. They are classified according to the different categories that result from the black box or the metabolic system description. Validation of hypotheses related to metabolic pathways can be supported by experimental validation of the linear metabolic relations. However, definite proof from biochemical evidence remains indispensable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohatgi, Upendra S.
Nuclear reactor codes require validation with appropriate data representing the plant for specific scenarios. The thermal-hydraulic data is scattered in different locations and in different formats. Some of the data is in danger of being lost. A relational database is being developed to organize the international thermal hydraulic test data for various reactor concepts and different scenarios. At the reactor system level, that data is organized to include separate effect tests and integral effect tests for specific scenarios and corresponding phenomena. The database relies on the phenomena identification sections of expert developed PIRTs. The database will provide a summary ofmore » appropriate data, review of facility information, test description, instrumentation, references for the experimental data and some examples of application of the data for validation. The current database platform includes scenarios for PWR, BWR, VVER, and specific benchmarks for CFD modelling data and is to be expanded to include references for molten salt reactors. There are place holders for high temperature gas cooled reactors, CANDU and liquid metal reactors. This relational database is called The International Experimental Thermal Hydraulic Systems (TIETHYS) database and currently resides at Nuclear Energy Agency (NEA) of the OECD and is freely open to public access. Going forward the database will be extended to include additional links and data as they become available. https://www.oecd-nea.org/tiethysweb/« less
Validation of Cross Sections for Monte Carlo Simulation of the Photoelectric Effect
NASA Astrophysics Data System (ADS)
Han, Min Cheol; Kim, Han Sung; Pia, Maria Grazia; Basaglia, Tullio; Batič, Matej; Hoff, Gabriela; Kim, Chan Hyeong; Saracco, Paolo
2016-04-01
Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofield's 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library (EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofield's non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebel's parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthill's parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.
Non-Linear System Identification for Aeroelastic Systems with Application to Experimental Data
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.
2008-01-01
Representation and identification of a non-linear aeroelastic pitch-plunge system as a model of the NARMAX class is considered. A non-linear difference equation describing this aircraft model is derived theoretically and shown to be of the NARMAX form. Identification methods for NARMAX models are applied to aeroelastic dynamics and its properties demonstrated via continuous-time simulations of experimental conditions. Simulation results show that (i) the outputs of the NARMAX model match closely those generated using continuous-time methods and (ii) NARMAX identification methods applied to aeroelastic dynamics provide accurate discrete-time parameter estimates. Application of NARMAX identification to experimental pitch-plunge dynamics data gives a high percent fit for cross-validated data.
Bed inventory overturn in a circulating fluid bed riser with pant-leg structure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jinjing Li; Wei Wang; Hairui Yang
2009-05-15
The special phenomenon, nominated as bed inventory overturn, in circulating fluid bed (CFB) riser with pant-leg structure was studied with model calculation and experimental work. A compounded pressure drop mathematic model was developed and validated with the experimental data in a cold experimental test rig. The model calculation results agree well with the measured data. In addition, the intensity of bed inventory overturn is directly proportional to the fluidizing velocity and is inversely proportional to the branch point height. The results in the present study provide significant information for the design and operation of a CFB boiler with pant-leg structure.more » 15 refs., 10 figs., 1 tab.« less
Prediction of physical protein protein interactions
NASA Astrophysics Data System (ADS)
Szilágyi, András; Grimm, Vera; Arakaki, Adrián K.; Skolnick, Jeffrey
2005-06-01
Many essential cellular processes such as signal transduction, transport, cellular motion and most regulatory mechanisms are mediated by protein-protein interactions. In recent years, new experimental techniques have been developed to discover the protein-protein interaction networks of several organisms. However, the accuracy and coverage of these techniques have proven to be limited, and computational approaches remain essential both to assist in the design and validation of experimental studies and for the prediction of interaction partners and detailed structures of protein complexes. Here, we provide a critical overview of existing structure-independent and structure-based computational methods. Although these techniques have significantly advanced in the past few years, we find that most of them are still in their infancy. We also provide an overview of experimental techniques for the detection of protein-protein interactions. Although the developments are promising, false positive and false negative results are common, and reliable detection is possible only by taking a consensus of different experimental approaches. The shortcomings of experimental techniques affect both the further development and the fair evaluation of computational prediction methods. For an adequate comparative evaluation of prediction and high-throughput experimental methods, an appropriately large benchmark set of biophysically characterized protein complexes would be needed, but is sorely lacking.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-31
... factors as the approved models, are validated by experimental test data, and receive the Administrator's... stage of the MEP involves applying the model against a database of experimental test cases including..., particularly the requirement for validation by experimental test data. That guidance is based on the MEP's...
Rabbah, Jean-Pierre; Saikrishnan, Neelakantan; Yoganathan, Ajit P
2013-02-01
Numerical models of the mitral valve have been used to elucidate mitral valve function and mechanics. These models have evolved from simple two-dimensional approximations to complex three-dimensional fully coupled fluid structure interaction models. However, to date these models lack direct one-to-one experimental validation. As computational solvers vary considerably, experimental benchmark data are critically important to ensure model accuracy. In this study, a novel left heart simulator was designed specifically for the validation of numerical mitral valve models. Several distinct experimental techniques were collectively performed to resolve mitral valve geometry and hemodynamics. In particular, micro-computed tomography was used to obtain accurate and high-resolution (39 μm voxel) native valvular anatomy, which included the mitral leaflets, chordae tendinae, and papillary muscles. Three-dimensional echocardiography was used to obtain systolic leaflet geometry. Stereoscopic digital particle image velocimetry provided all three components of fluid velocity through the mitral valve, resolved every 25 ms in the cardiac cycle. A strong central filling jet (V ~ 0.6 m/s) was observed during peak systole with minimal out-of-plane velocities. In addition, physiologic hemodynamic boundary conditions were defined and all data were synchronously acquired through a central trigger. Finally, the simulator is a precisely controlled environment, in which flow conditions and geometry can be systematically prescribed and resultant valvular function and hemodynamics assessed. Thus, this work represents the first comprehensive database of high fidelity experimental data, critical for extensive validation of mitral valve fluid structure interaction simulations.
Rabbah, Jean-Pierre; Saikrishnan, Neelakantan; Yoganathan, Ajit P.
2012-01-01
Numerical models of the mitral valve have been used to elucidate mitral valve function and mechanics. These models have evolved from simple two-dimensional approximations to complex three-dimensional fully coupled fluid structure interaction models. However, to date these models lack direct one-to-one experimental validation. As computational solvers vary considerably, experimental benchmark data are critically important to ensure model accuracy. In this study, a novel left heart simulator was designed specifically for the validation of numerical mitral valve models. Several distinct experimental techniques were collectively performed to resolve mitral valve geometry and hemodynamics. In particular, micro-computed tomography was used to obtain accurate and high-resolution (39 µm voxel) native valvular anatomy, which included the mitral leaflets, chordae tendinae, and papillary muscles. Threedimensional echocardiography was used to obtain systolic leaflet geometry for direct comparison of resultant leaflet kinematics. Stereoscopic digital particle image velocimetry provided all three components of fluid velocity through the mitral valve, resolved every 25 ms in the cardiac cycle. A strong central filling jet was observed during peak systole, with minimal out-of-plane velocities (V~0.6m/s). In addition, physiologic hemodynamic boundary conditions were defined and all data were synchronously acquired through a central trigger. Finally, the simulator is a precisely controlled environment, in which flow conditions and geometry can be systematically prescribed and resultant valvular function and hemodynamics assessed. Thus, these data represent the first comprehensive database of high fidelity experimental data, critical for extensive validation of mitral valve fluid structure interaction simulations. PMID:22965640
Development of a model counter-rotating type horizontal-axis tidal turbine
NASA Astrophysics Data System (ADS)
Huang, B.; Yoshida, K.; Kanemoto, T.
2016-05-01
In the past decade, the tidal energies have caused worldwide concern as it can provide regular and predictable renewable energy resource for power generation. The majority of technologies for exploiting the tidal stream energy are based on the concept of the horizontal axis tidal turbine (HATT). A unique counter-rotating type HATT was proposed in the present work. The original blade profiles were designed according to the developed blade element momentum theory (BEMT). CFD simulations and experimental tests were adopted to the performance of the model counter-rotating type HATT. The experimental data provides an evidence of validation of the CFD model. Further optimization of the blade profiles was also carried out based on the CFD results.
Equally parsimonious pathways through an RNA sequence space are not equally likely
NASA Technical Reports Server (NTRS)
Lee, Y. H.; DSouza, L. M.; Fox, G. E.
1997-01-01
An experimental system for determining the potential ability of sequences resembling 5S ribosomal RNA (rRNA) to perform as functional 5S rRNAs in vivo in the Escherichia coli cellular environment was devised previously. Presumably, the only 5S rRNA sequences that would have been fixed by ancestral populations are ones that were functionally valid, and hence the actual historical paths taken through RNA sequence space during 5S rRNA evolution would have most likely utilized valid sequences. Herein, we examine the potential validity of all sequence intermediates along alternative equally parsimonious trajectories through RNA sequence space which connect two pairs of sequences that had previously been shown to behave as valid 5S rRNAs in E. coli. The first trajectory requires a total of four changes. The 14 sequence intermediates provide 24 apparently equally parsimonious paths by which the transition could occur. The second trajectory involves three changes, six intermediate sequences, and six potentially equally parsimonious paths. In total, only eight of the 20 sequence intermediates were found to be clearly invalid. As a consequence of the position of these invalid intermediates in the sequence space, seven of the 30 possible paths consisted of exclusively valid sequences. In several cases, the apparent validity/invalidity of the intermediate sequences could not be anticipated on the basis of current knowledge of the 5S rRNA structure. This suggests that the interdependencies in RNA sequence space may be more complex than currently appreciated. If ancestral sequences predicted by parsimony are to be regarded as actual historical sequences, then the present results would suggest that they should also satisfy a validity requirement and that, in at least limited cases, this conjecture can be tested experimentally.
Experimental investigation of an RNA sequence space
NASA Technical Reports Server (NTRS)
Lee, Youn-Hyung; Dsouza, Lisa; Fox, George E.
1993-01-01
Modern rRNAs are the historic consequence of an ongoing evolutionary exploration of a sequence space. These extant sequences belong to a special subset of the sequence space that is comprised only of those primary sequences that can validly perform the biological function(s) required of the particular RNA. If it were possible to readily identify all such valid sequences, stochastic predictions could be made about the relative likelihood of various evolutionary pathways available to an RNA. Herein an experimental system which can assess whether a particular sequence is likely to have validity as a eubacterial 5S rRNA is described. A total of ten naturally occurring, and hence known to be valid, sequences and two point mutants of unknown validity were used to test the usefulness of the approach. Nine of the ten valid sequences tested positive whereas both mutants tested as clearly defective. The tenth valid sequence gave results that would be interpreted as reflecting a borderline status were the answer not known. These results demonstrate that it is possible to experimentally determine which sequences in local regions of the sequence space are potentially valid 5S rRNAs.
Methodological issues in microdialysis sampling for pharmacokinetic studies.
de Lange, E C; de Boer, A G; Breimer, D D
2000-12-15
Microdialysis is an in vivo technique that permits monitoring of local concentrations of drugs and metabolites at specific sites in the body. Microdialysis has several characteristics, which makes it an attractive tool for pharmacokinetic research. About a decade ago the microdialysis technique entered the field of pharmacokinetic research, in the brain, and later also in peripheral tissues and blood. Within this period much has been learned on the proper use of this technique. Today, it has outgrown its child diseases and its potentials and limitations have become more or less well defined. As microdialysis is a delicate technique for which experimental factors appear to be critical with respect to the validity of the experimental outcomes, several factors should be considered. These include the probe; the perfusion solution; post-surgery interval in relation to surgical trauma, tissue integrity and repeated experiments; the analysis of microdialysate samples; and the quantification of microdialysate data. Provided that experimental conditions are optimized to give valid and quantitative results, microdialysis can provide numerous data points from a relatively small number of individual animals to determine detailed pharmacokinetic information. An example of one of the added values of this technique compared with other in vivo pharmacokinetic techniques, is that microdialysis reflects free concentrations in tissues and plasma. This gives the opportunity to assess information on drug transport equilibration across membranes such as the blood-brain barrier, which already has provided new insights. With the progress of analytical methodology, especially with respect to low volume/low concentration measurements and simultaneous measurement of multiple compounds, the applications and importance of the microdialysis technique in pharmacokinetic research will continue to increase.
2014-04-15
SINGLE CYLINDER DIESEL ENGINE Amit Shrestha, Umashankar Joshi, Ziliang Zheng, Tamer Badawy, Naeim A. Henein, Wayne State University, Detroit, MI, USA...13-03-2014 4. TITLE AND SUBTITLE EXPERIMENTAL VALIDATION AND COMBUSTION MODELING OF A JP-8 SURROGATE IN A SINGLE CYLINDER DIESEL ENGINE 5a...INTERNATIONAL UNCLASSIFIED • Validate a two-component JP-8 surrogate in a single cylinder diesel engine. Validation parameters include – Ignition delay
Experimental validation of tape springs to be used as thin-walled space structures
NASA Astrophysics Data System (ADS)
Oberst, S.; Tuttle, S. L.; Griffin, D.; Lambert, A.; Boyce, R. R.
2018-04-01
With the advent of standardised launch geometries and off-the-shelf payloads, space programs utilising nano-satellite platforms are growing worldwide. Thin-walled, flexible and self-deployable structures are commonly used for antennae, instrument booms or solar panels owing to their lightweight, ideal packaging characteristics and near zero energy consumption. However their behaviour in space, in particular in Low Earth Orbits with continually changing environmental conditions, raises many questions. Accurate numerical models, which are often not available due to the difficulty of experimental testing under 1g-conditions, are needed to answer these questions. In this study, we present on-earth experimental validations, as a starting point to study the response of a tape spring as a representative of thin-walled flexible structures under static and vibrational loading. Material parameters of tape springs in a singly (straight, open cylinder) and a doubly curved design, are compared to each other by combining finite element calculations, with experimental laser vibrometry within a single and multi-stage model updating approach. While the determination of the Young's modulus is unproblematic, the damping is found to be inversely proportional to deployment length. With updated material properties the buckling instability margin is calculated using different slenderness ratios. Results indicate a high sensitivity of thin-walled structures to miniscule perturbations, which makes proper experimental testing a key requirement for stability prediction on thin-elastic space structures. The doubly curved tape spring provides closer agreement with experimental results than a straight tape spring design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.
Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less
NASA Technical Reports Server (NTRS)
Bellan, Josette; Harstad, Kenneth; Ohsaka, Kenichi
2003-01-01
Although the high pressure multicomponent fluid conservation equations have already been derived and approximately validated for binary mixtures by this PI, the validation of the multicomponent theory is hampered by the lack of existing mixing rules for property calculations. Classical gas dynamics theory can provide property mixing-rules at low pressures exclusively. While thermal conductivity and viscosity high-pressure mixing rules have been documented in the literature, there is no such equivalent for the diffusion coefficients and the thermal diffusion factors. The primary goal of this investigation is to extend the low pressure mixing rule theory to high pressures and validate the new theory with experimental data from levitated single drops. The two properties that will be addressed are the diffusion coefficients and the thermal diffusion factors. To validate/determine the property calculations, ground-based experiments from levitated drops are being conducted.
TH-CD-207A-08: Simulated Real-Time Image Guidance for Lung SBRT Patients Using Scatter Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redler, G; Cifter, G; Templeton, A
2016-06-15
Purpose: To develop a comprehensive Monte Carlo-based model for the acquisition of scatter images of patient anatomy in real-time, during lung SBRT treatment. Methods: During SBRT treatment, images of patient anatomy can be acquired from scattered radiation. To rigorously examine the utility of scatter images for image guidance, a model is developed using MCNP code to simulate scatter images of phantoms and lung cancer patients. The model is validated by comparing experimental and simulated images of phantoms of different complexity. The differentiation between tissue types is investigated by imaging objects of known compositions (water, lung, and bone equivalent). A lungmore » tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is used to investigate image noise properties for various quantities of delivered radiation (monitor units(MU)). Patient scatter images are simulated using the validated simulation model. 4DCT patient data is converted to an MCNP input geometry accounting for different tissue composition and densities. Lung tumor phantom images acquired with decreasing imaging time (decreasing MU) are used to model the expected noise amplitude in patient scatter images, producing realistic simulated patient scatter images with varying temporal resolution. Results: Image intensity in simulated and experimental scatter images of tissue equivalent objects (water, lung, bone) match within the uncertainty (∼3%). Lung tumor phantom images agree as well. Specifically, tumor-to-lung contrast matches within the uncertainty. The addition of random noise approximating quantum noise in experimental images to simulated patient images shows that scatter images of lung tumors can provide images in as fast as 0.5 seconds with CNR∼2.7. Conclusions: A scatter imaging simulation model is developed and validated using experimental phantom scatter images. Following validation, lung cancer patient scatter images are simulated. These simulated patient images demonstrate the clinical utility of scatter imaging for real-time tumor tracking during lung SBRT.« less
Utilizing Metalized Fabrics for Liquid and Rip Detection and Localization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Stephen; Mahan, Cody; Kuhn, Michael J
2013-01-01
This paper proposes a novel technique for utilizing conductive textiles as a distributed sensor for detecting and localizing liquids (e.g., blood), rips (e.g., bullet holes), and potentially biosignals. The proposed technique is verified through both simulation and experimental measurements. Circuit theory is utilized to depict conductive fabric as a bounded, near-infinite grid of resistors. Solutions to the well-known infinite resistance grid problem are used to confirm the accuracy and validity of this modeling approach. Simulations allow for discontinuities to be placed within the resistor matrix to illustrate the effects of bullet holes within the fabric. A real-time experimental system wasmore » developed that uses a multiplexed Wheatstone bridge approach to reconstruct the resistor grid across the conductive fabric and detect liquids and rips. The resistor grid model is validated through a comparison of simulated and experimental results. Results suggest accuracy proportional to the electrode spacing in determining the presence and location of discontinuities in conductive fabric samples. Future work is focused on refining the experimental system to provide more accuracy in detecting and localizing events as well as developing a complete prototype that can be deployed for field testing. Potential applications include intelligent clothing, flexible, lightweight sensing systems, and combat wound detection.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manera, Annalisa; Corradini, Michael; Petrov, Victor
This project has been focused on the experimental and numerical investigations of the water-cooled and air-cooled Reactor Cavity Cooling System (RCCS) designs. At this aim, we have leveraged an existing experimental facility at the University of Wisconsin-Madison (UW), and we have designed and built a separate effect test facility at the University of Michigan. The experimental facility at UW has underwent several upgrades, including the installation of advanced instrumentation (i.e. wire-mesh sensors) built at the University of Michigan. These provides highresolution time-resolved measurements of the void-fraction distribution in the risers of the water-cooled RCCS facility. A phenomenological model has beenmore » developed to assess the water cooled RCCS system stability and determine the root cause behind the oscillatory behavior that occurs under normal two-phase operation. Testing under various perturbations to the water-cooled RCCS facility have resulted in changes in the stability of the integral system. In particular, the effects on stability of inlet orifices, water tank volume have and system pressure been investigated. MELCOR was used as a predictive tool when performing inlet orificing tests and was able to capture the Density Wave Oscillations (DWOs) that occurred upon reaching saturation in the risers. The experimental and numerical results have then been used to provide RCCS design recommendations. The experimental facility built at the University of Michigan was aimed at the investigation of mixing in the upper plenum of the air-cooled RCCS design. The facility has been equipped with state-of-theart high-resolution instrumentation to achieve so-called CFD grade experiments, that can be used for the validation of Computational Fluid Dynanmics (CFD) models, both RANS (Reynold-Averaged) and LES (Large Eddy Simulations). The effect of risers penetration in the upper plenum has been investigated as well.« less
Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
2015-02-01
The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less
2011-01-01
with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...Research Associate at ARL with WRA, and largely completed more recently while at Dept. of Chem., SUNY, Cortland, NY. Currently unaffiliated. †Former...promised to provide an extensive, definitive review critically assessing our current understanding of DZ structure and chemistry, and providing a documented
2017-11-01
The Under-body Blast Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and... Evaluation Command to assess the vulnerability of vehicles to under-body blast. Finite element (FE) models are part of the current UBM for T&E methodology...Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and Evaluation Command
Design data needs modular high-temperature gas-cooled reactor. Revision 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1987-03-01
The Design Data Needs (DDNs) provide summary statements for program management, of the designer`s need for experimental data to confirm or validate assumptions made in the design. These assumptions were developed using the Integrated Approach and are tabulated in the Functional Analysis Report. These assumptions were also necessary in the analyses or trade studies (A/TS) to develop selections of hardware design or design requirements. Each DDN includes statements providing traceability to the function and the associated assumption that requires the need.
Mingus Discontinuous Multiphysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pat Notz, Dan Turner
Mingus provides hybrid coupled local/non-local mechanics analysis capabilities that extend several traditional methods to applications with inherent discontinuities. Its primary features include adaptations of solid mechanics, fluid dynamics and digital image correlation that naturally accommodate dijointed data or irregular solution fields by assimilating a variety of discretizations (such as control volume finite elements, peridynamics and meshless control point clouds). The goal of this software is to provide an analysis framework form multiphysics engineering problems with an integrated image correlation capability that can be used for experimental validation and model
What can music tell us about social interaction?
D'Ausilio, Alessandro; Novembre, Giacomo; Fadiga, Luciano; Keller, Peter E
2015-03-01
Humans are innately social creatures, but cognitive neuroscience, that has traditionally focused on individual brains, is only now beginning to investigate social cognition through realistic interpersonal interaction. Music provides an ideal domain for doing so because it offers a promising solution for balancing the trade-off between ecological validity and experimental control when testing cognitive and brain functions. Musical ensembles constitute a microcosm that provides a platform for parametrically modeling the complexity of human social interaction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Proceedings of the Twenty-Third Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1999-01-01
The Twenty-third Annual Software Engineering Workshop (SEW) provided 20 presentations designed to further the goals of the Software Engineering Laboratory (SEL) of the NASA-GSFC. The presentations were selected on their creativity. The sessions which were held on 2-3 of December 1998, centered on the SEL, Experimentation, Inspections, Fault Prediction, Verification and Validation, and Embedded Systems and Safety-Critical Systems.
ERIC Educational Resources Information Center
Donmoyer, Robert; Galloway, Fred
2010-01-01
In recent years, policy makers and researchers once again have embraced the traditional idea that quasi-experimental research designs (or reasonable facsimiles) can provide the sort of valid and generalizable knowledge about "what works" that educational researchers had promised--but never really produced--during the previous century. Although…
ERIC Educational Resources Information Center
Davis, Gregory J.; Gibson, Bradley S.
2012-01-01
Voluntary shifts of attention are often motivated in experimental contexts by using well-known symbols that accurately predict the direction of targets. The authors report 3 experiments, which showed that the presentation of predictive spatial information does not provide sufficient incentive to elicit voluntary shifts of attention. For instance,…
New Reactor Physics Benchmark Data in the March 2012 Edition of the IRPhEP Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; J. Blair Briggs; Jim Gulliford
2012-11-01
The International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications. Numerous experiments that have been performed worldwide, represent a large investment of infrastructure, expertise, and cost, and are valuable resources of data for present and future research. These valuable assets provide the basis for recording, development, and validation of methods. If the experimental data are lost, the high cost to repeat many of these measurements may be prohibitive. The purpose of the IRPhEP is to provide an extensively peer-reviewed set ofmore » reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. Contributors from around the world collaborate in the evaluation and review of selected benchmark experiments for inclusion in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [1]. Several new evaluations have been prepared for inclusion in the March 2012 edition of the IRPhEP Handbook.« less
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.; Brown, James L.; Gnoffo, Peter A.
2013-01-01
A database compilation of hypersonic shock-wave/turbulent boundary layer experiments is provided. The experiments selected for the database are either 2D or axisymmetric, and include both compression corner and impinging type SWTBL interactions. The strength of the interactions range from attached to incipient separation to fully separated flows. The experiments were chosen based on criterion to ensure quality of the datasets, to be relevant to NASA's missions and to be useful for validation and uncertainty assessment of CFD Navier-Stokes predictive methods, both now and in the future. An emphasis on datasets selected was on surface pressures and surface heating throughout the interaction, but include some wall shear stress distributions and flowfield profiles. Included, for selected cases, are example CFD grids and setup information, along with surface pressure and wall heating results from simulations using current NASA real-gas Navier-Stokes codes by which future CFD investigators can compare and evaluate physics modeling improvements and validation and uncertainty assessments of future CFD code developments. The experimental database is presented tabulated in the Appendices describing each experiment. The database is also provided in computer-readable ASCII files located on a companion DVD.
NASA Astrophysics Data System (ADS)
van Ness, Katherine; Hill, Craig; Aliseda, Alberto; Polagye, Brian
2017-11-01
Experimental measurements of a 0.45-m diameter, variable-pitch marine hydrokinetic (MHK) turbine were collected in a tow tank at different tip speed ratios and blade pitch angles. The coefficients of power and thrust are computed from direct measurements of torque, force and angular speed at the hub level. Loads on individual blades were measured with a six-degree of freedom load cell mounted at the root of one of the turbine blades. This information is used to validate the performance predictions provided by blade element model (BEM) simulations used in the turbine design, specifically the open-source code WTPerf developed by the National Renewable Energy Lab (NREL). Predictions of blade and hub loads by NREL's AeroDyn are also validated for the first time for an axial-flow MHK turbine. The influence of design twist angle, combined with the variable pitch angle, on the flow separation and subsequent blade loading will be analyzed with the complementary information from simulations and experiments. Funding for this research was provided by the United States Naval Facilities Engineering Command.
Cloud computing and validation of expandable in silico livers
2010-01-01
Background In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. Results The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. Conclusions The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware. PMID:21129207
Integrating cell biology and proteomic approaches in plants.
Takáč, Tomáš; Šamajová, Olga; Šamaj, Jozef
2017-10-03
Significant improvements of protein extraction, separation, mass spectrometry and bioinformatics nurtured advancements of proteomics during the past years. The usefulness of proteomics in the investigation of biological problems can be enhanced by integration with other experimental methods from cell biology, genetics, biochemistry, pharmacology, molecular biology and other omics approaches including transcriptomics and metabolomics. This review aims to summarize current trends integrating cell biology and proteomics in plant science. Cell biology approaches are most frequently used in proteomic studies investigating subcellular and developmental proteomes, however, they were also employed in proteomic studies exploring abiotic and biotic stress responses, vesicular transport, cytoskeleton and protein posttranslational modifications. They are used either for detailed cellular or ultrastructural characterization of the object subjected to proteomic study, validation of proteomic results or to expand proteomic data. In this respect, a broad spectrum of methods is employed to support proteomic studies including ultrastructural electron microscopy studies, histochemical staining, immunochemical localization, in vivo imaging of fluorescently tagged proteins and visualization of protein-protein interactions. Thus, cell biological observations on fixed or living cell compartments, cells, tissues and organs are feasible, and in some cases fundamental for the validation and complementation of proteomic data. Validation of proteomic data by independent experimental methods requires development of new complementary approaches. Benefits of cell biology methods and techniques are not sufficiently highlighted in current proteomic studies. This encouraged us to review most popular cell biology methods used in proteomic studies and to evaluate their relevance and potential for proteomic data validation and enrichment of purely proteomic analyses. We also provide examples of representative studies combining proteomic and cell biology methods for various purposes. Integrating cell biology approaches with proteomic ones allow validation and better interpretation of proteomic data. Moreover, cell biology methods remarkably extend the knowledge provided by proteomic studies and might be fundamental for the functional complementation of proteomic data. This review article summarizes current literature linking proteomics with cell biology. Copyright © 2017 Elsevier B.V. All rights reserved.
Yoder, Keith J; Belmonte, Matthew K
2010-12-16
Experimental paradigms are valuable insofar as the timing and other parameters of their stimuli are well specified and controlled, and insofar as they yield data relevant to the cognitive processing that occurs under ecologically valid conditions. These two goals often are at odds, since well controlled stimuli often are too repetitive to sustain subjects' motivation. Studies employing electroencephalography (EEG) are often especially sensitive to this dilemma between ecological validity and experimental control: attaining sufficient signal-to-noise in physiological averages demands large numbers of repeated trials within lengthy recording sessions, limiting the subject pool to individuals with the ability and patience to perform a set task over and over again. This constraint severely limits researchers' ability to investigate younger populations as well as clinical populations associated with heightened anxiety or attentional abnormalities. Even adult, non-clinical subjects may not be able to achieve their typical levels of performance or cognitive engagement: an unmotivated subject for whom an experimental task is little more than a chore is not the same, behaviourally, cognitively, or neurally, as a subject who is intrinsically motivated and engaged with the task. A growing body of literature demonstrates that embedding experiments within video games may provide a way between the horns of this dilemma between experimental control and ecological validity. The narrative of a game provides a more realistic context in which tasks occur, enhancing their ecological validity (Chaytor & Schmitter-Edgecombe, 2003). Moreover, this context provides motivation to complete tasks. In our game, subjects perform various missions to collect resources, fend off pirates, intercept communications or facilitate diplomatic relations. In so doing, they also perform an array of cognitive tasks, including a Posner attention-shifting paradigm (Posner, 1980), a go/no-go test of motor inhibition, a psychophysical motion coherence threshold task, the Embedded Figures Test (Witkin, 1950, 1954) and a theory-of-mind (Wimmer & Perner, 1983) task. The game software automatically registers game stimuli and subjects' actions and responses in a log file, and sends event codes to synchronise with physiological data recorders. Thus the game can be combined with physiological measures such as EEG or fMRI, and with moment-to-moment tracking of gaze. Gaze tracking can verify subjects' compliance with behavioural tasks (e.g. fixation) and overt attention to experimental stimuli, and also physiological arousal as reflected in pupil dilation (Bradley et al., 2008). At great enough sampling frequencies, gaze tracking may also help assess covert attention as reflected in microsaccades - eye movements that are too small to foveate a new object, but are as rapid in onset and have the same relationship between angular distance and peak velocity as do saccades that traverse greater distances. The distribution of directions of microsaccades correlates with the (otherwise) covert direction of attention (Hafed & Clark, 2002).
From Single-Cell Dynamics to Scaling Laws in Oncology
NASA Astrophysics Data System (ADS)
Chignola, Roberto; Sega, Michela; Stella, Sabrina; Vyshemirsky, Vladislav; Milotti, Edoardo
We are developing a biophysical model of tumor biology. We follow a strictly quantitative approach where each step of model development is validated by comparing simulation outputs with experimental data. While this strategy may slow down our advancements, at the same time it provides an invaluable reward: we can trust simulation outputs and use the model to explore territories of cancer biology where current experimental techniques fail. Here, we review our multi-scale biophysical modeling approach and show how a description of cancer at the cellular level has led us to general laws obeyed by both in vitro and in vivo tumors.
Fracture Test Methods for Plastically Responding COPV Liners
NASA Technical Reports Server (NTRS)
Dawicke, David S.; Lewis, Joseph C.
2009-01-01
An experimental procedure for evaluating the validity of using uniaxial tests to provide a conservative bound on the fatigue crack growth rate behavior small cracks in bi-axially loaded Composite Overwrapped Pressure Vessel (COPV) liners is described. The experimental procedure included the use of a laser notch to quickly generate small surface fatigue cracks with the desired size and aspect ratios. An out-of-plane constraint system was designed to allow fully reversed, fully plastic testing of thin sheet uniaxial coupons. Finally, a method was developed to determine to initiate small cracks in the liner of COPVs.
Experimentally validated modification to Cook-Torrance BRDF model for improved accuracy
NASA Astrophysics Data System (ADS)
Butler, Samuel D.; Ethridge, James A.; Nauyoks, Stephen E.; Marciniak, Michael A.
2017-09-01
The BRDF describes optical scatter off realistic surfaces. The microfacet BRDF model assumes geometric optics but is computationally simple compared to wave optics models. In this work, MERL BRDF data is fitted to the original Cook-Torrance microfacet model, and a modified Cook-Torrance model using the polarization factor in place of the mathematically problematic cross section conversion and geometric attenuation terms. The results provide experimental evidence that this modified Cook-Torrance model leads to improved fits, particularly for large incident and scattered angles. These results are expected to lead to more accurate BRDF modeling for remote sensing.
Uncertainty Assessment of Hypersonic Aerothermodynamics Prediction Capability
NASA Technical Reports Server (NTRS)
Bose, Deepak; Brown, James L.; Prabhu, Dinesh K.; Gnoffo, Peter; Johnston, Christopher O.; Hollis, Brian
2011-01-01
The present paper provides the background of a focused effort to assess uncertainties in predictions of heat flux and pressure in hypersonic flight (airbreathing or atmospheric entry) using state-of-the-art aerothermodynamics codes. The assessment is performed for four mission relevant problems: (1) shock turbulent boundary layer interaction on a compression corner, (2) shock turbulent boundary layer interaction due a impinging shock, (3) high-mass Mars entry and aerocapture, and (4) high speed return to Earth. A validation based uncertainty assessment approach with reliance on subject matter expertise is used. A code verification exercise with code-to-code comparisons and comparisons against well established correlations is also included in this effort. A thorough review of the literature in search of validation experiments is performed, which identified a scarcity of ground based validation experiments at hypersonic conditions. In particular, a shortage of useable experimental data at flight like enthalpies and Reynolds numbers is found. The uncertainty was quantified using metrics that measured discrepancy between model predictions and experimental data. The discrepancy data is statistically analyzed and investigated for physics based trends in order to define a meaningful quantified uncertainty. The detailed uncertainty assessment of each mission relevant problem is found in the four companion papers.
Final Design and Experimental Validation of the Thermal Performance of the LHC Lattice Cryostats
NASA Astrophysics Data System (ADS)
Bourcey, N.; Capatina, O.; Parma, V.; Poncet, A.; Rohmig, P.; Serio, L.; Skoczen, B.; Tock, J.-P.; Williams, L. R.
2004-06-01
The recent commissioning and operation of the LHC String 2 have given a first experimental validation of the global thermal performance of the LHC lattice cryostat at nominal cryogenic conditions. The cryostat designed to minimize the heat inleak from ambient temperature, houses under vacuum and thermally protects the cold mass, which contains the LHC twin-aperture superconducting magnets operating at 1.9 K in superfluid helium. Mechanical components linking the cold mass to the vacuum vessel, such as support posts and insulation vacuum barriers are designed with efficient thermalisations for heat interception to minimise heat conduction. Heat inleak by radiation is reduced by employing multilayer insulation (MLI) wrapped around the cold mass and around an aluminium thermal shield cooled to about 60 K. Measurements of the total helium vaporization rate in String 2 gives, after substraction of supplementary heat loads and end effects, an estimate of the total thermal load to a standard LHC cell (107 m) including two Short Straight Sections and six dipole cryomagnets. Temperature sensors installed at critical locations provide a temperature mapping which allows validation of the calculated and estimated thermal performance of the cryostat components, including efficiency of the heat interceptions.
McAuliff, Bradley D; Kovera, Margaret Bull; Nunez, Gabriel
2009-06-01
This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Y; Rottmann, J; Myronakis, M
2016-06-15
Purpose: The purpose of this study was to validate the use of a cascaded linear system model for MV cone-beam CT (CBCT) using a multi-layer (MLI) electronic portal imaging device (EPID) and provide experimental insight into image formation. A validated 3D model provides insight into salient factors affecting reconstructed image quality, allowing potential for optimizing detector design for CBCT applications. Methods: A cascaded linear system model was developed to investigate the potential improvement in reconstructed image quality for MV CBCT using an MLI EPID. Inputs to the three-dimensional (3D) model include projection space MTF and NPS. Experimental validation was performedmore » on a prototype MLI detector installed on the portal imaging arm of a Varian TrueBeam radiotherapy system. CBCT scans of up to 898 projections over 360 degrees were acquired at exposures of 16 and 64 MU. Image volumes were reconstructed using a Feldkamp-type (FDK) filtered backprojection (FBP) algorithm. Flat field images and scans of a Catphan model 604 phantom were acquired. The effect of 2×2 and 4×4 detector binning was also examined. Results: Using projection flat fields as an input, examination of the modeled and measured NPS in the axial plane exhibits good agreement. Binning projection images was shown to improve axial slice SDNR by a factor of approximately 1.4. This improvement is largely driven by a decrease in image noise of roughly 20%. However, this effect is accompanied by a subsequent loss in image resolution. Conclusion: The measured axial NPS shows good agreement with the theoretical calculation using a linear system model. Binning of projection images improves SNR of large objects on the Catphan phantom by decreasing noise. Specific imaging tasks will dictate the implementation image binning to two-dimensional projection images. The project was partially supported by a grant from Varian Medical Systems, Inc. and grant No. R01CA188446-01 from the National Cancer Institute.« less
Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites
NASA Technical Reports Server (NTRS)
Turner, Travis L.
2001-01-01
This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.
A Validated Open-Source Multisolver Fourth-Generation Composite Femur Model.
MacLeod, Alisdair R; Rose, Hannah; Gill, Harinderjit S
2016-12-01
Synthetic biomechanical test specimens are frequently used for preclinical evaluation of implant performance, often in combination with numerical modeling, such as finite-element (FE) analysis. Commercial and freely available FE packages are widely used with three FE packages in particular gaining popularity: abaqus (Dassault Systèmes, Johnston, RI), ansys (ANSYS, Inc., Canonsburg, PA), and febio (University of Utah, Salt Lake City, UT). To the best of our knowledge, no study has yet made a comparison of these three commonly used solvers. Additionally, despite the femur being the most extensively studied bone in the body, no freely available validated model exists. The primary aim of the study was primarily to conduct a comparison of mesh convergence and strain prediction between the three solvers (abaqus, ansys, and febio) and to provide validated open-source models of a fourth-generation composite femur for use with all the three FE packages. Second, we evaluated the geometric variability around the femoral neck region of the composite femurs. Experimental testing was conducted using fourth-generation Sawbones® composite femurs instrumented with strain gauges at four locations. A generic FE model and four specimen-specific FE models were created from CT scans. The study found that the three solvers produced excellent agreement, with strain predictions being within an average of 3.0% for all the solvers (r2 > 0.99) and 1.4% for the two commercial codes. The average of the root mean squared error against the experimental results was 134.5% (r2 = 0.29) for the generic model and 13.8% (r2 = 0.96) for the specimen-specific models. It was found that composite femurs had variations in cortical thickness around the neck of the femur of up to 48.4%. For the first time, an experimentally validated, finite-element model of the femur is presented for use in three solvers. This model is freely available online along with all the supporting validation data.
Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner
NASA Astrophysics Data System (ADS)
Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.
2015-02-01
Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.
Nonlinear System Identification for Aeroelastic Systems with Application to Experimental Data
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.
2008-01-01
Representation and identification of a nonlinear aeroelastic pitch-plunge system as a model of the Nonlinear AutoRegressive, Moving Average eXogenous (NARMAX) class is considered. A nonlinear difference equation describing this aircraft model is derived theoretically and shown to be of the NARMAX form. Identification methods for NARMAX models are applied to aeroelastic dynamics and its properties demonstrated via continuous-time simulations of experimental conditions. Simulation results show that (1) the outputs of the NARMAX model closely match those generated using continuous-time methods, and (2) NARMAX identification methods applied to aeroelastic dynamics provide accurate discrete-time parameter estimates. Application of NARMAX identification to experimental pitch-plunge dynamics data gives a high percent fit for cross-validated data.
Collapse of a Liquid Column: Numerical Simulation and Experimental Validation
NASA Astrophysics Data System (ADS)
Cruchaga, Marcela A.; Celentano, Diego J.; Tezduyar, Tayfun E.
2007-03-01
This paper is focused on the numerical and experimental analyses of the collapse of a liquid column. The measurements of the interface position in a set of experiments carried out with shampoo and water for two different initial column aspect ratios are presented together with the corresponding numerical predictions. The experimental procedure was found to provide acceptable recurrence in the observation of the interface evolution. Basic models describing some of the relevant physical aspects, e.g. wall friction and turbulence, are included in the simulations. Numerical experiments are conducted to evaluate the influence of the parameters involved in the modeling by comparing the results with the data from the measurements. The numerical predictions reasonably describe the physical trends.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, B; Keall, P; Holloway, L
Purpose: MRI guided radiation therapy (MRIgRT) is a rapidly growing field; however, Linac operation in MRI fringe fields represents an ongoing challenge. We have previously shown in-silico that Linacs could be redesigned to function in the in-line orientation with no magnetic shielding by adopting an RF-gun configuration. Other authors have also published insilico studies of Linac operation in magnetic fields; however to date no experimental validation data is published. This work details the design, construction, and installation of an experimental beam line to validate our in-silico results. Methods: An RF-gun comprising 1.5 accelerating cells and capable of generating electron energiesmore » up to 3.2MeV is used. The experimental apparatus was designed to monitor both beam current (toroid current monitor), spot size (two phosphor screens with viewports), and generate peak magnetic fields of at least 1000G (three variable current electromagnetic coils). Thermal FEM simulations were developed to ensure coil temperature remained within 100degC. Other design considerations included beam disposal, vacuum maintenance, radiation shielding, earthquake safety, and machine protection interlocks. Results: The beam line has been designed, built, and installed in a radiation shielded bunker. Water cooling, power supplies, thermo-couples, cameras, and radiation shielding have been successfully connected and tested. Interlock testing, vacuum processing, and RF processing have been successfully completed. The first beam on is expected within weeks. The coil heating simulations show that with care, peak fields of up to 1200G (320G at cathode) can be produced using 40A current, which is well within the fields expected for MRI-Linac systems. The maximum coil temperature at this current was 84degC after 6 minutes. Conclusion: An experimental beam line has been constructed and installed at SLAC in order to experimentally characterise RF gun performance in in-line magnetic fields, validate in-silico design work, and provide the first published experimental data relating to accelerator functionality for MRIgRT.« less
2018-01-01
Although it is becoming increasingly popular to monitor parameters related to training, recovery, and health with wearable sensor technology (wearables), scientific evaluation of the reliability, sensitivity, and validity of such data is limited and, where available, has involved a wide variety of approaches. To improve the trustworthiness of data collected by wearables and facilitate comparisons, we have outlined recommendations for standardized evaluation. We discuss the wearable devices themselves, as well as experimental and statistical considerations. Adherence to these recommendations should be beneficial not only for the individual, but also for regulatory organizations and insurance companies. PMID:29712629
Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment
NASA Technical Reports Server (NTRS)
Schwartz, Richard J.; McCrea, Andrew C.
2009-01-01
The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.
Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment
NASA Technical Reports Server (NTRS)
Schwartz, Richard J.; McCrea, Andrew C.
2010-01-01
The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.
Systematic Validation of Protein Force Fields against Experimental Data
Eastwood, Michael P.; Dror, Ron O.; Shaw, David E.
2012-01-01
Molecular dynamics simulations provide a vehicle for capturing the structures, motions, and interactions of biological macromolecules in full atomic detail. The accuracy of such simulations, however, is critically dependent on the force field—the mathematical model used to approximate the atomic-level forces acting on the simulated molecular system. Here we present a systematic and extensive evaluation of eight different protein force fields based on comparisons of experimental data with molecular dynamics simulations that reach a previously inaccessible timescale. First, through extensive comparisons with experimental NMR data, we examined the force fields' abilities to describe the structure and fluctuations of folded proteins. Second, we quantified potential biases towards different secondary structure types by comparing experimental and simulation data for small peptides that preferentially populate either helical or sheet-like structures. Third, we tested the force fields' abilities to fold two small proteins—one α-helical, the other with β-sheet structure. The results suggest that force fields have improved over time, and that the most recent versions, while not perfect, provide an accurate description of many structural and dynamical properties of proteins. PMID:22384157
Experimental statistical signature of many-body quantum interference
NASA Astrophysics Data System (ADS)
Giordani, Taira; Flamini, Fulvio; Pompili, Matteo; Viggianiello, Niko; Spagnolo, Nicolò; Crespi, Andrea; Osellame, Roberto; Wiebe, Nathan; Walschaers, Mattia; Buchleitner, Andreas; Sciarrino, Fabio
2018-03-01
Multi-particle interference is an essential ingredient for fundamental quantum mechanics phenomena and for quantum information processing to provide a computational advantage, as recently emphasized by boson sampling experiments. Hence, developing a reliable and efficient technique to witness its presence is pivotal in achieving the practical implementation of quantum technologies. Here, we experimentally identify genuine many-body quantum interference via a recent efficient protocol, which exploits statistical signatures at the output of a multimode quantum device. We successfully apply the test to validate three-photon experiments in an integrated photonic circuit, providing an extensive analysis on the resources required to perform it. Moreover, drawing upon established techniques of machine learning, we show how such tools help to identify the—a priori unknown—optimal features to witness these signatures. Our results provide evidence on the efficacy and feasibility of the method, paving the way for its adoption in large-scale implementations.
Schnoes, Alexandra M.; Ream, David C.; Thorman, Alexander W.; Babbitt, Patricia C.; Friedberg, Iddo
2013-01-01
The ongoing functional annotation of proteins relies upon the work of curators to capture experimental findings from scientific literature and apply them to protein sequence and structure data. However, with the increasing use of high-throughput experimental assays, a small number of experimental studies dominate the functional protein annotations collected in databases. Here, we investigate just how prevalent is the “few articles - many proteins” phenomenon. We examine the experimentally validated annotation of proteins provided by several groups in the GO Consortium, and show that the distribution of proteins per published study is exponential, with 0.14% of articles providing the source of annotations for 25% of the proteins in the UniProt-GOA compilation. Since each of the dominant articles describes the use of an assay that can find only one function or a small group of functions, this leads to substantial biases in what we know about the function of many proteins. Mass-spectrometry, microscopy and RNAi experiments dominate high throughput experiments. Consequently, the functional information derived from these experiments is mostly of the subcellular location of proteins, and of the participation of proteins in embryonic developmental pathways. For some organisms, the information provided by different studies overlap by a large amount. We also show that the information provided by high throughput experiments is less specific than those provided by low throughput experiments. Given the experimental techniques available, certain biases in protein function annotation due to high-throughput experiments are unavoidable. Knowing that these biases exist and understanding their characteristics and extent is important for database curators, developers of function annotation programs, and anyone who uses protein function annotation data to plan experiments. PMID:23737737
Caselli, Federica; Bisegna, Paolo
2017-10-01
The performance of a novel microfluidic impedance cytometer (MIC) with coplanar configuration is investigated in silico. The main feature of the device is the ability to provide accurate particle-sizing despite the well-known measurement sensitivity to particle trajectory. The working principle of the device is presented and validated by means of an original virtual laboratory providing close-to-experimental synthetic data streams. It is shown that a metric correlating with particle trajectory can be extracted from the signal traces and used to compensate the trajectory-induced error in the estimated particle size, thus reaching high-accuracy. An analysis of relevant parameters of the experimental setup is also presented. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo
2018-05-01
Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.
Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo
2018-05-14
Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.
MO-AB-BRA-02: A Novel Scatter Imaging Modality for Real-Time Image Guidance During Lung SBRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redler, G; Bernard, D; Templeton, A
2015-06-15
Purpose: A novel scatter imaging modality is developed and its feasibility for image-guided radiation therapy (IGRT) during stereotactic body radiation therapy (SBRT) for lung cancer patients is assessed using analytic and Monte Carlo models as well as experimental testing. Methods: During treatment, incident radiation interacts and scatters from within the patient. The presented methodology forms an image of patient anatomy from the scattered radiation for real-time localization of the treatment target. A radiographic flat panel-based pinhole camera provides spatial information regarding the origin of detected scattered radiation. An analytical model is developed, which provides a mathematical formalism for describing themore » scatter imaging system. Experimental scatter images are acquired by irradiating an object using a Varian TrueBeam accelerator. The differentiation between tissue types is investigated by imaging simple objects of known compositions (water, lung, and cortical bone equivalent). A lung tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is fabricated and imaged to investigate image quality for various quantities of delivered radiation. Monte Carlo N-Particle (MCNP) code is used for validation and testing by simulating scatter image formation using the experimental pinhole camera setup. Results: Analytical calculations, MCNP simulations, and experimental results when imaging the water, lung, and cortical bone equivalent objects show close agreement, thus validating the proposed models and demonstrating that scatter imaging differentiates these materials well. Lung tumor phantom images have sufficient contrast-to-noise ratio (CNR) to clearly distinguish tumor from surrounding lung tissue. CNR=4.1 and CNR=29.1 for 10MU and 5000MU images (equivalent to 0.5 and 250 second images), respectively. Conclusion: Lung SBRT provides favorable treatment outcomes, but depends on accurate target localization. A comprehensive approach, employing multiple simulation techniques and experiments, is taken to demonstrate the feasibility of a novel scatter imaging modality for the necessary real-time image guidance.« less
Tojo, H; Yamada, I; Yasuhara, R; Ejiri, A; Hiratsuka, J; Togashi, H; Yatsuka, E; Hatae, T; Funaba, H; Hayashi, H; Takase, Y; Itami, K
2016-09-01
This paper evaluates the accuracy of electron temperature measurements and relative transmissivities of double-pass Thomson scattering diagnostics. The electron temperature (T e ) is obtained from the ratio of signals from a double-pass scattering system, then relative transmissivities are calculated from the measured T e and intensity of the signals. How accurate the values are depends on the electron temperature (T e ) and scattering angle (θ), and therefore the accuracy of the values was evaluated experimentally using the Large Helical Device (LHD) and the Tokyo spherical tokamak-2 (TST-2). Analyzing the data from the TST-2 indicates that a high T e and a large scattering angle (θ) yield accurate values. Indeed, the errors for scattering angle θ = 135° are approximately half of those for θ = 115°. The method of determining the T e in a wide T e range spanning over two orders of magnitude (0.01-1.5 keV) was validated using the experimental results of the LHD and TST-2. A simple method to provide relative transmissivities, which include inputs from collection optics, vacuum window, optical fibers, and polychromators, is also presented. The relative errors were less than approximately 10%. Numerical simulations also indicate that the T e measurements are valid under harsh radiation conditions. This method to obtain T e can be considered for the design of Thomson scattering systems where there is high-performance plasma that generates harsh radiation environments.
CANEapp: a user-friendly application for automated next generation transcriptomic data analysis.
Velmeshev, Dmitry; Lally, Patrick; Magistri, Marco; Faghihi, Mohammad Ali
2016-01-13
Next generation sequencing (NGS) technologies are indispensable for molecular biology research, but data analysis represents the bottleneck in their application. Users need to be familiar with computer terminal commands, the Linux environment, and various software tools and scripts. Analysis workflows have to be optimized and experimentally validated to extract biologically meaningful data. Moreover, as larger datasets are being generated, their analysis requires use of high-performance servers. To address these needs, we developed CANEapp (application for Comprehensive automated Analysis of Next-generation sequencing Experiments), a unique suite that combines a Graphical User Interface (GUI) and an automated server-side analysis pipeline that is platform-independent, making it suitable for any server architecture. The GUI runs on a PC or Mac and seamlessly connects to the server to provide full GUI control of RNA-sequencing (RNA-seq) project analysis. The server-side analysis pipeline contains a framework that is implemented on a Linux server through completely automated installation of software components and reference files. Analysis with CANEapp is also fully automated and performs differential gene expression analysis and novel noncoding RNA discovery through alternative workflows (Cuffdiff and R packages edgeR and DESeq2). We compared CANEapp to other similar tools, and it significantly improves on previous developments. We experimentally validated CANEapp's performance by applying it to data derived from different experimental paradigms and confirming the results with quantitative real-time PCR (qRT-PCR). CANEapp adapts to any server architecture by effectively using available resources and thus handles large amounts of data efficiently. CANEapp performance has been experimentally validated on various biological datasets. CANEapp is available free of charge at http://psychiatry.med.miami.edu/research/laboratory-of-translational-rna-genomics/CANE-app . We believe that CANEapp will serve both biologists with no computational experience and bioinformaticians as a simple, timesaving but accurate and powerful tool to analyze large RNA-seq datasets and will provide foundations for future development of integrated and automated high-throughput genomics data analysis tools. Due to its inherently standardized pipeline and combination of automated analysis and platform-independence, CANEapp is an ideal for large-scale collaborative RNA-seq projects between different institutions and research groups.
NASA Astrophysics Data System (ADS)
Xu, Huixia; Zhang, Lijun; Cheng, Kaiming; Chen, Weimin; Du, Yong
2017-04-01
To establish an accurate atomic mobility database in solder alloys, a reassessment of atomic mobilities in the fcc (face centered cubic) Cu-Ag-Sn system was performed as reported in the present work. The work entailed initial preparation of three fcc Cu-Sn diffusion couples, which were used to determine the composition-dependent interdiffusivities at 873 K, 923 K, and 973 K, to validate the literature data and provide new experimental data at low temperatures. Then, atomic mobilities in three boundary binaries, fcc Cu-Sn, fcc Ag-Sn, and fcc Cu-Ag, were updated based on the data for various experimental diffusivities obtained from the literature and the present work, together with the available thermodynamic database for solder alloys. Finally, based on the large number of interdiffusivities recently measured from the present authors, atomic mobilities in the fcc Cu-Ag-Sn ternary system were carefully evaluated. A comprehensive comparison between various calculated/model-predicted diffusion properties and the experimental data was used to validate the reliability of the obtained atomic mobilities in ternary fcc Cu-Ag-Sn alloys.
NASA Astrophysics Data System (ADS)
Freeman, A. J.; Yu, Jaejun
1990-04-01
For years, there has been controversy on whether the normal state of the Cu-oxide superconductors is a Fermi liquid or some other exotic ground state. However, some experimentalists are clarifying the nature of the normal state of the high T(sub c) superconductors by surmounting the experimental difficulties in producing clean, well characterized surfaces so as to obtain meaningful high resolved photoemission data, which agrees with earlier positron-annihilation experiments. The experimental work on high resolution angle resolved photoemission by Campuzano et al. and positron-annihilation studies by Smedskjaer et al. has verified the calculated Fermi surfaces in YBa2Cu3O7 superconductors and has provided evidence for the validity of the energy band approach. Similar good agreement was found for Bi2Sr2CaCu2O8 by Olson et al. As a Fermi liquid (metallic) nature of the normal state of the high T(sub c) superconductors becomes evident, these experimental observations have served to confirm the predictions of the local density functional calculations and hence the energy band approach as a valid natural starting point for further studies of their superconductivity.
NASA Technical Reports Server (NTRS)
Freeman, A. J.; Yu, Jaejun
1990-01-01
For years, there has been controversy on whether the normal state of the Cu-oxide superconductors is a Fermi liquid or some other exotic ground state. However, some experimentalists are clarifying the nature of the normal state of the high T(sub c) superconductors by surmounting the experimental difficulties in producing clean, well characterized surfaces so as to obtain meaningful high resolved photoemission data, which agrees with earlier positron-annihilation experiments. The experimental work on high resolution angle resolved photoemission by Campuzano et al. and positron-annihilation studies by Smedskjaer et al. has verified the calculated Fermi surfaces in YBa2Cu3O7 superconductors and has provided evidence for the validity of the energy band approach. Similar good agreement was found for Bi2Sr2CaCu2O8 by Olson et al. As a Fermi liquid (metallic) nature of the normal state of the high T(sub c) superconductors becomes evident, these experimental observations have served to confirm the predictions of the local density functional calculations and hence the energy band approach as a valid natural starting point for further studies of their superconductivity.
Kobayashi, T.; Itoh, K.; Ido, T.; Kamiya, K.; Itoh, S.-I.; Miura, Y.; Nagashima, Y.; Fujisawa, A.; Inagaki, S.; Ida, K.; Hoshino, K.
2016-01-01
Self-regulation between structure and turbulence, which is a fundamental process in the complex system, has been widely regarded as one of the central issues in modern physics. A typical example of that in magnetically confined plasmas is the Low confinement mode to High confinement mode (L-H) transition, which is intensely studied for more than thirty years since it provides a confinement improvement necessary for the realization of the fusion reactor. An essential issue in the L-H transition physics is the mechanism of the abrupt “radial” electric field generation in toroidal plasmas. To date, several models for the L-H transition have been proposed but the systematic experimental validation is still challenging. Here we report the systematic and quantitative model validations of the radial electric field excitation mechanism for the first time, using a data set of the turbulence and the radial electric field having a high spatiotemporal resolution. Examining time derivative of Poisson’s equation, the sum of the loss-cone loss current and the neoclassical bulk viscosity current is found to behave as the experimentally observed radial current that excites the radial electric field within a few factors of magnitude. PMID:27489128
Validation of a C2-C7 cervical spine finite element model using specimen-specific flexibility data.
Kallemeyn, Nicole; Gandhi, Anup; Kode, Swathi; Shivanna, Kiran; Smucker, Joseph; Grosland, Nicole
2010-06-01
This study presents a specimen-specific C2-C7 cervical spine finite element model that was developed using multiblock meshing techniques. The model was validated using in-house experimental flexibility data obtained from the cadaveric specimen used for mesh development. The C2-C7 specimen was subjected to pure continuous moments up to +/-1.0 N m in flexion, extension, lateral bending, and axial rotation, and the motions at each level were obtained. Additionally, the specimen was divided into C2-C3, C4-C5, and C6-C7 functional spinal units (FSUs) which were tested in the intact state as well as after sequential removal of the interspinous, ligamentum flavum, and capsular ligaments. The finite element model was initially assigned baseline material properties based on the literature, but was calibrated using the experimental motion data which was obtained in-house, while utlizing the ranges of material property values as reported in the literature. The calibrated model provided good agreement with the nonlinear experimental loading curves, and can be used to further study the response of the cervical spine to various biomechanical investigations. Copyright 2010 IPEM. Published by Elsevier Ltd. All rights reserved.
Alonso-Torres, Beatriz; Hernández-Pérez, José Alfredo; Sierra-Espinoza, Fernando; Schenker, Stefan; Yeretzian, Chahan
2013-01-01
Heat and mass transfer in individual coffee beans during roasting were simulated using computational fluid dynamics (CFD). Numerical equations for heat and mass transfer inside the coffee bean were solved using the finite volume technique in the commercial CFD code Fluent; the software was complemented with specific user-defined functions (UDFs). To experimentally validate the numerical model, a single coffee bean was placed in a cylindrical glass tube and roasted by a hot air flow, using the identical geometrical 3D configuration and hot air flow conditions as the ones used for numerical simulations. Temperature and humidity calculations obtained with the model were compared with experimental data. The model predicts the actual process quite accurately and represents a useful approach to monitor the coffee roasting process in real time. It provides valuable information on time-resolved process variables that are otherwise difficult to obtain experimentally, but critical to a better understanding of the coffee roasting process at the individual bean level. This includes variables such as time-resolved 3D profiles of bean temperature and moisture content, and temperature profiles of the roasting air in the vicinity of the coffee bean.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Xingye; Hu, Bin; Wei, Changdong
Lanthanum zirconate (La2Zr2O7) is a promising candidate material for thermal barrier coating (TBC) applications due to its low thermal conductivity and high-temperature phase stability. In this work, a novel image-based multi-scale simulation framework combining molecular dynamics (MD) and finite element (FE) calculations is proposed to study the thermal conductivity of La2Zr2O7 coatings. Since there is no experimental data of single crystal La2Zr2O7 thermal conductivity, a reverse non-equilibrium molecular dynamics (reverse NEMD) approach is first employed to compute the temperature-dependent thermal conductivity of single crystal La2Zr2O7. The single crystal data is then passed to a FE model which takes into accountmore » of realistic thermal barrier coating microstructures. The predicted thermal conductivities from the FE model are in good agreement with experimental validations using both flash laser technique and pulsed thermal imaging-multilayer analysis. The framework proposed in this work provides a powerful tool for future design of advanced coating systems. (C) 2016 Elsevier Ltd. All rights reserved.« less
Černý, Jiří; Schneider, Bohdan; Biedermannová, Lada
2017-07-14
Water molecules represent an integral part of proteins and a key determinant of protein structure, dynamics and function. WatAA is a newly developed, web-based atlas of amino-acid hydration in proteins. The atlas provides information about the ordered first hydration shell of the most populated amino-acid conformers in proteins. The data presented in the atlas are drawn from two sources: experimental data and ab initio quantum-mechanics calculations. The experimental part is based on a data-mining study of a large set of high-resolution protein crystal structures. The crystal-derived data include 3D maps of water distribution around amino-acids and probability of occurrence of each of the identified hydration sites. The quantum mechanics calculations validate and extend this primary description by optimizing the water position for each hydration site, by providing hydrogen atom positions and by quantifying the interaction energy that stabilizes the water molecule at the particular hydration site position. The calculations show that the majority of experimentally derived hydration sites are positioned near local energy minima for water, and the calculated interaction energies help to assess the preference of water for the individual hydration sites. We propose that the atlas can be used to validate water placement in electron density maps in crystallographic refinement, to locate water molecules mediating protein-ligand interactions in drug design, and to prepare and evaluate molecular dynamics simulations. WatAA: Atlas of Protein Hydration is freely available without login at .
NASA Astrophysics Data System (ADS)
Lumentut, M. F.; Howard, I. M.
2013-03-01
Power harvesters that extract energy from vibrating systems via piezoelectric transduction show strong potential for powering smart wireless sensor devices in applications of health condition monitoring of rotating machinery and structures. This paper presents an analytical method for modelling an electromechanical piezoelectric bimorph beam with tip mass under two input base transverse and longitudinal excitations. The Euler-Bernoulli beam equations were used to model the piezoelectric bimorph beam. The polarity-electric field of the piezoelectric element is excited by the strain field caused by base input excitation, resulting in electrical charge. The governing electromechanical dynamic equations were derived analytically using the weak form of the Hamiltonian principle to obtain the constitutive equations. Three constitutive electromechanical dynamic equations based on independent coefficients of virtual displacement vectors were formulated and then further modelled using the normalised Ritz eigenfunction series. The electromechanical formulations include both the series and parallel connections of the piezoelectric bimorph. The multi-mode frequency response functions (FRFs) under varying electrical load resistance were formulated using Laplace transformation for the multi-input mechanical vibrations to provide the multi-output dynamic displacement, velocity, voltage, current and power. The experimental and theoretical validations reduced for the single mode system were shown to provide reasonable predictions. The model results from polar base excitation for off-axis input motions were validated with experimental results showing the change to the electrical power frequency response amplitude as a function of excitation angle, with relevance for practical implementation.
A MPPT Algorithm Based PV System Connected to Single Phase Voltage Controlled Grid
NASA Astrophysics Data System (ADS)
Sreekanth, G.; Narender Reddy, N.; Durga Prasad, A.; Nagendrababu, V.
2012-10-01
Future ancillary services provided by photovoltaic (PV) systems could facilitate their penetration in power systems. In addition, low-power PV systems can be designed to improve the power quality. This paper presents a single-phase PV systemthat provides grid voltage support and compensation of harmonic distortion at the point of common coupling thanks to a repetitive controller. The power provided by the PV panels is controlled by a Maximum Power Point Tracking algorithm based on the incremental conductance method specifically modified to control the phase of the PV inverter voltage. Simulation and experimental results validate the presented solution.
Rationales and Approaches for Studying Metabolism in Eukaryotic Microalgae
Veyel, Daniel; Erban, Alexander; Fehrle, Ines; Kopka, Joachim; Schroda, Michael
2014-01-01
The generation of efficient production strains is essential for the use of eukaryotic microalgae for biofuel production. Systems biology approaches including metabolite profiling on promising microalgal strains, will provide a better understanding of their metabolic networks, which is crucial for metabolic engineering efforts. Chlamydomonas reinhardtii represents a suited model system for this purpose. We give an overview to genetically amenable microalgal strains with the potential for biofuel production and provide a critical review of currently used protocols for metabolite profiling on Chlamydomonas. We provide our own experimental data to underpin the validity of the conclusions drawn. PMID:24957022
Hybrid, experimental and computational, investigation of mechanical components
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1996-07-01
Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.
Experimental Definition and Validation of Protein Coding Transcripts in Chlamydomonas reinhardtii
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kourosh Salehi-Ashtiani; Jason A. Papin
Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnectedmore » metabolic networks to optimize production of the compounds of interest. Using Chlamydomonas reinhardtii as a model, we developed a systems-level methodology bridging metabolic network reconstruction with annotation and experimental verification of enzyme encoding open reading frames. We reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. Our approach to generate a predictive metabolic model integrated with cloned open reading frames, provides a cost-effective platform to generate metabolic engineering resources. While the generated resources are specific to algal systems, the approach that we have developed is not specific to algae and can be readily expanded to other microbial systems as well as higher plants and animals.« less
Aerodynamic Validation of Emerging Projectile Configurations
2011-12-01
was benchmarked against modern aerodynamic prediction programs like ANSYS CFX and Aero-Prediction 09 (AP09). Next, a comparison was made between two...types of angle of attack generation methods in ANSYS CFX . The research then focused on controlled tilting of the projectile’s nose to investigate the...resulting aerodynamic effects. ANSYS CFX was found to provide better agreement with the experimental data than AP09. 14. SUBJECT
Accelerated lattice Boltzmann model for colloidal suspensions rheology and interface morphology
NASA Astrophysics Data System (ADS)
Farhat, Hassan
Colloids are ubiquitous in the food, medical, cosmetic, polymer, water purification and pharmaceutical industries. Colloids thermal, mechanical and storage properties are highly dependent on their interface morphology and their rheological behavior. Numerical methods provide a cheap and reliable virtual laboratory for the study of colloids. However efficiency is a major concern to address when using numerical methods for practical applications. This work introduces the main building-blocks for an improved lattice Boltzmann-based numerical tool designed for the study of colloidal rheology and interface morphology. The efficiency of the proposed model is enhanced by using the recently developed and validated migrating multi-block algorithms for the lattice Boltzmann method (LBM). The migrating multi-block was used to simulate single component, multi-component, multiphase and single component multiphase flows. Results were validated by experimental, numerical and analytical solutions. The contamination of the fluid-fluid interface influences the colloids morphology. This issue was addressed by the introduction of the hybrid LBM for surfactant-covered droplets. The module was used for the simulation of surfactant-covered droplet deformation under shear and uniaxial extensional flows respectively and under buoyancy. Validation with experimental and theoretical results was provided. Colloids are non-Newtonian fluids which exhibit rich rheological behavior. The suppression of coalescence module is the part of the proposed model which facilitates the study of colloids rheology. The model results for the relative viscosity were in agreement with some theoretical results. Biological suspensions such as blood are macro-colloids by nature. The study of the blood flow in the microvasculature was heuristically approached by assuming the red blood cells as surfactant covered droplets. The effects of interfacial tension on the flow velocity and the droplet exclusion from the walls in parabolic flows were in qualitative agreement with some experimental and numerical results. The Fahraeus and the Fahraeus-Lindqvist effects were reproduced. The proposed LBM model provides a flexible numerical platform consisting of various modules which could be used separately or in combination for the study of a variety of colloids and biological suspensions flow deformation problems.
Integrated computational model of the bioenergetics of isolated lung mitochondria
Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855
Integrated computational model of the bioenergetics of isolated lung mitochondria.
Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.
NASA Astrophysics Data System (ADS)
Gobbato, Maurizio; Kosmatka, John B.; Conte, Joel P.
2014-04-01
Fatigue-induced damage is one of the most uncertain and highly unpredictable failure mechanisms for a large variety of mechanical and structural systems subjected to cyclic and random loads during their service life. A health monitoring system capable of (i) monitoring the critical components of these systems through non-destructive evaluation (NDE) techniques, (ii) assessing their structural integrity, (iii) recursively predicting their remaining fatigue life (RFL), and (iv) providing a cost-efficient reliability-based inspection and maintenance plan (RBIM) is therefore ultimately needed. In contribution to these objectives, the first part of the paper provides an overview and extension of a comprehensive reliability-based fatigue damage prognosis methodology — previously developed by the authors — for recursively predicting and updating the RFL of critical structural components and/or sub-components in aerospace structures. In the second part of the paper, a set of experimental fatigue test data, available in the literature, is used to provide a numerical verification and an experimental validation of the proposed framework at the reliability component level (i.e., single damage mechanism evolving at a single damage location). The results obtained from this study demonstrate (i) the importance and the benefits of a nearly continuous NDE monitoring system, (ii) the efficiency of the recursive Bayesian updating scheme, and (iii) the robustness of the proposed framework in recursively updating and improving the RFL estimations. This study also demonstrates that the proposed methodology can lead to either an extent of the RFL (with a consequent economical gain without compromising the minimum safety requirements) or an increase of safety by detecting a premature fault and therefore avoiding a very costly catastrophic failure.
Developing a predictive model for the chemical composition of soot nanoparticles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Violi, Angela; Michelsen, Hope; Hansen, Nils
In order to provide the scientific foundation to enable technology breakthroughs in transportation fuel, it is important to develop a combustion modeling capability to optimize the operation and design of evolving fuels in advanced engines for transportation applications. The goal of this proposal is to develop a validated predictive model to describe the chemical composition of soot nanoparticles in premixed and diffusion flames. Atomistic studies in conjunction with state-of-the-art experiments are the distinguishing characteristics of this unique interdisciplinary effort. The modeling effort has been conducted at the University of Michigan by Prof. A. Violi. The experimental work has entailed amore » series of studies using different techniques to analyze gas-phase soot precursor chemistry and soot particle production in premixed and diffusion flames. Measurements have provided spatial distributions of polycyclic aromatic hydrocarbons and other gas-phase species and size and composition of incipient soot nanoparticles for comparison with model results. The experimental team includes Dr. N. Hansen and H. Michelsen at Sandia National Labs' Combustion Research Facility, and Dr. K. Wilson as collaborator at Lawrence Berkeley National Lab's Advanced Light Source. Our results show that the chemical and physical properties of nanoparticles affect the coagulation behavior in soot formation, and our results on an experimentally validated, predictive model for the chemical composition of soot nanoparticles will not only enhance our understanding of soot formation since but will also allow the prediction of particle size distributions under combustion conditions. These results provide a novel description of soot formation based on physical and chemical properties of the particles for use in the next generation of soot models and an enhanced capability for facilitating the design of alternative fuels and the engines they will power.« less
Follicle Online: an integrated database of follicle assembly, development and ovulation.
Hua, Juan; Xu, Bo; Yang, Yifan; Ban, Rongjun; Iqbal, Furhan; Cooke, Howard J; Zhang, Yuanwei; Shi, Qinghua
2015-01-01
Folliculogenesis is an important part of ovarian function as it provides the oocytes for female reproductive life. Characterizing genes/proteins involved in folliculogenesis is fundamental for understanding the mechanisms associated with this biological function and to cure the diseases associated with folliculogenesis. A large number of genes/proteins associated with folliculogenesis have been identified from different species. However, no dedicated public resource is currently available for folliculogenesis-related genes/proteins that are validated by experiments. Here, we are reporting a database 'Follicle Online' that provides the experimentally validated gene/protein map of the folliculogenesis in a number of species. Follicle Online is a web-based database system for storing and retrieving folliculogenesis-related experimental data. It provides detailed information for 580 genes/proteins (from 23 model organisms, including Homo sapiens, Mus musculus, Rattus norvegicus, Mesocricetus auratus, Bos Taurus, Drosophila and Xenopus laevis) that have been reported to be involved in folliculogenesis, POF (premature ovarian failure) and PCOS (polycystic ovary syndrome). The literature was manually curated from more than 43,000 published articles (till 1 March 2014). The Follicle Online database is implemented in PHP + MySQL + JavaScript and this user-friendly web application provides access to the stored data. In summary, we have developed a centralized database that provides users with comprehensive information about genes/proteins involved in folliculogenesis. This database can be accessed freely and all the stored data can be viewed without any registration. Database URL: http://mcg.ustc.edu.cn/sdap1/follicle/index.php © The Author(s) 2015. Published by Oxford University Press.
Follicle Online: an integrated database of follicle assembly, development and ovulation
Hua, Juan; Xu, Bo; Yang, Yifan; Ban, Rongjun; Iqbal, Furhan; Zhang, Yuanwei; Shi, Qinghua
2015-01-01
Folliculogenesis is an important part of ovarian function as it provides the oocytes for female reproductive life. Characterizing genes/proteins involved in folliculogenesis is fundamental for understanding the mechanisms associated with this biological function and to cure the diseases associated with folliculogenesis. A large number of genes/proteins associated with folliculogenesis have been identified from different species. However, no dedicated public resource is currently available for folliculogenesis-related genes/proteins that are validated by experiments. Here, we are reporting a database ‘Follicle Online’ that provides the experimentally validated gene/protein map of the folliculogenesis in a number of species. Follicle Online is a web-based database system for storing and retrieving folliculogenesis-related experimental data. It provides detailed information for 580 genes/proteins (from 23 model organisms, including Homo sapiens, Mus musculus, Rattus norvegicus, Mesocricetus auratus, Bos Taurus, Drosophila and Xenopus laevis) that have been reported to be involved in folliculogenesis, POF (premature ovarian failure) and PCOS (polycystic ovary syndrome). The literature was manually curated from more than 43 000 published articles (till 1 March 2014). The Follicle Online database is implemented in PHP + MySQL + JavaScript and this user-friendly web application provides access to the stored data. In summary, we have developed a centralized database that provides users with comprehensive information about genes/proteins involved in folliculogenesis. This database can be accessed freely and all the stored data can be viewed without any registration. Database URL: http://mcg.ustc.edu.cn/sdap1/follicle/index.php PMID:25931457
Design and Experimental Validation for Direct-Drive Fault-Tolerant Permanent-Magnet Vernier Machines
Liu, Guohai; Yang, Junqin; Chen, Ming; Chen, Qian
2014-01-01
A fault-tolerant permanent-magnet vernier (FT-PMV) machine is designed for direct-drive applications, incorporating the merits of high torque density and high reliability. Based on the so-called magnetic gearing effect, PMV machines have the ability of high torque density by introducing the flux-modulation poles (FMPs). This paper investigates the fault-tolerant characteristic of PMV machines and provides a design method, which is able to not only meet the fault-tolerant requirements but also keep the ability of high torque density. The operation principle of the proposed machine has been analyzed. The design process and optimization are presented specifically, such as the combination of slots and poles, the winding distribution, and the dimensions of PMs and teeth. By using the time-stepping finite element method (TS-FEM), the machine performances are evaluated. Finally, the FT-PMV machine is manufactured, and the experimental results are presented to validate the theoretical analysis. PMID:25045729
MATLAB/Simulink Pulse-Echo Ultrasound System Simulator Based on Experimentally Validated Models.
Kim, Taehoon; Shin, Sangmin; Lee, Hyongmin; Lee, Hyunsook; Kim, Heewon; Shin, Eunhee; Kim, Suhwan
2016-02-01
A flexible clinical ultrasound system must operate with different transducers, which have characteristic impulse responses and widely varying impedances. The impulse response determines the shape of the high-voltage pulse that is transmitted and the specifications of the front-end electronics that receive the echo; the impedance determines the specification of the matching network through which the transducer is connected. System-level optimization of these subsystems requires accurate modeling of pulse-echo (two-way) response, which in turn demands a unified simulation of the ultrasonics and electronics. In this paper, this is realized by combining MATLAB/Simulink models of the high-voltage transmitter, the transmission interface, the acoustic subsystem which includes wave propagation and reflection, the receiving interface, and the front-end receiver. To demonstrate the effectiveness of our simulator, the models are experimentally validated by comparing the simulation results with the measured data from a commercial ultrasound system. This simulator could be used to quickly provide system-level feedback for an optimized tuning of electronic design parameters.
Quasi-experimental study designs series-paper 6: risk of bias assessment.
Waddington, Hugh; Aloe, Ariel M; Becker, Betsy Jane; Djimeu, Eric W; Hombrados, Jorge Garcia; Tugwell, Peter; Wells, George; Reeves, Barney
2017-09-01
Rigorous and transparent bias assessment is a core component of high-quality systematic reviews. We assess modifications to existing risk of bias approaches to incorporate rigorous quasi-experimental approaches with selection on unobservables. These are nonrandomized studies using design-based approaches to control for unobservable sources of confounding such as difference studies, instrumental variables, interrupted time series, natural experiments, and regression-discontinuity designs. We review existing risk of bias tools. Drawing on these tools, we present domains of bias and suggest directions for evaluation questions. The review suggests that existing risk of bias tools provide, to different degrees, incomplete transparent criteria to assess the validity of these designs. The paper then presents an approach to evaluating the internal validity of quasi-experiments with selection on unobservables. We conclude that tools for nonrandomized studies of interventions need to be further developed to incorporate evaluation questions for quasi-experiments with selection on unobservables. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chauvin, J. P.; Lebrat, J. F.; Soule, R.
Since 1991, the CEA has studied the physics of hybrid systems, involving a sub-critical reactor coupled with an accelerator. These studies have provided information on the potential of hybrid systems to transmute actinides and, long lived fission products. The potential of such a system remains to be proven, specifically in terms of the physical understanding of the different phenomena involved and their modelling, as well as in terms of experimental validation of coupled systems, sub-critical environment/accelerator. This validation must be achieved through mock-up studies of the sub-critical environments coupled to a source of external neutrons. The MUSE-4 mock-up experiment ismore » planed at the MASURCA facility and will use an accelerator coupled to a tritium target. The great step between the generator used in the past and the accelerator will allow to increase the knowledge in hybrid physic and to decrease the experimental biases and the measurement uncertainties.« less
NASA Technical Reports Server (NTRS)
Sances, Dillon J.; Gangadharan, Sathya N.; Sudermann, James E.; Marsell, Brandon
2010-01-01
Liquid sloshing within spacecraft propellant tanks causes rapid energy dissipation at resonant modes, which can result in attitude destabilization of the vehicle. Identifying resonant slosh modes currently requires experimental testing and mechanical pendulum analogs to characterize the slosh dynamics. Computational Fluid Dynamics (CFD) techniques have recently been validated as an effective tool for simulating fuel slosh within free-surface propellant tanks. Propellant tanks often incorporate an internal flexible diaphragm to separate ullage and propellant which increases modeling complexity. A coupled fluid-structure CFD model is required to capture the damping effects of a flexible diaphragm on the propellant. ANSYS multidisciplinary engineering software employs a coupled solver for analyzing two-way Fluid Structure Interaction (FSI) cases such as the diaphragm propellant tank system. Slosh models generated by ANSYS software are validated by experimental lateral slosh test results. Accurate data correlation would produce an innovative technique for modeling fuel slosh within diaphragm tanks and provide an accurate and efficient tool for identifying resonant modes and the slosh dynamic response.
Modeling, simulation, and estimation of optical turbulence
NASA Astrophysics Data System (ADS)
Formwalt, Byron Paul
This dissertation documents three new contributions to simulation and modeling of optical turbulence. The first contribution is the formalization, optimization, and validation of a modeling technique called successively conditioned rendering (SCR). The SCR technique is empirically validated by comparing the statistical error of random phase screens generated with the technique. The second contribution is the derivation of the covariance delineation theorem, which provides theoretical bounds on the error associated with SCR. It is shown empirically that the theoretical bound may be used to predict relative algorithm performance. Therefore, the covariance delineation theorem is a powerful tool for optimizing SCR algorithms. For the third contribution, we introduce a new method for passively estimating optical turbulence parameters, and demonstrate the method using experimental data. The technique was demonstrated experimentally, using a 100 m horizontal path at 1.25 m above sun-heated tarmac on a clear afternoon. For this experiment, we estimated C2n ≈ 6.01 · 10-9 m-23 , l0 ≈ 17.9 mm, and L0 ≈ 15.5 m.
Experimental and Numerical Investigation of Flow Properties of Supersonic Helium-Air Jets
NASA Technical Reports Server (NTRS)
Miller, Steven A. E.; Veltin, Jeremy
2010-01-01
Heated high speed subsonic and supersonic jets operating on- or off-design are a source of noise that is not yet fully understood. Helium-air mixtures can be used in the correct ratio to simulate the total temperature ratio of heated air jets and hence have the potential to provide inexpensive and reliable flow and acoustic measurements. This study presents a combination of flow measurements of helium-air high speed jets and numerical simulations of similar helium-air mixture and heated air jets. Jets issuing from axisymmetric convergent and convergent-divergent nozzles are investigated, and the results show very strong similarity with heated air jet measurements found in the literature. This demonstrates the validity of simulating heated high speed jets with helium-air in the laboratory, together with the excellent agreement obtained in the presented data between the numerical predictions and the experiments. The very close match between the numerical and experimental data also validates the frozen chemistry model used in the numerical simulation.
Validation of the thermal challenge problem using Bayesian Belief Networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFarland, John; Swiler, Laura Painton
The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less
An improved swarm optimization for parameter estimation and biological model selection.
Abdullah, Afnizanfaizal; Deris, Safaai; Mohamad, Mohd Saberi; Anwar, Sohail
2013-01-01
One of the key aspects of computational systems biology is the investigation on the dynamic biological processes within cells. Computational models are often required to elucidate the mechanisms and principles driving the processes because of the nonlinearity and complexity. The models usually incorporate a set of parameters that signify the physical properties of the actual biological systems. In most cases, these parameters are estimated by fitting the model outputs with the corresponding experimental data. However, this is a challenging task because the available experimental data are frequently noisy and incomplete. In this paper, a new hybrid optimization method is proposed to estimate these parameters from the noisy and incomplete experimental data. The proposed method, called Swarm-based Chemical Reaction Optimization, integrates the evolutionary searching strategy employed by the Chemical Reaction Optimization, into the neighbouring searching strategy of the Firefly Algorithm method. The effectiveness of the method was evaluated using a simulated nonlinear model and two biological models: synthetic transcriptional oscillators, and extracellular protease production models. The results showed that the accuracy and computational speed of the proposed method were better than the existing Differential Evolution, Firefly Algorithm and Chemical Reaction Optimization methods. The reliability of the estimated parameters was statistically validated, which suggests that the model outputs produced by these parameters were valid even when noisy and incomplete experimental data were used. Additionally, Akaike Information Criterion was employed to evaluate the model selection, which highlighted the capability of the proposed method in choosing a plausible model based on the experimental data. In conclusion, this paper presents the effectiveness of the proposed method for parameter estimation and model selection problems using noisy and incomplete experimental data. This study is hoped to provide a new insight in developing more accurate and reliable biological models based on limited and low quality experimental data.
Probing the free energy landscape of the FBP28WW domain using multiple techniques.
Periole, Xavier; Allen, Lucy R; Tamiola, Kamil; Mark, Alan E; Paci, Emanuele
2009-05-01
The free-energy landscape of a small protein, the FBP 28 WW domain, has been explored using molecular dynamics (MD) simulations with alternative descriptions of the molecule. The molecular models used range from coarse-grained to all-atom with either an implicit or explicit treatment of the solvent. Sampling of conformation space was performed using both conventional and temperature-replica exchange MD simulations. Experimental chemical shifts and NOEs were used to validate the simulations, and experimental phi values both for validation and as restraints. This combination of different approaches has provided insight into the free energy landscape and barriers encountered by the protein during folding and enabled the characterization of native, denatured and transition states which are compatible with the available experimental data. All the molecular models used stabilize well defined native and denatured basins; however, the degree of agreement with the available experimental data varies. While the most detailed, explicit solvent model predicts the data reasonably accurately, it does not fold despite a simulation time 10 times that of the experimental folding time. The less detailed models performed poorly relative to the explicit solvent model: an implicit solvent model stabilizes a ground state which differs from the experimental native state, and a structure-based model underestimates the size of the barrier between the two states. The use of experimental phi values both as restraints, and to extract structures from unfolding simulations, result in conformations which, although not necessarily true transition states, appear to share the geometrical characteristics of transition state structures. In addition to characterizing the native, transition and denatured states of this particular system in this work, the advantages and limitations of using varying levels of representation are discussed. 2008 Wiley Periodicals, Inc.
Development of the trickle roof cooling and heating system: Experimental plan
NASA Astrophysics Data System (ADS)
Haves, P.; Jankovic, T.; Doderer, E.
1982-07-01
A passive system applicable both to retrofit and new construction was developed. This system (the trickle roof system) dissipates heat from a thin film of water flowing over the roof. A small scale trickle roof system dissipator was tested at Trinity University under a range of ambient conditions and operating configurations. The results suggest that trickle roof systems should have comparable performance to roof pond systems. Provided is a review of the trickle roof system concept, several possible configurations, and the benefits the systems can provide. Test module experiments And results are presented in detail. The requirements for full scale testing are discussed and a plan is outlined using the two identical residential scale passive test facility buildings at Trinity University, San Antonio, Texas. Full scale experimental results would be used to validate computer algorithms, provide system optimization, and produce a nationwide performance assessment and design guidelines. This would provide industry with the information necessary to determine the commerical potential of the trickle roof system.
Moyle, Richard L.; Carvalhais, Lilia C.; Pretorius, Lara-Simone; Nowak, Ekaterina; Subramaniam, Gayathery; Dalton-Morgan, Jessica; Schenk, Peer M.
2017-01-01
Studies investigating the action of small RNAs on computationally predicted target genes require some form of experimental validation. Classical molecular methods of validating microRNA action on target genes are laborious, while approaches that tag predicted target sequences to qualitative reporter genes encounter technical limitations. The aim of this study was to address the challenge of experimentally validating large numbers of computationally predicted microRNA-target transcript interactions using an optimized, quantitative, cost-effective, and scalable approach. The presented method combines transient expression via agroinfiltration of Nicotiana benthamiana leaves with a quantitative dual luciferase reporter system, where firefly luciferase is used to report the microRNA-target sequence interaction and Renilla luciferase is used as an internal standard to normalize expression between replicates. We report the appropriate concentration of N. benthamiana leaf extracts and dilution factor to apply in order to avoid inhibition of firefly LUC activity. Furthermore, the optimal ratio of microRNA precursor expression construct to reporter construct and duration of the incubation period post-agroinfiltration were determined. The optimized dual luciferase assay provides an efficient, repeatable and scalable method to validate and quantify microRNA action on predicted target sequences. The optimized assay was used to validate five predicted targets of rice microRNA miR529b, with as few as six technical replicates. The assay can be extended to assess other small RNA-target sequence interactions, including assessing the functionality of an artificial miRNA or an RNAi construct on a targeted sequence. PMID:28979287
Young, Jasmine Y; Westbrook, John D; Feng, Zukang; Sala, Raul; Peisach, Ezra; Oldfield, Thomas J; Sen, Sanchayita; Gutmanas, Aleksandras; Armstrong, David R; Berrisford, John M; Chen, Li; Chen, Minyu; Di Costanzo, Luigi; Dimitropoulos, Dimitris; Gao, Guanghua; Ghosh, Sutapa; Gore, Swanand; Guranovic, Vladimir; Hendrickx, Pieter M S; Hudson, Brian P; Igarashi, Reiko; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L; Liang, Yuhe; Mading, Steve; Mak, Lora; Mir, M Saqib; Mukhopadhyay, Abhik; Patwardhan, Ardan; Persikova, Irina; Rinaldi, Luana; Sanz-Garcia, Eduardo; Sekharan, Monica R; Shao, Chenghua; Swaminathan, G Jawahar; Tan, Lihua; Ulrich, Eldon L; van Ginkel, Glen; Yamashita, Reiko; Yang, Huanwang; Zhuravleva, Marina A; Quesada, Martha; Kleywegt, Gerard J; Berman, Helen M; Markley, John L; Nakamura, Haruki; Velankar, Sameer; Burley, Stephen K
2017-03-07
OneDep, a unified system for deposition, biocuration, and validation of experimentally determined structures of biological macromolecules to the PDB archive, has been developed as a global collaboration by the worldwide PDB (wwPDB) partners. This new system was designed to ensure that the wwPDB could meet the evolving archiving requirements of the scientific community over the coming decades. OneDep unifies deposition, biocuration, and validation pipelines across all wwPDB, EMDB, and BMRB deposition sites with improved focus on data quality and completeness in these archives, while supporting growth in the number of depositions and increases in their average size and complexity. In this paper, we describe the design, functional operation, and supporting infrastructure of the OneDep system, and provide initial performance assessments. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Storms, Bruce L.; Ross, James C.; Heineck, James T.; Walker, Stephen M.; Driver, David M.; Zilliac, Gregory G.; Bencze, Daniel P. (Technical Monitor)
2001-01-01
The 1/8-scale Ground Transportation System (GTS) model was studied experimentally in the NASA Ames 7- by 10-Ft Wind Tunnel. Designed for validation of computational fluid dynamics (CFD), the GTS model has a simplified geometry with a cab-over-engine design and no tractor-trailer gap. As a further simplification, all measurements of the GTS model were made without wheels. Aerodynamic boattail plates were also tested on the rear of the trailer to provide a simple geometry modification for computation. The experimental measurements include body-axis drag, surface pressures, surface hot-film anemometry, oil-film interferometry, and 3-D particle image velocimetry (PIV). The wind-averaged drag coefficient with and without boattail plates was 0.225 and 0.277, respectively. PIV measurements behind the model reveal a significant reduction in the wake size due to the flow turning provided by the boattail plates. Hot-film measurements on the side of the cab indicate laminar separation with turbulent reattachment within 0.08 trailer width for zero and +/- 10 degrees yaw. Oil film interferometry provided quantitative measurements of skin friction and qualitative oil flow images. A complete set of the experimental data and the surface definition of the model are included on a CD-ROM for further analysis and comparison.
HIPdb: a database of experimentally validated HIV inhibiting peptides.
Qureshi, Abid; Thakur, Nishant; Kumar, Manoj
2013-01-01
Besides antiretroviral drugs, peptides have also demonstrated potential to inhibit the Human immunodeficiency virus (HIV). For example, T20 has been discovered to effectively block the HIV entry and was approved by the FDA as a novel anti-HIV peptide (AHP). We have collated all experimental information on AHPs at a single platform. HIPdb is a manually curated database of experimentally verified HIV inhibiting peptides targeting various steps or proteins involved in the life cycle of HIV e.g. fusion, integration, reverse transcription etc. This database provides experimental information of 981 peptides. These are of varying length obtained from natural as well as synthetic sources and tested on different cell lines. Important fields included are peptide sequence, length, source, target, cell line, inhibition/IC(50), assay and reference. The database provides user friendly browse, search, sort and filter options. It also contains useful services like BLAST and 'Map' for alignment with user provided sequences. In addition, predicted structure and physicochemical properties of the peptides are also included. HIPdb database is freely available at http://crdd.osdd.net/servers/hipdb. Comprehensive information of this database will be helpful in selecting/designing effective anti-HIV peptides. Thus it may prove a useful resource to researchers for peptide based therapeutics development.
Modeling Specular Exchange Between Concentric Cylinders in a Radiative Shielded Furnace
NASA Technical Reports Server (NTRS)
Schunk, Richard Gregory; Wessling, Francis C.
2000-01-01
The objective of this research is to develop and validate mathematical models to characterize the thermal performance of a radiative shielded furnace, the University of Alabama in Huntsville (UAH) Isothermal Diffusion Oven. The mathematical models are validated against experimental data obtained from testing the breadboard oven in a terrestrial laboratory environment. It is anticipated that the validation will produce math models capable of predicting the thermal performance of the furnace over a wide range of operating conditions, including those for which no experimental data is available. Of particular interest is the furnace core temperature versus heater power parametric and the transient thermal response of the furnace. Application to a microgravity environment is not considered, although it is conjectured that the removal of any gravity dependent terms from the math models developed for the terrestrial application should yield adequate results in a microgravity environment. The UAH Isothermal Diffusion Oven is designed to provide a thermal environment that is conducive to measuring the diffusion of high temperature liquid metals. In addition to achieving the temperatures required to melt a sample placed within the furnace, reducing or eliminating convective motions within the melt is an important design consideration [1]. Both of these influences are reflected in the design of the furnace. Reducing unwanted heat losses from the furnace is achieved through the use of low conductivity materials and reflective shielding. As evidenced by the highly conductive copper core used to house the sample within the furnace, convective motions can be greatly suppressed by providing an essentially uniform thermal environment. An oven of this design could ultimately be utilized in a microgravity environment, presumably as a experiment payload. Such an application precipitates other design requirements that limit the resources available to the furnace such as power, mass, volume, and possibly even time. Through the experimental and numerical results obtained, the power requirements and thermal response time of the breadboard furnace are quantified.
Validation of the Physics Analysis used to Characterize the AGR-1 TRISO Fuel Irradiation Test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sterbentz, James W.; Harp, Jason M.; Demkowicz, Paul A.
2015-05-01
The results of a detailed physics depletion calculation used to characterize the AGR-1 TRISO-coated particle fuel test irradiated in the Advanced Test Reactor (ATR) at the Idaho National Laboratory are compared to measured data for the purpose of validation. The particle fuel was irradiated for 13 ATR power cycles over three calendar years. The physics analysis predicts compact burnups ranging from 11.30-19.56% FIMA and cumulative neutron fast fluence from 2.21?4.39E+25 n/m 2 under simulated high-temperature gas-cooled reactor conditions in the ATR. The physics depletion calculation can provide a full characterization of all 72 irradiated TRISO-coated particle compacts during and post-irradiation,more » so validation of this physics calculation was a top priority. The validation of the physics analysis was done through comparisons with available measured experimental data which included: 1) high-resolution gamma scans for compact activity and burnup, 2) mass spectrometry for compact burnup, 3) flux wires for cumulative fast fluence, and 4) mass spectrometry for individual actinide and fission product concentrations. The measured data are generally in very good agreement with the calculated results, and therefore provide an adequate validation of the physics analysis and the results used to characterize the irradiated AGR-1 TRISO fuel.« less
Rational selection of training and test sets for the development of validated QSAR models
NASA Astrophysics Data System (ADS)
Golbraikh, Alexander; Shen, Min; Xiao, Zhiyan; Xiao, Yun-De; Lee, Kuo-Hsiung; Tropsha, Alexander
2003-02-01
Quantitative Structure-Activity Relationship (QSAR) models are used increasingly to screen chemical databases and/or virtual chemical libraries for potentially bioactive molecules. These developments emphasize the importance of rigorous model validation to ensure that the models have acceptable predictive power. Using k nearest neighbors ( kNN) variable selection QSAR method for the analysis of several datasets, we have demonstrated recently that the widely accepted leave-one-out (LOO) cross-validated R2 (q2) is an inadequate characteristic to assess the predictive ability of the models [Golbraikh, A., Tropsha, A. Beware of q2! J. Mol. Graphics Mod. 20, 269-276, (2002)]. Herein, we provide additional evidence that there exists no correlation between the values of q 2 for the training set and accuracy of prediction ( R 2) for the test set and argue that this observation is a general property of any QSAR model developed with LOO cross-validation. We suggest that external validation using rationally selected training and test sets provides a means to establish a reliable QSAR model. We propose several approaches to the division of experimental datasets into training and test sets and apply them in QSAR studies of 48 functionalized amino acid anticonvulsants and a series of 157 epipodophyllotoxin derivatives with antitumor activity. We formulate a set of general criteria for the evaluation of predictive power of QSAR models.
Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet
2011-10-01
The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations. Only the model for one specimen met the validation criteria for average and peak pressure of both articulations; however the experimental measures for peak pressure also exhibited high variability. MRI-based modeling can reliably be used for evaluating the contact area and contact force with similar confidence as in currently available experimental techniques. Average contact pressure, and peak contact pressure were more variable from all measurement techniques, and these measures from MRI-based modeling should be used with some caution.
MAVRIC Flutter Model Transonic Limit Cycle Oscillation Test
NASA Technical Reports Server (NTRS)
Edwards, John W.; Schuster, David M.; Spain, Charles V.; Keller, Donald F.; Moses, Robert W.
2001-01-01
The Models for Aeroelastic Validation Research Involving Computation semi-span wind-tunnel model (MAVRIC-I), a business jet wing-fuselage flutter model, was tested in NASA Langley's Transonic Dynamics Tunnel with the goal of obtaining experimental data suitable for Computational Aeroelasticity code validation at transonic separation onset conditions. This research model is notable for its inexpensive construction and instrumentation installation procedures. Unsteady pressures and wing responses were obtained for three wingtip configurations of clean, tipstore, and winglet. Traditional flutter boundaries were measured over the range of M = 0.6 to 0.9 and maps of Limit Cycle Oscillation (LCO) behavior were made in the range of M = 0.85 to 0.95. Effects of dynamic pressure and angle-of-attack were measured. Testing in both R134a heavy gas and air provided unique data on Reynolds number, transition effects, and the effect of speed of sound on LCO behavior. The data set provides excellent code validation test cases for the important class of flow conditions involving shock-induced transonic flow separation onset at low wing angles, including LCO behavior.
MAVRIC Flutter Model Transonic Limit Cycle Oscillation Test
NASA Technical Reports Server (NTRS)
Edwards, John W.; Schuster, David M.; Spain, Charles V.; Keller, Donald F.; Moses, Robert W.
2001-01-01
The Models for Aeroelastic Validation Research Involving Computation semi-span wind-tunnel model (MAVRIC-I), a business jet wing-fuselage flutter model, was tested in NASA Langley's Transonic Dynamics Tunnel with the goal of obtaining experimental data suitable for Computational Aeroelasticity code validation at transonic separation onset conditions. This research model is notable for its inexpensive construction and instrumentation installation procedures. Unsteady pressures and wing responses were obtained for three wingtip configurations clean, tipstore, and winglet. Traditional flutter boundaries were measured over the range of M = 0.6 to 0.9 and maps of Limit Cycle Oscillation (LCO) behavior were made in the range of M = 0.85 to 0.95. Effects of dynamic pressure and angle-of-attack were measured. Testing in both R134a heavy gas and air provided unique data on Reynolds number, transition effects, and the effect of speed of sound on LCO behavior. The data set provides excellent code validation test cases for the important class of flow conditions involving shock-induced transonic flow separation onset at low wing angles, including Limit Cycle Oscillation behavior.
Geist, Rebecca E; DuBois, Chase H; Nichols, Timothy C; Caughey, Melissa C; Merricks, Elizabeth P; Raymer, Robin; Gallippi, Caterina M
2016-09-01
Acoustic radiation force impulse (ARFI) Surveillance of Subcutaneous Hemorrhage (ASSH) has been previously demonstrated to differentiate bleeding phenotype and responses to therapy in dogs and humans, but to date, the method has lacked experimental validation. This work explores experimental validation of ASSH in a poroelastic tissue-mimic and in vivo in dogs. The experimental design exploits calibrated flow rates and infusion durations of evaporated milk in tofu or heparinized autologous blood in dogs. The validation approach enables controlled comparisons of ASSH-derived bleeding rate (BR) and time to hemostasis (TTH) metrics. In tissue-mimicking experiments, halving the calibrated flow rate yielded ASSH-derived BRs that decreased by 44% to 48%. Furthermore, for calibrated flow durations of 5.0 minutes and 7.0 minutes, average ASSH-derived TTH was 5.2 minutes and 7.0 minutes, respectively, with ASSH predicting the correct TTH in 78% of trials. In dogs undergoing calibrated autologous blood infusion, ASSH measured a 3-minute increase in TTH, corresponding to the same increase in the calibrated flow duration. For a measured 5% decrease in autologous infusion flow rate, ASSH detected a 7% decrease in BR. These tissue-mimicking and in vivo preclinical experimental validation studies suggest the ASSH BR and TTH measures reflect bleeding dynamics. © The Author(s) 2015.
A joint-space numerical model of metabolic energy expenditure for human multibody dynamic system.
Kim, Joo H; Roberts, Dustyn
2015-09-01
Metabolic energy expenditure (MEE) is a critical performance measure of human motion. In this study, a general joint-space numerical model of MEE is derived by integrating the laws of thermodynamics and principles of multibody system dynamics, which can evaluate MEE without the limitations inherent in experimental measurements (phase delays, steady state and task restrictions, and limited range of motion) or muscle-space models (complexities and indeterminacies from excessive DOFs, contacts and wrapping interactions, and reliance on in vitro parameters). Muscle energetic components are mapped to the joint space, in which the MEE model is formulated. A constrained multi-objective optimization algorithm is established to estimate the model parameters from experimental walking data also used for initial validation. The joint-space parameters estimated directly from active subjects provide reliable MEE estimates with a mean absolute error of 3.6 ± 3.6% relative to validation values, which can be used to evaluate MEE for complex non-periodic tasks that may not be experimentally verifiable. This model also enables real-time calculations of instantaneous MEE rate as a function of time for transient evaluations. Although experimental measurements may not be completely replaced by model evaluations, predicted quantities can be used as strong complements to increase reliability of the results and yield unique insights for various applications. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Holobar, A.; Minetto, M. A.; Farina, D.
2014-02-01
Objective. A signal-based metric for assessment of accuracy of motor unit (MU) identification from high-density surface electromyograms (EMG) is introduced. This metric, so-called pulse-to-noise-ratio (PNR), is computationally efficient, does not require any additional experimental costs and can be applied to every MU that is identified by the previously developed convolution kernel compensation technique. Approach. The analytical derivation of the newly introduced metric is provided, along with its extensive experimental validation on both synthetic and experimental surface EMG signals with signal-to-noise ratios ranging from 0 to 20 dB and muscle contraction forces from 5% to 70% of the maximum voluntary contraction. Main results. In all the experimental and simulated signals, the newly introduced metric correlated significantly with both sensitivity and false alarm rate in identification of MU discharges. Practically all the MUs with PNR > 30 dB exhibited sensitivity >90% and false alarm rates <2%. Therefore, a threshold of 30 dB in PNR can be used as a simple method for selecting only reliably decomposed units. Significance. The newly introduced metric is considered a robust and reliable indicator of accuracy of MU identification. The study also shows that high-density surface EMG can be reliably decomposed at contraction forces as high as 70% of the maximum.
Development of Switchable Polarity Solvent Draw Solutes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Aaron D.
Results of a computational fluid dynamic (CFD) study of flow and heat transfer in a printed circuit heat exchanger (PCHE) geometry are presented. CFD results obtained from a two-plate model are compared to corresponding experimental results for the validation. This process provides the basis for further application of the CFD code to PCHE design and performance analysis in a variety of internal flow geometries. As a part of the code verification and validation (V&V) process, CFD simulation of a single semicircular straight channel under laminar isothermal conditions was also performed and compared to theoretical results. This comparison yielded excellent agreementmore » with the theoretical values. The two-plate CFD model based on the experimental PCHE design overestimated the effectiveness and underestimated the pressure drop. However, it is found that the discrepancy between the CFD result and experimental data was mainly caused by the uncertainty in the geometry of heat exchanger during the fabrication. The CFD results obtained using a slightly smaller channel diameter yielded good agreement with the experimental data. A separate investigation revealed that the average channel diameter of the OSU PCHE after the diffusion-bonding was 1.93 mm on the cold fluid side and 1.90 mm on the hot fluid side which are both smaller than the nominal design value. Consequently, the CFD code was shown to have sufficient capability to evaluate the heat exchanger thermal-hydraulic performance.« less
Turbulence Modeling Validation, Testing, and Development
NASA Technical Reports Server (NTRS)
Bardina, J. E.; Huang, P. G.; Coakley, T. J.
1997-01-01
The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.
Bayesian Inference in the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2008-01-01
This paper provides an elementary tutorial overview of Bayesian inference and its potential for application in aerospace experimentation in general and wind tunnel testing in particular. Bayes Theorem is reviewed and examples are provided to illustrate how it can be applied to objectively revise prior knowledge by incorporating insights subsequently obtained from additional observations, resulting in new (posterior) knowledge that combines information from both sources. A logical merger of Bayesian methods and certain aspects of Response Surface Modeling is explored. Specific applications to wind tunnel testing, computational code validation, and instrumentation calibration are discussed.
Horton, pipe hydraulics, and the atmospheric boundary layer (The Robert E. Horton Memorial Lecture)
NASA Technical Reports Server (NTRS)
Brutsaert, Wilfried
1993-01-01
The early stages of Horton's scientific career which provided the opportunity and stimulus to delve into the origins of some contemporary concepts on the atmospheric boundary layer are reviewed. The study of Saph and Schoder provided basis for the experimental verification and validation of similarity by Blasius, Staton and Pannel, and for the subsequent developments that led to the present understanding of the turbulent boundary layer. Particular attention is given to incorporation of similarity and scaling in the analysis of turbulent flow.
Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui
2017-12-01
Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.
Numerical Modeling of Active Flow Control in a Boundary Layer Ingesting Offset Inlet
NASA Technical Reports Server (NTRS)
Allan, Brian G.; Owens, Lewis R.; Berrier, Bobby L.
2004-01-01
This investigation evaluates the numerical prediction of flow distortion and pressure recovery for a boundary layer ingesting offset inlet with active flow control devices. The numerical simulations are computed using a Reynolds averaged Navier-Stokes code developed at NASA. The numerical results are validated by comparison to experimental wind tunnel tests conducted at NASA Langley Research Center at both low and high Mach numbers. Baseline comparisons showed good agreement between numerical and experimental results. Numerical simulations for the inlet with passive and active flow control also showed good agreement at low Mach numbers where experimental data has already been acquired. Numerical simulations of the inlet at high Mach numbers with flow control jets showed an improvement of the flow distortion. Studies on the location of the jet actuators, for the high Mach number case, were conducted to provide guidance for the design of a future experimental wind tunnel test.
NASA Astrophysics Data System (ADS)
Hoang, Thai M.; Pan, Rui; Ahn, Jonghoon; Bang, Jaehoon; Quan, H. T.; Li, Tongcang
2018-02-01
Nonequilibrium processes of small systems such as molecular machines are ubiquitous in biology, chemistry, and physics but are often challenging to comprehend. In the past two decades, several exact thermodynamic relations of nonequilibrium processes, collectively known as fluctuation theorems, have been discovered and provided critical insights. These fluctuation theorems are generalizations of the second law and can be unified by a differential fluctuation theorem. Here we perform the first experimental test of the differential fluctuation theorem using an optically levitated nanosphere in both underdamped and overdamped regimes and in both spatial and velocity spaces. We also test several theorems that can be obtained from it directly, including a generalized Jarzynski equality that is valid for arbitrary initial states, and the Hummer-Szabo relation. Our study experimentally verifies these fundamental theorems and initiates the experimental study of stochastic energetics with the instantaneous velocity measurement.
Experimental study and simulation of 63Zn production via proton induce reaction.
Rostampour, Malihe; Sadeghi, Mahdi; Aboudzadeh, Mohammadreza; Hamidi, Saeid; Soltani, Naser; Novin, Fatemeh Bolouri; Rahiminejad, Ali; Rajabifar, Saeid
2018-06-01
The 63 Zn was produced by16.8 MeV proton irradiation of natural copper. Thick target yield for 63 Zn in the energy range of 16.8 →12.2 MeV was 2.47 ± 0.12 GBq/μA.h. Reasonable agreement between achieved experimental data and theoretical value of thick target yield for 63 Zn was observed. A simple separation procedure of 63 Zn from copper target was developed using cation exchange chromatography. About 88 ± 5% of the loaded activity was recovered. The performance of FLUKA to reproduce experimental data of thick target yield of 63 Zn is validated. The achieved results from this code were compared with the corresponding experimental data. This comparison demonstrated that FLUKA provides a suitable tool for the simulation of radionuclide production using proton irradiation. Copyright © 2018 Elsevier Ltd. All rights reserved.
Creation of the new industry-standard space test of laser retroreflectors for the GNSS and LAGEOS
NASA Astrophysics Data System (ADS)
Dell'Agnello, S.; Delle Monache, G. O.; Currie, D. G.; Vittori, R.; Cantone, C.; Garattini, M.; Boni, A.; Martini, M.; Lops, C.; Intaglietta, N.; Tauraso, R.; Arnold, D. A.; Pearlman, M. R.; Bianco, G.; Zerbini, S.; Maiello, M.; Berardi, S.; Porcelli, L.; Alley, C. O.; McGarry, J. F.; Sciarretta, C.; Luceri, V.; Zagwodzki, T. W.
2011-03-01
We built a new experimental apparatus (the “Satellite/lunar laser ranging Characterization Facility”, SCF) and created a new test procedure (the SCF-Test) to characterize and model the detailed thermal behavior and the optical performance of cube corner laser retroreflectors in space for industrial and scientific applications. The primary goal of these innovative tools is to provide critical design and diagnostic capabilities for Satellites Laser Ranging (SLR) to Galileo and other GNSS (Global Navigation Satellite System) constellations. The capability will allow us to optimize the design of GNSS laser retroreflector payloads to maximize ranging efficiency, to improve signal-to-noise conditions in daylight and to provide pre-launch validation of retroreflector performance under laboratory-simulated space conditions. Implementation of new retroreflector designs being studied will help to improve GNSS orbits, which will then increase the accuracy, stability, and distribution of the International Terrestrial Reference Frame (ITRF), to provide better definition of the geocenter (origin) and the scale (length unit).Our key experimental innovation is the concurrent measurement and modeling of the optical Far Field Diffraction Pattern (FFDP) and the temperature distribution of the SLR retroreflector payload under thermal conditions produced with a close-match solar simulator. The apparatus includes infrared cameras for non-invasive thermometry, thermal control and real-time movement of the payload to experimentally simulate satellite orientation on orbit with respect to both solar illumination and laser interrogation beams. These unique capabilities provide experimental validation of the space segment for SLR and Lunar Laser Ranging (LLR).We used the SCF facility and the SCF-Test to perform a comprehensive, non-invasive space characterization of older generation, back-coated retroreflectors of the GIOVE-A and -B (Galileo In-Orbit Validation Elements) and the GPS-35 and -36 designs. First, using a full GPS flight model at laser wavelengths of 532 and 632 nm, we found its “effective optical cross section” in air, under isothermal conditions, to be six times lower than the Retroreflector Standard for GNSS satellites (100 × 106 m2 at 20,000 km altitude for GPS and 180 × 106 m2 for Galileo at 23,200 km altitude), issued by the International Laser Ranging Service (ILRS). Under the simulated thermal and space conditions of the SCF, we also showed that in some space configurations the “effective optical cross section” is further reduced, by the thermal degradation of the FFDP. Using the same SCF-Test configuration on an individual GIOVE prototype cube, we measured severe thermal degradation in optical performance, which appears to be caused by the retroreflector metal coating and the non-optimized thermal conductance of the mounting.Uncoated retroreflectors with proper mounting can minimize thermal degradation and significantly increase the optical performance, and as such, are emerging as the recommended design for modern GNSS satellites. The COMPASS-M1, GLONASS-115 GNSS satellites use uncoated cubes. They provide better efficiency than those on GPS and GIOVE, including better daylight ranging performance. However, these retroreflectors were not characterized in the laboratory under space conditions prior to launch, so we have no basis to evaluate how well they were optimized for future GNSS satellites. SCF-Testing, under a non-disclosure agreement between INFN-LNF and the European Space Agency (ESA), of prototype uncoated cubes for the first four Galileo satellites to be launched (named “IOV”, In-Orbit Validation satellites) is a major step forward. An SCF-Test performed on a LAGEOS (LAser GEOdynamics Satellite) engineering model retroreflector array provided by NASA, showed the good space performance on what is now a reference ILRS payload standard. The IOV and LAGEOS measurements demonstrated the effectiveness of the SCF-Test as an LRA diagnostic, optimization and validation tool in use by NASA, ESA and ASI.
Fan, Ang-Xiao; Dakpé, Stéphanie; Dao, Tien Tuan; Pouletaut, Philippe; Rachik, Mohamed; Ho Ba Tho, Marie Christine
2017-07-01
Finite element simulation of facial mimics provides objective indicators about soft tissue functions for improving diagnosis, treatment and follow-up of facial disorders. There is a lack of in vivo experimental data for model development and validation. In this study, the contribution of the paired Zygomaticus Major (ZM) muscle contraction on the facial mimics was investigated using in vivo experimental data derived from MRI. Maximal relative differences of 7.7% and 37% were noted between MRI-based measurements and numerical outcomes for ZM and skin deformation behaviors respectively. This study opens a new direction to simulate facial mimics with in vivo data.
Are there reliable constitutive laws for dynamic friction?
Woodhouse, Jim; Putelat, Thibaut; McKay, Andrew
2015-09-28
Structural vibration controlled by interfacial friction is widespread, ranging from friction dampers in gas turbines to the motion of violin strings. To predict, control or prevent such vibration, a constitutive description of frictional interactions is inevitably required. A variety of friction models are discussed to assess their scope and validity, in the light of constraints provided by different experimental observations. Three contrasting case studies are used to illustrate how predicted behaviour can be extremely sensitive to the choice of frictional constitutive model, and to explore possible experimental paths to discriminate between and calibrate dynamic friction models over the full parameter range needed for real applications. © 2015 The Author(s).
Further Validation of a CFD Code for Calculating the Performance of Two-Stage Light Gas Guns
NASA Technical Reports Server (NTRS)
Bogdanoff, David W.
2017-01-01
Earlier validations of a higher-order Godunov code for modeling the performance of two-stage light gas guns are reviewed. These validation comparisons were made between code predictions and experimental data from the NASA Ames 1.5" and 0.28" guns and covered muzzle velocities of 6.5 to 7.2 km/s. In the present report, five more series of code validation comparisons involving experimental data from the Ames 0.22" (1.28" pump tube diameter), 0.28", 0.50", 1.00" and 1.50" guns are presented. The total muzzle velocity range of the validation data presented herein is 3 to 11.3 km/s. The agreement between the experimental data and CFD results is judged to be very good. Muzzle velocities were predicted within 0.35 km/s for 74% of the cases studied with maximum differences being 0.5 km/s and for 4 out of 50 cases, 0.5 - 0.7 km/s.
Experimental investigation and model verification for a GAX absorber
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmer, S.C.; Christensen, R.N.
1996-12-31
In the ammonia-water generator-absorber heat exchange (GAX) absorption heat pump, the heat and mass transfer processes which occur between the generator and absorber are the most crucial in assuring that the heat pump will achieve COPs competitive with those of current technologies. In this study, a model is developed for the heat and mass transfer processes that occur in a counter-current vertical fluted tube absorber (VFTA) with inserts. Correlations for heat and mass transfer in annuli are used to model the processes in the VTA. Experimental data is used to validate the model for three different insert geometries. Comparison ofmore » model results with experimental data provides insight into model corrections necessary to bring the model into agreement with the physical phenomena observed in the laboratory.« less
Patient-reported outcomes in borderline personality disorder.
Hasler, Gregor; Hopwood, Christopher J; Jacob, Gitta A; Brändle, Laura S; Schulte-Vels, Thomas
2014-06-01
Patient-reported outcome (PRO) refers to measures that emphasize the subjective view of patients about their health-related conditions and behaviors. Typically, PROs include self-report questionnaires and clinical interviews. Defining PROs for borderline personality disorder (BPD) is particularly challenging given the disorder's high symptomatic heterogeneity, high comorbidity with other psychiatric conditions, highly fluctuating symptoms, weak correlations between symptoms and functional outcomes, and lack of valid and reliable experimental measures to complement self-report data. Here, we provide an overview of currently used BPD outcome measures and discuss them from clinical, psychometric, experimental, and patient perspectives. In addition, we review the most promising leads to improve BPD PROs, including the DSM-5 Section III, the Recovery Approach, Ecological Momentary Assessments, and novel experimental measures of social functioning that are associated with functional and social outcomes.
Fink, Andreas; Benedek, Mathias; Grabner, Roland H; Staudt, Beate; Neubauer, Aljoscha C
2007-05-01
The psychometric assessment of different facets of creative abilities as well as the availability of experimental tasks for the neuroscientific study of creative thinking has replaced the view of creativity as an unsearchable trait. In this article we provide a brief overview of contemporary methodologies used for the operationalization of creative thinking in a neuroscientific context. Empirical studies are reported which measured brain activity (by means of EEG, fMRI, NIRS or PET) during the performance of different experimental tasks. These tasks, along with creative idea generation tasks used in our laboratory, constitute useful tools in uncovering possible brain correlates of creative thinking. Nevertheless, much more work is needed in order to establish reliable and valid measures of creative thinking, in particular measures of novelty or originality of creative insights.
2003-03-01
Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig
Update of Standard Practices for New Method Validation in Forensic Toxicology.
Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T
2017-01-01
International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
An experimental and theoretical investigation of deposition patterns from an agricultural airplane
NASA Technical Reports Server (NTRS)
Morris, D. J.; Croom, C. C.; Vandam, C. P.; Holmes, B. J.
1984-01-01
A flight test program has been conducted with a representative agricultural airplane to provide data for validating a computer program model which predicts aerially applied particle deposition. Test procedures and the data from this test are presented and discussed. The computer program features are summarized, and comparisons of predicted and measured particle deposition are presented. Applications of the computer program for spray pattern improvement are illustrated.
ERIC Educational Resources Information Center
LeGeros, Life
2013-01-01
This quasi-experimental value-added study provided evidence for the predictive validity of the Massachusetts MTEL General Curriculum Mathematics Subtest by finding an association between the licensure test results of 130 teachers and the growth of their 2640 grade 4 and 5 students. The study took advantage of a natural experiment that arose due to…
1992-05-12
compared to an untested one. The quasi-experinmtal design with nonequivalent comparison groups included leadership training between pre and posttests...experimental, pretest-posttest, nonequivalent comparison groups (Wave A & Wave B) design . It allowed investigation of the influence of leadership training on...provided comparison groups . According to Burns and Grove (1987), pretest-posttest designs have inherent threats to validity. Pretest administration
Márquez-Islas, Roberto; Sánchez-Pérez, Celia; García-Valenzuela, Augusto
2014-02-01
We describe a method for obtaining the refractive index (RI), size, and concentration of nonabsorbing nanoparticles in suspension from relatively simple optical measurements. The method requires measuring the complex effective RI of two dilute suspensions of the particles in liquids of different refractive indices. We describe the theoretical basis of the proposed method and provide experimental results validating the procedure.
A dual-user teleoperation system with Online Authority Adjustment for haptic training.
Fei Liu; Leleve, Arnaud; Eberard, Damien; Redarce, Tanneguy
2015-08-01
This paper introduces a dual-user teleoperation system for hands-on medical training. A shared control based architecture is presented for authority management. In this structure, the combination of control signals is obtained using a dominance factor. Its main improvement is Online Authority Adjustment (OAA): the authority can be adjusted manually/automatically during the training progress. Experimental results are provided to validate the performances of the system.
Resource and environmental surveys from space with the thematic mapper in the 1980's
NASA Technical Reports Server (NTRS)
1976-01-01
The selection of observation of vegetation is the primary optimization objective of the thematic mapper. The following are aspects of plans for the thematic mapper: (1) to include an appropriately modified first generation MSS in the thematic mapper mission; (2) to provide assured coverage for a minimum of six years to give agencies and other users an opportunity to justify the necessary commitment of resources for the transition into a completely valid operational phase; (3) to provide for global, direct data read-out, without the necessity for on-board data storage or dependence on foreign receiving stations; (4) to recognize the operational character of the thematic mapper after successful completion of its experimental evaluation; and (5) to combine future experimental packages with compatible orbits as part of the operational LANDSAT follow-on payloads.
Translational MR Neuroimaging of Stroke and Recovery
Mandeville, Emiri T.; Ayata, Cenk; Zheng, Yi; Mandeville, Joseph B.
2016-01-01
Multiparametric magnetic resonance imaging (MRI) has become a critical clinical tool for diagnosing focal ischemic stroke severity, staging treatment, and predicting outcome. Imaging during the acute phase focuses on tissue viability in the stroke vicinity, while imaging during recovery requires the evaluation of distributed structural and functional connectivity. Preclinical MRI of experimental stroke models provides validation of non-invasive biomarkers in terms of cellular and molecular mechanisms, while also providing a translational platform for evaluation of prospective therapies. This brief review of translational stroke imaging discusses the acute to chronic imaging transition, the principles underlying common MRI methods employed in stroke research, and experimental results obtained by clinical and preclinical imaging to determine tissue viability, vascular remodeling, structural connectivity of major white matter tracts, and functional connectivity using task-based and resting-state fMRI during the stroke recovery process. PMID:27578048
Thaden, Joshua T; Mogno, Ilaria; Wierzbowski, Jamey; Cottarel, Guillaume; Kasif, Simon; Collins, James J; Gardner, Timothy S
2007-01-01
Machine learning approaches offer the potential to systematically identify transcriptional regulatory interactions from a compendium of microarray expression profiles. However, experimental validation of the performance of these methods at the genome scale has remained elusive. Here we assess the global performance of four existing classes of inference algorithms using 445 Escherichia coli Affymetrix arrays and 3,216 known E. coli regulatory interactions from RegulonDB. We also developed and applied the context likelihood of relatedness (CLR) algorithm, a novel extension of the relevance networks class of algorithms. CLR demonstrates an average precision gain of 36% relative to the next-best performing algorithm. At a 60% true positive rate, CLR identifies 1,079 regulatory interactions, of which 338 were in the previously known network and 741 were novel predictions. We tested the predicted interactions for three transcription factors with chromatin immunoprecipitation, confirming 21 novel interactions and verifying our RegulonDB-based performance estimates. CLR also identified a regulatory link providing central metabolic control of iron transport, which we confirmed with real-time quantitative PCR. The compendium of expression data compiled in this study, coupled with RegulonDB, provides a valuable model system for further improvement of network inference algorithms using experimental data. PMID:17214507
High shear rate flow in a linear stroke magnetorheological energy absorber
NASA Astrophysics Data System (ADS)
Hu, W.; Wereley, N. M.; Hiemenz, G. J.; Ngatu, G. T.
2014-05-01
To provide adaptive stroking load in the crew seats of ground vehicles to protect crew from blast or impact loads, a magnetorheological energy absorber (MREA) or shock absorber was developed. The MREA provides appropriate levels of controllable stroking load for different occupant weights and peak acceleration because the viscous stroking load generated by the MREA force increases with velocity squared, thereby reducing its controllable range at high piston velocity. Therefore, MREA behavior at high piston velocity is analyzed and validated experimentally in order to investigate the effects of velocity and magnetic field on MREA performance. The analysis used to predict the MREA force as a function of piston velocity squared and applied field is presented. A conical fairing is mounted to the piston head of the MREA in order reduce predicted inlet flow loss by 9% at nominal velocity of 8 m/s, which resulted in a viscous force reduction of nominally 4%. The MREA behavior is experimentally measured using a high speed servo-hydraulic testing system for speeds up to 8 m/s. The measured MREA force is used to validate the analysis, which captures the transient force quite accurately, although the peak force is under-predicted at the peak speed of 8 m/s.
Crea, Simona; Cipriani, Christian; Donati, Marco; Carrozza, Maria Chiara; Vitiello, Nicola
2015-03-01
Here we describe a novel wearable feedback apparatus for lower-limb amputees. The system is based on three modules: a pressure-sensitive insole for the measurement of the plantar pressure distribution under the prosthetic foot during gait, a computing unit for data processing and gait segmentation, and a set of vibrating elements placed on the thigh skin. The feedback strategy relies on the detection of specific gait-phase transitions of the amputated leg. Vibrating elements are activated in a time-discrete manner, simultaneously with the occurrence of the detected gait-phase transitions. Usability and effectiveness of the apparatus were successfully assessed through an experimental validation involving ten healthy volunteers.
Yamaguchi, Takumi; Sakae, Yoshitake; Zhang, Ying; Yamamoto, Sayoko; Okamoto, Yuko; Kato, Koichi
2014-10-06
Exploration of the conformational spaces of flexible biomacromolecules is essential for quantitatively understanding the energetics of their molecular recognition processes. We employed stable isotope- and lanthanide-assisted NMR approaches in conjunction with replica-exchange molecular dynamics (REMD) simulations to obtain atomic descriptions of the conformational dynamics of high-mannose-type oligosaccharides, which harbor intracellular glycoprotein-fate determinants in their triantennary structures. The experimentally validated REMD simulation provided quantitative views of the dynamic conformational ensembles of the complicated, branched oligosaccharides, and indicated significant expansion of the conformational space upon removal of a terminal mannose residue during the functional glycan-processing pathway. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Eisfeld, Bernhard; Rumsey, Chris; Togiti, Vamshi
2015-01-01
The implementation of the SSG/LRR-omega differential Reynolds stress model into the NASA flow solvers CFL3D and FUN3D and the DLR flow solver TAU is verified by studying the grid convergence of the solution of three different test cases from the Turbulence Modeling Resource Website. The model's predictive capabilities are assessed based on four basic and four extended validation cases also provided on this website, involving attached and separated boundary layer flows, effects of streamline curvature and secondary flow. Simulation results are compared against experimental data and predictions by the eddy-viscosity models of Spalart-Allmaras (SA) and Menter's Shear Stress Transport (SST).
Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code
NASA Technical Reports Server (NTRS)
Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.
2003-01-01
Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.
Furrer, F; Franz, T; Berta, M; Leverrier, A; Scholz, V B; Tomamichel, M; Werner, R F
2012-09-07
We provide a security analysis for continuous variable quantum key distribution protocols based on the transmission of two-mode squeezed vacuum states measured via homodyne detection. We employ a version of the entropic uncertainty relation for smooth entropies to give a lower bound on the number of secret bits which can be extracted from a finite number of runs of the protocol. This bound is valid under general coherent attacks, and gives rise to keys which are composably secure. For comparison, we also give a lower bound valid under the assumption of collective attacks. For both scenarios, we find positive key rates using experimental parameters reachable today.
Solar-Diesel Hybrid Power System Optimization and Experimental Validation
NASA Astrophysics Data System (ADS)
Jacobus, Headley Stewart
As of 2008 1.46 billion people, or 22 percent of the World's population, were without electricity. Many of these people live in remote areas where decentralized generation is the only method of electrification. Most mini-grids are powered by diesel generators, but new hybrid power systems are becoming a reliable method to incorporate renewable energy while also reducing total system cost. This thesis quantifies the measurable Operational Costs for an experimental hybrid power system in Sierra Leone. Two software programs, Hybrid2 and HOMER, are used during the system design and subsequent analysis. Experimental data from the installed system is used to validate the two programs and to quantify the savings created by each component within the hybrid system. This thesis bridges the gap between design optimization studies that frequently lack subsequent validation and experimental hybrid system performance studies.
Outcome evaluation of a new model of critical care orientation.
Morris, Linda L; Pfeifer, Pamela; Catalano, Rene; Fortney, Robert; Nelson, Greta; Rabito, Robb; Harap, Rebecca
2009-05-01
The shortage of critical care nurses and the service expansion of 2 intensive care units provided a unique opportunity to create a new model of critical care orientation. The goal was to design a program that assessed critical thinking, validated competence, and provided learning pathways that accommodated diverse experience. To determine the effect of a new model of critical care orientation on satisfaction, retention, turnover, vacancy, preparedness to manage patient care assignment, length of orientation, and cost of orientation. A prospective, quasi-experimental design with both quantitative and qualitative methods. The new model improved satisfaction scores, retention rates, and recruitment of critical care nurses. Length of orientation was unchanged. Cost was increased, primarily because a full-time education consultant was added. A new model for nurse orientation that was focused on critical thinking and competence validation improved retention and satisfaction and serves as a template for orientation of nurses throughout the medical center.
Campaign 2 Level 2 Milestone Review 2009: Milestone # 3131 Grain Scale Simulation of Pore Collapse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, A J
2009-09-28
The milestone reviewed on Sept. 16, 2009 was 'High-fidelity simulation of shock initiation of high explosives at the grain scale using coupled hydrodynamics, thermal transport and chemistry'. It is the opinion of the committee that the team has satisfied the milestone. A detailed description of how the goals were met is provided. The milestone leveraged capabilities from ASC Physics and Engineering Materials program combined with experimental input from Campaign 2. A combined experimental-multiscale simulation approach was used to create and validate the various TATB model components. At the lowest length scale, quantum chemical calculations were used to determine equations ofmore » state, thermal transport properties and reaction rates for TATB as it is decomposing. High-pressure experiments conducted in diamond anvil cells, gas guns and the Z machine were used to validate the EOS, thermal conductivity, specific heat and predictions of water formation. The predicted reaction networks and chemical kinetic equations were implemented in Cheetah and validated against the lower length scale data. Cheetah was then used within the ASC code ALE3D for high-resolution, thermo-mechanically coupled simulations of pore collapse at the micron size scale to predict conditions for detonation initiation.« less
Numerical and experimental validation for the thermal transmittance of windows with cellular shades
Hart, Robert
2018-02-21
Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less
Simon, Daniela; Kischkel, Eva; Spielberg, Rüdiger; Kathmann, Norbert
2012-06-30
Distressing symptom-related anxiety is difficult to study in obsessive-compulsive disorder (OCD) due to the disorder's heterogeneity. Our aim was to develop and validate a set of pictures and films comprising a variety of prominent OCD triggers that can be used for individually tailored symptom provocation in experimental studies. In a two-staged production procedure a large pool of OCD triggers and neutral contents was produced and preselected by three psychotherapists specialized in OCD. A sample of 13 OCD patients and 13 controls rated their anxiety, aversiveness and arousal during exposure to OCD-relevant, aversive and neutral control stimuli. Our findings demonstrate differences between the responses of patients and controls to OCD triggers only. Symptom-related anxiety was stronger in response to dynamic compared with static OCD-relevant stimuli. Due to the small number of 13 patients included in the study, only tentative conclusions can be drawn and this study merely provides a first step of validation. These standardized sets constitute valuable tools that can be used in experimental studies on the brain correlates of OCD symptoms and for the study of therapeutic interventions in order to contribute to future developments in the field. Copyright © 2012 Elsevier Ltd. All rights reserved.
Numerical and experimental validation for the thermal transmittance of windows with cellular shades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Robert
Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less
The early maximum likelihood estimation model of audiovisual integration in speech perception.
Andersen, Tobias S
2015-05-01
Speech perception is facilitated by seeing the articulatory mouth movements of the talker. This is due to perceptual audiovisual integration, which also causes the McGurk-MacDonald illusion, and for which a comprehensive computational account is still lacking. Decades of research have largely focused on the fuzzy logical model of perception (FLMP), which provides excellent fits to experimental observations but also has been criticized for being too flexible, post hoc and difficult to interpret. The current study introduces the early maximum likelihood estimation (MLE) model of audiovisual integration to speech perception along with three model variations. In early MLE, integration is based on a continuous internal representation before categorization, which can make the model more parsimonious by imposing constraints that reflect experimental designs. The study also shows that cross-validation can evaluate models of audiovisual integration based on typical data sets taking both goodness-of-fit and model flexibility into account. All models were tested on a published data set previously used for testing the FLMP. Cross-validation favored the early MLE while more conventional error measures favored more complex models. This difference between conventional error measures and cross-validation was found to be indicative of over-fitting in more complex models such as the FLMP.
Pradhan, Nirakar; Dipasquale, Laura; d'Ippolito, Giuliana; Fontana, Angelo; Panico, Antonio; Pirozzi, Francesco; Lens, Piet N L; Esposito, Giovanni
2016-08-01
The aim of the present study was to develop a kinetic model for a recently proposed unique and novel metabolic process called capnophilic (CO2-requiring) lactic fermentation (CLF) pathway in Thermotoga neapolitana. The model was based on Monod kinetics and the mathematical expressions were developed to enable the simulation of biomass growth, substrate consumption and product formation. The calibrated kinetic parameters such as maximum specific uptake rate (k), semi-saturation constant (kS), biomass yield coefficient (Y) and endogenous decay rate (kd) were 1.30 h(-1), 1.42 g/L, 0.1195 and 0.0205 h(-1), respectively. A high correlation (>0.98) was obtained between the experimental data and model predictions for both model validation and cross validation processes. An increase of the lactate production in the range of 40-80% was obtained through CLF pathway compared to the classic dark fermentation model. The proposed kinetic model is the first mechanistically based model for the CLF pathway. This model provides useful information to improve the knowledge about how acetate and CO2 are recycled back by Thermotoga neapolitana to produce lactate without compromising the overall hydrogen yield. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister
2017-01-01
Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach offers practical benefits for probing novel insights into the mode of action of investigational compounds, and for the identification of new target selectivities for drug repurposing applications. PMID:28787438
Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview
NASA Technical Reports Server (NTRS)
Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard
1996-01-01
The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'
From military to civil loadings: Preliminary numerical-based thorax injury criteria investigations.
Goumtcha, Aristide Awoukeng; Bodo, Michèle; Taddei, Lorenzo; Roth, Sébastien
2016-03-01
Effects of the impact of a mechanical structure on the human body are of great interest in the understanding of body trauma. Experimental tests have led to first conclusions about the dangerousness of an impact observing impact forces or displacement time history with PMHS (Post Mortem human Subjects). They have allowed providing interesting data for the development and the validation of numerical biomechanical models. These models, widely used in the framework of automotive crashworthiness, have led to the development of numerical-based injury criteria and tolerance thresholds. The aim of this process is to improve the safety of mechanical structures in interaction with the body. In a military context, investigations both at experimental and numerical level are less successfully completed. For both military and civil frameworks, the literature list a number of numerical analysis trying to propose injury mechanisms, and tolerance thresholds based on biofidelic Finite Element (FE) models of different part of the human body. However the link between both frameworks is not obvious, since lots of parameters are different: great mass impacts at relatively low velocity for civil impacts (falls, automotive crashworthiness) and low mass at very high velocity for military loadings (ballistic, blast). In this study, different accident cases were investigated, and replicated with a previously developed and validated FE model of the human thorax named Hermaphrodite Universal Biomechanical YX model (HUBYX model). These previous validations included replications of standard experimental tests often used to validate models in the context of automotive industry, experimental ballistic tests in high speed dynamic impact and also numerical replication of blast loading test ensuring its biofidelity. In order to extend the use of this model in other frameworks, some real-world accidents were reconstructed, and consequences of these loadings on the FE model were explored. These various numerical replications of accident coming from different contexts raise the question about the ability of a FE model to correctly predict several kinds of trauma, from blast or ballistic impacts to falls, sports or automotive ones in a context of numerical injury mechanisms and tolerance limits investigations. Copyright © 2015 John Wiley & Sons, Ltd.
2016-05-24
experimental data. However, the time and length scales, and energy deposition rates in the canonical laboratory flames that have been studied over the...is to obtain high-fidelity experimental data critically needed to validate research codes at relevant conditions, and to develop systematic and...validated with experimental data. However, the time and length scales, and energy deposition rates in the canonical laboratory flames that have been
2016-06-02
Retrieval of droplet-size density distribution from multiple-field-of-view cross-polarized lidar signals: theory and experimental validation...theoretical and experimental studies of mul- tiple scattering and multiple-field-of-view (MFOV) li- dar detection have made possible the retrieval of cloud...droplet cloud are typical of Rayleigh scattering, with a signature close to a dipole (phase function quasi -flat and a zero-depolarization ratio
2001-08-30
Body with Thermo-Chemical destribution of Heat-Protected System . In: Physical and Gasdynamic Phenomena in Supersonic Flows Over Bodies. Edit. By...Final Report on ISTC Contract # 1809p Parametric Study of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental...of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental Validation Planning 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT
Experimental validation of ultrasonic guided modes in electrical cables by optical interferometry.
Mateo, Carlos; de Espinosa, Francisco Montero; Gómez-Ullate, Yago; Talavera, Juan A
2008-03-01
In this work, the dispersion curves of elastic waves propagating in electrical cables and in bare copper wires are obtained theoretically and validated experimentally. The theoretical model, based on Gazis equations formulated according to the global matrix methodology, is resolved numerically. Viscoelasticity and attenuation are modeled theoretically using the Kelvin-Voigt model. Experimental tests are carried out using interferometry. There is good agreement between the simulations and the experiments despite the peculiarities of electrical cables.
Translating Fatigue to Human Performance.
Enoka, Roger M; Duchateau, Jacques
2016-11-01
Despite flourishing interest in the topic of fatigue-as indicated by the many presentations on fatigue at the 2015 Annual Meeting of the American College of Sports Medicine-surprisingly little is known about its effect on human performance. There are two main reasons for this dilemma: 1) the inability of current terminology to accommodate the scope of the conditions ascribed to fatigue, and 2) a paucity of validated experimental models. In contrast to current practice, a case is made for a unified definition of fatigue to facilitate its management in health and disease. On the basis of the classic two-domain concept of Mosso, fatigue is defined as a disabling symptom in which physical and cognitive function is limited by interactions between performance fatigability and perceived fatigability. As a symptom, fatigue can only be measured by self-report, quantified as either a trait characteristic or a state variable. One consequence of such a definition is that the word fatigue should not be preceded by an adjective (e.g., central, mental, muscle, peripheral, and supraspinal) to suggest the locus of the changes responsible for an observed level of fatigue. Rather, mechanistic studies should be performed with validated experimental models to identify the changes responsible for the reported fatigue. As indicated by three examples (walking endurance in old adults, time trials by endurance athletes, and fatigue in persons with multiple sclerosis) discussed in the review, however, it has proven challenging to develop valid experimental models of fatigue. The proposed framework provides a foundation to address the many gaps in knowledge of how laboratory measures of fatigue and fatigability affect real-world performance.
Translating Fatigue to Human Performance
Enoka, Roger M.; Duchateau, Jacques
2016-01-01
Despite flourishing interest in the topic of fatigue—as indicated by the many presentations on fatigue at the 2015 annual meeting of the American College of Sports Medicine—surprisingly little is known about its impact on human performance. There are two main reasons for this dilemma: (1) the inability of current terminology to accommodate the scope of the conditions ascribed to fatigue, and (2) a paucity of validated experimental models. In contrast to current practice, a case is made for a unified definition of fatigue to facilitate its management in health and disease. Based on the classic two-domain concept of Mosso, fatigue is defined as a disabling symptom in which physical and cognitive function is limited by interactions between performance fatigability and perceived fatigability. As a symptom, fatigue can only be measured by self-report, quantified as either a trait characteristic or a state variable. One consequence of such a definition is that the word fatigue should not be preceded by an adjective (e.g., central, mental, muscle, peripheral, and supraspinal) to suggest the locus of the changes responsible for an observed level of fatigue. Rather, mechanistic studies should be performed with validated experimental models to identify the changes responsible for the reported fatigue. As indicated by three examples (walking endurance in old adults, time trials by endurance athletes, and fatigue in persons with multiple sclerosis) discussed in the review, however, it has proven challenging to develop valid experimental models of fatigue. The proposed framework provides a foundation to address the many gaps in knowledge of how laboratory measures of fatigue and fatigability impact real-world performance. PMID:27015386
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tojo, H., E-mail: tojo.hiroshi@qst.go.jp; Hiratsuka, J.; Yatsuka, E.
2016-09-15
This paper evaluates the accuracy of electron temperature measurements and relative transmissivities of double-pass Thomson scattering diagnostics. The electron temperature (T{sub e}) is obtained from the ratio of signals from a double-pass scattering system, then relative transmissivities are calculated from the measured T{sub e} and intensity of the signals. How accurate the values are depends on the electron temperature (T{sub e}) and scattering angle (θ), and therefore the accuracy of the values was evaluated experimentally using the Large Helical Device (LHD) and the Tokyo spherical tokamak-2 (TST-2). Analyzing the data from the TST-2 indicates that a high T{sub e} andmore » a large scattering angle (θ) yield accurate values. Indeed, the errors for scattering angle θ = 135° are approximately half of those for θ = 115°. The method of determining the T{sub e} in a wide T{sub e} range spanning over two orders of magnitude (0.01–1.5 keV) was validated using the experimental results of the LHD and TST-2. A simple method to provide relative transmissivities, which include inputs from collection optics, vacuum window, optical fibers, and polychromators, is also presented. The relative errors were less than approximately 10%. Numerical simulations also indicate that the T{sub e} measurements are valid under harsh radiation conditions. This method to obtain T{sub e} can be considered for the design of Thomson scattering systems where there is high-performance plasma that generates harsh radiation environments.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-05
... modeling needs and experimental validation techniques for complex flow phenomena in and around off- shore... experimental validation. Ultimately, research in this area may lead to significant improvements in wind plant... meeting will consist of an initial plenary session in which invited speakers will survey available...
Validation of a Monte Carlo simulation of the Inveon PET scanner using GATE
NASA Astrophysics Data System (ADS)
Lu, Lijun; Zhang, Houjin; Bian, Zhaoying; Ma, Jianhua; Feng, Qiangjin; Chen, Wufan
2016-08-01
The purpose of this study is to validate the application of GATE (Geant4 Application for Tomographic Emission) Monte Carlo simulation toolkit in order to model the performance characteristics of Siemens Inveon small animal PET system. The simulation results were validated against experimental/published data in accordance with the NEMA NU-4 2008 protocol for standardized evaluation of spatial resolution, sensitivity, scatter fraction (SF) and noise equivalent counting rate (NECR) of a preclinical PET system. An agreement of less than 18% was obtained between the radial, tangential and axial spatial resolutions of the simulated and experimental results. The simulated peak NECR of mouse-size phantom agreed with the experimental result, while for the rat-size phantom simulated value was higher than experimental result. The simulated and experimental SFs of mouse- and rat- size phantom both reached an agreement of less than 2%. It has been shown the feasibility of our GATE model to accurately simulate, within certain limits, all major performance characteristics of Inveon PET system.
Validation of an automated mite counter for Dermanyssus gallinae in experimental laying hen cages.
Mul, Monique F; van Riel, Johan W; Meerburg, Bastiaan G; Dicke, Marcel; George, David R; Groot Koerkamp, Peter W G
2015-08-01
For integrated pest management (IPM) programs to be maximally effective, monitoring of the growth and decline of the pest populations is essential. Here, we present the validation results of a new automated monitoring device for the poultry red mite (Dermanyssus gallinae), a serious pest in laying hen facilities world-wide. This monitoring device (called an "automated mite counter") was validated in experimental laying hen cages with live birds and a growing population of D. gallinae. This validation study resulted in 17 data points of 'number of mites counted' by the automated mite counter and the 'number of mites present' in the experimental laying hen cages. The study demonstrated that the automated mite counter was able to track the D. gallinae population effectively. A wider evaluation showed that this automated mite counter can become a useful tool in IPM of D. gallinae in laying hen facilities.
Review and assessment of turbulence models for hypersonic flows
NASA Astrophysics Data System (ADS)
Roy, Christopher J.; Blottner, Frederick G.
2006-10-01
Accurate aerodynamic prediction is critical for the design and optimization of hypersonic vehicles. Turbulence modeling remains a major source of uncertainty in the computational prediction of aerodynamic forces and heating for these systems. The first goal of this article is to update the previous comprehensive review of hypersonic shock/turbulent boundary-layer interaction experiments published in 1991 by Settles and Dodson (Hypersonic shock/boundary-layer interaction database. NASA CR 177577, 1991). In their review, Settles and Dodson developed a methodology for assessing experiments appropriate for turbulence model validation and critically surveyed the existing hypersonic experiments. We limit the scope of our current effort by considering only two-dimensional (2D)/axisymmetric flows in the hypersonic flow regime where calorically perfect gas models are appropriate. We extend the prior database of recommended hypersonic experiments (on four 2D and two 3D shock-interaction geometries) by adding three new geometries. The first two geometries, the flat plate/cylinder and the sharp cone, are canonical, zero-pressure gradient flows which are amenable to theory-based correlations, and these correlations are discussed in detail. The third geometry added is the 2D shock impinging on a turbulent flat plate boundary layer. The current 2D hypersonic database for shock-interaction flows thus consists of nine experiments on five different geometries. The second goal of this study is to review and assess the validation usage of various turbulence models on the existing experimental database. Here we limit the scope to one- and two-equation turbulence models where integration to the wall is used (i.e., we omit studies involving wall functions). A methodology for validating turbulence models is given, followed by an extensive evaluation of the turbulence models on the current hypersonic experimental database. A total of 18 one- and two-equation turbulence models are reviewed, and results of turbulence model assessments for the six models that have been extensively applied to the hypersonic validation database are compiled and presented in graphical form. While some of the turbulence models do provide reasonable predictions for the surface pressure, the predictions for surface heat flux are generally poor, and often in error by a factor of four or more. In the vast majority of the turbulence model validation studies we review, the authors fail to adequately address the numerical accuracy of the simulations (i.e., discretization and iterative error) and the sensitivities of the model predictions to freestream turbulence quantities or near-wall y+ mesh spacing. We recommend new hypersonic experiments be conducted which (1) measure not only surface quantities but also mean and fluctuating quantities in the interaction region and (2) provide careful estimates of both random experimental uncertainties and correlated bias errors for the measured quantities and freestream conditions. For the turbulence models, we recommend that a wide-range of turbulence models (including newer models) be re-examined on the current hypersonic experimental database, including the more recent experiments. Any future turbulence model validation efforts should carefully assess the numerical accuracy and model sensitivities. In addition, model corrections (e.g., compressibility corrections) should be carefully examined for their effects on a standard, low-speed validation database. Finally, as new experiments or direct numerical simulation data become available with information on mean and fluctuating quantities, they should be used to improve the turbulence models and thus increase their predictive capability.
van de Streek, Jacco; Neumann, Marcus A
2010-10-01
This paper describes the validation of a dispersion-corrected density functional theory (d-DFT) method for the purpose of assessing the correctness of experimental organic crystal structures and enhancing the information content of purely experimental data. 241 experimental organic crystal structures from the August 2008 issue of Acta Cryst. Section E were energy-minimized in full, including unit-cell parameters. The differences between the experimental and the minimized crystal structures were subjected to statistical analysis. The r.m.s. Cartesian displacement excluding H atoms upon energy minimization with flexible unit-cell parameters is selected as a pertinent indicator of the correctness of a crystal structure. All 241 experimental crystal structures are reproduced very well: the average r.m.s. Cartesian displacement for the 241 crystal structures, including 16 disordered structures, is only 0.095 Å (0.084 Å for the 225 ordered structures). R.m.s. Cartesian displacements above 0.25 A either indicate incorrect experimental crystal structures or reveal interesting structural features such as exceptionally large temperature effects, incorrectly modelled disorder or symmetry breaking H atoms. After validation, the method is applied to nine examples that are known to be ambiguous or subtly incorrect.
QSAR modeling: where have you been? Where are you going to?
Cherkasov, Artem; Muratov, Eugene N; Fourches, Denis; Varnek, Alexandre; Baskin, Igor I; Cronin, Mark; Dearden, John; Gramatica, Paola; Martin, Yvonne C; Todeschini, Roberto; Consonni, Viviana; Kuz'min, Victor E; Cramer, Richard; Benigni, Romualdo; Yang, Chihae; Rathman, James; Terfloth, Lothar; Gasteiger, Johann; Richard, Ann; Tropsha, Alexander
2014-06-26
Quantitative structure-activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists toward collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR studies, as well as encourage the use of high quality, validated QSARs for regulatory decision making.
Assessing Students' Understanding of Macroevolution: Concerns regarding the validity of the MUM
NASA Astrophysics Data System (ADS)
Novick, Laura R.; Catley, Kefyn M.
2012-11-01
In a recent article, Nadelson and Southerland (2010. Development and preliminary evaluation of the Measure of Understanding of Macroevolution: Introducing the MUM. The Journal of Experimental Education, 78, 151-190) reported on their development of a multiple-choice concept inventory intended to assess college students' understanding of macroevolutionary concepts, the Measure of Understanding Macroevolution (MUM). Given that the only existing evolution inventories assess understanding of natural selection, a microevolutionary concept, a valid assessment of students' understanding of macroevolution would be a welcome and necessary addition to the field of science education. Although the conceptual framework underlying Nadelson and Southerland's test is promising, we believe the test has serious shortcomings with respect to validity evidence for the construct being tested. We argue and provide evidence that these problems are serious enough that the MUM should not be used in its current form to measure students' understanding of macroevolution.
Young, Jasmine Y.; Westbrook, John D.; Feng, Zukang; Sala, Raul; Peisach, Ezra; Oldfield, Thomas J.; Sen, Sanchayita; Gutmanas, Aleksandras; Armstrong, David R.; Berrisford, John M.; Chen, Li; Chen, Minyu; Di Costanzo, Luigi; Dimitropoulos, Dimitris; Gao, Guanghua; Ghosh, Sutapa; Gore, Swanand; Guranovic, Vladimir; Hendrickx, Pieter MS; Hudson, Brian P.; Igarashi, Reiko; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L.; Liang, Yuhe; Mading, Steve; Mak, Lora; Mir, M. Saqib; Mukhopadhyay, Abhik; Patwardhan, Ardan; Persikova, Irina; Rinaldi, Luana; Sanz-Garcia, Eduardo; Sekharan, Monica R.; Shao, Chenghua; Swaminathan, G. Jawahar; Tan, Lihua; Ulrich, Eldon L.; van Ginkel, Glen; Yamashita, Reiko; Yang, Huanwang; Zhuravleva, Marina A.; Quesada, Martha; Kleywegt, Gerard J.; Berman, Helen M.; Markley, John L.; Nakamura, Haruki; Velankar, Sameer; Burley, Stephen K.
2017-01-01
SUMMARY OneDep, a unified system for deposition, biocuration, and validation of experimentally determined structures of biological macromolecules to the Protein Data Bank (PDB) archive, has been developed as a global collaboration by the Worldwide Protein Data Bank (wwPDB) partners. This new system was designed to ensure that the wwPDB could meet the evolving archiving requirements of the scientific community over the coming decades. OneDep unifies deposition, biocuration, and validation pipelines across all wwPDB, EMDB, and BMRB deposition sites with improved focus on data quality and completeness in these archives, while supporting growth in the number of depositions and increases in their average size and complexity. In this paper, we describe the design, functional operation, and supporting infrastructure of the OneDep system, and provide initial performance assessments. PMID:28190782
QSAR Modeling: Where have you been? Where are you going to?
Cherkasov, Artem; Muratov, Eugene N.; Fourches, Denis; Varnek, Alexandre; Baskin, Igor I.; Cronin, Mark; Dearden, John; Gramatica, Paola; Martin, Yvonne C.; Todeschini, Roberto; Consonni, Viviana; Kuz'min, Victor E.; Cramer, Richard; Benigni, Romualdo; Yang, Chihae; Rathman, James; Terfloth, Lothar; Gasteiger, Johann; Richard, Ann; Tropsha, Alexander
2014-01-01
Quantitative Structure-Activity Relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss: (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists towards collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR studies, as well as encourage the use of high quality, validated QSARs for regulatory decision making. PMID:24351051
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pardini, Tom; Aquila, Andrew; Boutet, Sebastien
Numerical simulations of the current and future pulse intensity distributions at selected locations along the Far Experimental Hall, the hard X-ray section of the Linac Coherent Light Source (LCLS), are provided. Estimates are given for the pulse fluence, energy and size in and out of focus, taking into account effects due to the experimentally measured divergence of the X-ray beam, and measured figure errors of all X-ray optics in the beam path. Out-of-focus results are validated by comparison with experimental data. Previous work is expanded on, providing quantitatively correct predictions of the pulse intensity distribution. Numerical estimates in focus aremore » particularly important given that the latter cannot be measured with direct imaging techniques due to detector damage. Finally, novel numerical estimates of improvements to the pulse intensity distribution expected as part of the on-going upgrade of the LCLS X-ray transport system are provided. As a result, we suggest how the new generation of X-ray optics to be installed would outperform the old one, satisfying the tight requirements imposed by X-ray free-electron laser facilities.« less
Pardini, Tom; Aquila, Andrew; Boutet, Sebastien; ...
2017-06-15
Numerical simulations of the current and future pulse intensity distributions at selected locations along the Far Experimental Hall, the hard X-ray section of the Linac Coherent Light Source (LCLS), are provided. Estimates are given for the pulse fluence, energy and size in and out of focus, taking into account effects due to the experimentally measured divergence of the X-ray beam, and measured figure errors of all X-ray optics in the beam path. Out-of-focus results are validated by comparison with experimental data. Previous work is expanded on, providing quantitatively correct predictions of the pulse intensity distribution. Numerical estimates in focus aremore » particularly important given that the latter cannot be measured with direct imaging techniques due to detector damage. Finally, novel numerical estimates of improvements to the pulse intensity distribution expected as part of the on-going upgrade of the LCLS X-ray transport system are provided. As a result, we suggest how the new generation of X-ray optics to be installed would outperform the old one, satisfying the tight requirements imposed by X-ray free-electron laser facilities.« less
Rousu, Matthew C.; Thrasher, James F.
2014-01-01
Experimental and observational research often involves asking consumers to self-report the impact of some proposed option. Because self-reported responses involve no consequence to the respondent for falsely revealing how he or she feels about an issue, self-reports may be subject to social desirability and other influences that bias responses in important ways. In this article, we analyzed data from an experiment on the impact of cigarette packaging and pack warnings, comparing smokers’ self-reported impact (four-item scale) and the bids they placed in experimental auctions to estimate differences in demand. The results were consistent across methods; however, the estimated effect size associated with different warning labels was two times greater for the four-item self-reported response scale when compared to the change in demand as indicated by auction bids. Our study provides evidence that self-reported psychosocial responses provide a valid proxy for behavioral change as reflected by experimental auction bidding behavior. More research is needed to better understand the advantages and disadvantages of behavioral economic methods and traditional self-report approaches to evaluating health behavior change interventions. PMID:24399267
Expanding the horizons of microRNA bioinformatics.
Huntley, Rachael P; Kramarz, Barbara; Sawford, Tony; Umrao, Zara; Kalea, Anastasia Z; Acquaah, Vanessa; Martin, Maria-Jesus; Mayr, Manuel; Lovering, Ruth C
2018-06-05
MicroRNA regulation of key biological and developmental pathways is a rapidly expanding area of research, accompanied by vast amounts of experimental data. This data, however, is not widely available in bioinformatic resources, making it difficult for researchers to find and analyse microRNA-related experimental data and define further research projects. We are addressing this problem by providing two new bioinformatics datasets that contain experimentally verified functional information for mammalian microRNAs involved in cardiovascular-relevant, and other, processes. To date, our resource provides over 3,900 Gene Ontology annotations associated with almost 500 miRNAs from human, mouse and rat and over 2,200 experimentally validated miRNA:target interactions. We illustrate how this resource can be used to create miRNA-focused interaction networks with a biological context using the known biological role of miRNAs and the mRNAs they regulate, enabling discovery of associations between gene products, biological pathways and, ultimately, diseases. This data will be crucial in advancing the field of microRNA bioinformatics and will establish consistent datasets for reproducible functional analysis of microRNAs across all biological research areas. Published by Cold Spring Harbor Laboratory Press for the RNA Society.
ARMOUR - A Rice miRNA: mRNA Interaction Resource.
Sanan-Mishra, Neeti; Tripathi, Anita; Goswami, Kavita; Shukla, Rohit N; Vasudevan, Madavan; Goswami, Hitesh
2018-01-01
ARMOUR was developed as A Rice miRNA:mRNA interaction resource. This informative and interactive database includes the experimentally validated expression profiles of miRNAs under different developmental and abiotic stress conditions across seven Indian rice cultivars. This comprehensive database covers 689 known and 1664 predicted novel miRNAs and their expression profiles in more than 38 different tissues or conditions along with their predicted/known target transcripts. The understanding of miRNA:mRNA interactome in regulation of functional cellular machinery is supported by the sequence information of the mature and hairpin structures. ARMOUR provides flexibility to users in querying the database using multiple ways like known gene identifiers, gene ontology identifiers, KEGG identifiers and also allows on the fly fold change analysis and sequence search query with inbuilt BLAST algorithm. ARMOUR database provides a cohesive platform for novel and mature miRNAs and their expression in different experimental conditions and allows searching for their interacting mRNA targets, GO annotation and their involvement in various biological pathways. The ARMOUR database includes a provision for adding more experimental data from users, with an aim to develop it as a platform for sharing and comparing experimental data contributed by research groups working on rice.
NASA Astrophysics Data System (ADS)
Murray, Mark; Shorter, Alex; Howle, Laurens; Johnson, Mark; Moore, Michael
2012-11-01
The improvement and miniaturization of sensing technologies has made bio-logging tags, utilized for the study of marine mammal behavior, more practical. These sophisticated sensing packages require a housing which protects the electronics from the environment and provides a means of attachment to the animal. The hydrodynamic forces on these housings can inadvertently remove the tag or adversely affect the behavior or energetics of the animal. A modification to the original design of a suction cup bio-logging tag housing was desired to minimize the adverse forces. In this work, hydrodynamic loading of two suction cup tag designs, original and modified designs, were analyzed using computational fluid dynamics (CFD) models and validated experimentally. Overall, the simulation and experimental results demonstrated that a tag housing that minimized geometric disruptions to the flow reduced drag forces, and that a tag housing with a small frontal cross-sectional area close to the attachment surface reduced lift forces. Preliminary results from experimental work with a common dolphin cadaver indicates that the suction cups used to attach the tags to the animal provide sufficient attachment force to resist failure at predicted drag and lift forces in 10 m/s flow.
Computational fluid dynamics modeling of laboratory flames and an industrial flare.
Singh, Kanwar Devesh; Gangadharan, Preeti; Chen, Daniel H; Lou, Helen H; Li, Xianchang; Richmond, Peyton
2014-11-01
A computational fluid dynamics (CFD) methodology for simulating the combustion process has been validated with experimental results. Three different types of experimental setups were used to validate the CFD model. These setups include an industrial-scale flare setups and two lab-scale flames. The CFD study also involved three different fuels: C3H6/CH/Air/N2, C2H4/O2/Ar and CH4/Air. In the first setup, flare efficiency data from the Texas Commission on Environmental Quality (TCEQ) 2010 field tests were used to validate the CFD model. In the second setup, a McKenna burner with flat flames was simulated. Temperature and mass fractions of important species were compared with the experimental data. Finally, results of an experimental study done at Sandia National Laboratories to generate a lifted jet flame were used for the purpose of validation. The reduced 50 species mechanism, LU 1.1, the realizable k-epsilon turbulence model, and the EDC turbulence-chemistry interaction model were usedfor this work. Flare efficiency, axial profiles of temperature, and mass fractions of various intermediate species obtained in the simulation were compared with experimental data and a good agreement between the profiles was clearly observed. In particular the simulation match with the TCEQ 2010 flare tests has been significantly improved (within 5% of the data) compared to the results reported by Singh et al. in 2012. Validation of the speciated flat flame data supports the view that flares can be a primary source offormaldehyde emission.
An Experimentally Validated Numerical Modeling Technique for Perforated Plate Heat Exchangers
Nellis, G. F.; Kelin, S. A.; Zhu, W.; Gianchandani, Y.
2010-01-01
Cryogenic and high-temperature systems often require compact heat exchangers with a high resistance to axial conduction in order to control the heat transfer induced by axial temperature differences. One attractive design for such applications is a perforated plate heat exchanger that utilizes high conductivity perforated plates to provide the stream-to-stream heat transfer and low conductivity spacers to prevent axial conduction between the perforated plates. This paper presents a numerical model of a perforated plate heat exchanger that accounts for axial conduction, external parasitic heat loads, variable fluid and material properties, and conduction to and from the ends of the heat exchanger. The numerical model is validated by experimentally testing several perforated plate heat exchangers that are fabricated using microelectromechanical systems based manufacturing methods. This type of heat exchanger was investigated for potential use in a cryosurgical probe. One of these heat exchangers included perforated plates with integrated platinum resistance thermometers. These plates provided in situ measurements of the internal temperature distribution in addition to the temperature, pressure, and flow rate measured at the inlet and exit ports of the device. The platinum wires were deposited between the fluid passages on the perforated plate and are used to measure the temperature at the interface between the wall material and the flowing fluid. The experimental testing demonstrates the ability of the numerical model to accurately predict both the overall performance and the internal temperature distribution of perforated plate heat exchangers over a range of geometry and operating conditions. The parameters that were varied include the axial length, temperature range, mass flow rate, and working fluid. PMID:20976021
NASA Astrophysics Data System (ADS)
Egron, Sylvain; Soummer, Rémi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Levecq, Olivier; Mazoyer, Johan; N'Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand
2017-09-01
The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop experiment designed to study wavefront sensing and control for a segmented space telescope, such as JWST. With the JWST Science and Operations Center co-located at STScI, JOST was developed to provide both a platform for staff training and to test alternate wavefront sensing and control strategies for independent validation or future improvements beyond the baseline operations. The design of JOST reproduces the physics of JWST's three-mirror anastigmat (TMA) using three custom aspheric lenses. It provides similar quality image as JWST (80% Strehl ratio) over a field equivalent to a NIRCam module, but at 633 nm. An Iris AO segmented mirror stands for the segmented primary mirror of JWST. Actuators allow us to control (1) the 18 segments of the segmented mirror in piston, tip, tilt and (2) the second lens, which stands for the secondary mirror, in tip, tilt and x, y, z positions. We present the most recent experimental results for the segmented mirror alignment. Our implementation of the Wavefront Sensing (WFS) algorithms using phase diversity is tested on simulation and experimentally. The wavefront control (WFC) algorithms, which rely on a linear model for optical aberrations induced by misalignment of the secondary lens and the segmented mirror, are tested and validated both on simulations and experimentally. In this proceeding, we present the performance of the full active optic control loop in presence of perturbations on the segmented mirror, and we detail the quality of the alignment correction.
An Experimentally Validated Numerical Modeling Technique for Perforated Plate Heat Exchangers.
White, M J; Nellis, G F; Kelin, S A; Zhu, W; Gianchandani, Y
2010-11-01
Cryogenic and high-temperature systems often require compact heat exchangers with a high resistance to axial conduction in order to control the heat transfer induced by axial temperature differences. One attractive design for such applications is a perforated plate heat exchanger that utilizes high conductivity perforated plates to provide the stream-to-stream heat transfer and low conductivity spacers to prevent axial conduction between the perforated plates. This paper presents a numerical model of a perforated plate heat exchanger that accounts for axial conduction, external parasitic heat loads, variable fluid and material properties, and conduction to and from the ends of the heat exchanger. The numerical model is validated by experimentally testing several perforated plate heat exchangers that are fabricated using microelectromechanical systems based manufacturing methods. This type of heat exchanger was investigated for potential use in a cryosurgical probe. One of these heat exchangers included perforated plates with integrated platinum resistance thermometers. These plates provided in situ measurements of the internal temperature distribution in addition to the temperature, pressure, and flow rate measured at the inlet and exit ports of the device. The platinum wires were deposited between the fluid passages on the perforated plate and are used to measure the temperature at the interface between the wall material and the flowing fluid. The experimental testing demonstrates the ability of the numerical model to accurately predict both the overall performance and the internal temperature distribution of perforated plate heat exchangers over a range of geometry and operating conditions. The parameters that were varied include the axial length, temperature range, mass flow rate, and working fluid.
NASA Technical Reports Server (NTRS)
Hazelton, R. C.; Yadlowsky, E. J.; Churchill, R. J.; Parker, L. W.; Sellers, B.
1981-01-01
The effect differential charging of spacecraft thermal control surfaces is assessed by studying the dynamics of the charging process. A program to experimentally validate a computer model of the charging process was established. Time resolved measurements of the surface potential were obtained for samples of Kapton and Teflon irradiated with a monoenergetic electron beam. Results indicate that the computer model and experimental measurements agree well and that for Teflon, secondary emission is the governing factor. Experimental data indicate that bulk conductivities play a significant role in the charging of Kapton.
Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment
NASA Technical Reports Server (NTRS)
Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)
2017-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.
Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment
NASA Technical Reports Server (NTRS)
Storey, Jed; Kirk, Daniel (Editor); Marsell, Brandon (Editor); Schallhorn, Paul (Editor)
2017-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.
Design and Implementation of a Smart Home System Using Multisensor Data Fusion Technology.
Hsu, Yu-Liang; Chou, Po-Huan; Chang, Hsing-Cheng; Lin, Shyan-Lung; Yang, Shih-Chin; Su, Heng-Yi; Chang, Chih-Chien; Cheng, Yuan-Sheng; Kuo, Yu-Chen
2017-07-15
This paper aims to develop a multisensor data fusion technology-based smart home system by integrating wearable intelligent technology, artificial intelligence, and sensor fusion technology. We have developed the following three systems to create an intelligent smart home environment: (1) a wearable motion sensing device to be placed on residents' wrists and its corresponding 3D gesture recognition algorithm to implement a convenient automated household appliance control system; (2) a wearable motion sensing device mounted on a resident's feet and its indoor positioning algorithm to realize an effective indoor pedestrian navigation system for smart energy management; (3) a multisensor circuit module and an intelligent fire detection and alarm algorithm to realize a home safety and fire detection system. In addition, an intelligent monitoring interface is developed to provide in real-time information about the smart home system, such as environmental temperatures, CO concentrations, communicative environmental alarms, household appliance status, human motion signals, and the results of gesture recognition and indoor positioning. Furthermore, an experimental testbed for validating the effectiveness and feasibility of the smart home system was built and verified experimentally. The results showed that the 3D gesture recognition algorithm could achieve recognition rates for automated household appliance control of 92.0%, 94.8%, 95.3%, and 87.7% by the 2-fold cross-validation, 5-fold cross-validation, 10-fold cross-validation, and leave-one-subject-out cross-validation strategies. For indoor positioning and smart energy management, the distance accuracy and positioning accuracy were around 0.22% and 3.36% of the total traveled distance in the indoor environment. For home safety and fire detection, the classification rate achieved 98.81% accuracy for determining the conditions of the indoor living environment.
Design and Implementation of a Smart Home System Using Multisensor Data Fusion Technology
Chou, Po-Huan; Chang, Hsing-Cheng; Lin, Shyan-Lung; Yang, Shih-Chin; Su, Heng-Yi; Chang, Chih-Chien; Cheng, Yuan-Sheng; Kuo, Yu-Chen
2017-01-01
This paper aims to develop a multisensor data fusion technology-based smart home system by integrating wearable intelligent technology, artificial intelligence, and sensor fusion technology. We have developed the following three systems to create an intelligent smart home environment: (1) a wearable motion sensing device to be placed on residents’ wrists and its corresponding 3D gesture recognition algorithm to implement a convenient automated household appliance control system; (2) a wearable motion sensing device mounted on a resident’s feet and its indoor positioning algorithm to realize an effective indoor pedestrian navigation system for smart energy management; (3) a multisensor circuit module and an intelligent fire detection and alarm algorithm to realize a home safety and fire detection system. In addition, an intelligent monitoring interface is developed to provide in real-time information about the smart home system, such as environmental temperatures, CO concentrations, communicative environmental alarms, household appliance status, human motion signals, and the results of gesture recognition and indoor positioning. Furthermore, an experimental testbed for validating the effectiveness and feasibility of the smart home system was built and verified experimentally. The results showed that the 3D gesture recognition algorithm could achieve recognition rates for automated household appliance control of 92.0%, 94.8%, 95.3%, and 87.7% by the 2-fold cross-validation, 5-fold cross-validation, 10-fold cross-validation, and leave-one-subject-out cross-validation strategies. For indoor positioning and smart energy management, the distance accuracy and positioning accuracy were around 0.22% and 3.36% of the total traveled distance in the indoor environment. For home safety and fire detection, the classification rate achieved 98.81% accuracy for determining the conditions of the indoor living environment. PMID:28714884
Pre-test CFD Calculations for a Bypass Flow Standard Problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rich Johnson
The bypass flow in a prismatic high temperature gas-cooled reactor (HTGR) is the flow that occurs between adjacent graphite blocks. Gaps exist between blocks due to variances in their manufacture and installation and because of the expansion and shrinkage of the blocks from heating and irradiation. Although the temperature of fuel compacts and graphite is sensitive to the presence of bypass flow, there is great uncertainty in the level and effects of the bypass flow. The Next Generation Nuclear Plant (NGNP) program at the Idaho National Laboratory has undertaken to produce experimental data of isothermal bypass flow between three adjacentmore » graphite blocks. These data are intended to provide validation for computational fluid dynamic (CFD) analyses of the bypass flow. Such validation data sets are called Standard Problems in the nuclear safety analysis field. Details of the experimental apparatus as well as several pre-test calculations of the bypass flow are provided. Pre-test calculations are useful in examining the nature of the flow and to see if there are any problems associated with the flow and its measurement. The apparatus is designed to be able to provide three different gap widths in the vertical direction (the direction of the normal coolant flow) and two gap widths in the horizontal direction. It is expected that the vertical bypass flow will range from laminar to transitional to turbulent flow for the different gap widths that will be available.« less
Assessing the significance of pedobarographic signals using random field theory.
Pataky, Todd C
2008-08-07
Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.
Numerical Investigation of Plasma Detachment in Magnetic Nozzle Experiments
NASA Technical Reports Server (NTRS)
Sankaran, Kamesh; Polzin, Kurt A.
2008-01-01
At present there exists no generally accepted theoretical model that provides a consistent physical explanation of plasma detachment from an externally-imposed magnetic nozzle. To make progress towards that end, simulation of plasma flow in the magnetic nozzle of an arcjet experiment is performed using a multidimensional numerical simulation tool that includes theoretical models of the various dispersive and dissipative processes present in the plasma. This is an extension of the simulation tool employed in previous work by Sankaran et al. The aim is to compare the computational results with various proposed magnetic nozzle detachment theories to develop an understanding of the physical mechanisms that cause detachment. An applied magnetic field topology is obtained using a magnetostatic field solver (see Fig. I), and this field is superimposed on the time-dependent magnetic field induced in the plasma to provide a self-consistent field description. The applied magnetic field and model geometry match those found in experiments by Kuriki and Okada. This geometry is modeled because there is a substantial amount of experimental data that can be compared to the computational results, allowing for validation of the model. In addition, comparison of the simulation results with the experimentally obtained plasma parameters will provide insight into the mechanisms that lead to plasma detachment, revealing how they scale with different input parameters. Further studies will focus on modeling literature experiments both for the purpose of additional code validation and to extract physical insight regarding the mechanisms driving detachment.
Schlosser, Ralf W; Belfiore, Phillip J; Sigafoos, Jeff; Briesch, Amy M; Wendt, Oliver
2018-05-28
Evidence-based practice as a process requires the appraisal of research as a critical step. In the field of developmental disabilities, single-case experimental designs (SCEDs) figure prominently as a means for evaluating the effectiveness of non-reversible instructional interventions. Comparative SCEDs contrast two or more instructional interventions to document their relative effectiveness and efficiency. As such, these designs have great potential to inform evidence-based decision-making. To harness this potential, however, interventionists and authors of systematic reviews need tools to appraise the evidence generated by these designs. Our literature review revealed that existing tools do not adequately address the specific methodological considerations of comparative SCEDs that aim to compare instructional interventions of non-reversible target behaviors. The purpose of this paper is to introduce the Comparative Single-Case Experimental Design Rating System (CSCEDARS, "cedars") as a tool for appraising the internal validity of comparative SCEDs of two or more non-reversible instructional interventions. Pertinent literature will be reviewed to establish the need for this tool and to underpin the rationales for individual rating items. Initial reliability information will be provided as well. Finally, directions for instrument validation will be proposed. Copyright © 2018 Elsevier Ltd. All rights reserved.
Calder, Stefan; O'Grady, Greg; Cheng, Leo K; Du, Peng
2018-04-27
Electrogastrography (EGG) is a non-invasive method for measuring gastric electrical activity. Recent simulation studies have attempted to extend the current clinical utility of the EGG, in particular by providing a theoretical framework for distinguishing specific gastric slow wave dysrhythmias. In this paper we implement an experimental setup called a 'torso-tank' with the aim of expanding and experimentally validating these previous simulations. The torso-tank was developed using an adult male torso phantom with 190 electrodes embedded throughout the torso. The gastric slow waves were reproduced using an artificial current source capable of producing 3D electrical fields. Multiple gastric dysrhythmias were reproduced based on high-resolution mapping data from cases of human gastric dysfunction (gastric re-entry, conduction blocks and ectopic pacemakers) in addition to normal test data. Each case was recorded and compared to the previously-presented simulated results. Qualitative and quantitative analyses were performed to define the accuracy showing [Formula: see text] 1.8% difference, [Formula: see text] 0.99 correlation, and [Formula: see text] 0.04 normalised RMS error between experimental and simulated findings. These results reaffirm previous findings and these methods in unison therefore present a promising morphological-based methodology for advancing the understanding and clinical applications of EGG.
Supersonic Retro-Propulsion Experimental Design for Computational Fluid Dynamics Model Validation
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Laws, Christopher T.; Kleb, W. L.; Rhode, Matthew N.; Spells, Courtney; McCrea, Andrew C.; Truble, Kerry A.; Schauerhamer, Daniel G.; Oberkampf, William L.
2011-01-01
The development of supersonic retro-propulsion, an enabling technology for heavy payload exploration missions to Mars, is the primary focus for the present paper. A new experimental model, intended to provide computational fluid dynamics model validation data, was recently designed for the Langley Research Center Unitary Plan Wind Tunnel Test Section 2. Pre-test computations were instrumental for sizing and refining the model, over the Mach number range of 2.4 to 4.6, such that tunnel blockage and internal flow separation issues would be minimized. A 5-in diameter 70-deg sphere-cone forebody, which accommodates up to four 4:1 area ratio nozzles, followed by a 10-in long cylindrical aftbody was developed for this study based on the computational results. The model was designed to allow for a large number of surface pressure measurements on the forebody and aftbody. Supplemental data included high-speed Schlieren video and internal pressures and temperatures. The run matrix was developed to allow for the quantification of various sources of experimental uncertainty, such as random errors due to run-to-run variations and bias errors due to flow field or model misalignments. Some preliminary results and observations from the test are presented, although detailed analyses of the data and uncertainties are still on going.
Determination of the core temperature of a Li-ion cell during thermal runaway
NASA Astrophysics Data System (ADS)
Parhizi, M.; Ahmed, M. B.; Jain, A.
2017-12-01
Safety and performance of Li-ion cells is severely affected by thermal runaway where exothermic processes within the cell cause uncontrolled temperature rise, eventually leading to catastrophic failure. Most past experimental papers on thermal runaway only report surface temperature measurement, while the core temperature of the cell remains largely unknown. This paper presents an experimentally validated method based on thermal conduction analysis to determine the core temperature of a Li-ion cell during thermal runaway using surface temperature and chemical kinetics data. Experiments conducted on a thermal test cell show that core temperature computed using this method is in good agreement with independent thermocouple-based measurements in a wide range of experimental conditions. The validated method is used to predict core temperature as a function of time for several previously reported thermal runaway tests. In each case, the predicted peak core temperature is found to be several hundreds of degrees Celsius higher than the measured surface temperature. This shows that surface temperature alone is not sufficient for thermally characterizing the cell during thermal runaway. Besides providing key insights into the fundamental nature of thermal runaway, the ability to determine the core temperature shown here may lead to practical tools for characterizing and mitigating thermal runaway.
Capelli, Claudio; Biglino, Giovanni; Petrini, Lorenza; Migliavacca, Francesco; Cosentino, Daria; Bonhoeffer, Philipp; Taylor, Andrew M; Schievano, Silvia
2012-12-01
Finite element (FE) modelling can be a very resourceful tool in the field of cardiovascular devices. To ensure result reliability, FE models must be validated experimentally against physical data. Their clinical application (e.g., patients' suitability, morphological evaluation) also requires fast simulation process and access to results, while engineering applications need highly accurate results. This study shows how FE models with different mesh discretisations can suit clinical and engineering requirements for studying a novel device designed for percutaneous valve implantation. Following sensitivity analysis and experimental characterisation of the materials, the stent-graft was first studied in a simplified geometry (i.e., compliant cylinder) and validated against in vitro data, and then in a patient-specific implantation site (i.e., distensible right ventricular outflow tract). Different meshing strategies using solid, beam and shell elements were tested. Results showed excellent agreement between computational and experimental data in the simplified implantation site. Beam elements were found to be convenient for clinical applications, providing reliable results in less than one hour in a patient-specific anatomical model. Solid elements remain the FE choice for engineering applications, albeit more computationally expensive (>100 times). This work also showed how information on device mechanical behaviour differs when acquired in a simplified model as opposed to a patient-specific model.
NASA Astrophysics Data System (ADS)
Torbahn, Lutz; Weuster, Alexander; Handl, Lisa; Schmidt, Volker; Kwade, Arno; Wolf, Dietrich E.
2017-06-01
The interdependency of structure and mechanical features of a cohesive powder packing is on current scientific focus and far from being well understood. Although the Discrete Element Method provides a well applicable and widely used tool to model powder behavior, non-trivial contact mechanics of micron-sized particles demand a sophisticated contact model. Here, a direct comparison between experiment and simulation on a particle level offers a proper approach for model validation. However, the simulation of a full scale shear-tester experiment with micron-sized particles, and hence, validating this simulation remains a challenge. We address this task by down scaling the experimental setup: A fully functional micro shear-tester was developed and implemented into an X-ray tomography device in order to visualize the sample on a bulk and particle level within small bulk volumes of the order of a few micro liter under well-defined consolidation. Using spherical micron-sized particles (30 μm), shear tests with a particle number accessible for simulations can be performed. Moreover, particle level analysis allows for a direct comparison of experimental and numerical results, e.g., regarding structural evolution. In this talk, we focus on density inhomogeneity and shear induced heterogeneity during compaction and shear deformation.
Marvel Analysis of the Measured High-resolution Rovibronic Spectra of TiO
NASA Astrophysics Data System (ADS)
McKemmish, Laura K.; Masseron, Thomas; Sheppard, Samuel; Sandeman, Elizabeth; Schofield, Zak; Furtenbacher, Tibor; Császár, Attila G.; Tennyson, Jonathan; Sousa-Silva, Clara
2017-02-01
Accurate, experimental rovibronic energy levels, with associated labels and uncertainties, are reported for 11 low-lying electronic states of the diatomic {}48{{Ti}}16{{O}} molecule, determined using the Marvel (Measured Active Rotational-Vibrational Energy Levels) algorithm. All levels are based on lines corresponding to critically reviewed and validated high-resolution experimental spectra taken from 24 literature sources. The transition data are in the 2-22,160 cm-1 region. Out of the 49,679 measured transitions, 43,885 are triplet-triplet, 5710 are singlet-singlet, and 84 are triplet-singlet transitions. A careful analysis of the resulting experimental spectroscopic network (SN) allows 48,590 transitions to be validated. The transitions determine 93 vibrational band origins of {}48{{Ti}}16{{O}}, including 71 triplet and 22 singlet ones. There are 276 (73) triplet-triplet (singlet-singlet) band-heads derived from Marvel experimental energies, 123(38) of which have never been assigned in low- or high-resolution experiments. The highest J value, where J stands for the total angular momentum, for which an energy level is validated is 163. The number of experimentally derived triplet and singlet {}48{{Ti}}16{{O}} rovibrational energy levels is 8682 and 1882, respectively. The lists of validated lines and levels for {}48{{Ti}}16{{O}} are deposited in the supporting information to this paper.
Houdek, Petr
2017-01-01
The aim of this perspective article is to show that current experimental evidence on factors influencing dishonesty has limited external validity. Most of experimental studies is built on random assignments, in which control/experimental groups of subjects face varied sizes of the expected reward for behaving dishonestly, opportunities for cheating, means of rationalizing dishonest behavior etc., and mean groups' reactions are observed. The studies have internal validity in assessing the causal influence of these and other factors, but they lack external validity in organizational, market and other environments. If people can opt into or out of diverse real-world environments, an experiment aimed at studying factors influencing real-life degree of dishonesty should permit for such an option. The behavior of such self-selected groups of marginal subjects would probably contain a larger level of (non)deception than the behavior of average people. The article warns that there are not many studies that would enable self-selection or sorting of participants into varying environments, and that limits current knowledge of the extent and dynamics of dishonest and fraudulent behavior. The article focuses on suggestions how to improve dishonesty research, especially how to avoid the experimenter demand bias.
Houdek, Petr
2017-01-01
The aim of this perspective article is to show that current experimental evidence on factors influencing dishonesty has limited external validity. Most of experimental studies is built on random assignments, in which control/experimental groups of subjects face varied sizes of the expected reward for behaving dishonestly, opportunities for cheating, means of rationalizing dishonest behavior etc., and mean groups’ reactions are observed. The studies have internal validity in assessing the causal influence of these and other factors, but they lack external validity in organizational, market and other environments. If people can opt into or out of diverse real-world environments, an experiment aimed at studying factors influencing real-life degree of dishonesty should permit for such an option. The behavior of such self-selected groups of marginal subjects would probably contain a larger level of (non)deception than the behavior of average people. The article warns that there are not many studies that would enable self-selection or sorting of participants into varying environments, and that limits current knowledge of the extent and dynamics of dishonest and fraudulent behavior. The article focuses on suggestions how to improve dishonesty research, especially how to avoid the experimenter demand bias. PMID:28955279
Experimental investigations of turbulent temperature fluctuations and phase angles in ASDEX Upgrade
NASA Astrophysics Data System (ADS)
Freethy, Simon
2017-10-01
A complete experimental understanding of the turbulent fluctuations in tokamak plasmas is essential for providing confidence in the extrapolation of heat transport models to future experimental devices and reactors. Guided by ``predict first'' nonlinear gyrokinetic simulations with the GENE code, two new turbulence diagnostics were designed and have been installed on ASDEX Upgrade (AUG) to probe the fundamentals of ion-scale turbulent electron heat transport. The first, a 30-channel correlation ECE (CECE) radiometer, measures radial profiles (0.5
New Turbulent Multiphase Flow Facilities for Simulation Benchmarking
NASA Astrophysics Data System (ADS)
Teoh, Chee Hau; Salibindla, Ashwanth; Masuk, Ashik Ullah Mohammad; Ni, Rui
2017-11-01
The Fluid Transport Lab at Penn State has devoted last few years on developing new experimental facilities to unveil the underlying physics of coupling between solid-gas and gas-liquid multiphase flow in a turbulent environment. In this poster, I will introduce one bubbly flow facility and one dusty flow facility for validating and verifying simulation results. Financial support for this project was provided by National Science Foundation under Grant Number: 1653389 and 1705246.
Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials
NASA Technical Reports Server (NTRS)
Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar
2015-01-01
The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition, after investigating various methods, a Smoothed Particle Hydrodynamics Model (SPH Model) was developed to model wire feeding process. Its computational efficiency and simple architecture makes it more robust and flexible than other models. More research on material properties may be needed to realistically model the AAM processes. A microscale model was developed to investigate heterogeneous nucleation, dendritic grain growth, epitaxial growth of columnar grains, columnar-to-equiaxed transition, grain transport in melt, and other properties. The orientations of the columnar grains were almost perpendicular to the laser motion's direction. Compared to the similar studies in the literature, the multiple grain morphology modeling result is in the same order of magnitude as optical morphologies in the experiment. Experimental work was conducted to validate different models. An infrared camera was incorporated as a process monitoring and validating tool to identify the solidus and mushy zones during deposition. The images were successfully processed to identify these regions. This research project has investigated multiscale and multiphysics of the complex AAM processes thus leading to advanced understanding of these processes. The project has also developed several modeling tools and experimental validation tools that will be very critical in the future of AAM process qualification and certification.
NASA Astrophysics Data System (ADS)
Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin
2018-04-01
This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.
Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.
Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan
2013-01-01
In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.
Patient-reported outcomes in borderline personality disorder
Hasler, Gregor; Hopwood, Christopher J.; Jacob, Gitta A.; Brändle, Laura S.; Schulte-Vels, Thomas
2014-01-01
Patient-reported outcome (PRO) refers to measures that emphasize the subjective view of patients about their health-related conditions and behaviors. Typically, PROs include self-report questionnaires and clinical interviews. Defining PROs for borderline personality disorder (BPD) is particularly challenging given the disorder's high symptomatic heterogeneity, high comorbidity with other psychiatric conditions, highly fluctuating symptoms, weak correlations between symptoms and functional outcomes, and lack of valid and reliable experimental measures to complement self-report data. Here, we provide an overview of currently used BPD outcome measures and discuss them from clinical, psychometric, experimental, and patient perspectives. In addition, we review the most promising leads to improve BPD PROs, including the DSM-5 Section III, the Recovery Approach, Ecological Momentary Assessments, and novel experimental measures of social functioning that are associated with functional and social outcomes. PMID:25152662
Effective wavefront aberration measurement of spectacle lenses in as-worn status
NASA Astrophysics Data System (ADS)
Jia, Zhigang; Xu, Kai; Fang, Fengzhou
2018-04-01
An effective wavefront aberration analysis method for measuring spectacle lenses in as-worn status was proposed and verified using an experimental apparatus based on an eye rotation model. Two strategies were employed to improve the accuracy of measurement of the effective wavefront aberrations on the corneal sphere. The influences of three as-worn parameters, the vertex distance, pantoscopic angle, and face form angle, together with the eye rotation and corresponding incident beams, were objectively and quantitatively obtained. The experimental measurements of spherical single vision and freeform progressive addition lenses demonstrate the accuracy and validity of the proposed method and experimental apparatus, which provide a potential means of achieving supernormal vision correction with customization and personalization in optimizing the as-worn status-based design of spectacle lenses and evaluating their manufacturing and imaging qualities.
Pilot Wave Model for Impulsive Thrust from RF Test Device Measured in Vacuum
NASA Technical Reports Server (NTRS)
White, Harold; Lawrence, James; Sylvester, Andre; Vera, Jerry; Chap, Andrew; George, Jeff
2017-01-01
A physics model is developed in detail and its place in the taxonomy of ideas about the nature of the quantum vacuum is discussed. The experimental results from the recently completed vacuum test campaign evaluating the impulsive thrust performance of a tapered RF test article excited in the TM212 mode at 1,937 megahertz (MHz) are summarized. The empirical data from this campaign is compared to the predictions from the physics model tools. A discussion is provided to further elaborate on the possible implications of the proposed model if it is physically valid. Based on the correlation of analysis prediction with experimental data collected, it is proposed that the observed anomalous thrust forces are real, not due to experimental error, and are due to a new type of interaction with quantum vacuum fluctuations.
Experimental Characterization of the Jet Wiping Process
NASA Astrophysics Data System (ADS)
Mendez, Miguel Alfonso; Enache, Adriana; Gosset, Anne; Buchlin, Jean-Marie
2018-06-01
This paper presents an experimental characterization of the jet wiping process, used in continuous coating applications to control the thickness of a liquid coat using an impinging gas jet. Time Resolved Particle Image Velocimetry (TR-PIV) is used to characterize the impinging gas flow, while an automatic interface detection algorithm is developed to track the liquid interface at the impact. The study of the flow interaction is combined with time resolved 3D thickness measurements of the liquid film remaining after the wiping, via Time Resolved Light Absorption (TR-LAbs). The simultaneous frequency analysis of liquid and gas flows allows to correlate their respective instability, provide an experimental data set for the validation of numerical studies and allows for formulating a working hypothesis on the origin of the coat non-uniformity encountered in many jet wiping processes.
Olondo, C; Legarda, F; Herranz, M; Idoeta, R
2017-04-01
This paper shows the procedure performed to validate the migration equation and the migration parameters' values presented in a previous paper (Legarda et al., 2011) regarding the migration of 137 Cs in Spanish mainland soils. In this paper, this model validation has been carried out checking experimentally obtained activity concentration values against those predicted by the model. This experimental data come from the measured vertical activity profiles of 8 new sampling points which are located in northern Spain. Before testing predicted values of the model, the uncertainty of those values has been assessed with the appropriate uncertainty analysis. Once establishing the uncertainty of the model, both activity concentration values, experimental versus model predicted ones, have been compared. Model validation has been performed analyzing its accuracy, studying it as a whole and also at different depth intervals. As a result, this model has been validated as a tool to predict 137 Cs behaviour in a Mediterranean environment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Zuckerman, Amitai; Ram, Omri; Ifergane, Gal; Matar, Michael A; Sagi, Ram; Ostfeld, Ishay; Hoffman, Jay R; Kaplan, Zeev; Sadot, Oren; Cohen, Hagit
2017-01-01
The intense focus in the clinical literature on the mental and neurocognitive sequelae of explosive blast-wave exposure, especially when comorbid with post-traumatic stress-related disorders (PTSD) is justified, and warrants the design of translationally valid animal studies to provide valid complementary basic data. We employed a controlled experimental blast-wave paradigm in which unanesthetized animals were exposed to visual, auditory, olfactory, and tactile effects of an explosive blast-wave produced by exploding a thin copper wire. By combining cognitive-behavioral paradigms and ex vivo brain MRI to assess mild traumatic brain injury (mTBI) phenotype with a validated behavioral model for PTSD, complemented by morphological assessments, this study sought to examine our ability to evaluate the biobehavioral effects of low-intensity blast overpressure on rats, in a translationally valid manner. There were no significant differences between blast- and sham-exposed rats on motor coordination and strength, or sensory function. Whereas most male rats exposed to the blast-wave displayed normal behavioral and cognitive responses, 23.6% of the rats displayed a significant retardation of spatial learning acquisition, fulfilling criteria for mTBI-like responses. In addition, 5.4% of the blast-exposed animals displayed an extreme response in the behavioral tasks used to define PTSD-like criteria, whereas 10.9% of the rats developed both long-lasting and progressively worsening behavioral and cognitive "symptoms," suggesting comorbid PTSD-mTBI-like behavioral and cognitive response patterns. Neither group displayed changes on MRI. Exposure to experimental blast-wave elicited distinct behavioral and morphological responses modelling mTBI-like, PTSD-like, and comorbid mTBI-PTSD-like responses. This experimental animal model can be a useful tool for elucidating neurobiological mechanisms underlying the effects of blast-wave-induced mTBI and PTSD and comorbid mTBI-PTSD.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.
Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao
2017-06-30
Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do
2017-01-01
Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113
Implementation of an experimental fault-tolerant memory system
NASA Technical Reports Server (NTRS)
Carter, W. C.; Mccarthy, C. E.
1976-01-01
The experimental fault-tolerant memory system described in this paper has been designed to enable the modular addition of spares, to validate the theoretical fault-secure and self-testing properties of the translator/corrector, to provide a basis for experiments using the new testing and correction processes for recovery, and to determine the practicality of such systems. The hardware design and implementation are described, together with methods of fault insertion. The hardware/software interface, including a restricted single error correction/double error detection (SEC/DED) code, is specified. Procedures are carefully described which, (1) test for specified physical faults, (2) ensure that single error corrections are not miscorrections due to triple faults, and (3) enable recovery from double errors.
Experimental Demonstration of X-Ray Drive Enhancement with Rugby-Shaped Hohlraums
NASA Astrophysics Data System (ADS)
Philippe, F.; Casner, A.; Caillaud, T.; Landoas, O.; Monteil, M. C.; Liberatore, S.; Park, H. S.; Amendt, P.; Robey, H.; Sorce, C.; Li, C. K.; Seguin, F.; Rosenberg, M.; Petrasso, R.; Glebov, V.; Stoeckl, C.
2010-01-01
Rugby-shaped hohlraums have been suggested as a way to enhance x-ray drive in the indirect drive approach to inertial confinement fusion. This Letter presents an experimental comparison of rugby-shaped and cylinder hohlraums used for D2 and DHe3-filled capsules implosions on the Omega laser facility, demonstrating an increase of x-ray flux by 18% in rugby-shaped hohlraums. The highest yields to date for deuterium gas implosions in indirect drive on Omega (1.5×1010 neutrons) were obtained, allowing for the first time the measurement of a DD burn history. Proton spectra measurements provide additional validation of the higher drive in rugby-shaped hohlraums.
NASA Astrophysics Data System (ADS)
Nir, A.; Doughty, C.; Tsang, C. F.
Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no attempt to validate a specific model, but several models of increasing complexity are compared with experimental results. The outcome is interpreted as a demonstration of the paradigm proposed by van der Heijde, 26 that different constituencies have different objectives for the validation process and therefore their acceptance criteria differ also.
NASA Astrophysics Data System (ADS)
Stigliano, Robert Vincent
The use of magnetic nanoparticles (mNPs) to induce local hyperthermia has been emerging in recent years as a promising cancer therapy, in both a stand-alone and combination treatment setting, including surgery radiation and chemotherapy. The mNP solution can be injected either directly into the tumor, or administered intravenously. Studies have shown that some cancer cells associate with, internalize, and aggregate mNPs more preferentially than normal cells, with and without antibody targeting. Once the mNPs are delivered inside the cells, a low frequency (30-300kHz) alternating electromagnetic field is used to activate the mNPs. The nanoparticles absorb the applied field and provide localized heat generation at nano-micron scales. Treatment planning models have been shown to improve treatment efficacy in radiation therapy by limiting normal tissue damage while maximizing dose to the tumor. To date, there does not exist a clinical treatment planning model for magnetic nanoparticle hyperthermia which is robust, validated, and commercially available. The focus of this research is on the development and experimental validation of a treatment planning model, consisting of a coupled electromagnetic and thermal model that predicts dynamic thermal distributions during treatment. When allowed to incubate, the mNPs are often sequestered by cancer cells and packed into endosomes. The proximity of the mNPs has a strong influence on their ability to heat due to interparticle magnetic interaction effects. A model of mNP heating which takes into account the effects of magnetic interaction was developed, and validated against experimental data. An animal study in mice was conducted to determine the effects of mNP solution injection duration and PEGylation on macroscale mNP distribution within the tumor, in order to further inform the treatment planning model and future experimental technique. In clinical applications, a critical limiting factor for the maximum applied field is the heating caused by eddy currents, which are induced in the noncancerous tissue. Phantom studies were conducted to validate the ability of the model to accurately predict eddy current heating in the case of zero blood perfusion, and preliminary data was collected to show the validity of the model in live mice to incorporate blood perfusion.
Pivetta, Tiziana; Isaia, Francesco; Trudu, Federica; Pani, Alessandra; Manca, Matteo; Perra, Daniela; Amato, Filippo; Havel, Josef
2013-10-15
The combination of two or more drugs using multidrug mixtures is a trend in the treatment of cancer. The goal is to search for a synergistic effect and thereby reduce the required dose and inhibit the development of resistance. An advanced model-free approach for data exploration and analysis, based on artificial neural networks (ANN) and experimental design is proposed to predict and quantify the synergism of drugs. The proposed method non-linearly correlates the concentrations of drugs with the cytotoxicity of the mixture, providing the possibility of choosing the optimal drug combination that gives the maximum synergism. The use of ANN allows for the prediction of the cytotoxicity of each combination of drugs in the chosen concentration interval. The method was validated by preparing and experimentally testing the combinations with the predicted highest synergistic effect. In all cases, the data predicted by the network were experimentally confirmed. The method was applied to several binary mixtures of cisplatin and [Cu(1,10-orthophenanthroline)2(H2O)](ClO4)2, Cu(1,10-orthophenanthroline)(H2O)2(ClO4)2 or [Cu(1,10-orthophenanthroline)2(imidazolidine-2-thione)](ClO4)2. The cytotoxicity of the two drugs, alone and in combination, was determined against human acute T-lymphoblastic leukemia cells (CCRF-CEM). For all systems, a synergistic effect was found for selected combinations. © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hernandez, K. F.; Shah-Fairbank, S.
2016-12-01
The San Dimas Experimental Forest has been designated as a research area by the United States Forest Service for use as a hydrologic testing facility since 1933 to investigate watershed hydrology of the 27 square mile land. Incorporation of a computer model provides validity to the testing of the physical model. This study focuses on San Dimas Experimental Forest's Bell Canyon, one of the triad of watersheds contained within the Big Dalton watershed of the San Dimas Experimental Forest. A scaled physical model was constructed of Bell Canyon to highlight watershed characteristics and each's effect on runoff. The physical model offers a comprehensive visualization of a natural watershed and can vary the characteristics of rainfall intensity, slope, and roughness through interchangeable parts and adjustments to the system. The scaled physical model is validated and calibrated through a HEC-HMS model to assure similitude of the system. Preliminary results of the physical model suggest that a 50-year storm event can be represented by a peak discharge of 2.2 X 10-3 cfs. When comparing the results to HEC-HMS, this equates to a flow relationship of approximately 1:160,000, which can be used to model other return periods. The completion of the Bell Canyon physical model can be used for educational instruction in the classroom, outreach in the community, and further research using the model as an accurate representation of the watershed present in the San Dimas Experimental Forest.
Towards oscillations-based simulation of social systems: a neurodynamic approach
NASA Astrophysics Data System (ADS)
Plikynas, Darius; Basinskas, Gytis; Laukaitis, Algirdas
2015-04-01
This multidisciplinary work presents synopsis of theories in the search for common field-like fundamental principles of self-organisation and communication existing on quantum, cellular, and even social levels. Based on these fundamental principles, we formulate conceptually novel social neuroscience paradigm (OSIMAS), which envisages social systems emerging from the coherent neurodynamical processes taking place in the individual mind-fields. In this way, societies are understood as global processes emerging from the superposition of the conscious and subconscious mind-fields of individual members of society. For the experimental validation of the biologically inspired OSIMAS paradigm, we have designed a framework of EEG-based experiments. Initial baseline individual tests of spectral cross-correlations of EEG-recorded brainwave patterns for some mental states have been provided in this paper. Preliminary experimental results do not refute the main OSIMAS postulates. This paper also provides some insights for the construction of OSIMAS-based simulation models.
Metabolic network reconstruction of Chlamydomonas offers insight into light-driven algal metabolism
Chang, Roger L; Ghamsari, Lila; Manichaikul, Ani; Hom, Erik F Y; Balaji, Santhanam; Fu, Weiqi; Shen, Yun; Hao, Tong; Palsson, Bernhard Ø; Salehi-Ashtiani, Kourosh; Papin, Jason A
2011-01-01
Metabolic network reconstruction encompasses existing knowledge about an organism's metabolism and genome annotation, providing a platform for omics data analysis and phenotype prediction. The model alga Chlamydomonas reinhardtii is employed to study diverse biological processes from photosynthesis to phototaxis. Recent heightened interest in this species results from an international movement to develop algal biofuels. Integrating biological and optical data, we reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. PMID:21811229
The mathematics of cancer: integrating quantitative models.
Altrock, Philipp M; Liu, Lin L; Michor, Franziska
2015-12-01
Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.
Acoustic Shielding for a Model Scale Counter-rotation Open Rotor
NASA Technical Reports Server (NTRS)
Stephens, David B.; Edmane, Envia
2012-01-01
The noise shielding benefit of installing an open rotor above a simplified wing or tail is explored experimentally. The test results provide both a benchmark data set for validating shielding prediction tools and an opportunity for a system level evaluation of the noise reduction potential of propulsion noise shielding by an airframe component. A short barrier near the open rotor was found to provide up to 8.5 dB of attenuation at some directivity angles, with tonal sound particularly well shielded. Predictions from two simple shielding theories were found to overestimate the shielding benefit.
Towards natural language question generation for the validation of ontologies and mappings.
Ben Abacha, Asma; Dos Reis, Julio Cesar; Mrabet, Yassine; Pruski, Cédric; Da Silveira, Marcos
2016-08-08
The increasing number of open-access ontologies and their key role in several applications such as decision-support systems highlight the importance of their validation. Human expertise is crucial for the validation of ontologies from a domain point-of-view. However, the growing number of ontologies and their fast evolution over time make manual validation challenging. We propose a novel semi-automatic approach based on the generation of natural language (NL) questions to support the validation of ontologies and their evolution. The proposed approach includes the automatic generation, factorization and ordering of NL questions from medical ontologies. The final validation and correction is performed by submitting these questions to domain experts and automatically analyzing their feedback. We also propose a second approach for the validation of mappings impacted by ontology changes. The method exploits the context of the changes to propose correction alternatives presented as Multiple Choice Questions. This research provides a question optimization strategy to maximize the validation of ontology entities with a reduced number of questions. We evaluate our approach for the validation of three medical ontologies. We also evaluate the feasibility and efficiency of our mappings validation approach in the context of ontology evolution. These experiments are performed with different versions of SNOMED-CT and ICD9. The obtained experimental results suggest the feasibility and adequacy of our approach to support the validation of interconnected and evolving ontologies. Results also suggest that taking into account RDFS and OWL entailment helps reducing the number of questions and validation time. The application of our approach to validate mapping evolution also shows the difficulty of adapting mapping evolution over time and highlights the importance of semi-automatic validation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radulescu, Georgeta; Gauld, Ian C; Ilas, Germina
2011-01-01
The expanded use of burnup credit in the United States (U.S.) for storage and transport casks, particularly in the acceptance of credit for fission products, has been constrained by the availability of experimental fission product data to support code validation. The U.S. Nuclear Regulatory Commission (NRC) staff has noted that the rationale for restricting the Interim Staff Guidance on burnup credit for storage and transportation casks (ISG-8) to actinide-only is based largely on the lack of clear, definitive experiments that can be used to estimate the bias and uncertainty for computational analyses associated with using burnup credit. To address themore » issues of burnup credit criticality validation, the NRC initiated a project with the Oak Ridge National Laboratory to (1) develop and establish a technically sound validation approach for commercial spent nuclear fuel (SNF) criticality safety evaluations based on best-available data and methods and (2) apply the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The purpose of this paper is to describe the isotopic composition (depletion) validation approach and resulting observations and recommendations. Validation of the criticality calculations is addressed in a companion paper at this conference. For isotopic composition validation, the approach is to determine burnup-dependent bias and uncertainty in the effective neutron multiplication factor (keff) due to bias and uncertainty in isotopic predictions, via comparisons of isotopic composition predictions (calculated) and measured isotopic compositions from destructive radiochemical assay utilizing as much assay data as is available, and a best-estimate Monte Carlo based method. This paper (1) provides a detailed description of the burnup credit isotopic validation approach and its technical bases, (2) describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models to demonstrate its usage and applicability, (3) provides reference bias and uncertainty results based on a quality-assurance-controlled prerelease version of the Scale 6.1 code package and the ENDF/B-VII nuclear cross section data.« less
Selecting and Improving Quasi-Experimental Designs in Effectiveness and Implementation Research.
Handley, Margaret A; Lyles, Courtney R; McCulloch, Charles; Cattamanchi, Adithya
2018-04-01
Interventional researchers face many design challenges when assessing intervention implementation in real-world settings. Intervention implementation requires holding fast on internal validity needs while incorporating external validity considerations (such as uptake by diverse subpopulations, acceptability, cost, and sustainability). Quasi-experimental designs (QEDs) are increasingly employed to achieve a balance between internal and external validity. Although these designs are often referred to and summarized in terms of logistical benefits, there is still uncertainty about (a) selecting from among various QEDs and (b) developing strategies to strengthen the internal and external validity of QEDs. We focus here on commonly used QEDs (prepost designs with nonequivalent control groups, interrupted time series, and stepped-wedge designs) and discuss several variants that maximize internal and external validity at the design, execution and implementation, and analysis stages.
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
Experimental and theoretical study of magnetohydrodynamic ship models.
Cébron, David; Viroulet, Sylvain; Vidal, Jérémie; Masson, Jean-Paul; Viroulet, Philippe
2017-01-01
Magnetohydrodynamic (MHD) ships represent a clear demonstration of the Lorentz force in fluids, which explains the number of students practicals or exercises described on the web. However, the related literature is rather specific and no complete comparison between theory and typical small scale experiments is currently available. This work provides, in a self-consistent framework, a detailed presentation of the relevant theoretical equations for small MHD ships and experimental measurements for future benchmarks. Theoretical results of the literature are adapted to these simple battery/magnets powered ships moving on salt water. Comparison between theory and experiments are performed to validate each theoretical step such as the Tafel and the Kohlrausch laws, or the predicted ship speed. A successful agreement is obtained without any adjustable parameter. Finally, based on these results, an optimal design is then deduced from the theory. Therefore this work provides a solid theoretical and experimental ground for small scale MHD ships, by presenting in detail several approximations and how they affect the boat efficiency. Moreover, the theory is general enough to be adapted to other contexts, such as large scale ships or industrial flow measurement techniques.
Physical realization of the Glauber quantum oscillator.
Gentilini, Silvia; Braidotti, Maria Chiara; Marcucci, Giulia; DelRe, Eugenio; Conti, Claudio
2015-11-02
More than thirty years ago Glauber suggested that the link between the reversible microscopic and the irreversible macroscopic world can be formulated in physical terms through an inverted harmonic oscillator describing quantum amplifiers. Further theoretical studies have shown that the paradigm for irreversibility is indeed the reversed harmonic oscillator. As outlined by Glauber, providing experimental evidence of these idealized physical systems could open the way to a variety of fundamental studies, for example to simulate irreversible quantum dynamics and explain the arrow of time. However, supporting experimental evidence of reversed quantized oscillators is lacking. We report the direct observation of exploding n = 0 and n = 2 discrete states and Γ0 and Γ2 quantized decay rates of a reversed harmonic oscillator generated by an optical photothermal nonlinearity. Our results give experimental validation to the main prediction of irreversible quantum mechanics, that is, the existence of states with quantized decay rates. Our results also provide a novel perspective to optical shock-waves, potentially useful for applications as lasers, optical amplifiers, white-light and X-ray generation.
Electrode Coverage Optimization for Piezoelectric Energy Harvesting from Tip Excitation
Chen, Guangzhu; Bai, Nan
2018-01-01
Piezoelectric energy harvesting using cantilever-type structures has been extensively investigated due to its potential application in providing power supplies for wireless sensor networks, but the low output power has been a bottleneck for its further commercialization. To improve the power conversion capability, a piezoelectric beam with different electrode coverage ratios is studied theoretically and experimentally in this paper. A distributed-parameter theoretical model is established for a bimorph piezoelectric beam with the consideration of the electrode coverage area. The impact of the electrode coverage on the capacitance, the output power and the optimal load resistance are analyzed, showing that the piezoelectric beam has the best performance with an electrode coverage of 66.1%. An experimental study was then carried out to validate the theoretical results using a piezoelectric beam fabricated with segmented electrodes. The experimental results fit well with the theoretical model. A 12% improvement on the Root-Mean-Square (RMS) output power was achieved with the optimized electrode converge ratio (66.1%). This work provides a simple approach to utilizing piezoelectric beams in a more efficient way. PMID:29518934
Experimental and theoretical study of magnetohydrodynamic ship models
Viroulet, Sylvain; Vidal, Jérémie; Masson, Jean-Paul; Viroulet, Philippe
2017-01-01
Magnetohydrodynamic (MHD) ships represent a clear demonstration of the Lorentz force in fluids, which explains the number of students practicals or exercises described on the web. However, the related literature is rather specific and no complete comparison between theory and typical small scale experiments is currently available. This work provides, in a self-consistent framework, a detailed presentation of the relevant theoretical equations for small MHD ships and experimental measurements for future benchmarks. Theoretical results of the literature are adapted to these simple battery/magnets powered ships moving on salt water. Comparison between theory and experiments are performed to validate each theoretical step such as the Tafel and the Kohlrausch laws, or the predicted ship speed. A successful agreement is obtained without any adjustable parameter. Finally, based on these results, an optimal design is then deduced from the theory. Therefore this work provides a solid theoretical and experimental ground for small scale MHD ships, by presenting in detail several approximations and how they affect the boat efficiency. Moreover, the theory is general enough to be adapted to other contexts, such as large scale ships or industrial flow measurement techniques. PMID:28665941
Wind tunnel validation of AeroDyn within LIFES50+ project: imposed Surge and Pitch tests
NASA Astrophysics Data System (ADS)
Bayati, I.; Belloli, M.; Bernini, L.; Zasso, A.
2016-09-01
This paper presents the first set of results of the steady and unsteady wind tunnel tests, performed at Politecnico di Milano wind tunnel, on a 1/75 rigid scale model of the DTU 10 MW wind turbine, within the LIFES50+ project. The aim of these tests is the validation of the open source code AeroDyn developed at NREL. Numerical and experimental steady results are compared in terms of thrust and torque coefficients, showing good agreement, as well as for unsteady measurements gathered with a 2 degree-of-freedom test rig, capable of imposing the displacements at the base of the model, and providing the surge and pitch motion of the floating offshore wind turbine (FOWT) scale model. The measurements of the unsteady test configuration are compared with AeroDyn/Dynin module results, implementing the generalized dynamic wake (GDW) model. Numerical and experimental comparison showed similar behaviours in terms of non linear hysteresis, however some discrepancies are herein reported and need further data analysis and interpretations about the aerodynamic integral quantities, with a special attention to the physics of the unsteady phenomenon.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewsuk, K.G.; Cochran, R.J.; Blackwell, B.F.
The properties and performance of a ceramic component is determined by a combination of the materials from which it was fabricated and how it was processed. Most ceramic components are manufactured by dry pressing a powder/binder system in which the organic binder provides formability and green compact strength. A key step in this manufacturing process is the removal of the binder from the powder compact after pressing. The organic binder is typically removed by a thermal decomposition process in which heating rate, temperature, and time are the key process parameters. Empirical approaches are generally used to design the burnout time-temperaturemore » cycle, often resulting in excessive processing times and energy usage, and higher overall manufacturing costs. Ideally, binder burnout should be completed as quickly as possible without damaging the compact, while using a minimum of energy. Process and computational modeling offer one means to achieve this end. The objective of this study is to develop an experimentally validated computer model that can be used to better understand, control, and optimize binder burnout from green ceramic compacts.« less
A Validated Multiscale In-Silico Model for Mechano-sensitive Tumour Angiogenesis and Growth
Loizidou, Marilena; Stylianopoulos, Triantafyllos; Hawkes, David J.
2017-01-01
Vascularisation is a key feature of cancer growth, invasion and metastasis. To better understand the governing biophysical processes and their relative importance, it is instructive to develop physiologically representative mathematical models with which to compare to experimental data. Previous studies have successfully applied this approach to test the effect of various biochemical factors on tumour growth and angiogenesis. However, these models do not account for the experimentally observed dependency of angiogenic network evolution on growth-induced solid stresses. This work introduces two novel features: the effects of hapto- and mechanotaxis on vessel sprouting, and mechano-sensitive dynamic vascular remodelling. The proposed three-dimensional, multiscale, in-silico model of dynamically coupled angiogenic tumour growth is specified to in-vivo and in-vitro data, chosen, where possible, to provide a physiologically consistent description. The model is then validated against in-vivo data from murine mammary carcinomas, with particular focus placed on identifying the influence of mechanical factors. Crucially, we find that it is necessary to include hapto- and mechanotaxis to recapitulate observed time-varying spatial distributions of angiogenic vasculature. PMID:28125582
MAJIQ-SPEL: Web-tool to interrogate classical and complex splicing variations from RNA-Seq data.
Green, Christopher J; Gazzara, Matthew R; Barash, Yoseph
2017-09-11
Analysis of RNA sequencing (RNA-Seq) data have highlighted the fact that most genes undergo alternative splicing (AS) and that these patterns are tightly regulated. Many of these events are complex, resulting in numerous possible isoforms that quickly become difficult to visualize, interpret, and experimentally validate. To address these challenges we developed MAJIQ-SPEL, a web-tool that takes as input local splicing variations (LSVs) quantified from RNA-Seq data and provides users with visualization and quantification of gene isoforms associated with those. Importantly, MAJIQ-SPEL is able to handle both classical (binary) and complex, non-binary, splicing variations. Using a matching primer design algorithm it also suggests users possible primers for experimental validation by RT-PCR and displays those, along with the matching protein domains affected by the LSV, on UCSC Genome Browser for further downstream analysis. Program and code will be available at http://majiq.biociphers.org/majiq-spel. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
OVERVIEW OF NEUTRON MEASUREMENTS IN JET FUSION DEVICE.
Batistoni, P; Villari, R; Obryk, B; Packer, L W; Stamatelatos, I E; Popovichev, S; Colangeli, A; Colling, B; Fonnesu, N; Loreti, S; Klix, A; Klosowski, M; Malik, K; Naish, J; Pillon, M; Vasilopoulou, T; De Felice, P; Pimpinella, M; Quintieri, L
2017-10-05
The design and operation of ITER experimental fusion reactor requires the development of neutron measurement techniques and numerical tools to derive the fusion power and the radiation field in the device and in the surrounding areas. Nuclear analyses provide essential input to the conceptual design, optimisation, engineering and safety case in ITER and power plant studies. The required radiation transport calculations are extremely challenging because of the large physical extent of the reactor plant, the complexity of the geometry, and the combination of deep penetration and streaming paths. This article reports the experimental activities which are carried-out at JET to validate the neutronics measurements methods and numerical tools used in ITER and power plant design. A new deuterium-tritium campaign is proposed in 2019 at JET: the unique 14 MeV neutron yields produced will be exploited as much as possible to validate measurement techniques, codes, procedures and data currently used in ITER design thus reducing the related uncertainties and the associated risks in the machine operation. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A Validation of Object-Oriented Design Metrics
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Briand, Lionel; Melo, Walcelio L.
1995-01-01
This paper presents the results of a study conducted at the University of Maryland in which we experimentally investigated the suite of Object-Oriented (00) design metrics introduced by [Chidamber and Kemerer, 1994]. In order to do this, we assessed these metrics as predictors of fault-prone classes. This study is complementary to [Lieand Henry, 1993] where the same suite of metrics had been used to assess frequencies of maintenance changes to classes. To perform our validation accurately, we collected data on the development of eight medium-sized information management systems based on identical requirements. All eight projects were developed using a sequential life cycle model, a well-known 00 analysis/design method and the C++ programming language. Based on experimental results, the advantages and drawbacks of these 00 metrics are discussed and suggestions for improvement are provided. Several of Chidamber and Kemerer's 00 metrics appear to be adequate to predict class fault-proneness during the early phases of the life-cycle. We also showed that they are, on our data set, better predictors than "traditional" code metrics, which can only be collected at a later phase of the software development processes.
Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.
2017-01-01
The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195
NASA Astrophysics Data System (ADS)
Fredette, Luke; Singh, Rajendra
2017-02-01
A spectral element approach is proposed to determine the multi-axis dynamic stiffness terms of elastomeric isolators with fractional damping over a broad range of frequencies. The dynamic properties of a class of cylindrical isolators are modeled by using the continuous system theory in terms of homogeneous rods or Timoshenko beams. The transfer matrix type dynamic stiffness expressions are developed from exact harmonic solutions given translational or rotational displacement excitations. Broadband dynamic stiffness magnitudes (say up to 5 kHz) are computationally verified for axial, torsional, shear, flexural, and coupled stiffness terms using a finite element model. Some discrepancies are found between finite element and spectral element models for the axial and flexural motions, illustrating certain limitations of each method. Experimental validation is provided for an isolator with two cylindrical elements (that work primarily in the shear mode) using dynamic measurements, as reported in the prior literature, up to 600 Hz. Superiority of the fractional damping formulation over structural or viscous damping models is illustrated via experimental validation. Finally, the strengths and limitations of the spectral element approach are briefly discussed.
Brannock, M; Wang, Y; Leslie, G
2010-05-01
Membrane Bioreactors (MBRs) have been successfully used in aerobic biological wastewater treatment to solve the perennial problem of effective solids-liquid separation. The optimisation of MBRs requires knowledge of the membrane fouling, biokinetics and mixing. However, research has mainly concentrated on the fouling and biokinetics (Ng and Kim, 2007). Current methods of design for a desired flow regime within MBRs are largely based on assumptions (e.g. complete mixing of tanks) and empirical techniques (e.g. specific mixing energy). However, it is difficult to predict how sludge rheology and vessel design in full-scale installations affects hydrodynamics, hence overall performance. Computational Fluid Dynamics (CFD) provides a method for prediction of how vessel features and mixing energy usage affect the hydrodynamics. In this study, a CFD model was developed which accounts for aeration, sludge rheology and geometry (i.e. bioreactor and membrane module). This MBR CFD model was then applied to two full-scale MBRs and was successfully validated against experimental results. The effect of sludge settling and rheology was found to have a minimal impact on the bulk mixing (i.e. the residence time distribution).
A New Viewpoint (The expanding universe, Dark energy and Dark matter)
NASA Astrophysics Data System (ADS)
Cwele, Daniel
2011-10-01
Just as the relativity paradox once threatened the validity of physics in Albert Einstein's days, the cosmos paradox, the galaxy rotation paradox and the experimental invalidity of the theory of dark matter and dark energy threaten the stability and validity of physics today. These theories and ideas and many others, including the Big Bang theory, all depend almost entirely on the notion of the expanding universe, Edwin Hubble's observations and reports and the observational inconsistencies of modern day theoretical Physics and Astrophysics on related subjects. However, much of the evidence collected in experimental Physics and Astronomy aimed at proving many of these ideas and theories is ambiguous, and can be used to prove other theories, given a different interpretation of its implications. The argument offered here is aimed at providing one such interpretation, attacking the present day theories of dark energy, dark matter and the Big Bang, and proposing a new Cosmological theory based on a modification of Isaac Newton's laws and an expansion on Albert Einstein's theories, without assuming any invalidity or questionability on present day cosmological data and astronomical observations.
Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reboud, C.; Premel, D.; Lesselier, D.
2007-03-21
Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.
Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations
NASA Astrophysics Data System (ADS)
Reboud, C.; Prémel, D.; Lesselier, D.; Bisiaux, B.
2007-03-01
Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.
NASA Astrophysics Data System (ADS)
Soligo, Riccardo
In this work, the insight provided by our sophisticated Full Band Monte Carlo simulator is used to analyze the behavior of state-of-art devices like GaN High Electron Mobility Transistors and Hot Electron Transistors. Chapter 1 is dedicated to the description of the simulation tool used to obtain the results shown in this work. Moreover, a separate section is dedicated the set up of a procedure to validate to the tunneling algorithm recently implemented in the simulator. Chapter 2 introduces High Electron Mobility Transistors (HEMTs), state-of-art devices characterized by highly non linear transport phenomena that require the use of advanced simulation methods. The techniques for device modeling are described applied to a recent GaN-HEMT, and they are validated with experimental measurements. The main techniques characterization techniques are also described, including the original contribution provided by this work. Chapter 3 focuses on a popular technique to enhance HEMTs performance: the down-scaling of the device dimensions. In particular, this chapter is dedicated to lateral scaling and the calculation of a limiting cutoff frequency for a device of vanishing length. Finally, Chapter 4 and Chapter 5 describe the modeling of Hot Electron Transistors (HETs). The simulation approach is validated by matching the current characteristics with the experimental one before variations of the layouts are proposed to increase the current gain to values suitable for amplification. The frequency response of these layouts is calculated, and modeled by a small signal circuit. For this purpose, a method to directly calculate the capacitance is developed which provides a graphical picture of the capacitative phenomena that limit the frequency response in devices. In Chapter 5 the properties of the hot electrons are investigated for different injection energies, which are obtained by changing the layout of the emitter barrier. Moreover, the large signal characterization of the HET is shown for different layouts, where the collector barrier was scaled.
Comprehensive overview of the Point-by-Point model of prompt emission in fission
NASA Astrophysics Data System (ADS)
Tudora, A.; Hambsch, F.-J.
2017-08-01
The investigation of prompt emission in fission is very important in understanding the fission process and to improve the quality of evaluated nuclear data required for new applications. In the last decade remarkable efforts were done for both the development of prompt emission models and the experimental investigation of the properties of fission fragments and the prompt neutrons and γ-ray emission. The accurate experimental data concerning the prompt neutron multiplicity as a function of fragment mass and total kinetic energy for 252Cf(SF) and 235 ( n, f) recently measured at JRC-Geel (as well as other various prompt emission data) allow a consistent and very detailed validation of the Point-by-Point (PbP) deterministic model of prompt emission. The PbP model results describe very well a large variety of experimental data starting from the multi-parametric matrices of prompt neutron multiplicity ν (A,TKE) and γ-ray energy E_{γ}(A,TKE) which validate the model itself, passing through different average prompt emission quantities as a function of A ( e.g., ν(A), E_{γ}(A), < ɛ > (A) etc.), as a function of TKE ( e.g., ν (TKE), E_{γ}(TKE)) up to the prompt neutron distribution P (ν) and the total average prompt neutron spectrum. The PbP model does not use free or adjustable parameters. To calculate the multi-parametric matrices it needs only data included in the reference input parameter library RIPL of IAEA. To provide average prompt emission quantities as a function of A, of TKE and total average quantities the multi-parametric matrices are averaged over reliable experimental fragment distributions. The PbP results are also in agreement with the results of the Monte Carlo prompt emission codes FIFRELIN, CGMF and FREYA. The good description of a large variety of experimental data proves the capability of the PbP model to be used in nuclear data evaluations and its reliability to predict prompt emission data for fissioning nuclei and incident energies for which the experimental information is completely missing. The PbP treatment can also provide input parameters of the improved Los Alamos model with non-equal residual temperature distributions recently reported by Madland and Kahler, especially for fissioning nuclei without any experimental information concerning the prompt emission.
An energy-dependent numerical model for the condensation probability, γ j
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerby, Leslie Marie
The “condensation” probability, γ j, is an important variable in the preequilibrium stage of nuclear spallation reactions. It represents the probability that p j excited nucleons (excitons) will “condense” to form complex particle type j in the excited residual nucleus. In addition, it has a significant impact on the emission width, or probability of emitting fragment type j from the residual nucleus. Previous formulations for γ j were energy-independent and valid for fragments up to 4He only. This paper explores the formulation of a new model for γ j, one which is energy-dependent and valid for up to 28Mg, andmore » which provides improved fits compared to experimental fragment spectra.« less
NASA Astrophysics Data System (ADS)
Sanford, T. W. L.; Beutler, D. E.; Halbleib, J. A.; Knott, D. P.
1991-12-01
The radiation produced by a 15.5-MeV monoenergetic electron beam incident on optimized and nonoptimized bremsstrahlung targets is characterized using the ITS Monte Carlo code and measurements with equilibrated and nonequilibrated TLD dosimetry. Comparisons between calculations and measurements verify the calculations and demonstrate that the code can be used to predict both bremsstrahlung production and TLD response for radiation fields that are characteristic of those produced by pulsed simulators of gamma rays. The comparisons provide independent confirmation of the validity of the TLD calibration for photon fields characteristic of gamma-ray simulators. The empirical Martin equation, which is often used to calculate radiation dose from optimized bremsstrahlung targets, is examined, and its range of validity is established.
Predictive searching algorithm for Fourier ptychography
NASA Astrophysics Data System (ADS)
Li, Shunkai; Wang, Yifan; Wu, Weichen; Liang, Yanmei
2017-12-01
By capturing a set of low-resolution images under different illumination angles and stitching them together in the Fourier domain, Fourier ptychography (FP) is capable of providing high-resolution image with large field of view. Despite its validity, long acquisition time limits its real-time application. We proposed an incomplete sampling scheme in this paper, termed the predictive searching algorithm to shorten the acquisition and recovery time. Informative sub-regions of the sample’s spectrum are searched and the corresponding images of the most informative directions are captured for spectrum expansion. Its effectiveness is validated by both simulated and experimental results, whose data requirement is reduced by ˜64% to ˜90% without sacrificing image reconstruction quality compared with the conventional FP method.
An energy-dependent numerical model for the condensation probability, γ j
Kerby, Leslie Marie
2016-12-09
The “condensation” probability, γ j, is an important variable in the preequilibrium stage of nuclear spallation reactions. It represents the probability that p j excited nucleons (excitons) will “condense” to form complex particle type j in the excited residual nucleus. In addition, it has a significant impact on the emission width, or probability of emitting fragment type j from the residual nucleus. Previous formulations for γ j were energy-independent and valid for fragments up to 4He only. This paper explores the formulation of a new model for γ j, one which is energy-dependent and valid for up to 28Mg, andmore » which provides improved fits compared to experimental fragment spectra.« less
Integration of design and inspection
NASA Astrophysics Data System (ADS)
Simmonds, William H.
1990-08-01
Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.
Experimental validation of solid rocket motor damping models
NASA Astrophysics Data System (ADS)
Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio
2017-12-01
In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe damping properties of slender launch vehicles in payload/launcher coupled load analysis.
Experimental validation of solid rocket motor damping models
NASA Astrophysics Data System (ADS)
Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio
2018-06-01
In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe damping properties of slender launch vehicles in payload/launcher coupled load analysis.
NASA Astrophysics Data System (ADS)
Stotsky, Jay A.; Hammond, Jason F.; Pavlovsky, Leonid; Stewart, Elizabeth J.; Younger, John G.; Solomon, Michael J.; Bortz, David M.
2016-07-01
The goal of this work is to develop a numerical simulation that accurately captures the biomechanical response of bacterial biofilms and their associated extracellular matrix (ECM). In this, the second of a two-part effort, the primary focus is on formally presenting the heterogeneous rheology Immersed Boundary Method (hrIBM) and validating our model by comparison to experimental results. With this extension of the Immersed Boundary Method (IBM), we use the techniques originally developed in Part I ([19]) to treat biofilms as viscoelastic fluids possessing variable rheological properties anchored to a set of moving locations (i.e., the bacteria locations). In particular, we incorporate spatially continuous variable viscosity and density fields into our model. Although in [14,15], variable viscosity is used in an IBM context to model discrete viscosity changes across interfaces, to our knowledge this work and Part I are the first to apply the IBM to model a continuously variable viscosity field. We validate our modeling approach from Part I by comparing dynamic moduli and compliance moduli computed from our model to data from mechanical characterization experiments on Staphylococcus epidermidis biofilms. The experimental setup is described in [26] in which biofilms are grown and tested in a parallel plate rheometer. In order to initialize the positions of bacteria in the biofilm, experimentally obtained three dimensional coordinate data was used. One of the major conclusions of this effort is that treating the spring-like connections between bacteria as Maxwell or Zener elements provides good agreement with the mechanical characterization data. We also found that initializing the simulations with different coordinate data sets only led to small changes in the mechanical characterization results. Matlab code used to produce results in this paper will be available at https://github.com/MathBioCU/BiofilmSim.
NASA Astrophysics Data System (ADS)
Joiner, N.; Esser, B.; Fertig, M.; Gülhan, A.; Herdrich, G.; Massuti-Ballester, B.
2016-12-01
This paper summarises the final synthesis of an ESA technology research programme entitled "Development of an Innovative Validation Strategy of Gas Surface Interaction Modelling for Re-entry Applications". The focus of the project was to demonstrate the correct pressure dependency of catalytic surface recombination, with an emphasis on Low Earth Orbit (LEO) re-entry conditions and thermal protection system materials. A physics-based model describing the prevalent recombination mechanisms was proposed for implementation into two CFD codes, TINA and TAU. A dedicated experimental campaign was performed to calibrate and validate the CFD model on TPS materials pertinent to the EXPERT space vehicle at a wide range of temperatures and pressures relevant to LEO. A new set of catalytic recombination data was produced that was able to improve the chosen model calibration for CVD-SiC and provide the first model calibration for the Nickel-Chromium super-alloy PM1000. The experimentally observed pressure dependency of catalytic recombination can only be reproduced by the Langmuir-Hinshelwood recombination mechanism. Due to decreasing degrees of (enthalpy and hence) dissociation with facility stagnation pressure, it was not possible to obtain catalytic recombination coefficients from the measurements at high experimental stagnation pressures. Therefore, the CFD model calibration has been improved by this activity based on the low pressure results. The results of the model calibration were applied to the existing EXPERT mission profile to examine the impact of the experimentally calibrated model at flight relevant conditions. The heat flux overshoot at the CVD-SiC/PM1000 junction on EXPERT is confirmed to produce radiative equilibrium temperatures in close proximity to the PM1000 melt temperature.This was anticipated within the margins of the vehicle design; however, due to the measurements made here for the first time at relevant temperatures for the junction, an increased confidence in this finding is placed on the computations.
Mirage: a visible signature evaluation tool
NASA Astrophysics Data System (ADS)
Culpepper, Joanne B.; Meehan, Alaster J.; Shao, Q. T.; Richards, Noel
2017-10-01
This paper presents the Mirage visible signature evaluation tool, designed to provide a visible signature evaluation capability that will appropriately reflect the effect of scene content on the detectability of targets, providing a capability to assess visible signatures in the context of the environment. Mirage is based on a parametric evaluation of input images, assessing the value of a range of image metrics and combining them using the boosted decision tree machine learning method to produce target detectability estimates. It has been developed using experimental data from photosimulation experiments, where human observers search for vehicle targets in a variety of digital images. The images used for tool development are synthetic (computer generated) images, showing vehicles in many different scenes and exhibiting a wide variation in scene content. A preliminary validation has been performed using k-fold cross validation, where 90% of the image data set was used for training and 10% of the image data set was used for testing. The results of the k-fold validation from 200 independent tests show a prediction accuracy between Mirage predictions of detection probability and observed probability of detection of r(262) = 0:63, p < 0:0001 (Pearson correlation) and a MAE = 0:21 (mean absolute error).
NASA Technical Reports Server (NTRS)
Anusonti-Inthra, Phuriwat
2010-01-01
This paper presents validations of a novel rotorcraft analysis that coupled Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and Particle Vortex Transport Method (PVTM) methodologies. The CSD with associated vehicle trim analysis is used to calculate blade deformations and trim parameters. The near body CFD analysis is employed to provide detailed near body flow field information which is used to obtain high-fidelity blade aerodynamic loadings. The far field wake dominated region is simulated using the PVTM analysis which provides accurate prediction of the evolution of the rotor wake released from the near body CFD domains. A loose coupling methodology between the CSD and CFD/PVTM modules are used with appropriate information exchange amongst the CSD/CFD/PVTM modules. The coupled CSD/CFD/PVTM methodology is used to simulate various rotorcraft flight conditions (i.e. hover, transition, and high speed flights), and the results are compared with several sets of experimental data. For the hover condition, the results are compared with hover data for the HART II rotor tested at DLR Institute of Flight Systems, Germany. For the forward flight conditions, the results are validated with the UH-60A flight test data.
Radiation and matter: Electrodynamics postulates and Lorenz gauge
NASA Astrophysics Data System (ADS)
Bobrov, V. B.; Trigger, S. A.; van Heijst, G. J.; Schram, P. P.
2016-11-01
In general terms, we have considered matter as the system of charged particles and quantized electromagnetic field. For consistent description of the thermodynamic properties of matter, especially in an extreme state, the problem of quantization of the longitudinal and scalar potentials should be solved. In this connection, we pay attention that the traditional postulates of electrodynamics, which claim that only electric and magnetic fields are observable, is resolved by denial of the statement about validity of the Maxwell equations for microscopic fields. The Maxwell equations, as the generalization of experimental data, are valid only for averaged values. We show that microscopic electrodynamics may be based on postulation of the d'Alembert equations for four-vector of the electromagnetic field potential. The Lorenz gauge is valid for the averages potentials (and provides the implementation of the Maxwell equations for averages). The suggested concept overcomes difficulties under the electromagnetic field quantization procedure being in accordance with the results of quantum electrodynamics. As a result, longitudinal and scalar photons become real rather than virtual and may be observed in principle. The longitudinal and scalar photons provide not only the Coulomb interaction of charged particles, but also allow the electrical Aharonov-Bohm effect.