Validation of the thermal challenge problem using Bayesian Belief Networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFarland, John; Swiler, Laura Painton
The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less
A Comprehensive Validation Methodology for Sparse Experimental Data
NASA Technical Reports Server (NTRS)
Norman, Ryan B.; Blattnig, Steve R.
2010-01-01
A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.
FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju
To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposedmore » based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting hybrid validation data package (composed of experimental and dummy data) will provide a clear and complete instance delineating the structure of the desired validation data and enabling effective communication among the modeler, the experimentalist, and the knowledgebase developer. With a good common understanding of the desired data structure by the three parties of subject matter experts, further existing data hunting will be effectively conducted, new experimental data generation will be realistically pursued, knowledgebase schema will be practically designed; and code validation will be confidently planned.« less
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
NASA Astrophysics Data System (ADS)
Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin
2018-04-01
This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1993-01-01
Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1992-01-01
Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin Leigh; Smith, Curtis Lee; Burns, Douglas Edward
This report describes the development plan for a new multi-partner External Hazards Experimental Group (EHEG) coordinated by Idaho National Laboratory (INL) within the Risk-Informed Safety Margin Characterization (RISMC) technical pathway of the Light Water Reactor Sustainability Program. Currently, there is limited data available for development and validation of the tools and methods being developed in the RISMC Toolkit. The EHEG is being developed to obtain high-quality, small- and large-scale experimental data validation of RISMC tools and methods in a timely and cost-effective way. The group of universities and national laboratories that will eventually form the EHEG (which is ultimately expectedmore » to include both the initial participants and other universities and national laboratories that have been identified) have the expertise and experimental capabilities needed to both obtain and compile existing data archives and perform additional seismic and flooding experiments. The data developed by EHEG will be stored in databases for use within RISMC. These databases will be used to validate the advanced external hazard tools and methods.« less
NASA Technical Reports Server (NTRS)
Baumeister, Joseph F.
1994-01-01
A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.
Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.
Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan
2013-01-01
In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.
Design, development, testing and validation of a Photonics Virtual Laboratory for the study of LEDs
NASA Astrophysics Data System (ADS)
Naranjo, Francisco L.; Martínez, Guadalupe; Pérez, Ángel L.; Pardo, Pedro J.
2014-07-01
This work presents the design, development, testing and validation of a Photonic Virtual Laboratory, highlighting the study of LEDs. The study was conducted from a conceptual, experimental and didactic standpoint, using e-learning and m-learning platforms. Specifically, teaching tools that help ensure that our students perform significant learning have been developed. It has been brought together the scientific aspect, such as the study of LEDs, with techniques of generation and transfer of knowledge through the selection, hierarchization and structuring of information using concept maps. For the validation of the didactic materials developed, it has been used procedures with various assessment tools for the collection and processing of data, applied in the context of an experimental design. Additionally, it was performed a statistical analysis to determine the validity of the materials developed. The assessment has been designed to validate the contributions of the new materials developed over the traditional method of teaching, and to quantify the learning achieved by students, in order to draw conclusions that serve as a reference for its application in the teaching and learning processes, and comprehensively validate the work carried out.
Methodological convergence of program evaluation designs.
Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2014-01-01
Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
Development of a Conservative Model Validation Approach for Reliable Analysis
2015-01-01
CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account
Experimental validation of an ultrasonic flowmeter for unsteady flows
NASA Astrophysics Data System (ADS)
Leontidis, V.; Cuvier, C.; Caignaert, G.; Dupont, P.; Roussette, O.; Fammery, S.; Nivet, P.; Dazin, A.
2018-04-01
An ultrasonic flowmeter was developed for further applications in cryogenic conditions and for measuring flow rate fluctuations in the range of 0 to 70 Hz. The prototype was installed in a flow test rig, and was validated experimentally both in steady and unsteady water flow conditions. A Coriolis flowmeter was used for the calibration under steady state conditions, whereas in the unsteady case the validation was done simultaneously against two methods: particle image velocimetry (PIV), and with pressure transducers installed flush on the wall of the pipe. The results show that the developed flowmeter and the proposed methodology can accurately measure the frequency and amplitude of unsteady fluctuations in the experimental range of 0-9 l s-1 of the mean main flow rate and 0-70 Hz of the imposed disturbances.
Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram
2014-01-10
Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.
Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram
2014-01-01
Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343
NASA Technical Reports Server (NTRS)
Stankovic, Ana V.
2003-01-01
Professor Stankovic will be developing and refining Simulink based models of the PM alternator and comparing the simulation results with experimental measurements taken from the unit. Her first task is to validate the models using the experimental data. Her next task is to develop alternative control techniques for the application of the Brayton Cycle PM Alternator in a nuclear electric propulsion vehicle. The control techniques will be first simulated using the validated models then tried experimentally with hardware available at NASA. Testing and simulation of a 2KW PM synchronous generator with diode bridge output is described. The parameters of a synchronous PM generator have been measured and used in simulation. Test procedures have been developed to verify the PM generator model with diode bridge output. Experimental and simulation results are in excellent agreement.
CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction
NASA Technical Reports Server (NTRS)
Davis, David O.
2015-01-01
Experimental investigations of specific flow phenomena, e.g., Shock Wave Boundary-Layer Interactions (SWBLI), provide great insight to the flow behavior but often lack the necessary details to be useful as CFD validation experiments. Reasons include: 1.Undefined boundary conditions Inconsistent results 2.Undocumented 3D effects (CL only measurements) 3.Lack of uncertainty analysis While there are a number of good subsonic experimental investigations that are sufficiently documented to be considered test cases for CFD and turbulence model validation, the number of supersonic and hypersonic cases is much less. This was highlighted by Settles and Dodsons [1] comprehensive review of available supersonic and hypersonic experimental studies. In all, several hundred studies were considered for their database.Of these, over a hundred were subjected to rigorous acceptance criteria. Based on their criteria, only 19 (12 supersonic, 7 hypersonic) were considered of sufficient quality to be used for validation purposes. Aeschliman and Oberkampf [2] recognized the need to develop a specific methodology for experimental studies intended specifically for validation purposes.
Aerodynamic Database Development for Mars Smart Lander Vehicle Configurations
NASA Technical Reports Server (NTRS)
Bobskill, Glenn J.; Parikh, Paresh C.; Prabhu, Ramadas K.; Tyler, Erik D.
2002-01-01
An aerodynamic database has been generated for the Mars Smart Lander Shelf-All configuration using computational fluid dynamics (CFD) simulations. Three different CFD codes, USM3D and FELISA, based on unstructured grid technology and LAURA, an established and validated structured CFD code, were used. As part of this database development, the results for the Mars continuum were validated with experimental data and comparisons made where applicable. The validation of USM3D and LAURA with the Unitary experimental data, the use of intermediate LAURA check analyses, as well as the validation of FELISA with the Mach 6 CF(sub 4) experimental data provided a higher confidence in the ability for CFD to provide aerodynamic data in order to determine the static trim characteristics for longitudinal stability. The analyses of the noncontinuum regime showed the existence of multiple trim angles of attack that can be unstable or stable trim points. This information is needed to design guidance controller throughout the trajectory.
Development of the Biological Experimental Design Concept Inventory (BEDCI)
ERIC Educational Resources Information Center
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gulnur
2014-01-01
Interest in student conception of experimentation inspired the development of a fully validated 14-question inventory on experimental design in biology (BEDCI) by following established best practices in concept inventory (CI) design. This CI can be used to diagnose specific examples of non-expert-like thinking in students and to evaluate the…
2016-05-24
experimental data. However, the time and length scales, and energy deposition rates in the canonical laboratory flames that have been studied over the...is to obtain high-fidelity experimental data critically needed to validate research codes at relevant conditions, and to develop systematic and...validated with experimental data. However, the time and length scales, and energy deposition rates in the canonical laboratory flames that have been
Principles for valid histopathologic scoring in research
Gibson-Corley, Katherine N.; Olivier, Alicia K.; Meyerholz, David K.
2013-01-01
Histopathologic scoring is a tool by which semi-quantitative data can be obtained from tissues. Initially, a thorough understanding of the experimental design, study objectives and methods are required to allow the pathologist to appropriately examine tissues and develop lesion scoring approaches. Many principles go into the development of a scoring system such as tissue examination, lesion identification, scoring definitions and consistency in interpretation. Masking (a.k.a. “blinding”) of the pathologist to experimental groups is often necessary to constrain bias and multiple mechanisms are available. Development of a tissue scoring system requires appreciation of the attributes and limitations of the data (e.g. nominal, ordinal, interval and ratio data) to be evaluated. Incidence, ordinal and rank methods of tissue scoring are demonstrated along with key principles for statistical analyses and reporting. Validation of a scoring system occurs through two principal measures: 1) validation of repeatability and 2) validation of tissue pathobiology. Understanding key principles of tissue scoring can help in the development and/or optimization of scoring systems so as to consistently yield meaningful and valid scoring data. PMID:23558974
Numerical Modelling of Femur Fracture and Experimental Validation Using Bone Simulant.
Marco, Miguel; Giner, Eugenio; Larraínzar-Garijo, Ricardo; Caeiro, José Ramón; Miguélez, María Henar
2017-10-01
Bone fracture pattern prediction is still a challenge and an active field of research. The main goal of this article is to present a combined methodology (experimental and numerical) for femur fracture onset analysis. Experimental work includes the characterization of the mechanical properties and fracture testing on a bone simulant. The numerical work focuses on the development of a model whose material properties are provided by the characterization tests. The fracture location and the early stages of the crack propagation are modelled using the extended finite element method and the model is validated by fracture tests developed in the experimental work. It is shown that the accuracy of the numerical results strongly depends on a proper bone behaviour characterization.
Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites
NASA Technical Reports Server (NTRS)
Turner, Travis L.
2001-01-01
This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.
Dasgupta, Annwesa P.; Anderson, Trevor R.
2014-01-01
It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students’ responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students’ experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. PMID:26086658
Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview
NASA Technical Reports Server (NTRS)
Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard
1996-01-01
The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'
Development and validation of a 10-year-old child ligamentous cervical spine finite element model.
Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H
2013-12-01
Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard
2016-12-29
The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less
Computer Simulations of Coronary Blood Flow Through a Constriction
2014-03-01
interventional procedures (e.g., stent deployment). Building off previous models that have been partially validated with experimental data, this thesis... stent deployment). Building off previous models that have been partially validated with experimental data, this thesis continues to develop the...the artery and increase blood flow. Generally a stent , or a mesh wire tube, is permanently inserted in order to scaffold open the artery wall
Fuel Combustion and Engine Performance | Transportation Research | NREL
. Through modeling, simulation, and experimental validation, researchers examine what happens to fuel inside combustion and engine research activities include: Developing experimental and simulation research platforms develop and refine accurate, efficient kinetic mechanisms for fuel ignition Investigating low-speed pre
Developing an Experimental Model of Vascular Toxicity in Embryonic Zebrafish
Developing an Experimental Model of Vascular Toxicity in Embryonic Zebrafish Tamara Tal, Integrated Systems Toxicology Division, U.S. EPA Background: There are tens of thousands of chemicals that have yet to be fully evaluated for their toxicity by validated in vivo testing ...
NASA Technical Reports Server (NTRS)
Stefanescu, D. M.; Catalina, A. V.; Juretzko, Frank R.; Sen, Subhayu; Curreri, P. A.
2003-01-01
The objective of the work on Particle Engulfment and Pushing by Solidifying Interfaces (PEP) include: 1) to obtain fundamental understanding of the physics of particle pushing and engulfment, 2) to develop mathematical models to describe the phenomenon, and 3) to perform critical experiments in the microgravity environment of space to provide benchmark data for model validation. Successful completion of this project will yield vital information relevant to a diverse area of terrestrial applications. With PEP being a long term research effort, this report will focus on advances in the theoretical treatment of the solid/liquid interface interaction with an approaching particle, experimental validation of some aspects of the developed models, and the experimental design aspects of future experiments to be performed on board the International Space Station.
Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes
NASA Astrophysics Data System (ADS)
Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.
2018-04-01
To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.
Southern Regional Center for Lightweight Innovative Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Paul T.
The Southern Regional Center for Lightweight Innovative Design (SRCLID) has developed an experimentally validated cradle-to-grave modeling and simulation effort to optimize automotive components in order to decrease weight and cost, yet increase performance and safety in crash scenarios. In summary, the three major objectives of this project are accomplished: To develop experimentally validated cradle-to-grave modeling and simulation tools to optimize automotive and truck components for lightweighting materials (aluminum, steel, and Mg alloys and polymer-based composites) with consideration of uncertainty to decrease weight and cost, yet increase the performance and safety in impact scenarios; To develop multiscale computational models that quantifymore » microstructure-property relations by evaluating various length scales, from the atomic through component levels, for each step of the manufacturing process for vehicles; and To develop an integrated K-12 educational program to educate students on lightweighting designs and impact scenarios. In this final report, we divided the content into two parts: the first part contains the development of building blocks for the project, including materials and process models, process-structure-property (PSP) relationship, and experimental validation capabilities; the second part presents the demonstration task for Mg front-end work associated with USAMP projects.« less
Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reboud, C.; Premel, D.; Lesselier, D.
2007-03-21
Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.
Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations
NASA Astrophysics Data System (ADS)
Reboud, C.; Prémel, D.; Lesselier, D.; Bisiaux, B.
2007-03-01
Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.
OECD-NEA Expert Group on Multi-Physics Experimental Data, Benchmarks and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valentine, Timothy; Rohatgi, Upendra S.
High-fidelity, multi-physics modeling and simulation (M&S) tools are being developed and utilized for a variety of applications in nuclear science and technology and show great promise in their abilities to reproduce observed phenomena for many applications. Even with the increasing fidelity and sophistication of coupled multi-physics M&S tools, the underpinning models and data still need to be validated against experiments that may require a more complex array of validation data because of the great breadth of the time, energy and spatial domains of the physical phenomena that are being simulated. The Expert Group on Multi-Physics Experimental Data, Benchmarks and Validationmore » (MPEBV) of the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) was formed to address the challenges with the validation of such tools. The work of the MPEBV expert group is shared among three task forces to fulfill its mandate and specific exercises are being developed to demonstrate validation principles for common industrial challenges. This paper describes the overall mission of the group, the specific objectives of the task forces, the linkages among the task forces, and the development of a validation exercise that focuses on a specific reactor challenge problem.« less
Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code
NASA Astrophysics Data System (ADS)
Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.
2015-12-01
WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).
An experimental validation of a statistical-based damage detection approach.
DOT National Transportation Integrated Search
2011-01-01
In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to : autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and : predicted beha...
Experimental Validation of the Piezoelectric Triple Hybrid Actuation System (TriHYBAS)
NASA Technical Reports Server (NTRS)
Xu, Tian-Bing; Jiang, Xiaoning; Su, Ji
2008-01-01
A piezoelectric triple hybrid actuation system (TriHYBAS) has been developed. In this brief presentation of the validation process the displacement profile of TriHYBAS and findings regarding displacement versus applied voltage are highlighted.
NASA Astrophysics Data System (ADS)
Johnston, Michael A.; Farrell, Damien; Nielsen, Jens Erik
2012-04-01
The exchange of information between experimentalists and theoreticians is crucial to improving the predictive ability of theoretical methods and hence our understanding of the related biology. However many barriers exist which prevent the flow of information between the two disciplines. Enabling effective collaboration requires that experimentalists can easily apply computational tools to their data, share their data with theoreticians, and that both the experimental data and computational results are accessible to the wider community. We present a prototype collaborative environment for developing and validating predictive tools for protein biophysical characteristics. The environment is built on two central components; a new python-based integration module which allows theoreticians to provide and manage remote access to their programs; and PEATDB, a program for storing and sharing experimental data from protein biophysical characterisation studies. We demonstrate our approach by integrating PEATSA, a web-based service for predicting changes in protein biophysical characteristics, into PEATDB. Furthermore, we illustrate how the resulting environment aids method development using the Potapov dataset of experimentally measured ΔΔGfold values, previously employed to validate and train protein stability prediction algorithms.
Energy transformation, transfer, and release dynamics in high speed turbulent flows
2017-03-01
experimental techniques developed allowed non -intrusive measurement of convecting velocity fields in supersonic flows and used for validation of LES of...by the absence of (near-)normal shocks that normal injection generates. New experimental techniques were developed that allowed the non -intrusive...and was comprised of several parts in which significant accomplishments were made: 1. An experimental effort focusing on investigations in: a
2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation
NASA Technical Reports Server (NTRS)
Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.
2009-01-01
A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.
An Integrated Study on a Novel High Temperature High Entropy Alloy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Shizhong
2016-12-31
This report summarizes our recent works of theoretical modeling, simulation, and experimental validation of the simulation results on the new refractory high entropy alloy (HEA) design and oxide doped refractory HEA research. The simulation of the stability and thermal dynamics simulation on potential thermal stable candidates were performed and related HEA with oxide doped samples were synthesized and characterized. The HEA ab initio density functional theory and molecular dynamics physical property simulation methods and experimental texture validation techniques development, achievements already reached, course work development, students and postdoc training, and future improvement research directions are briefly introduced.
NASA Astrophysics Data System (ADS)
Maślak, Mariusz; Pazdanowski, Michał; Woźniczka, Piotr
2018-01-01
Validation of fire resistance for the same steel frame bearing structure is performed here using three different numerical models, i.e. a bar one prepared in the SAFIR environment, and two 3D models developed within the framework of Autodesk Simulation Mechanical (ASM) and an alternative one developed in the environment of the Abaqus code. The results of the computer simulations performed are compared with the experimental results obtained previously, in a laboratory fire test, on a structure having the same characteristics and subjected to the same heating regimen. Comparison of the experimental and numerically determined displacement evolution paths for selected nodes of the considered frame during the simulated fire exposure constitutes the basic criterion applied to evaluate the validity of the numerical results obtained. The experimental and numerically determined estimates of critical temperature specific to the considered frame and related to the limit state of bearing capacity in fire have been verified as well.
Fast scattering simulation tool for multi-energy x-ray imaging
NASA Astrophysics Data System (ADS)
Sossin, A.; Tabary, J.; Rebuffel, V.; Létang, J. M.; Freud, N.; Verger, L.
2015-12-01
A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.
Selecting and Improving Quasi-Experimental Designs in Effectiveness and Implementation Research.
Handley, Margaret A; Lyles, Courtney R; McCulloch, Charles; Cattamanchi, Adithya
2018-04-01
Interventional researchers face many design challenges when assessing intervention implementation in real-world settings. Intervention implementation requires holding fast on internal validity needs while incorporating external validity considerations (such as uptake by diverse subpopulations, acceptability, cost, and sustainability). Quasi-experimental designs (QEDs) are increasingly employed to achieve a balance between internal and external validity. Although these designs are often referred to and summarized in terms of logistical benefits, there is still uncertainty about (a) selecting from among various QEDs and (b) developing strategies to strengthen the internal and external validity of QEDs. We focus here on commonly used QEDs (prepost designs with nonequivalent control groups, interrupted time series, and stepped-wedge designs) and discuss several variants that maximize internal and external validity at the design, execution and implementation, and analysis stages.
NASA Astrophysics Data System (ADS)
Nir, A.; Doughty, C.; Tsang, C. F.
Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no attempt to validate a specific model, but several models of increasing complexity are compared with experimental results. The outcome is interpreted as a demonstration of the paradigm proposed by van der Heijde, 26 that different constituencies have different objectives for the validation process and therefore their acceptance criteria differ also.
USDA-ARS?s Scientific Manuscript database
A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...
ENFIN--A European network for integrative systems biology.
Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan
2009-11-01
Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.
Experimentally validated finite element model of electrocaloric multilayer ceramic structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk
2014-07-28
A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to supportmore » the design of optimised electrocaloric units and operating conditions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Brian; Gutowska, Izabela; Chiger, Howard
Computer simulations of nuclear reactor thermal-hydraulic phenomena are often used in the design and licensing of nuclear reactor systems. In order to assess the accuracy of these computer simulations, computer codes and methods are often validated against experimental data. This experimental data must be of sufficiently high quality in order to conduct a robust validation exercise. In addition, this experimental data is generally collected at experimental facilities that are of a smaller scale than the reactor systems that are being simulated due to cost considerations. Therefore, smaller scale test facilities must be designed and constructed in such a fashion tomore » ensure that the prototypical behavior of a particular nuclear reactor system is preserved. The work completed through this project has resulted in scaling analyses and conceptual design development for a test facility capable of collecting code validation data for the following high temperature gas reactor systems and events— 1. Passive natural circulation core cooling system, 2. pebble bed gas reactor concept, 3. General Atomics Energy Multiplier Module reactor, and 4. prismatic block design steam-water ingress event. In the event that code validation data for these systems or events is needed in the future, significant progress in the design of an appropriate integral-type test facility has already been completed as a result of this project. Where applicable, the next step would be to begin the detailed design development and material procurement. As part of this project applicable scaling analyses were completed and test facility design requirements developed. Conceptual designs were developed for the implementation of these design requirements at the Oregon State University (OSU) High Temperature Test Facility (HTTF). The original HTTF is based on a ¼-scale model of a high temperature gas reactor concept with the capability for both forced and natural circulation flow through a prismatic core with an electrical heat source. The peak core region temperature capability is 1400°C. As part of this project, an inventory of test facilities that could be used for these experimental programs was completed. Several of these facilities showed some promise, however, upon further investigation it became clear that only the OSU HTTF had the power and/or peak temperature limits that would allow for the experimental programs envisioned herein. Thus the conceptual design and feasibility study development focused on examining the feasibility of configuring the current HTTF to collect validation data for these experimental programs. In addition to the scaling analyses and conceptual design development, a test plan was developed for the envisioned modified test facility. This test plan included a discussion on an appropriate shakedown test program as well as the specific matrix tests. Finally, a feasibility study was completed to determine the cost and schedule considerations that would be important to any test program developed to investigate these designs and events.« less
Rossi, Michael R.; Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed
2009-01-01
The current study focuses on experimentally validating a planning scheme based on the so-called bubble-packing method. This study is a part of an ongoing effort to develop computerized planning tools for cryosurgery, where bubble packing has been previously developed as a means to find an initial, uniform distribution of cryoprobes within a given domain; the so-called force-field analogy was then used to move cryoprobes to their optimum layout. However, due to the high quality of the cryoprobes’ distribution, suggested by bubble packing and its low computational cost, it has been argued that a planning scheme based solely on bubble packing may be more clinically relevant. To test this argument, an experimental validation is performed on a simulated cross-section of the prostate, using gelatin solution as a phantom material, proprietary liquid-nitrogen based cryoprobes, and a cryoheater to simulate urethral warming. Experimental results are compared with numerically simulated temperature histories resulting from planning. Results indicate an average disagreement of 0.8 mm in identifying the freezing front location, which is an acceptable level of uncertainty in the context of prostate cryosurgery imaging. PMID:19885373
Development and application of transcriptomics-based gene classifiers for ecotoxicological applications lag far behind those of human biomedical science. Many such classifiers discovered thus far lack vigorous statistical and experimental validations, with their stability and rel...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zi-Kui; Gleeson, Brian; Shang, Shunli
This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities,more » which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.« less
Experimental Validation of a Closed Brayton Cycle System Transient Simulation
NASA Technical Reports Server (NTRS)
Johnson, Paul K.; Hervol, David S.
2006-01-01
The Brayton Power Conversion Unit (BPCU) located at NASA Glenn Research Center (GRC) in Cleveland, Ohio was used to validate the results of a computational code known as Closed Cycle System Simulation (CCSS). Conversion system thermal transient behavior was the focus of this validation. The BPCU was operated at various steady state points and then subjected to transient changes involving shaft rotational speed and thermal energy input. These conditions were then duplicated in CCSS. Validation of the CCSS BPCU model provides confidence in developing future Brayton power system performance predictions, and helps to guide high power Brayton technology development.
Software aspects of the Geant4 validation repository
NASA Astrophysics Data System (ADS)
Dotti, Andrea; Wenzel, Hans; Elvira, Daniel; Genser, Krzysztof; Yarba, Julia; Carminati, Federico; Folger, Gunter; Konstantinov, Dmitri; Pokorski, Witold; Ribon, Alberto
2017-10-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
Software Aspects of the Geant4 Validation Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dotti, Andrea; Wenzel, Hans; Elvira, Daniel
2016-01-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientic Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
NASA Technical Reports Server (NTRS)
Cognata, Thomas J.; Leimkuehler, Thomas O.; Sheth, Rubik B.; Le,Hung
2012-01-01
The Fusible Heat Sink is a novel vehicle heat rejection technology which combines a flow through radiator with a phase change material. The combined technologies create a multi-function device able to shield crew members against Solar Particle Events (SPE), reduce radiator extent by permitting sizing to the average vehicle heat load rather than to the peak vehicle heat load, and to substantially absorb heat load excursions from the average while constantly maintaining thermal control system setpoints. This multi-function technology provides great flexibility for mission planning, making it possible to operate a vehicle in hot or cold environments and under high or low heat load conditions for extended periods of time. This paper describes the model development and experimental validation of the Fusible Heat Sink technology. The model developed was intended to meet the radiation and heat rejection requirements of a nominal MMSEV mission. Development parameters and results, including sizing and model performance will be discussed. From this flight-sized model, a scaled test-article design was modeled, designed, and fabricated for experimental validation of the technology at Johnson Space Center thermal vacuum chamber facilities. Testing showed performance comparable to the model at nominal loads and the capability to maintain heat loads substantially greater than nominal for extended periods of time.
Morsink, Maarten C; Dukers, Danny F
2009-03-01
Animal models have been widely used for studying the physiology and pharmacology of psychiatric and neurological diseases. The concepts of face, construct, and predictive validity are used as indicators to estimate the extent to which the animal model mimics the disease. Currently, we used these three concepts to design a theoretical assignment to integrate the teaching of neurophysiology, neuropharmacology, and experimental design. For this purpose, seven case studies were developed in which animal models for several psychiatric and neurological diseases were described and in which neuroactive drugs used to treat or study these diseases were introduced. Groups of undergraduate students were assigned to one of these case studies and asked to give a classroom presentation in which 1) the disease and underlying pathophysiology are described, 2) face and construct validity of the animal model are discussed, and 3) a pharmacological experiment with the associated neuroactive drug to assess predictive validity is presented. After evaluation of the presentations, we found that the students had gained considerable insight into disease phenomenology, its underlying neurophysiology, and the mechanism of action of the neuroactive drug. Moreover, the assignment was very useful in the teaching of experimental design, allowing an in-depth discussion of experimental control groups and the prediction of outcomes in these groups if the animal model were to display predictive validity. Finally, the highly positive responses in the student evaluation forms indicated that the assignment was of great interest to the students. Hence, the currently developed case studies constitute a very useful tool for teaching neurophysiology, neuropharmacology, and experimental design.
Escaño, Mary Clare Sison; Arevalo, Ryan Lacdao; Gyenge, Elod; Kasai, Hideaki
2014-09-03
The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4(-) on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.
NASA Astrophysics Data System (ADS)
Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki
2014-09-01
The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.
NASA Astrophysics Data System (ADS)
Lezberg, Erwin A.; Mularz, Edward J.; Liou, Meng-Sing
1991-03-01
The objectives and accomplishments of research in chemical reacting flows, including both experimental and computational problems are described. The experimental research emphasizes the acquisition of reliable reacting-flow data for code validation, the development of chemical kinetics mechanisms, and the understanding of two-phase flow dynamics. Typical results from two nonreacting spray studies are presented. The computational fluid dynamics (CFD) research emphasizes the development of efficient and accurate algorithms and codes, as well as validation of methods and modeling (turbulence and kinetics) for reacting flows. Major developments of the RPLUS code and its application to mixing concepts, the General Electric combustor, and the Government baseline engine for the National Aerospace Plane are detailed. Finally, the turbulence research in the newly established Center for Modeling of Turbulence and Transition (CMOTT) is described.
NASA Technical Reports Server (NTRS)
Sellers, William L., III; Dwoyer, Douglas L.
1992-01-01
The design of a hypersonic aircraft poses unique challenges to the engineering community. Problems with duplicating flight conditions in ground based facilities have made performance predictions risky. Computational fluid dynamics (CFD) has been proposed as an additional means of providing design data. At the present time, CFD codes are being validated based on sparse experimental data and then used to predict performance at flight conditions with generally unknown levels of uncertainty. This paper will discuss the facility and measurement techniques that are required to support CFD development for the design of hypersonic aircraft. Illustrations are given of recent success in combining experimental and direct numerical simulation in CFD model development and validation for hypersonic perfect gas flows.
Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment
NASA Technical Reports Server (NTRS)
Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.
2008-01-01
Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.
WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruehl, Kelley; Michelen, Carlos; Bosma, Bret
2016-08-01
The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is amore » follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.« less
DoSSiER: Database of scientific simulation and experimental results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenzel, Hans; Yarba, Julia; Genser, Krzystof
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
NASA Astrophysics Data System (ADS)
Kartalov, Emil P.; Scherer, Axel; Quake, Stephen R.; Taylor, Clive R.; Anderson, W. French
2007-03-01
A systematic experimental study and theoretical modeling of the device physics of polydimethylsiloxane "pushdown" microfluidic valves are presented. The phase space is charted by 1587 dimension combinations and encompasses 45-295μm lateral dimensions, 16-39μm membrane thickness, and 1-28psi closing pressure. Three linear models are developed and tested against the empirical data, and then combined into a fourth-power-polynomial superposition. The experimentally validated final model offers a useful quantitative prediction for a valve's properties as a function of its dimensions. Typical valves (80-150μm width) are shown to behave like thin springs.
NASA Astrophysics Data System (ADS)
Maghareh, Amin; Silva, Christian E.; Dyke, Shirley J.
2018-05-01
Hydraulic actuators play a key role in experimental structural dynamics. In a previous study, a physics-based model for a servo-hydraulic actuator coupled with a nonlinear physical system was developed. Later, this dynamical model was transformed into controllable canonical form for position tracking control purposes. For this study, a nonlinear device is designed and fabricated to exhibit various nonlinear force-displacement profiles depending on the initial condition and the type of materials used as replaceable coupons. Using this nonlinear system, the controllable canonical dynamical model is experimentally validated for a servo-hydraulic actuator coupled with a nonlinear physical system.
DoSSiER: Database of scientific simulation and experimental results
Wenzel, Hans; Yarba, Julia; Genser, Krzystof; ...
2016-08-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
Medina-Figueroa, Alda María; Espinosa-Alarcón, Patricia Atzimba
2007-01-01
To estimate the achievement of an educative strategy that promoted participation of the development of the clinical aptitude of undergraduate medical students with regard to patients with diabetes. We conducted a quasi-experimental study with two groups of undergraduate medical students. We validated an instrument to explore clinical attitude concerning diabetes mellitus (ACDIME) with 30 items for each of six indicators. The instrument was applied at two general hospitals, before and after carrying out the educative strategies. In the experimental group, we conducted an educative strategy that promotes participation in developing clinical aptitude, while the customary strategy was developed in the control group. ACDIME consistency was 0.80. Both study groups were similar before the educative strategies (p = 0.165). Statistically significant differences existed after the strategies in all indicators were evaluated, in favor of the experimental group. The tendency to change, with a criterion of 50% or more, was only statistically significant in the experimental group (p < 0.0001). The ACDIME instrument is valid and reliable. The educative strategy that promoted participation is clearly superior to the customary strategy with regard to achievement.
Benchmark tests for a Formula SAE Student car prototyping
NASA Astrophysics Data System (ADS)
Mariasiu, Florin
2011-12-01
Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.
Finite Element Model of the Knee for Investigation of Injury Mechanisms: Development and Validation
Kiapour, Ali; Kiapour, Ata M.; Kaul, Vikas; Quatman, Carmen E.; Wordeman, Samuel C.; Hewett, Timothy E.; Demetropoulos, Constantine K.; Goel, Vijay K.
2014-01-01
Multiple computational models have been developed to study knee biomechanics. However, the majority of these models are mainly validated against a limited range of loading conditions and/or do not include sufficient details of the critical anatomical structures within the joint. Due to the multifactorial dynamic nature of knee injuries, anatomic finite element (FE) models validated against multiple factors under a broad range of loading conditions are necessary. This study presents a validated FE model of the lower extremity with an anatomically accurate representation of the knee joint. The model was validated against tibiofemoral kinematics, ligaments strain/force, and articular cartilage pressure data measured directly from static, quasi-static, and dynamic cadaveric experiments. Strong correlations were observed between model predictions and experimental data (r > 0.8 and p < 0.0005 for all comparisons). FE predictions showed low deviations (root-mean-square (RMS) error) from average experimental data under all modes of static and quasi-static loading, falling within 2.5 deg of tibiofemoral rotation, 1% of anterior cruciate ligament (ACL) and medial collateral ligament (MCL) strains, 17 N of ACL load, and 1 mm of tibiofemoral center of pressure. Similarly, the FE model was able to accurately predict tibiofemoral kinematics and ACL and MCL strains during simulated bipedal landings (dynamic loading). In addition to minimal deviation from direct cadaveric measurements, all model predictions fell within 95% confidence intervals of the average experimental data. Agreement between model predictions and experimental data demonstrates the ability of the developed model to predict the kinematics of the human knee joint as well as the complex, nonuniform stress and strain fields that occur in biological soft tissue. Such a model will facilitate the in-depth understanding of a multitude of potential knee injury mechanisms with special emphasis on ACL injury. PMID:24763546
ERIC Educational Resources Information Center
Gervais, Matthew M.
2017-01-01
Experimental economic games reveal significant population variation in human social behavior. However, most protocols involve anonymous recipients, limiting their validity to fleeting interactions. Understanding human relationship dynamics will require methods with the virtues of economic games that also tap recipient identity-conditioned…
NASA Astrophysics Data System (ADS)
Rizzo, Axel; Vaglio-Gaudard, Claire; Martin, Julie-Fiona; Noguère, Gilles; Eschbach, Romain
2017-09-01
DARWIN2.3 is the reference package used for fuel cycle applications in France. It solves the Boltzmann and Bateman equations in a coupling way, with the European JEFF-3.1.1 nuclear data library, to compute the fuel cycle values of interest. It includes both deterministic transport codes APOLLO2 (for light water reactors) and ERANOS2 (for fast reactors), and the DARWIN/PEPIN2 depletion code, each of them being developed by CEA/DEN with the support of its industrial partners. The DARWIN2.3 package has been experimentally validated for pressurized and boiling water reactors, as well as for sodium fast reactors; this experimental validation relies on the analysis of post-irradiation experiments (PIE). The DARWIN2.3 experimental validation work points out some isotopes for which the depleted concentration calculation can be improved. Some other nuclides have no available experimental validation, and their concentration calculation uncertainty is provided by the propagation of a priori nuclear data uncertainties. This paper describes the work plan of studies initiated this year to improve the accuracy of the DARWIN2.3 depleted material balance calculation concerning some nuclides of interest for the fuel cycle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Wei-Yang; Korellis, John S.; Lee, Kenneth L.
2006-08-01
Honeycomb is a structure that consists of two-dimensional regular arrays of open cells. High-density aluminum honeycomb has been used in weapon assemblies to mitigate shock and protect payload because of its excellent crush properties. In order to use honeycomb efficiently and to certify the payload is protected by the honeycomb under various loading conditions, a validated honeycomb crush model is required and the mechanical properties of the honeycombs need to be fully characterized. Volume I of this report documents an experimental study of the crush behavior of high-density honeycombs. Two sets of honeycombs were included in this investigation: commercial grademore » for initial exploratory experiments, and weapon grade, which satisfied B61 specifications. This investigation also includes developing proper experimental methods for crush characterization, conducting discovery experiments to explore crush behaviors for model improvement, and identifying experimental and material uncertainties.« less
NASA Astrophysics Data System (ADS)
Allred, C. Jeff; Churchill, David; Buckner, Gregory D.
2017-07-01
This paper presents a novel approach to monitoring rotor blade flap, lead-lag and pitch using an embedded gyroscope and symmetrically mounted MEMS accelerometers. The central hypothesis is that differential accelerometer measurements are proportional only to blade motion; fuselage acceleration and blade bending are inherently compensated for. The inverse kinematic relationships (from blade position to acceleration and angular rate) are derived and simulated to validate this hypothesis. An algorithm to solve the forward kinematic relationships (from sensor measurement to blade position) is developed using these simulation results. This algorithm is experimentally validated using a prototype device. The experimental results justify continued development of this kinematic estimation approach.
Hybrid Particle-Element Simulation of Impact on Composite Orbital Debris Shields
NASA Technical Reports Server (NTRS)
Fahrenthold, Eric P.
2004-01-01
This report describes the development of new numerical methods and new constitutive models for the simulation of hypervelocity impact effects on spacecraft. The research has included parallel implementation of the numerical methods and material models developed under the project. Validation work has included both one dimensional simulations, for comparison with exact solutions, and three dimensional simulations of published hypervelocity impact experiments. The validated formulations have been applied to simulate impact effects in a velocity and kinetic energy regime outside the capabilities of current experimental methods. The research results presented here allow for the expanded use of numerical simulation, as a complement to experimental work, in future design of spacecraft for hypervelocity impact effects.
NASA Technical Reports Server (NTRS)
Nehl, T. W.; Demerdash, N. A.
1983-01-01
Mathematical models capable of simulating the transient, steady state, and faulted performance characteristics of various brushless dc machine-PSA (power switching assembly) configurations were developed. These systems are intended for possible future use as primemovers in EMAs (electromechanical actuators) for flight control applications. These machine-PSA configurations include wye, delta, and open-delta connected systems. The research performed under this contract was initially broken down into the following six tasks: development of mathematical models for various machine-PSA configurations; experimental validation of the model for failure modes; experimental validation of the mathematical model for shorted turn-failure modes; tradeoff study; and documentation of results and methodology.
Liu, Huolong; Galbraith, S C; Ricart, Brendon; Stanton, Courtney; Smith-Goettler, Brandye; Verdi, Luke; O'Connor, Thomas; Lee, Sau; Yoon, Seongkyu
2017-06-15
In this study, the influence of key process variables (screw speed, throughput and liquid to solid (L/S) ratio) of a continuous twin screw wet granulation (TSWG) was investigated using a central composite face-centered (CCF) experimental design method. Regression models were developed to predict the process responses (motor torque, granule residence time), granule properties (size distribution, volume average diameter, yield, relative width, flowability) and tablet properties (tensile strength). The effects of the three key process variables were analyzed via contour and interaction plots. The experimental results have demonstrated that all the process responses, granule properties and tablet properties are influenced by changing the screw speed, throughput and L/S ratio. The TSWG process was optimized to produce granules with specific volume average diameter of 150μm and the yield of 95% based on the developed regression models. A design space (DS) was built based on volume average granule diameter between 90 and 200μm and the granule yield larger than 75% with a failure probability analysis using Monte Carlo simulations. Validation experiments successfully validated the robustness and accuracy of the DS generated using the CCF experimental design in optimizing a continuous TSWG process. Copyright © 2017 Elsevier B.V. All rights reserved.
Detection of overreported psychopathology with the MMPI-2-RF [corrected] validity scales.
Sellbom, Martin; Bagby, R Michael
2010-12-01
We examined the utility of the validity scales on the recently released Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2 RF; Ben-Porath & Tellegen, 2008) to detect overreported psychopathology. This set of validity scales includes a newly developed scale and revised versions of the original MMPI-2 validity scales. We used an analogue, experimental simulation in which MMPI-2 RF responses (derived from archived MMPI-2 protocols) of undergraduate students instructed to overreport psychopathology (in either a coached or noncoached condition) were compared with those of psychiatric inpatients who completed the MMPI-2 under standardized instructions. The MMPI-2 RF validity scale Infrequent Psychopathology Responses best differentiated the simulation groups from the sample of patients, regardless of experimental condition. No other validity scale added consistent incremental predictive utility to Infrequent Psychopathology Responses in distinguishing the simulation groups from the sample of patients. Classification accuracy statistics confirmed the recommended cut scores in the MMPI-2 RF manual (Ben-Porath & Tellegen, 2008).
Code Validation Studies of High-Enthalpy Flows
2006-12-01
stage of future hypersonic vehicles. The development and design of such vehicles is aided by the use of experimentation and numerical simulation... numerical predictions and experimental measurements. 3. Summary of Previous Work We have studied extensively hypersonic double-cone flows with and in...the experimental measurements and the numerical predictions. When we accounted for that effect in numerical simulations, and also augmented the
Hua, Xijin; Wang, Ling; Al-Hajjar, Mazen; Jin, Zhongmin; Wilcox, Ruth K; Fisher, John
2014-07-01
Finite element models are becoming increasingly useful tools to conduct parametric analysis, design optimisation and pre-clinical testing for hip joint replacements. However, the verification of the finite element model is critically important. The purposes of this study were to develop a three-dimensional anatomic finite element model for a modular metal-on-polyethylene total hip replacement for predicting its contact mechanics and to conduct experimental validation for a simple finite element model which was simplified from the anatomic finite element model. An anatomic modular metal-on-polyethylene total hip replacement model (anatomic model) was first developed and then simplified with reasonable accuracy to a simple modular total hip replacement model (simplified model) for validation. The contact areas on the articulating surface of three polyethylene liners of modular metal-on-polyethylene total hip replacement bearings with different clearances were measured experimentally in the Leeds ProSim hip joint simulator under a series of loading conditions and different cup inclination angles. The contact areas predicted from the simplified model were then compared with that measured experimentally under the same conditions. The results showed that the simplification made for the anatomic model did not change the predictions of contact mechanics of the modular metal-on-polyethylene total hip replacement substantially (less than 12% for contact stresses and contact areas). Good agreements of contact areas between the finite element predictions from the simplified model and experimental measurements were obtained, with maximum difference of 14% across all conditions considered. This indicated that the simplification and assumptions made in the anatomic model were reasonable and the finite element predictions from the simplified model were valid. © IMechE 2014.
NASA Technical Reports Server (NTRS)
Ku, Jentung; Ottenstein, Laura; Douglas, Donya; Hoang, Triem
2010-01-01
Under NASA s New Millennium Program Space Technology 8 (ST 8) Project, Goddard Space Fight Center has conducted a Thermal Loop experiment to advance the maturity of the Thermal Loop technology from proof of concept to prototype demonstration in a relevant environment , i.e. from a technology readiness level (TRL) of 3 to a level of 6. The thermal Loop is an advanced thermal control system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers designed for future small system applications requiring low mass, low power, and compactness. The MLHP retains all features of state-of-the-art loop heat pipes (LHPs) and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. An MLHP breadboard was built and tested in the laboratory and thermal vacuum environments for the TRL 4 and TRL 5 validations, respectively, and an MLHP proto-flight unit was built and tested in a thermal vacuum chamber for the TRL 6 validation. In addition, an analytical model was developed to simulate the steady state and transient behaviors of the MLHP during various validation tests. The MLHP demonstrated excellent performance during experimental tests and the analytical model predictions agreed very well with experimental data. All success criteria at various TRLs were met. Hence, the Thermal Loop technology has reached a TRL of 6. This paper presents the validation results, both experimental and analytical, of such a technology development effort.
NASA Technical Reports Server (NTRS)
Cognata, Thomas J.; Leimkuehler, Thomas; Sheth, Rubik; Le, Hung
2013-01-01
The Fusible Heat Sink is a novel vehicle heat rejection technology which combines a flow through radiator with a phase change material. The combined technologies create a multi-function device able to shield crew members against Solar Particle Events (SPE), reduce radiator extent by permitting sizing to the average vehicle heat load rather than to the peak vehicle heat load, and to substantially absorb heat load excursions from the average while constantly maintaining thermal control system setpoints. This multi-function technology provides great flexibility for mission planning, making it possible to operate a vehicle in hot or cold environments and under high or low heat load conditions for extended periods of time. This paper describes the modeling and experimental validation of the Fusible Heat Sink technology. The model developed was intended to meet the radiation and heat rejection requirements of a nominal MMSEV mission. Development parameters and results, including sizing and model performance will be discussed. From this flight-sized model, a scaled test-article design was modeled, designed, and fabricated for experimental validation of the technology at Johnson Space Center thermal vacuum chamber facilities. Testing showed performance comparable to the model at nominal loads and the capability to maintain heat loads substantially greater than nominal for extended periods of time.
Computational design and experimental validation of new thermal barrier systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Shengmin
2015-03-31
The focus of this project is on the development of a reliable and efficient ab initio based computational high temperature material design method which can be used to assist the Thermal Barrier Coating (TBC) bond-coat and top-coat design. Experimental evaluations on the new TBCs are conducted to confirm the new TBCs’ properties. Southern University is the subcontractor on this project with a focus on the computational simulation method development. We have performed ab initio density functional theory (DFT) method and molecular dynamics simulation on screening the top coats and bond coats for gas turbine thermal barrier coating design and validationmore » applications. For experimental validations, our focus is on the hot corrosion performance of different TBC systems. For example, for one of the top coatings studied, we examined the thermal stability of TaZr 2.75O 8 and confirmed it’s hot corrosion performance.« less
Experimental aerothermodynamic research of hypersonic aircraft
NASA Technical Reports Server (NTRS)
Cleary, Joseph W.
1987-01-01
The 2-D and 3-D advance computer codes being developed for use in the design of such hypersonic aircraft as the National Aero-Space Plane require comparison of the computational results with a broad spectrum of experimental data to fully assess the validity of the codes. This is particularly true for complex flow fields with control surfaces present and for flows with separation, such as leeside flow. Therefore, the objective is to provide a hypersonic experimental data base required for validation of advanced computational fluid dynamics (CFD) computer codes and for development of more thorough understanding of the flow physics necessary for these codes. This is being done by implementing a comprehensive test program for a generic all-body hypersonic aircraft model in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel over a broad range of test conditions to obtain pertinent surface and flowfield data. Results from the flow visualization portion of the investigation are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hassan, Tasnim; Lissenden, Cliff; Carroll, Laura
The proposed research will develop systematic sets of uniaxial and multiaxial experimental data at a very high temperature (850-950°C) for Alloy 617. The loading histories to be prescribed in the experiments will induce creep-fatigue and creep-ratcheting failure mechanisms. These experimental responses will be scrutinized in order to quantify the influences of temperature and creep on fatigue and ratcheting failures. A unified constitutive model (UCM) will be developed and validated against these experimental responses. The improved UCM will be incorporated into the widely used finite element commercial software packages ANSYS. The modified ANSYS will be validated so that it can bemore » used for evaluating the very high temperature ASME-NH design-by-analysis methodology for Alloy 617 and thereby addressing the ASME-NH design code issues.« less
Modeling and Experimental Validation for 3D mm-wave Radar Imaging
NASA Astrophysics Data System (ADS)
Ghazi, Galia
As the problem of identifying suicide bombers wearing explosives concealed under clothing becomes increasingly important, it becomes essential to detect suspicious individuals at a distance. Systems which employ multiple sensors to determine the presence of explosives on people are being developed. Their functions include observing and following individuals with intelligent video, identifying explosives residues or heat signatures on the outer surface of their clothing, and characterizing explosives using penetrating X-rays, terahertz waves, neutron analysis, or nuclear quadrupole resonance. At present, mm-wave radar is the only modality that can both penetrate and sense beneath clothing at a distance of 2 to 50 meters without causing physical harm. Unfortunately, current mm-wave radar systems capable of performing high-resolution, real-time imaging require using arrays with a large number of transmitting and receiving modules; therefore, these systems present undesired large size, weight and power consumption, as well as extremely complex hardware architecture. The overarching goal of this thesis is the development and experimental validation of a next generation inexpensive, high-resolution radar system that can distinguish security threats hidden on individuals located at 2-10 meters range. In pursuit of this goal, this thesis proposes the following contributions: (1) Development and experimental validation of a new current-based, high-frequency computational method to model large scattering problems (hundreds of wavelengths) involving lossy, penetrable and multi-layered dielectric and conductive structures, which is needed for an accurate characterization of the wave-matter interaction and EM scattering in the target region; (2) Development of combined Norm-1, Norm-2 regularized imaging algorithms, which are needed for enhancing the resolution of the images while using a minimum number of transmitting and receiving antennas; (3) Implementation and experimental validation of new calibration techniques, which are needed for coherent imaging with multistatic configurations; and (4) Investigation of novel compressive antennas, which spatially modulate the wavefield in order to enhance the information transfer efficiency between sampling and imaging regions and use of Compressive Sensing algorithms.
Experimental validation of the DARWIN2.3 package for fuel cycle applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
San-Felice, L.; Eschbach, R.; Bourdot, P.
2012-07-01
The DARWIN package, developed by the CEA and its French partners (AREVA and EDF) provides the required parameters for fuel cycle applications: fuel inventory, decay heat, activity, neutron, {gamma}, {alpha}, {beta} sources and spectrum, radiotoxicity. This paper presents the DARWIN2.3 experimental validation for fuel inventory and decay heat calculations on Pressurized Water Reactor (PWR). In order to validate this code system for spent fuel inventory a large program has been undertaken, based on spent fuel chemical assays. This paper deals with the experimental validation of DARWIN2.3 for the Pressurized Water Reactor (PWR) Uranium Oxide (UOX) and Mixed Oxide (MOX) fuelmore » inventory calculation, focused on the isotopes involved in Burn-Up Credit (BUC) applications and decay heat computations. The calculation - experiment (C/E-1) discrepancies are calculated with the latest European evaluation file JEFF-3.1.1 associated with the SHEM energy mesh. An overview of the tendencies is obtained on a complete range of burn-up from 10 to 85 GWd/t (10 to 60 GWcVt for MOX fuel). The experimental validation of the DARWIN2.3 package for decay heat calculation is performed using calorimetric measurements carried out at the Swedish Interim Spent Fuel Storage Facility for Pressurized Water Reactor (PWR) assemblies, covering a large burn-up (20 to 50 GWd/t) and cooling time range (10 to 30 years). (authors)« less
Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
2015-02-01
The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less
Validity of a Checklist for the Design, Content, and Instructional Qualities of Children's Books
ERIC Educational Resources Information Center
Çer, Erkan; Sahin, Ertugrul
2016-01-01
The purpose of this study was to develop a checklist whose validity has been tested in assessing children's books. Participants consisted of university students who had taken a course in children's literature. They were selected through convenience sampling and randomly assigned into experimental and control groups. Participants in the…
RotCFD Software Validation - Computational and Experimental Data Comparison
NASA Technical Reports Server (NTRS)
Fernandez, Ovidio Montalvo
2014-01-01
RotCFD is a software intended to ease the design of NextGen rotorcraft. Since RotCFD is a new software still in the development process, the results need to be validated to determine the software's accuracy. The purpose of the present document is to explain one of the approaches to accomplish that goal.
Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy
2014-01-01
It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students' responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students' experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. © 2014 A. P. Dasgupta et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
ERIC Educational Resources Information Center
Yener, Özen
2014-01-01
In this research, we aim to develop a 5-point likert scale and use it in an experimental application by performing its validity and reliability in order to measure the will perception of teenagers and adults. With this aim, firstly the items have been taken either in the same or changed way from various scales and an item pool including 61 items…
Development, Validation, and Application of OSSEs at NASA/GMAO
NASA Technical Reports Server (NTRS)
Errico, Ronald; Prive, Nikki
2015-01-01
During the past several years, NASA Goddard's Global Modeling and Assimilation Office (GMAO) has been developing a framework for conducting Observing System Simulation Experiments (OSSEs). The motivation and design of that framework will be described and a sample of validation results presented. Fundamentals issues will be highlighted, particularly the critical importance of appropriately simulating system errors. Some problems that have just arisen in the newest experimental system will also be mentioned.
Finite Element Vibration Modeling and Experimental Validation for an Aircraft Engine Casing
NASA Astrophysics Data System (ADS)
Rabbitt, Christopher
This thesis presents a procedure for the development and validation of a theoretical vibration model, applies this procedure to a pair of aircraft engine casings, and compares select parameters from experimental testing of those casings to those from a theoretical model using the Modal Assurance Criterion (MAC) and linear regression coefficients. A novel method of determining the optimal MAC between axisymmetric results is developed and employed. It is concluded that the dynamic finite element models developed as part of this research are fully capable of modelling the modal parameters within the frequency range of interest. Confidence intervals calculated in this research for correlation coefficients provide important information regarding the reliability of predictions, and it is recommended that these intervals be calculated for all comparable coefficients. The procedure outlined for aligning mode shapes around an axis of symmetry proved useful, and the results are promising for the development of further optimization techniques.
[Animal experimentation, computer simulation and surgical research].
Carpentier, Alain
2009-11-01
We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.
Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao
2017-06-30
Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do
2017-01-01
Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113
NASA Technical Reports Server (NTRS)
Freeman, Delman C., Jr.; Reubush, Daivd E.; McClinton, Charles R.; Rausch, Vincent L.; Crawford, J. Larry
1997-01-01
This paper provides an overview of NASA's Hyper-X Program; a focused hypersonic technology effort designed to move hypersonic, airbreathing vehicle technology from the laboratory environment to the flight environment. This paper presents an overview of the flight test program, research objectives, approach, schedule and status. Substantial experimental database and concept validation have been completed. The program is currently concentrating on the first, Mach 7, vehicle development, verification and validation in preparation for wind-tunnel testing in 1998 and flight testing in 1999. Parallel to this effort the Mach 5 and 10 vehicle designs are being finalized. Detailed analytical and experimental evaluation of the Mach 7 vehicle at the flight conditions is nearing completion, and will provide a database for validation of design methods once flight test data are available.
CFD Modeling Needs and What Makes a Good Supersonic Combustion Validation Experiment
NASA Technical Reports Server (NTRS)
Gaffney, Richard L., Jr.; Cutler, Andrew D.
2005-01-01
If a CFD code/model developer is asked what experimental data he wants to validate his code or numerical model, his answer will be: "Everything, everywhere, at all times." Since this is not possible, practical, or even reasonable, the developer must understand what can be measured within the limits imposed by the test article, the test location, the test environment and the available diagnostic equipment. At the same time, it is important for the expermentalist/diagnostician to understand what the CFD developer needs (as opposed to wants) in order to conduct a useful CFD validation experiment. If these needs are not known, it is possible to neglect easily measured quantities at locations needed by the developer, rendering the data set useless for validation purposes. It is also important for the experimentalist/diagnostician to understand what the developer is trying to validate so that the experiment can be designed to isolate (as much as possible) the effects of a particular physical phenomena that is associated with the model to be validated. The probability of a successful validation experiment can be greatly increased if the two groups work together, each understanding the needs and limitations of the other.
Supersonic Combustion Research at NASA
NASA Technical Reports Server (NTRS)
Drummond, J. P.; Danehy, Paul M.; Gaffney, Richard L., Jr.; Tedder, Sarah A.; Cutler, Andrew D.; Bivolaru, Daniel
2007-01-01
This paper discusses the progress of work to model high-speed supersonic reacting flow. The purpose of the work is to improve the state of the art of CFD capabilities for predicting the flow in high-speed propulsion systems, particularly combustor flowpaths. The program has several components including the development of advanced algorithms and models for simulating engine flowpaths as well as a fundamental experimental and diagnostic development effort to support the formulation and validation of the mathematical models. The paper will provide details of current work on experiments that will provide data for the modeling efforts along with the associated nonintrusive diagnostics used to collect the data from the experimental flowfield. Simulation of a recent experiment to partially validate the accuracy of a combustion code is also described.
NASA Astrophysics Data System (ADS)
Brown, Alexander; Eviston, Connor
2017-02-01
Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jernigan, Dann A.; Blanchat, Thomas K.
It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less
Studying Sexual Aggression: A Review of the Evolution and Validity of Laboratory Paradigms
Davis, Kelly Cue; George, William H.; Nagayama Hall, Gordon C.; Parrott, Dominic J.; Tharp, Andra Teten; Stappenbeck, Cynthia A.
2018-01-01
Objective Researchers have endeavored for decades to develop and implement experimental assessments of sexual aggression and its precursors to capitalize on the many scientific advantages offered by laboratory experiments, such as rigorous control of key variables and identification of causal relationships. The purpose of this review is to provide an overview of and commentary on the evolution of these laboratory-based methods. Conclusions To date, two primary types of sexual aggression laboratory studies have been developed: those that involve behavioral analogues of sexual aggression and those that assess postulated precursors to sexually aggressive behavior. Although the study of sexual aggression in the laboratory is fraught with methodological challenges, validity concerns, and ethical considerations, advances in the field have resulted in greater methodological rigor, more precise dependent measures, and improved experimental validity, reliability, and realism. Because highly effective sexual aggression prevention strategies remain elusive, continued laboratory-based investigation of sexual aggression coupled with translation of critical findings to the development and modification of sexual aggression prevention programs remains an important task for the field. PMID:29675289
NASA Astrophysics Data System (ADS)
Hufner, D. R.; Augustine, M. R.
2018-05-01
A novel experimental method was developed to simulate underwater explosion pressure pulses within a laboratory environment. An impact-based experimental apparatus was constructed; capable of generating pressure pulses with basic character similar to underwater explosions, while also allowing the pulse to be tuned to different intensities. Having the capability to vary the shock impulse was considered essential to producing various levels of shock-induced damage without the need to modify the fixture. The experimental apparatus and test method are considered ideal for investigating the shock response of composite material systems and/or experimental validation of new material models. One such test program is presented herein, in which a series of E-glass/Vinylester laminates were subjected to a range of shock pulses that induced varying degrees of damage. Analysis-test correlations were performed using a rate-dependent constitutive model capable of representing anisotropic damage and ultimate yarn failure. Agreement between analytical predictions and experimental results was considered acceptable.
NASA Technical Reports Server (NTRS)
Bache, George
1993-01-01
Validation of CFD codes is a critical first step in the process of developing CFD design capability. The MSFC Pump Technology Team has recognized the importance of validation and has thus funded several experimental programs designed to obtain CFD quality validation data. The first data set to become available is for the SSME High Pressure Fuel Turbopump Impeller. LDV Data was taken at the impeller inlet (to obtain a reliable inlet boundary condition) and three radial positions at the impeller discharge. Our CFD code, TASCflow, is used within the Propulsion and Commercial Pump industry as a tool for pump design. The objective of this work, therefore, is to further validate TASCflow for application in pump design. TASCflow was used to predict flow at the impeller discharge for flowrates of 80, 100 and 115 percent of design flow. Comparison to data has been made with encouraging results.
Ong, Robert H.; King, Andrew J. C.; Mullins, Benjamin J.; Cooper, Timothy F.; Caley, M. Julian
2012-01-01
We present Computational Fluid Dynamics (CFD) models of the coupled dynamics of water flow, heat transfer and irradiance in and around corals to predict temperatures experienced by corals. These models were validated against controlled laboratory experiments, under constant and transient irradiance, for hemispherical and branching corals. Our CFD models agree very well with experimental studies. A linear relationship between irradiance and coral surface warming was evident in both the simulation and experimental result agreeing with heat transfer theory. However, CFD models for the steady state simulation produced a better fit to the linear relationship than the experimental data, likely due to experimental error in the empirical measurements. The consistency of our modelling results with experimental observations demonstrates the applicability of CFD simulations, such as the models developed here, to coral bleaching studies. A study of the influence of coral skeletal porosity and skeletal bulk density on surface warming was also undertaken, demonstrating boundary layer behaviour, and interstitial flow magnitude and temperature profiles in coral cross sections. Our models compliment recent studies showing systematic changes in these parameters in some coral colonies and have utility in the prediction of coral bleaching. PMID:22701582
Development and Validation of a Constitutive Model for Dental Composites during the Curing Process
NASA Astrophysics Data System (ADS)
Wickham Kolstad, Lauren
Debonding is a critical failure of a dental composites used for dental restorations. Debonding of dental composites can be determined by comparing the shrinkage stress of to the debonding strength of the adhesive that bonds it to the tooth surface. It is difficult to measure shrinkage stress experimentally. In this study, finite element analysis is used to predict the stress in the composite during cure. A new constitutive law is presented that will allow composite developers to evaluate composite shrinkage stress at early stages in the material development. Shrinkage stress and shrinkage strain experimental data were gathered for three dental resins, Z250, Z350, and P90. Experimental data were used to develop a constitutive model for the Young's modulus as a function of time of the dental composite during cure. A Maxwell model, spring and dashpot in series, was used to simulate the composite. The compliance of the shrinkage stress device was also taken into account by including a spring in series with the Maxwell model. A coefficient of thermal expansion was also determined for internal loading of the composite by dividing shrinkage strain by time. Three FEA models are presented. A spring-disk model validates that the constitutive law is self-consistent. A quarter cuspal deflection model uses separate experimental data to verify that the constitutive law is valid. Finally, an axisymmetric tooth model is used to predict interfacial stresses in the composite. These stresses are compared to the debonding strength to check if the composite debonds. The new constitutive model accurately predicted cuspal deflection data. Predictions for interfacial bond stress in the tooth model compare favorably with debonding characteristics observed in practice for dental resins.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
Code of Federal Regulations, 2012 CFR
2012-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
Code of Federal Regulations, 2013 CFR
2013-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...
MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.
Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui
2015-12-12
Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.
ERIC Educational Resources Information Center
Dasgupta, Annwesa P.; Anderson, Trevor R.; Pelaez, Nancy
2014-01-01
It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological…
Development of the Biological Experimental Design Concept Inventory (BEDCI)
Deane, Thomas; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2014-01-01
Interest in student conception of experimentation inspired the development of a fully validated 14-question inventory on experimental design in biology (BEDCI) by following established best practices in concept inventory (CI) design. This CI can be used to diagnose specific examples of non–expert-like thinking in students and to evaluate the success of teaching strategies that target conceptual changes. We used BEDCI to diagnose non–expert-like student thinking in experimental design at the pre- and posttest stage in five courses (total n = 580 students) at a large research university in western Canada. Calculated difficulty and discrimination metrics indicated that BEDCI questions are able to effectively capture learning changes at the undergraduate level. A high correlation (r = 0.84) between responses by students in similar courses and at the same stage of their academic career, also suggests that the test is reliable. Students showed significant positive learning changes by the posttest stage, but some non–expert-like responses were widespread and persistent. BEDCI is a reliable and valid diagnostic tool that can be used in a variety of life sciences disciplines. PMID:25185236
Experimental aeroelasticity history, status and future in brief
NASA Technical Reports Server (NTRS)
Ricketts, Rodney H.
1990-01-01
NASA conducts wind tunnel experiments to determine and understand the aeroelastic characteristics of new and advanced flight vehicles, including fixed-wing, rotary-wing and space-launch configurations. Review and assessments are made of the state-of-the-art in experimental aeroelasticity regarding available facilities, measurement techniques, and other means and devices useful in testing. In addition, some past experimental programs are described which assisted in the development of new technology, validated new analysis codes, or provided needed information for clearing flight envelopes of unwanted aeroelastic response. Finally, needs and requirements for advances and improvements in testing capabilities for future experimental research and development programs are described.
Robust and real-time rotor control with magnetic bearings
NASA Technical Reports Server (NTRS)
Sinha, A.; Wang, K. W.; Mease, K. L.
1991-01-01
This paper deals with the sliding mode control of a rigid rotor via radial magnetic bearings. The digital control algorithm and the results from numerical simulations are presented for an experimental rig. The experimental system which has been set up to digitally implement and validate the sliding mode control algorithm is described. Two methods for the development of control softwares are presented. Experimental results for individual rotor axis are discussed.
The Use of Virtual Reality in the Study of People's Responses to Violent Incidents.
Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel
2009-01-01
This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call 'plausibility' - including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.
The Use of Virtual Reality in the Study of People's Responses to Violent Incidents
Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel
2009-01-01
This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call ‘plausibility’ – including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents. PMID:20076762
McGowan, C.P.; Neptune, R.R.; Herzog, W.
2009-01-01
History dependent effects on muscle force development following active changes in length have been measured in a number of experimental studies. However, few muscle models have included these properties or examined their impact on force and power output in dynamic cyclic movements. The goal of this study was to develop and validate a modified Hill-type muscle model that includes shortening induced force depression and assess its influence on locomotor performance. The magnitude of force depression was defined by empirical relationships based on muscle mechanical work. To validate the model, simulations incorporating force depression were developed to emulate single muscle in situ and whole muscle group leg extension experiments. There was excellent agreement between simulation and experimental values, with in situ force patterns closely matching the experimental data (average RMS error < 1.5 N) and force depression in the simulated leg extension exercise being similar in magnitude to experimental values (6.0% vs 6.5%, respectively). To examine the influence of force depression on locomotor performance, simulations of maximum power pedaling with and without force depression were generated. Force depression decreased maximum crank power by 20% – 40%, depending on the relationship between force depression and muscle work used. These results indicate that force depression has the potential to substantially influence muscle power output in dynamic cyclic movements. However, to fully understand the impact of this phenomenon on human movement, more research is needed to characterize the relationship between force depression and mechanical work in large muscles with different morphologies. PMID:19879585
Examining students' views about validity of experiments: From introductory to Ph.D. students
NASA Astrophysics Data System (ADS)
Hu, Dehui; Zwickl, Benjamin M.
2018-06-01
We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.
Lingner, Thomas; Kataya, Amr R. A.; Reumann, Sigrun
2012-01-01
We recently developed the first algorithms specifically for plants to predict proteins carrying peroxisome targeting signals type 1 (PTS1) from genome sequences.1 As validated experimentally, the prediction methods are able to correctly predict unknown peroxisomal Arabidopsis proteins and to infer novel PTS1 tripeptides. The high prediction performance is primarily determined by the large number and sequence diversity of the underlying positive example sequences, which mainly derived from EST databases. However, a few constructs remained cytosolic in experimental validation studies, indicating sequencing errors in some ESTs. To identify erroneous sequences, we validated subcellular targeting of additional positive example sequences in the present study. Moreover, we analyzed the distribution of prediction scores separately for each orthologous group of PTS1 proteins, which generally resembled normal distributions with group-specific mean values. The cytosolic sequences commonly represented outliers of low prediction scores and were located at the very tail of a fitted normal distribution. Three statistical methods for identifying outliers were compared in terms of sensitivity and specificity.” Their combined application allows elimination of erroneous ESTs from positive example data sets. This new post-validation method will further improve the prediction accuracy of both PTS1 and PTS2 protein prediction models for plants, fungi, and mammals. PMID:22415050
Lingner, Thomas; Kataya, Amr R A; Reumann, Sigrun
2012-02-01
We recently developed the first algorithms specifically for plants to predict proteins carrying peroxisome targeting signals type 1 (PTS1) from genome sequences. As validated experimentally, the prediction methods are able to correctly predict unknown peroxisomal Arabidopsis proteins and to infer novel PTS1 tripeptides. The high prediction performance is primarily determined by the large number and sequence diversity of the underlying positive example sequences, which mainly derived from EST databases. However, a few constructs remained cytosolic in experimental validation studies, indicating sequencing errors in some ESTs. To identify erroneous sequences, we validated subcellular targeting of additional positive example sequences in the present study. Moreover, we analyzed the distribution of prediction scores separately for each orthologous group of PTS1 proteins, which generally resembled normal distributions with group-specific mean values. The cytosolic sequences commonly represented outliers of low prediction scores and were located at the very tail of a fitted normal distribution. Three statistical methods for identifying outliers were compared in terms of sensitivity and specificity." Their combined application allows elimination of erroneous ESTs from positive example data sets. This new post-validation method will further improve the prediction accuracy of both PTS1 and PTS2 protein prediction models for plants, fungi, and mammals.
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.
2017-01-01
A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices. PMID:28594889
Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R
2017-01-01
A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices.
Incremental checking of Master Data Management model based on contextual graphs
NASA Astrophysics Data System (ADS)
Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan
2015-10-01
The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.
NASA Astrophysics Data System (ADS)
Fiorenti, Simone; Guanetti, Jacopo; Guezennec, Yann; Onori, Simona
2013-11-01
This paper presents the development and experimental validation of a dynamic model of a Hybridized Energy Storage System (HESS) consisting of a parallel connection of a lead acid (PbA) battery and double layer capacitors (DLCs), for automotive applications. The dynamic modeling of both the PbA battery and the DLC has been tackled via the equivalent electric circuit based approach. Experimental tests are designed for identification purposes. Parameters of the PbA battery model are identified as a function of state of charge and current direction, whereas parameters of the DLC model are identified for different temperatures. A physical HESS has been assembled at the Center for Automotive Research The Ohio State University and used as a test-bench to validate the model against a typical current profile generated for Start&Stop applications. The HESS model is then integrated into a vehicle simulator to assess the effects of the battery hybridization on the vehicle fuel economy and mitigation of the battery stress.
Engineering Property Prediction Tools for Tailored Polymer Composite Structures (49465)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Kunc, Vlastimil
2009-12-29
Process and constitutive models as well as characterization tools and testing methods were developed to determine stress-strain responses, damage development, strengths and creep of long-fiber thermoplastics (LFTs). The developed models were implemented in Moldflow and ABAQUS and have been validated against LFT data obtained experimentally.
TH-CD-207A-08: Simulated Real-Time Image Guidance for Lung SBRT Patients Using Scatter Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redler, G; Cifter, G; Templeton, A
2016-06-15
Purpose: To develop a comprehensive Monte Carlo-based model for the acquisition of scatter images of patient anatomy in real-time, during lung SBRT treatment. Methods: During SBRT treatment, images of patient anatomy can be acquired from scattered radiation. To rigorously examine the utility of scatter images for image guidance, a model is developed using MCNP code to simulate scatter images of phantoms and lung cancer patients. The model is validated by comparing experimental and simulated images of phantoms of different complexity. The differentiation between tissue types is investigated by imaging objects of known compositions (water, lung, and bone equivalent). A lungmore » tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is used to investigate image noise properties for various quantities of delivered radiation (monitor units(MU)). Patient scatter images are simulated using the validated simulation model. 4DCT patient data is converted to an MCNP input geometry accounting for different tissue composition and densities. Lung tumor phantom images acquired with decreasing imaging time (decreasing MU) are used to model the expected noise amplitude in patient scatter images, producing realistic simulated patient scatter images with varying temporal resolution. Results: Image intensity in simulated and experimental scatter images of tissue equivalent objects (water, lung, bone) match within the uncertainty (∼3%). Lung tumor phantom images agree as well. Specifically, tumor-to-lung contrast matches within the uncertainty. The addition of random noise approximating quantum noise in experimental images to simulated patient images shows that scatter images of lung tumors can provide images in as fast as 0.5 seconds with CNR∼2.7. Conclusions: A scatter imaging simulation model is developed and validated using experimental phantom scatter images. Following validation, lung cancer patient scatter images are simulated. These simulated patient images demonstrate the clinical utility of scatter imaging for real-time tumor tracking during lung SBRT.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Bapanapalli, Satish K.; Smith, Mark T.
2008-09-01
The objective of our work is to enable the optimum design of lightweight automotive structural components using injection-molded long fiber thermoplastics (LFTs). To this end, an integrated approach that links process modeling to structural analysis with experimental microstructural characterization and validation is developed. First, process models for LFTs are developed and implemented into processing codes (e.g. ORIENT, Moldflow) to predict the microstructure of the as-formed composite (i.e. fiber length and orientation distributions). In parallel, characterization and testing methods are developed to obtain necessary microstructural data to validate process modeling predictions. Second, the predicted LFT composite microstructure is imported into amore » structural finite element analysis by ABAQUS to determine the response of the as-formed composite to given boundary conditions. At this stage, constitutive models accounting for the composite microstructure are developed to predict various types of behaviors (i.e. thermoelastic, viscoelastic, elastic-plastic, damage, fatigue, and impact) of LFTs. Experimental methods are also developed to determine material parameters and to validate constitutive models. Such a process-linked-structural modeling approach allows an LFT composite structure to be designed with confidence through numerical simulations. Some recent results of our collaborative research will be illustrated to show the usefulness and applications of this integrated approach.« less
Zuthi, M F R; Ngo, H H; Guo, W S; Nghiem, L D; Hai, F I; Xia, S Q; Zhang, Z Q; Li, J X
2015-08-01
This study investigates the influence of key biomass parameters on specific oxygen uptake rate (SOUR) in a sponge submerged membrane bioreactor (SSMBR) to develop mathematical models of biomass viability. Extra-cellular polymeric substances (EPS) were considered as a lumped parameter of bound EPS (bEPS) and soluble microbial products (SMP). Statistical analyses of experimental results indicate that the bEPS, SMP, mixed liquor suspended solids and volatile suspended solids (MLSS and MLVSS) have functional relationships with SOUR and their relative influence on SOUR was in the order of EPS>bEPS>SMP>MLVSS/MLSS. Based on correlations among biomass parameters and SOUR, two independent empirical models of biomass viability were developed. The models were validated using results of the SSMBR. However, further validation of the models for different operating conditions is suggested. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hijazi, Bilal; Cool, Simon; Vangeyte, Jürgen; Mertens, Koen C; Cointault, Frédéric; Paindavoine, Michel; Pieters, Jan G
2014-11-13
A 3D imaging technique using a high speed binocular stereovision system was developed in combination with corresponding image processing algorithms for accurate determination of the parameters of particles leaving the spinning disks of centrifugal fertilizer spreaders. Validation of the stereo-matching algorithm using a virtual 3D stereovision simulator indicated an error of less than 2 pixels for 90% of the particles. The setup was validated using the cylindrical spread pattern of an experimental spreader. A 2D correlation coefficient of 90% and a Relative Error of 27% was found between the experimental results and the (simulated) spread pattern obtained with the developed setup. In combination with a ballistic flight model, the developed image acquisition and processing algorithms can enable fast determination and evaluation of the spread pattern which can be used as a tool for spreader design and precise machine calibration.
Modeling Combustion in Supersonic Flows
NASA Technical Reports Server (NTRS)
Drummond, J. Philip; Danehy, Paul M.; Bivolaru, Daniel; Gaffney, Richard L.; Tedder, Sarah A.; Cutler, Andrew D.
2007-01-01
This paper discusses the progress of work to model high-speed supersonic reacting flow. The purpose of the work is to improve the state of the art of CFD capabilities for predicting the flow in high-speed propulsion systems, particularly combustor flow-paths. The program has several components including the development of advanced algorithms and models for simulating engine flowpaths as well as a fundamental experimental and diagnostic development effort to support the formulation and validation of the mathematical models. The paper will provide details of current work on experiments that will provide data for the modeling efforts along with with the associated nonintrusive diagnostics used to collect the data from the experimental flowfield. Simulation of a recent experiment to partially validate the accuracy of a combustion code is also described.
Kovács, Béla; Kántor, Lajos Kristóf; Croitoru, Mircea Dumitru; Kelemen, Éva Katalin; Obreja, Mona; Nagy, Előd Ernő; Székely-Szentmiklósi, Blanka; Gyéresi, Árpád
2018-06-01
A reverse-phase HPLC (RP-HPLC) method was developed for strontium ranelate using a full factorial, screening experimental design. The analytical procedure was validated according to international guidelines for linearity, selectivity, sensitivity, accuracy and precision. A separate experimental design was used to demonstrate the robustness of the method. Strontium ranelate was eluted at 4.4 minutes and showed no interference with the excipients used in the formulation, at 321 nm. The method is linear in the range of 20-320 μg mL-1 (R2 = 0.99998). Recovery, tested in the range of 40-120 μg mL-1, was found to be 96.1-102.1 %. Intra-day and intermediate precision RSDs ranged from 1.0-1.4 and 1.2-1.4 %, resp. The limit of detection and limit of quantitation were 0.06 and 0.20 μg mL-1, resp. The proposed technique is fast, cost-effective, reliable and reproducible, and is proposed for the routine analysis of strontium ranelate.
Effectiveness of CAI Package on Achievement in Physics of IX Standard Students
ERIC Educational Resources Information Center
Maheswari, I. Uma; Ramakrishnan, N.
2015-01-01
The present study is an experimental one in nature, to find out the effectiveness of CAI package on in Physics of IX std. students. For this purpose a CAI package was developed and validated. The validated CAI package formed an independent variable of this study. The dependent variable is students' achievements in physics content. In order to find…
Design and validation of instruments to measure knowledge.
Elliott, T E; Regal, R R; Elliott, B A; Renier, C M
2001-01-01
Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.
Reverse Engineering Validation using a Benchmark Synthetic Gene Circuit in Human Cells
Kang, Taek; White, Jacob T.; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas
2013-01-01
Multi-component biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network. PMID:23654266
Reverse engineering validation using a benchmark synthetic gene circuit in human cells.
Kang, Taek; White, Jacob T; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas
2013-05-17
Multicomponent biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network.
Boundary Layer Transition Experiments in Support of the Hypersonics Program
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Chen, Fang-Jenq; Wilder, Michael C.; Reda, Daniel C.
2007-01-01
Two experimental boundary layer transition studies in support of fundamental hypersonics research are reviewed. The two studies are the HyBoLT flight experiment and a new ballistic range effort. Details are provided of the objectives and approach associated with each experimental program. The establishment of experimental databases from ground and flight are to provide better understanding of high-speed flows and data to validate and guide the development of simulation tools.
NASA Technical Reports Server (NTRS)
Sebok, Angelia; Wickens, Christopher; Sargent, Robert
2015-01-01
One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.
Alvarez, M Lucrecia
2014-01-01
Different target prediction algorithms have been developed to provide a list of candidate target genes for a given animal microRNAs (miRNAs). However, these computational approaches provide both false-positive and false-negative predictions. Therefore, the target genes of a specific miRNA identified in silico should be experimentally validated. In this chapter, we describe a step-by-step protocol for the experimental validation of a direct miRNA target using a faster Dual Firefly-Renilla Luciferase Reporter Assay. We describe how to construct reporter plasmids using the simple, fast, and highly efficient cold fusion cloning technology, which does not require ligase, phosphatase, or restriction enzymes. In addition, we provide a protocol for co-transfection of reporter plasmids with either miRNA mimics or miRNA inhibitors in human embryonic kidney 293 (HEK293) cells, as well as a description on how to measure Firefly and Renilla luciferase activity using the Dual-Glo Luciferase Assay kit. As an example of the use of this technology, we will validate glucose-6-phosphate dehydrogenase (G6PD) as a direct target of miR-1207-5p.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, C., E-mail: hansec@uw.edu; Columbia University, New York, New York 10027; Victor, B.
We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numericalmore » validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.« less
Fang, Jiansong; Wu, Zengrui; Cai, Chuipu; Wang, Qi; Tang, Yun; Cheng, Feixiong
2017-11-27
Natural products with diverse chemical scaffolds have been recognized as an invaluable source of compounds in drug discovery and development. However, systematic identification of drug targets for natural products at the human proteome level via various experimental assays is highly expensive and time-consuming. In this study, we proposed a systems pharmacology infrastructure to predict new drug targets and anticancer indications of natural products. Specifically, we reconstructed a global drug-target network with 7,314 interactions connecting 751 targets and 2,388 natural products and built predictive network models via a balanced substructure-drug-target network-based inference approach. A high area under receiver operating characteristic curve of 0.96 was yielded for predicting new targets of natural products during cross-validation. The newly predicted targets of natural products (e.g., resveratrol, genistein, and kaempferol) with high scores were validated by various literature studies. We further built the statistical network models for identification of new anticancer indications of natural products through integration of both experimentally validated and computationally predicted drug-target interactions of natural products with known cancer proteins. We showed that the significantly predicted anticancer indications of multiple natural products (e.g., naringenin, disulfiram, and metformin) with new mechanism-of-action were validated by various published experimental evidence. In summary, this study offers powerful computational systems pharmacology approaches and tools for the development of novel targeted cancer therapies by exploiting the polypharmacology of natural products.
Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation
NASA Astrophysics Data System (ADS)
Maiti, Raman
2016-06-01
The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.
Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation
NASA Astrophysics Data System (ADS)
Maiti, Raman
2018-06-01
The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.
NASA Technical Reports Server (NTRS)
Gouldin, F. C.
1982-01-01
Fluid mechanical effects on combustion processes in steady flow combustors, especially gas turbine combustors were investigated. Flow features of most interest were vorticity, especially swirl, and turbulence. Theoretical analyses, numerical calculations, and experiments were performed. The theoretical and numerical work focused on noncombusting flows, while the experimental work consisted of both reacting and nonreacting flow studies. An experimental data set, e.g., velocity, temperature and composition, was developed for a swirl flow combustor for use by combustion modelers for development and validation work.
NASA Astrophysics Data System (ADS)
Morrissey, Liam S.; Nakhla, Sam
2018-07-01
The effect of porosity on elastic modulus in low-porosity materials is investigated. First, several models used to predict the reduction in elastic modulus due to porosity are compared with a compilation of experimental data to determine their ranges of validity and accuracy. The overlapping solid spheres model is found to be most accurate with the experimental data and valid between 3 and 10 pct porosity. Next, a FEM is developed with the objective of demonstrating that a macroscale plate with a center hole can be used to model the effect of microscale porosity on elastic modulus. The FEM agrees best with the overlapping solid spheres model and shows higher accuracy with experimental data than the overlapping solid spheres model.
Summary: Experimental validation of real-time fault-tolerant systems
NASA Technical Reports Server (NTRS)
Iyer, R. K.; Choi, G. S.
1992-01-01
Testing and validation of real-time systems is always difficult to perform since neither the error generation process nor the fault propagation problem is easy to comprehend. There is no better substitute to results based on actual measurements and experimentation. Such results are essential for developing a rational basis for evaluation and validation of real-time systems. However, with physical experimentation, controllability and observability are limited to external instrumentation that can be hooked-up to the system under test. And this process is quite a difficult, if not impossible, task for a complex system. Also, to set up such experiments for measurements, physical hardware must exist. On the other hand, a simulation approach allows flexibility that is unequaled by any other existing method for system evaluation. A simulation methodology for system evaluation was successfully developed and implemented and the environment was demonstrated using existing real-time avionic systems. The research was oriented toward evaluating the impact of permanent and transient faults in aircraft control computers. Results were obtained for the Bendix BDX 930 system and Hamilton Standard EEC131 jet engine controller. The studies showed that simulated fault injection is valuable, in the design stage, to evaluate the susceptibility of computing sytems to different types of failures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clemens, Noel
This project was a combined computational and experimental effort to improve predictive capability for boundary layer flashback of premixed swirl flames relevant to gas-turbine power plants operating with high-hydrogen-content fuels. During the course of this project, significant progress in modeling was made on four major fronts: 1) use of direct numerical simulation of turbulent flames to understand the coupling between the flame and the turbulent boundary layer; 2) improved modeling capability for flame propagation in stratified pre-mixtures; 3) improved portability of computer codes using the OpenFOAM platform to facilitate transfer to industry and other researchers; and 4) application of LESmore » to flashback in swirl combustors, and a detailed assessment of its capabilities and limitations for predictive purposes. A major component of the project was an experimental program that focused on developing a rich experimental database of boundary layer flashback in swirl flames. Both methane and high-hydrogen fuels, including effects of elevated pressure (1 to 5 atm), were explored. For this project, a new model swirl combustor was developed. Kilohertz-rate stereoscopic PIV and chemiluminescence imaging were used to investigate the flame propagation dynamics. In addition to the planar measurements, a technique capable of detecting the instantaneous, time-resolved 3D flame front topography was developed and applied successfully to investigate the flow-flame interaction. The UT measurements and legacy data were used in a hierarchical validation approach where flows with increasingly complex physics were used for validation. First component models were validated with DNS and literature data in simplified configurations, and this was followed by validation with the UT 1-atm flashback cases, and then the UT high-pressure flashback cases. The new models and portable code represent a major improvement over what was available before this project was initiated.« less
Experimental validation of a new heterogeneous mechanical test design
NASA Astrophysics Data System (ADS)
Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.
2018-05-01
Standard material parameters identification strategies generally use an extensive number of classical tests for collecting the required experimental data. However, a great effort has been made recently by the scientific and industrial communities to support this experimental database on heterogeneous tests. These tests can provide richer information on the material behavior allowing the identification of a more complete set of material parameters. This is a result of the recent development of full-field measurements techniques, like digital image correlation (DIC), that can capture the heterogeneous deformation fields on the specimen surface during the test. Recently, new specimen geometries were designed to enhance the richness of the strain field and capture supplementary strain states. The butterfly specimen is an example of these new geometries, designed through a numerical optimization procedure where an indicator capable of evaluating the heterogeneity and the richness of strain information. However, no experimental validation was yet performed. The aim of this work is to experimentally validate the heterogeneous butterfly mechanical test in the parameter identification framework. For this aim, DIC technique and a Finite Element Model Up-date inverse strategy are used together for the parameter identification of a DC04 steel, as well as the calculation of the indicator. The experimental tests are carried out in a universal testing machine with the ARAMIS measuring system to provide the strain states on the specimen surface. The identification strategy is accomplished with the data obtained from the experimental tests and the results are compared to a reference numerical solution.
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Huber, Frank W.
1992-01-01
The current status of the activities and future plans of the Turbine Technology Team of the Consortium for Computational Fluid Dynamics is reviewed. The activities of the Turbine Team focus on developing and enhancing codes and models, obtaining data for code validation and general understanding of flows through turbines, and developing and analyzing the aerodynamic designs of turbines suitable for use in the Space Transportation Main Engine fuel and oxidizer turbopumps. Future work will include the experimental evaluation of the oxidizer turbine configuration, the development, analysis, and experimental verification of concepts to control secondary and tip losses, and the aerodynamic design, analysis, and experimental evaluation of turbine volutes.
CADASTER QSPR Models for Predictions of Melting and Boiling Points of Perfluorinated Chemicals.
Bhhatarai, Barun; Teetz, Wolfram; Liu, Tao; Öberg, Tomas; Jeliazkova, Nina; Kochev, Nikolay; Pukalov, Ognyan; Tetko, Igor V; Kovarich, Simona; Papa, Ester; Gramatica, Paola
2011-03-14
Quantitative structure property relationship (QSPR) studies on per- and polyfluorinated chemicals (PFCs) on melting point (MP) and boiling point (BP) are presented. The training and prediction chemicals used for developing and validating the models were selected from Syracuse PhysProp database and literatures. The available experimental data sets were split in two different ways: a) random selection on response value, and b) structural similarity verified by self-organizing-map (SOM), in order to propose reliable predictive models, developed only on the training sets and externally verified on the prediction sets. Individual linear and non-linear approaches based models developed by different CADASTER partners on 0D-2D Dragon descriptors, E-state descriptors and fragment based descriptors as well as consensus model and their predictions are presented. In addition, the predictive performance of the developed models was verified on a blind external validation set (EV-set) prepared using PERFORCE database on 15 MP and 25 BP data respectively. This database contains only long chain perfluoro-alkylated chemicals, particularly monitored by regulatory agencies like US-EPA and EU-REACH. QSPR models with internal and external validation on two different external prediction/validation sets and study of applicability-domain highlighting the robustness and high accuracy of the models are discussed. Finally, MPs for additional 303 PFCs and BPs for 271 PFCs were predicted for which experimental measurements are unknown. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Spielmann, Horst; Grune, Barbara; Liebsch, Manfred; Seiler, Andrea; Vogel, Richard
2008-06-01
A short description of the history of the 3Rs concept is given, which was developed as the scientific concept to refine, reduce and replace animal experiments by Russel and Burch more than 40 years ago. In addition, the legal framework in Europe for developing alternatives to animal experiments is given and the current status of in vitro systems in pharmacology and toxicology is described including an update on metabolising systems. The decrease in experimental animal numbers during the past decade in Europe is illustrated by the situation in Germany and the contribution of international harmonisation of test guidelines on reducing animal numbers in regulatory testing is described. A review of the development of the principles of experimental validation is given and the 3T3 NRU in vitro phototoxicity test is used as an example for a successful validation study, which led to the acceptance of the first in vitro toxicity test for regulatory purposes by the OECD. Finally, the currently accepted alternative methods for standardisation and safety testing of drugs, biologicals and medical devices are summarised.
Ocean Observations with EOS/MODIS: Algorithm Development and Post Launch Studies
NASA Technical Reports Server (NTRS)
Gordon, Howard R.; Conboy, Barbara (Technical Monitor)
1999-01-01
This separation has been logical thus far; however, as launch of AM-1 approaches, it must be recognized that many of these activities will shift emphasis from algorithm development to validation. For example, the second, third, and fifth bullets will become almost totally validation-focussed activities in the post-launch era, providing the core of our experimental validation effort. Work under the first bullet will continue into the post-launch time frame, driven in part by algorithm deficiencies revealed as a result of validation activities. Prior to the start of the 1999 fiscal year (FY99) we were requested to prepare a brief plan for our FY99 activities. This plan is included as Appendix 1. The present report describes the progress made on our planned activities.
Validation Testing of a Peridynamic Impact Damage Model Using NASA's Micro-Particle Gun
NASA Technical Reports Server (NTRS)
Baber, Forrest E.; Zelinski, Brian J.; Guven, Ibrahim; Gray, Perry
2017-01-01
Through a collaborative effort between the Virginia Commonwealth University and Raytheon, a peridynamic model for sand impact damage has been developed1-3. Model development has focused on simulating impacts of sand particles on ZnS traveling at velocities consistent with aircraft take-off and landing speeds. The model reproduces common features of impact damage including pit and radial cracks, and, under some conditions, lateral cracks. This study focuses on a preliminary validation exercise in which simulation results from the peridynamic model are compared to a limited experimental data set generated by NASA's recently developed micro-particle gun (MPG). The MPG facility measures the dimensions and incoming and rebound velocities of the impact particles. It also links each particle to a specific impact site and its associated damage. In this validation exercise parameters of the peridynamic model are adjusted to fit the experimentally observed pit diameter, average length of radial cracks and rebound velocities for 4 impacts of 300 µm glass beads on ZnS. Results indicate that a reasonable fit of these impact characteristics can be obtained by suitable adjustment of the peridynamic input parameters, demonstrating that the MPG can be used effectively as a validation tool for impact modeling and that the peridynamic sand impact model described herein possesses not only a qualitative but also a quantitative ability to simulate sand impact events.
NASA Technical Reports Server (NTRS)
Magee, Todd E.; Wilcox, Peter A.; Fugal, Spencer R.; Acheson, Kurt E.; Adamson, Eric E.; Bidwell, Alicia L.; Shaw, Stephen G.
2013-01-01
This report describes the work conducted by The Boeing Company under American Recovery and Reinvestment Act (ARRA) and NASA funding to experimentally validate the conceptual design of a supersonic airliner feasible for entry into service in the 2018 to 2020 timeframe (NASA N+2 generation). The report discusses the design, analysis and development of a low-boom concept that meets aggressive sonic boom and performance goals for a cruise Mach number of 1.8. The design is achieved through integrated multidisciplinary optimization tools. The report also describes the detailed design and fabrication of both sonic boom and performance wind tunnel models of the low-boom concept. Additionally, a description of the detailed validation wind tunnel testing that was performed with the wind tunnel models is provided along with validation comparisons with pretest Computational Fluid Dynamics (CFD). Finally, the report describes the evaluation of existing NASA sonic boom pressure rail measurement instrumentation and a detailed description of new sonic boom measurement instrumentation that was constructed for the validation wind tunnel testing.
Validation of Lower Body Negative Pressure as an Experimental Model of Hemorrhage
2013-12-19
saving intervention (15). Therefore it is important to develop a valid model for understanding the physiology of human hemorrhage especially during the...hemorrhage to investigate the physiological responses to hypovolemia (7). LBNP causes a reduction in pressure sur- rounding the lower extremities. As...from that observed with hemorrhage reflects the physiological mechanisms producing central hypovolemia. During LBNP, intravascular fluid shifts to the
Quasiglobal reaction model for ethylene combustion
NASA Technical Reports Server (NTRS)
Singh, D. J.; Jachimowski, Casimir J.
1994-01-01
The objective of this study is to develop a reduced mechanism for ethylene oxidation. The authors are interested in a model with a minimum number of species and reactions that still models the chemistry with reasonable accuracy for the expected combustor conditions. The model will be validated by comparing the results to those calculated with a detailed kinetic model that has been validated against the experimental data.
The Effects of Magnetic Nozzle Configurations on Plasma Thrusters
NASA Technical Reports Server (NTRS)
Turchi, P. J.
1997-01-01
Over the course of eight years, the Ohio State University has performed research in support of electric propulsion development efforts at the NASA Lewis Research Center, Cleveland, OH. This research has been largely devoted to plasma propulsion systems including MagnetoPlasmaDynamic (MPD) thrusters with externally-applied, solenoidal magnetic fields, hollow cathodes, and Pulsed Plasma Microthrusters (PPT's). Both experimental and theoretical work has been performed, as documented in four master's theses, two doctoral dissertations, and numerous technical papers. The present document is the final report for the grant period 5 December 1987 to 31 December 1995, and summarizes all activities. Detailed discussions of each area of activity are provided in appendices: Appendix 1 - Experimental studies of magnetic nozzle effects on plasma thrusters; Appendix 2 - Numerical modeling of applied-field MPD thrusters; Appendix 3 - Theoretical and experimental studies of hollow cathodes; and Appendix 4 -Theoretical, numerical and experimental studies of pulsed plasma thrusters. Especially notable results include the efficacy of using a solenoidal magnetic field downstream of a plasma thruster to collimate the exhaust flow, the development of a new understanding of applied-field MPD thrusters (based on experimentally-validated results from state-of-the art, numerical simulation) leading to predictions of improved performance, an experimentally-validated, first-principles model for orificed, hollow-cathode behavior, and the first time-dependent, two-dimensional calculations of ablation-fed, pulsed plasma thrusters.
NASA Astrophysics Data System (ADS)
Serevina, V.; Muliyati, D.
2018-05-01
This research aims to develop students’ performance assessment instrument based on scientific approach is valid and reliable in assessing the performance of students on basic physics lab of Simple Harmonic Motion (SHM). This study uses the ADDIE consisting of stages: Analyze, Design, Development, Implementation, and Evaluation. The student performance assessment developed can be used to measure students’ skills in observing, asking, conducting experiments, associating and communicate experimental results that are the ‘5M’ stages in a scientific approach. Each grain of assessment in the instrument is validated by the instrument expert and the evaluation with the result of all points of assessment shall be eligible to be used with a 100% eligibility percentage. The instrument is then tested for the quality of construction, material, and language by panel (lecturer) with the result: 85% or very good instrument construction aspect, material aspect 87.5% or very good, and language aspect 83% or very good. For small group trial obtained instrument reliability level of 0.878 or is in the high category, where r-table is 0.707. For large group trial obtained instrument reliability level of 0.889 or is in the high category, where r-table is 0.320. Instruments declared valid and reliable for 5% significance level. Based on the result of this research, it can be concluded that the student performance appraisal instrument based on the developed scientific approach is declared valid and reliable to be used in assessing student skill in SHM experimental activity.
1960-03-01
BELL XV-3 (AF54-148) Convertiplane (experimental tilt rotor) IN FLIGHT Note: Used in publication in Flight Research at Ames; 57 Years of Development and Validation of Aeronautical Technology NASA SP-1998-3300 fig. 121
Defense Science Board Task Force Report on Next-Generation Unmanned Undersea Systems
2016-10-01
active learning occurs in an environment that extends beyondchoreographed demonstrations designed to validate pre -determined hypotheses. Finally, when...4 OPNAV N99 should coordinate a broad-based design , development, and experimental effort to bypass traditional limitations for unmanned undersea...approaches that could facilitate rapid experimentation , operational demonstration of capabilities, and deployment of initial capabilities that show
Perspectives on the simulation of protein–surface interactions using empirical force field methods
Latour, Robert A.
2014-01-01
Protein–surface interactions are of fundamental importance for a broad range of applications in the fields of biomaterials and biotechnology. Present experimental methods are limited in their ability to provide a comprehensive depiction of these interactions at the atomistic level. In contrast, empirical force field based simulation methods inherently provide the ability to predict and visualize protein–surface interactions with full atomistic detail. These methods, however, must be carefully developed, validated, and properly applied before confidence can be placed in results from the simulations. In this perspectives paper, I provide an overview of the critical aspects that I consider being of greatest importance for the development of these methods, with a focus on the research that my combined experimental and molecular simulation groups have conducted over the past decade to address these issues. These critical issues include the tuning of interfacial force field parameters to accurately represent the thermodynamics of interfacial behavior, adequate sampling of these types of complex molecular systems to generate results that can be comparable with experimental data, and the generation of experimental data that can be used for simulation results evaluation and validation. PMID:25028242
Verification and Validation of Residual Stresses in Bi-Material Composite Rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Stacy Michelle; Hanson, Alexander Anthony; Briggs, Timothy
Process-induced residual stresses commonly occur in composite structures composed of dissimilar materials. These residual stresses form due to differences in the composite materials’ coefficients of thermal expansion and the shrinkage upon cure exhibited by polymer matrix materials. Depending upon the specific geometric details of the composite structure and the materials’ curing parameters, it is possible that these residual stresses could result in interlaminar delamination or fracture within the composite. Therefore, the consideration of potential residual stresses is important when designing composite parts and their manufacturing processes. However, the experimental determination of residual stresses in prototype parts can be time andmore » cost prohibitive. As an alternative to physical measurement, it is possible for computational tools to be used to quantify potential residual stresses in composite prototype parts. Therefore, the objectives of the presented work are to demonstrate a simplistic method for simulating residual stresses in composite parts, as well as the potential value of sensitivity and uncertainty quantification techniques during analyses for which material property parameters are unknown. Specifically, a simplified residual stress modeling approach, which accounts for coefficient of thermal expansion mismatch and polymer shrinkage, is implemented within the Sandia National Laboratories’ developed SIERRA/SolidMechanics code. Concurrent with the model development, two simple, bi-material structures composed of a carbon fiber/epoxy composite and aluminum, a flat plate and a cylinder, are fabricated and the residual stresses are quantified through the measurement of deformation. Then, in the process of validating the developed modeling approach with the experimental residual stress data, manufacturing process simulations of the two simple structures are developed and undergo a formal verification and validation process, including a mesh convergence study, sensitivity analysis, and uncertainty quantification. The simulations’ final results show adequate agreement with the experimental measurements, indicating the validity of a simple modeling approach, as well as a necessity for the inclusion of material parameter uncertainty in the final residual stress predictions.« less
Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials
NASA Technical Reports Server (NTRS)
Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar
2015-01-01
The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition, after investigating various methods, a Smoothed Particle Hydrodynamics Model (SPH Model) was developed to model wire feeding process. Its computational efficiency and simple architecture makes it more robust and flexible than other models. More research on material properties may be needed to realistically model the AAM processes. A microscale model was developed to investigate heterogeneous nucleation, dendritic grain growth, epitaxial growth of columnar grains, columnar-to-equiaxed transition, grain transport in melt, and other properties. The orientations of the columnar grains were almost perpendicular to the laser motion's direction. Compared to the similar studies in the literature, the multiple grain morphology modeling result is in the same order of magnitude as optical morphologies in the experiment. Experimental work was conducted to validate different models. An infrared camera was incorporated as a process monitoring and validating tool to identify the solidus and mushy zones during deposition. The images were successfully processed to identify these regions. This research project has investigated multiscale and multiphysics of the complex AAM processes thus leading to advanced understanding of these processes. The project has also developed several modeling tools and experimental validation tools that will be very critical in the future of AAM process qualification and certification.
Continued Development and Validation of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2015-11-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.
Validation and Continued Development of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2016-10-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
2017-11-01
Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.
[The use of systematic review to develop a self-management program for CKD].
Lee, Yu-Chin; Wu, Shu-Fang Vivienne; Lee, Mei-Chen; Chen, Fu-An; Yao, Yen-Hong; Wang, Chin-Ling
2014-12-01
Chronic kidney disease (CKD) has become a public health issue of international concern due to its high prevalence. The concept of self-management has been comprehensively applied in education programs that address chronic diseases. In recent years, many studies have used self-management programs in CKD interventions and have investigated the pre- and post-intervention physiological and psychological effectiveness of this approach. However, a complete clinical application program in the self-management model has yet to be developed for use in clinical renal care settings. A systematic review is used to develop a self-management program for CKD. Three implementation steps were used in this study. These steps include: (1) A systematic literature search and review using databases including CEPS (Chinese Electronic Periodical Services) of Airiti, National Digital Library of Theses and Dissertations in Taiwan, CINAHL, Pubmed, Medline, Cochrane Library, and Joanna Briggs Institute. A total of 22 studies were identified as valid and submitted to rigorous analysis. Of these, 4 were systematic literature reviews, 10 were randomized experimental studies, and 8 were non-randomized experimental studies. (2) Empirical evidence then was used to draft relevant guidelines on clinical application. (3) Finally, expert panels tested the validity of the draft to ensure the final version was valid for application in practice. This study designed a self-management program for CKD based on the findings of empirical studies. The content of this program included: design principles, categories, elements, and the intervention measures used in the self-management program. This program and then was assessed using the content validity index (CVI) and a four-point Liker's scale. The content validity score was .98. The guideline of self-management program to CKD was thus developed. This study developed a self-management program applicable to local care of CKD. It is hoped that the guidelines developed in this study offer a reference for clinical caregivers to improve their healthcare practices.
Assessing Students' Understanding of Macroevolution: Concerns regarding the Validity of the MUM
ERIC Educational Resources Information Center
Novick, Laura R.; Catley, Kefyn M.
2012-01-01
In a recent article, Nadelson and Southerland (2010. Development and preliminary evaluation of the Measure of Understanding of Macroevolution: Introducing the MUM. "The Journal of Experimental Education", 78, 151-190) reported on their development of a multiple-choice concept inventory intended to assess college students' understanding…
Alocomotino Control Algorithm for Robotic Linkage Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dohner, Jeffrey L.
This dissertation describes the development of a control algorithm that transitions a robotic linkage system between stabilized states producing responsive locomotion. The developed algorithm is demonstrated using a simple robotic construction consisting of a few links with actuation and sensing at each joint. Numerical and experimental validation is presented.
Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda
2002-01-01
A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.
Alternatives to animal testing: research, trends, validation, regulatory acceptance.
Huggins, Jane
2003-01-01
Current trends and issues in the development of alternatives to the use of animals in biomedical experimentation are discussed in this position paper. Eight topics are considered and include refinement of acute toxicity assays; eye corrosion/irritation alternatives; skin corrosion/irritation alternatives; contact sensitization alternatives; developmental/reproductive testing alternatives; genetic engineering (transgenic) assays; toxicogenomics; and validation of alternative methods. The discussion of refinement of acute toxicity assays is focused primarily on developments with regard to reduction of the number of animals used in the LD(50) assay. However, the substitution of humane endpoints such as clinical signs of toxicity for lethality in these assays is also evaluated. Alternative assays for eye corrosion/irritation as well as those for skin corrosion/irritation are described with particular attention paid to the outcomes, both successful and unsuccessful, of several validation efforts. Alternative assays for contact sensitization and developmental/reproductive toxicity are presented as examples of methods designed for the examination of interactions between toxins and somewhat more complex physiological systems. Moreover, genetic engineering and toxicogenomics are discussed with an eye toward the future of biological experimentation in general. The implications of gene manipulation for research animals, specifically, are also examined. Finally, validation methods are investigated as to their effectiveness, or lack thereof, and suggestions for their standardization and improvement, as well as implementation are reviewed.
Three-dimensional shape optimization of a cemented hip stem and experimental validations.
Higa, Masaru; Tanino, Hiromasa; Nishimura, Ikuya; Mitamura, Yoshinori; Matsuno, Takeo; Ito, Hiroshi
2015-03-01
This study proposes novel optimized stem geometry with low stress values in the cement using a finite element (FE) analysis combined with an optimization procedure and experimental measurements of cement stress in vitro. We first optimized an existing stem geometry using a three-dimensional FE analysis combined with a shape optimization technique. One of the most important factors in the cemented stem design is to reduce stress in the cement. Hence, in the optimization study, we minimized the largest tensile principal stress in the cement mantle under a physiological loading condition by changing the stem geometry. As the next step, the optimized stem and the existing stem were manufactured to validate the usefulness of the numerical models and the results of the optimization in vitro. In the experimental study, strain gauges were embedded in the cement mantle to measure the strain in the cement mantle adjacent to the stems. The overall trend of the experimental study was in good agreement with the results of the numerical study, and we were able to reduce the largest stress by more than 50% in both shape optimization and strain gauge measurements. Thus, we could validate the usefulness of the numerical models and the results of the optimization using the experimental models. The optimization employed in this study is a useful approach for developing new stem designs.
miRTarBase update 2018: a resource for experimentally validated microRNA-target interactions.
Chou, Chih-Hung; Shrestha, Sirjana; Yang, Chi-Dung; Chang, Nai-Wen; Lin, Yu-Ling; Liao, Kuang-Wen; Huang, Wei-Chi; Sun, Ting-Hsuan; Tu, Siang-Jyun; Lee, Wei-Hsiang; Chiew, Men-Yee; Tai, Chun-San; Wei, Ting-Yen; Tsai, Tzi-Ren; Huang, Hsin-Tzu; Wang, Chung-Yu; Wu, Hsin-Yi; Ho, Shu-Yi; Chen, Pin-Rong; Chuang, Cheng-Hsun; Hsieh, Pei-Jung; Wu, Yi-Shin; Chen, Wen-Liang; Li, Meng-Ju; Wu, Yu-Chun; Huang, Xin-Yi; Ng, Fung Ling; Buddhakosai, Waradee; Huang, Pei-Chun; Lan, Kuan-Chun; Huang, Chia-Yen; Weng, Shun-Long; Cheng, Yeong-Nan; Liang, Chao; Hsu, Wen-Lian; Huang, Hsien-Da
2018-01-04
MicroRNAs (miRNAs) are small non-coding RNAs of ∼ 22 nucleotides that are involved in negative regulation of mRNA at the post-transcriptional level. Previously, we developed miRTarBase which provides information about experimentally validated miRNA-target interactions (MTIs). Here, we describe an updated database containing 422 517 curated MTIs from 4076 miRNAs and 23 054 target genes collected from over 8500 articles. The number of MTIs curated by strong evidence has increased ∼1.4-fold since the last update in 2016. In this updated version, target sites validated by reporter assay that are available in the literature can be downloaded. The target site sequence can extract new features for analysis via a machine learning approach which can help to evaluate the performance of miRNA-target prediction tools. Furthermore, different ways of browsing enhance user browsing specific MTIs. With these improvements, miRTarBase serves as more comprehensively annotated, experimentally validated miRNA-target interactions databases in the field of miRNA related research. miRTarBase is available at http://miRTarBase.mbc.nctu.edu.tw/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Experimental Validation: Subscale Aircraft Ground Facilities and Integrated Test Capability
NASA Technical Reports Server (NTRS)
Bailey, Roger M.; Hostetler, Robert W., Jr.; Barnes, Kevin N.; Belcastro, Celeste M.; Belcastro, Christine M.
2005-01-01
Experimental testing is an important aspect of validating complex integrated safety critical aircraft technologies. The Airborne Subscale Transport Aircraft Research (AirSTAR) Testbed is being developed at NASA Langley to validate technologies under conditions that cannot be flight validated with full-scale vehicles. The AirSTAR capability comprises a series of flying sub-scale models, associated ground-support equipment, and a base research station at NASA Langley. The subscale model capability utilizes a generic 5.5% scaled transport class vehicle known as the Generic Transport Model (GTM). The AirSTAR Ground Facilities encompass the hardware and software infrastructure necessary to provide comprehensive support services for the GTM testbed. The ground facilities support remote piloting of the GTM aircraft, and include all subsystems required for data/video telemetry, experimental flight control algorithm implementation and evaluation, GTM simulation, data recording/archiving, and audio communications. The ground facilities include a self-contained, motorized vehicle serving as a mobile research command/operations center, capable of deployment to remote sites when conducting GTM flight experiments. The ground facilities also include a laboratory based at NASA LaRC providing near identical capabilities as the mobile command/operations center, as well as the capability to receive data/video/audio from, and send data/audio to the mobile command/operations center during GTM flight experiments.
Experimental validation of structural optimization methods
NASA Technical Reports Server (NTRS)
Adelman, Howard M.
1992-01-01
The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.
Hyper-X: Flight Validation of Hypersonic Airbreathing Technology
NASA Technical Reports Server (NTRS)
Rausch, Vincent L.; McClinton, Charles R.; Crawford, J. Larry
1997-01-01
This paper provides an overview of NASA's focused hypersonic technology program, i.e. the Hyper-X program. This program is designed to move hypersonic, air breathing vehicle technology from the laboratory environment to the flight environment, the last stage preceding prototype development. This paper presents some history leading to the flight test program, research objectives, approach, schedule and status. Substantial experimental data base and concept validation have been completed. The program is concentrating on Mach 7 vehicle development, verification and validation in preparation for wind tunnel testing in 1998 and flight testing in 1999. It is also concentrating on finalization of the Mach 5 and 10 vehicle designs. Detailed evaluation of the Mach 7 vehicle at the flight conditions is nearing completion, and will provide a data base for validation of design methods once flight test data are available.
Computational fluid dynamic modeling of a medium-sized surface mine blasthole drill shroud
Zheng, Y.; Reed, W.R.; Zhou, L.; Rider, J.P.
2016-01-01
The Pittsburgh Mining Research Division of the U.S. National Institute for Occupational Safety and Health (NIOSH) recently developed a series of models using computational fluid dynamics (CFD) to study airflows and respirable dust distribution associated with a medium-sized surface blasthole drill shroud with a dry dust collector system. Previously run experiments conducted in NIOSH’s full-scale drill shroud laboratory were used to validate the models. The setup values in the CFD models were calculated from experimental data obtained from the drill shroud laboratory and measurements of test material particle size. Subsequent simulation results were compared with the experimental data for several test scenarios, including 0.14 m3/s (300 cfm) and 0.24 m3/s (500 cfm) bailing airflow with 2:1, 3:1 and 4:1 dust collector-to-bailing airflow ratios. For the 2:1 and 3:1 ratios, the calculated dust concentrations from the CFD models were within the 95 percent confidence intervals of the experimental data. This paper describes the methodology used to develop the CFD models, to calculate the model input and to validate the models based on the experimental data. Problem regions were identified and revealed by the study. The simulation results could be used for future development of dust control methods for a surface mine blasthole drill shroud. PMID:27932851
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.
1989-01-01
The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.
A measure of state persecutory ideation for experimental studies.
Freeman, Daniel; Pugh, Katherine; Green, Catherine; Valmaggia, Lucia; Dunn, Graham; Garety, Philippa
2007-09-01
Experimental research is increasingly important in developing the understanding of paranoid thinking. An assessment measure of persecutory ideation is necessary for such work. We report the reliability and validity of the first state measure of paranoia: The State Social Paranoia Scale. The items in the measure conform to a recent definition in which persecutory thinking has the 2 elements of feared harm and perpetrator intent. The measure was tested with 164 nonclinical participants and 21 individuals at high risk of psychosis with attenuated positive symptoms. The participants experienced a social situation presented in virtual reality and completed the new measure. The State Social Paranoia Scale was found to have excellent internal reliability, adequate test-retest reliability, clear convergent validity as assessed by both independent interviewer ratings and self-report measures, and showed divergent validity with measures of positive and neutral thinking. The measure of paranoia in a recent social situation has good psychometric properties.
Modelling and analysis of a direct ascorbic acid fuel cell
NASA Astrophysics Data System (ADS)
Zeng, Yingzhi; Fujiwara, Naoko; Yamazaki, Shin-ichi; Tanimoto, Kazumi; Wu, Ping
L-Ascorbic acid (AA), also known as vitamin C, is an environmentally-benign and biologically-friendly compound that can be used as an alternative fuel for direct oxidation fuel cells. While direct ascorbic acid fuel cells (DAAFCs) have been studied experimentally, modelling and simulation of these devices have been overlooked. In this work, we develop a mathematical model to describe a DAAFC and validate it with experimental data. The model is formulated by integrating the mass and charge balances, and model parameters are estimated by best-fitting to experimental data of current-voltage curves. By comparing the transient voltage curves predicted by dynamic simulation and experiments, the model is further validated. Various parameters that affect the power generation are studied by simulation. The cathodic reaction is found to be the most significant determinant of power generation, followed by fuel feed concentration and the mass-transfer coefficient of ascorbic acid. These studies also reveal that the power density steadily increases with respect to the fuel feed concentration. The results may guide future development and operation of a more efficient DAAFC.
Reader, Arran T; Holmes, Nicholas P
2016-01-01
Social interaction is an essential part of the human experience, and much work has been done to study it. However, several common approaches to examining social interactions in psychological research may inadvertently either unnaturally constrain the observed behaviour by causing it to deviate from naturalistic performance, or introduce unwanted sources of variance. In particular, these sources are the differences between naturalistic and experimental behaviour that occur from changes in visual fidelity (quality of the observed stimuli), gaze (whether it is controlled for in the stimuli), and social potential (potential for the stimuli to provide actual interaction). We expand on these possible sources of extraneous variance and why they may be important. We review the ways in which experimenters have developed novel designs to remove these sources of extraneous variance. New experimental designs using a 'two-person' approach are argued to be one of the most effective ways to develop more ecologically valid measures of social interaction, and we suggest that future work on social interaction should use these designs wherever possible.
NASA National Combustion Code Simulations
NASA Technical Reports Server (NTRS)
Iannetti, Anthony; Davoudzadeh, Farhad
2001-01-01
A systematic effort is in progress to further validate the National Combustion Code (NCC) that has been developed at NASA Glenn Research Center (GRC) for comprehensive modeling and simulation of aerospace combustion systems. The validation efforts include numerical simulation of the gas-phase combustor experiments conducted at the Center for Turbulence Research (CTR), Stanford University, followed by comparison and evaluation of the computed results with the experimental data. Presently, at GRC, a numerical model of the experimental gaseous combustor is built to simulate the experimental model. The constructed numerical geometry includes the flow development sections for air annulus and fuel pipe, 24 channel air and fuel swirlers, hub, combustor, and tail pipe. Furthermore, a three-dimensional multi-block, multi-grid grid (1.6 million grid points, 3-levels of multi-grid) is generated. Computational simulation of the gaseous combustor flow field operating on methane fuel has started. The computational domain includes the whole flow regime starting from the fuel pipe and the air annulus, through the 12 air and 12 fuel channels, in the combustion region and through the tail pipe.
A Computational Investigation of Gear Windage
NASA Technical Reports Server (NTRS)
Hill, Matthew J.; Kunz, Robert F.
2012-01-01
A CFD method has been developed for application to gear windage aerodynamics. The goals of this research are to develop and validate numerical and modeling approaches for these systems, to develop physical understanding of the aerodynamics of gear windage loss, including the physics of loss mitigation strategies, and to propose and evaluate new approaches for minimizing loss. Absolute and relative frame CFD simulation, overset gridding, multiphase flow analysis, and sub-layer resolved turbulence modeling were brought to bear in achieving these goals. Several spur gear geometries were studied for which experimental data are available. Various shrouding configurations and free-spinning (no shroud) cases were studied. Comparisons are made with experimental data from the open literature, and data recently obtained in the NASA Glenn Research Center Gear Windage Test Facility. The results show good agreement with experiment. Interrogation of the validative and exploratory CFD results have led, for the first time, to a detailed understanding of the physical mechanisms of gear windage loss, and have led to newly proposed mitigation strategies whose effectiveness is computationally explored.
Mathematical modeling of a single stage ultrasonically assisted distillation process.
Mahdi, Taha; Ahmad, Arshad; Ripin, Adnan; Abdullah, Tuan Amran Tuan; Nasef, Mohamed M; Ali, Mohamad W
2015-05-01
The ability of sonication phenomena in facilitating separation of azeotropic mixtures presents a promising approach for the development of more intensified and efficient distillation systems than conventional ones. To expedite the much-needed development, a mathematical model of the system based on conservation principles, vapor-liquid equilibrium and sonochemistry was developed in this study. The model that was founded on a single stage vapor-liquid equilibrium system and enhanced with ultrasonic waves was coded using MATLAB simulator and validated with experimental data for ethanol-ethyl acetate mixture. The effects of both ultrasonic frequency and intensity on the relative volatility and azeotropic point were examined, and the optimal conditions were obtained using genetic algorithm. The experimental data validated the model with a reasonable accuracy. The results of this study revealed that the azeotropic point of the mixture can be totally eliminated with the right combination of sonication parameters and this can be utilized in facilitating design efforts towards establishing a workable ultrasonically intensified distillation system. Copyright © 2014 Elsevier B.V. All rights reserved.
Effects of human running cadence and experimental validation of the bouncing ball model
NASA Astrophysics Data System (ADS)
Bencsik, László; Zelei, Ambrus
2017-05-01
The biomechanical analysis of human running is a complex problem, because of the large number of parameters and degrees of freedom. However, simplified models can be constructed, which are usually characterized by some fundamental parameters, like step length, foot strike pattern and cadence. The bouncing ball model of human running is analysed theoretically and experimentally in this work. It is a minimally complex dynamic model when the aim is to estimate the energy cost of running and the tendency of ground-foot impact intensity as a function of cadence. The model shows that cadence has a direct effect on energy efficiency of running and ground-foot impact intensity. Furthermore, it shows that higher cadence implies lower risk of injury and better energy efficiency. An experimental data collection of 121 amateur runners is presented. The experimental results validate the model and provides information about the walk-to-run transition speed and the typical development of cadence and grounded phase ratio in different running speed ranges.
NASA Astrophysics Data System (ADS)
Lee, Bo Mi; Loh, Kenneth J.
2017-04-01
Carbon nanotubes can be randomly deposited in polymer thin film matrices to form nanocomposite strain sensors. However, a computational framework that enables the direct design of these nanocomposite thin films is still lacking. The objective of this study is to derive an experimentally validated and two-dimensional numerical model of carbon nanotube-based thin film strain sensors. This study consisted of two parts. First, multi-walled carbon nanotube (MWCNT)-Pluronic strain sensors were fabricated using vacuum filtration, and their physical, electrical, and electromechanical properties were evaluated. Second, scanning electron microscope images of the films were used for identifying topological features of the percolated MWCNT network, where the information obtained was then utilized for developing the numerical model. Validation of the numerical model was achieved by ensuring that the area ratios (of MWCNTs relative to the polymer matrix) were equivalent for both the experimental and modeled cases. Strain sensing behavior of the percolation-based model was simulated and then compared to experimental test results.
Abbey, Antonia; Wegner, Rhiana
2015-01-01
The goals of this article are to review the major findings from alcohol administration studies that use sexual aggression proxies and to encourage additional experimental research that evaluates hypotheses about the role of alcohol in the etiology of men’s sexual aggression. Experiments allow participants to be randomly assigned to drink conditions, therefore ensuring that any differences between drinkers and nondrinkers can be attributed to their alcohol consumption. One of the biggest challenges faced by experimental researchers is the identification of valid operationalizations of key constructs. The tension between internal and external validity is particularly problematic for violence researchers because they cannot allow participants to engage in the target behavior in the laboratory. The strengths and limitations associated with written vignettes, audiotapes, videotapes, and confederate proxies for sexual aggression are described. Suggestions are made for future research to broaden the generalizability of the findings from experimental research. PMID:26048214
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohatgi, Upendra S.
Nuclear reactor codes require validation with appropriate data representing the plant for specific scenarios. The thermal-hydraulic data is scattered in different locations and in different formats. Some of the data is in danger of being lost. A relational database is being developed to organize the international thermal hydraulic test data for various reactor concepts and different scenarios. At the reactor system level, that data is organized to include separate effect tests and integral effect tests for specific scenarios and corresponding phenomena. The database relies on the phenomena identification sections of expert developed PIRTs. The database will provide a summary ofmore » appropriate data, review of facility information, test description, instrumentation, references for the experimental data and some examples of application of the data for validation. The current database platform includes scenarios for PWR, BWR, VVER, and specific benchmarks for CFD modelling data and is to be expanded to include references for molten salt reactors. There are place holders for high temperature gas cooled reactors, CANDU and liquid metal reactors. This relational database is called The International Experimental Thermal Hydraulic Systems (TIETHYS) database and currently resides at Nuclear Energy Agency (NEA) of the OECD and is freely open to public access. Going forward the database will be extended to include additional links and data as they become available. https://www.oecd-nea.org/tiethysweb/« less
Validation of hydrogen gas stratification and mixing models
Wu, Hsingtzu; Zhao, Haihua
2015-05-26
Two validation benchmarks confirm that the BMIX++ code is capable of simulating unintended hydrogen release scenarios efficiently. The BMIX++ (UC Berkeley mechanistic MIXing code in C++) code has been developed to accurately and efficiently predict the fluid mixture distribution and heat transfer in large stratified enclosures for accident analyses and design optimizations. The BMIX++ code uses a scaling based one-dimensional method to achieve large reduction in computational effort compared to a 3-D computational fluid dynamics (CFD) simulation. Two BMIX++ benchmark models have been developed. One is for a single buoyant jet in an open space and another is for amore » large sealed enclosure with both a jet source and a vent near the floor. Both of them have been validated by comparisons with experimental data. Excellent agreements are observed. The entrainment coefficients of 0.09 and 0.08 are found to fit the experimental data for hydrogen leaks with the Froude number of 99 and 268 best, respectively. In addition, the BIX++ simulation results of the average helium concentration for an enclosure with a vent and a single jet agree with the experimental data within a margin of about 10% for jet flow rates ranging from 1.21 × 10⁻⁴ to 3.29 × 10⁻⁴ m³/s. In conclusion, computing time for each BMIX++ model with a normal desktop computer is less than 5 min.« less
Gorguluarslan, Recep M; Choi, Seung-Kyum; Saldana, Christopher J
2017-07-01
A methodology is proposed for uncertainty quantification and validation to accurately predict the mechanical response of lattice structures used in the design of scaffolds. Effective structural properties of the scaffolds are characterized using a developed multi-level stochastic upscaling process that propagates the quantified uncertainties at strut level to the lattice structure level. To obtain realistic simulation models for the stochastic upscaling process and minimize the experimental cost, high-resolution finite element models of individual struts were reconstructed from the micro-CT scan images of lattice structures which are fabricated by selective laser melting. The upscaling method facilitates the process of determining homogenized strut properties to reduce the computational cost of the detailed simulation model for the scaffold. Bayesian Information Criterion is utilized to quantify the uncertainties with parametric distributions based on the statistical data obtained from the reconstructed strut models. A systematic validation approach that can minimize the experimental cost is also developed to assess the predictive capability of the stochastic upscaling method used at the strut level and lattice structure level. In comparison with physical compression test results, the proposed methodology of linking the uncertainty quantification with the multi-level stochastic upscaling method enabled an accurate prediction of the elastic behavior of the lattice structure with minimal experimental cost by accounting for the uncertainties induced by the additive manufacturing process. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development and validation of a numerical model of the swine head subjected to open-field blasts
NASA Astrophysics Data System (ADS)
Kalra, A.; Zhu, F.; Feng, K.; Saif, T.; Kallakuri, S.; Jin, X.; Yang, K.; King, A.
2017-11-01
A finite element model of the head of a 55-kg Yucatan pig was developed to calculate the incident pressure and corresponding intracranial pressure due to the explosion of 8 lb (3.63 kg) of C4 at three different distances. The results from the model were validated by comparing findings with experimentally obtained data from five pigs at three different blast overpressure levels: low (150 kPa), medium (275 kPa), and high (400 kPa). The peak values of intracranial pressures from numerical model at different locations of the brain such as the frontal, central, left temporal, right temporal, parietal, and occipital regions were compared with experimental values. The model was able to predict the peak pressure with reasonable percentage differences. The differences for peak incident and intracranial pressure values between the simulation results and the experimental values were found to be less than 2.2 and 29.3%, respectively, at all locations other than the frontal region. Additionally, a series of parametric studies shows that the intracranial pressure was very sensitive to sensor locations, the presence of air bubbles, and reflections experienced during the experiments. Further efforts will be undertaken to correlate the different biomechanical response parameters, such as the intracranial pressure gradient, stress, and strain results obtained from the validated model with injured brain locations once the histology data become available.
Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua
2018-01-04
Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu
2018-01-01
Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416
Chen, S C; You, S H; Liu, C Y; Chio, C P; Liao, C M
2012-09-01
The aim of this work was to use experimental infection data of human influenza to assess a simple viral dynamics model in epithelial cells and better understand the underlying complex factors governing the infection process. The developed study model expands on previous reports of a target cell-limited model with delayed virus production. Data from 10 published experimental infection studies of human influenza was used to validate the model. Our results elucidate, mechanistically, the associations between epithelial cells, human immune responses, and viral titres and were supported by the experimental infection data. We report that the maximum total number of free virions following infection is 10(3)-fold higher than the initial introduced titre. Our results indicated that the infection rates of unprotected epithelial cells probably play an important role in affecting viral dynamics. By simulating an advanced model of viral dynamics and applying it to experimental infection data of human influenza, we obtained important estimates of the infection rate. This work provides epidemiologically meaningful results, meriting further efforts to understand the causes and consequences of influenza A infection.
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.
1994-01-01
This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.
Benson, Charles T.; Critser, John K.
2014-01-01
Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3 × 3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87 ± 0.06 (mean ± S.D.). Only the treatment variable of perfusing solution was found to be significant (p < 0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. PMID:24950195
NSRD-10: Leak Path Factor Guidance Using MELCOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less
Andrade, E L; Bento, A F; Cavalli, J; Oliveira, S K; Freitas, C S; Marcon, R; Schwanke, R C; Siqueira, J M; Calixto, J B
2016-10-24
This review presents a historical overview of drug discovery and the non-clinical stages of the drug development process, from initial target identification and validation, through in silico assays and high throughput screening (HTS), identification of leader molecules and their optimization, the selection of a candidate substance for clinical development, and the use of animal models during the early studies of proof-of-concept (or principle). This report also discusses the relevance of validated and predictive animal models selection, as well as the correct use of animal tests concerning the experimental design, execution and interpretation, which affect the reproducibility, quality and reliability of non-clinical studies necessary to translate to and support clinical studies. Collectively, improving these aspects will certainly contribute to the robustness of both scientific publications and the translation of new substances to clinical development.
A new biodegradation prediction model specific to petroleum hydrocarbons.
Howard, Philip; Meylan, William; Aronson, Dallas; Stiteler, William; Tunkel, Jay; Comber, Michael; Parkerton, Thomas F
2005-08-01
A new predictive model for determining quantitative primary biodegradation half-lives of individual petroleum hydrocarbons has been developed. This model uses a fragment-based approach similar to that of several other biodegradation models, such as those within the Biodegradation Probability Program (BIOWIN) estimation program. In the present study, a half-life in days is estimated using multiple linear regression against counts of 31 distinct molecular fragments. The model was developed using a data set consisting of 175 compounds with environmentally relevant experimental data that was divided into training and validation sets. The original fragments from the Ministry of International Trade and Industry BIOWIN model were used initially as structural descriptors and additional fragments were then added to better describe the ring systems found in petroleum hydrocarbons and to adjust for nonlinearity within the experimental data. The training and validation sets had r2 values of 0.91 and 0.81, respectively.
Development and Validation of a Mathematical Model for Olive Oil Oxidation
NASA Astrophysics Data System (ADS)
Rahmouni, K.; Bouhafa, H.; Hamdi, S.
2009-03-01
A mathematical model describing the stability or the susceptibility to oxidation of extra virgin olive oil has been developed. The model has been resolved by an iterative method using differential finite method. It was validated by experimental data of extra virgin olive oil (EVOO) oxidation. EVOO stability was tested by using a Rancimat at four different temperatures 60, 70, 80 and 90° C until peroxide accumulation reached 20 [meq/kg]. Peroxide formation is speed relatively slow; fits zero order reaction with linear regression coefficients varying from 0, 98 to 0, 99. The mathematical model was used to predict the shelf life of bulk conditioned olive oil. This model described peroxide accumulation inside a container in excess of oxygen as a function of time at various positions from the interface air/oil. Good correlations were obtained between theoretical and experimental values.
Patterson, P Daniel; Weaver, Matthew D; Fabio, Anthony; Teasley, Ellen M; Renn, Megan L; Curtis, Brett R; Matthews, Margaret E; Kroemer, Andrew J; Xun, Xiaoshuang; Bizhanova, Zhadyra; Weiss, Patricia M; Sequeira, Denisse J; Coppler, Patrick J; Lang, Eddy S; Higgins, J Stephen
2018-02-15
This study sought to systematically search the literature to identify reliable and valid survey instruments for fatigue measurement in the Emergency Medical Services (EMS) occupational setting. A systematic review study design was used and searched six databases, including one website. The research question guiding the search was developed a priori and registered with the PROSPERO database of systematic reviews: "Are there reliable and valid instruments for measuring fatigue among EMS personnel?" (2016:CRD42016040097). The primary outcome of interest was criterion-related validity. Important outcomes of interest included reliability (e.g., internal consistency), and indicators of sensitivity and specificity. Members of the research team independently screened records from the databases. Full-text articles were evaluated by adapting the Bolster and Rourke system for categorizing findings of systematic reviews, and the rated data abstracted from the body of literature as favorable, unfavorable, mixed/inconclusive, or no impact. The Grading of Recommendations, Assessment, Development and Evaluation (GRADE) methodology was used to evaluate the quality of evidence. The search strategy yielded 1,257 unique records. Thirty-four unique experimental and non-experimental studies were determined relevant following full-text review. Nineteen studies reported on the reliability and/or validity of ten different fatigue survey instruments. Eighteen different studies evaluated the reliability and/or validity of four different sleepiness survey instruments. None of the retained studies reported sensitivity or specificity. Evidence quality was rated as very low across all outcomes. In this systematic review, limited evidence of the reliability and validity of 14 different survey instruments to assess the fatigue and/or sleepiness status of EMS personnel and related shift worker groups was identified.
From military to civil loadings: Preliminary numerical-based thorax injury criteria investigations.
Goumtcha, Aristide Awoukeng; Bodo, Michèle; Taddei, Lorenzo; Roth, Sébastien
2016-03-01
Effects of the impact of a mechanical structure on the human body are of great interest in the understanding of body trauma. Experimental tests have led to first conclusions about the dangerousness of an impact observing impact forces or displacement time history with PMHS (Post Mortem human Subjects). They have allowed providing interesting data for the development and the validation of numerical biomechanical models. These models, widely used in the framework of automotive crashworthiness, have led to the development of numerical-based injury criteria and tolerance thresholds. The aim of this process is to improve the safety of mechanical structures in interaction with the body. In a military context, investigations both at experimental and numerical level are less successfully completed. For both military and civil frameworks, the literature list a number of numerical analysis trying to propose injury mechanisms, and tolerance thresholds based on biofidelic Finite Element (FE) models of different part of the human body. However the link between both frameworks is not obvious, since lots of parameters are different: great mass impacts at relatively low velocity for civil impacts (falls, automotive crashworthiness) and low mass at very high velocity for military loadings (ballistic, blast). In this study, different accident cases were investigated, and replicated with a previously developed and validated FE model of the human thorax named Hermaphrodite Universal Biomechanical YX model (HUBYX model). These previous validations included replications of standard experimental tests often used to validate models in the context of automotive industry, experimental ballistic tests in high speed dynamic impact and also numerical replication of blast loading test ensuring its biofidelity. In order to extend the use of this model in other frameworks, some real-world accidents were reconstructed, and consequences of these loadings on the FE model were explored. These various numerical replications of accident coming from different contexts raise the question about the ability of a FE model to correctly predict several kinds of trauma, from blast or ballistic impacts to falls, sports or automotive ones in a context of numerical injury mechanisms and tolerance limits investigations. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Seybert, A. F.; Wu, T. W.; Wu, X. F.
1994-01-01
This research report is presented in three parts. In the first part, acoustical analyses were performed on modes of vibration of the housing of a transmission of a gear test rig developed by NASA. The modes of vibration of the transmission housing were measured using experimental modal analysis. The boundary element method (BEM) was used to calculate the sound pressure and sound intensity on the surface of the housing and the radiation efficiency of each mode. The radiation efficiency of each of the transmission housing modes was then compared to theoretical results for a finite baffled plate. In the second part, analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise level radiated from the box. The FEM was used to predict the vibration, while the BEM was used to predict the sound intensity and total radiated sound power using surface vibration as the input data. Vibration predicted by the FEM model was validated by experimental modal analysis; noise predicted by the BEM was validated by measurements of sound intensity. Three types of results are presented for the total radiated sound power: sound power predicted by the BEM model using vibration data measured on the surface of the box; sound power predicted by the FEM/BEM model; and sound power measured by an acoustic intensity scan. In the third part, the structure used in part two was modified. A rib was attached to the top plate of the structure. The FEM and BEM were then used to predict structural vibration and radiated noise respectively. The predicted vibration and radiated noise were then validated through experimentation.
Challenges of NDE simulation tool validation, optimization, and utilization for composites
NASA Astrophysics Data System (ADS)
Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter
2016-02-01
Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.
Generic Skill Development and Learning/Assessment Process: Use of Rubrics and Student Validation
ERIC Educational Resources Information Center
Iborra Urios, Montserrat; Ramirez Rangel, Eliana; Bringué Tomàs, Roger; Tejero Salvador, Javier; Cunill García, Fidel; Fité Piquer, Carles
2015-01-01
To fulfill the European Higher Education context in the subject of the Chemical Engineering Undergraduate Degree of University of Barcelona named "Chemical Engineering Experimentation II" team work, written and oral communication generic skills were developed and assessed by means of rubrics. In order to appraise the methodological…
As part of our efforts to develop a public platform to provide access to predictive models we have attempted to disentangle the influence of the quality versus quantity of data available to develop and validate QSAR models. Using a thorough manual review of the data underlying t...
Pradhan, Nirakar; Dipasquale, Laura; d'Ippolito, Giuliana; Fontana, Angelo; Panico, Antonio; Pirozzi, Francesco; Lens, Piet N L; Esposito, Giovanni
2016-08-01
The aim of the present study was to develop a kinetic model for a recently proposed unique and novel metabolic process called capnophilic (CO2-requiring) lactic fermentation (CLF) pathway in Thermotoga neapolitana. The model was based on Monod kinetics and the mathematical expressions were developed to enable the simulation of biomass growth, substrate consumption and product formation. The calibrated kinetic parameters such as maximum specific uptake rate (k), semi-saturation constant (kS), biomass yield coefficient (Y) and endogenous decay rate (kd) were 1.30 h(-1), 1.42 g/L, 0.1195 and 0.0205 h(-1), respectively. A high correlation (>0.98) was obtained between the experimental data and model predictions for both model validation and cross validation processes. An increase of the lactate production in the range of 40-80% was obtained through CLF pathway compared to the classic dark fermentation model. The proposed kinetic model is the first mechanistically based model for the CLF pathway. This model provides useful information to improve the knowledge about how acetate and CO2 are recycled back by Thermotoga neapolitana to produce lactate without compromising the overall hydrogen yield. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ivanova, B. B.; Simeonov, V. D.; Arnaudov, M. G.; Tsalev, D. L.
2007-05-01
A validation of the developed new orientation method of solid samples as suspension in nematic liquid crystal (NLC), applied in linear-dichroic infrared (IR-LD) spectroscopy has been carried out using a model system DL-isoleucine ( DL-isoleu). Accuracy, precision and the influence of the liquid crystal medium on peak positions and integral absorbances of guest molecules have been presented. Optimization of experimental conditions has been performed as well. An experimental design for quantitative evaluation of the impact of four input factors: the number of scans, the rubbing-out of KBr-pellets, the amount of studied compounds included in the liquid crystal medium and the ratios of Lorentzian to Gaussian peak functions in the curve fitting procedure on the spectroscopic signal at five different frequencies, indicating important specifities of the system has been studied.
Modeling aspects of human memory for scientific study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caudell, Thomas P.; Watson, Patrick; McDaniel, Mark A.
Working with leading experts in the field of cognitive neuroscience and computational intelligence, SNL has developed a computational architecture that represents neurocognitive mechanisms associated with how humans remember experiences in their past. The architecture represents how knowledge is organized and updated through information from individual experiences (episodes) via the cortical-hippocampal declarative memory system. We compared the simulated behavioral characteristics with those of humans measured under well established experimental standards, controlling for unmodeled aspects of human processing, such as perception. We used this knowledge to create robust simulations of & human memory behaviors that should help move the scientific community closermore » to understanding how humans remember information. These behaviors were experimentally validated against actual human subjects, which was published. An important outcome of the validation process will be the joining of specific experimental testing procedures from the field of neuroscience with computational representations from the field of cognitive modeling and simulation.« less
Slavov, Svetoslav H; Stoyanova-Slavova, Iva; Mattes, William; Beger, Richard D; Brüschweiler, Beat J
2018-07-01
A grid-based, alignment-independent 3D-SDAR (three-dimensional spectral data-activity relationship) approach based on simulated 13 C and 15 N NMR chemical shifts augmented with through-space interatomic distances was used to model the mutagenicity of 554 primary and 419 secondary aromatic amines. A robust modeling strategy supported by extensive validation including randomized training/hold-out test set pairs, validation sets, "blind" external test sets as well as experimental validation was applied to avoid over-parameterization and build Organization for Economic Cooperation and Development (OECD 2004) compliant models. Based on an experimental validation set of 23 chemicals tested in a two-strain Salmonella typhimurium Ames assay, 3D-SDAR was able to achieve performance comparable to 5-strain (Ames) predictions by Lhasa Limited's Derek and Sarah Nexus for the same set. Furthermore, mapping of the most frequently occurring bins on the primary and secondary aromatic amine structures allowed the identification of molecular features that were associated either positively or negatively with mutagenicity. Prominent structural features found to enhance the mutagenic potential included: nitrobenzene moieties, conjugated π-systems, nitrothiophene groups, and aromatic hydroxylamine moieties. 3D-SDAR was also able to capture "true" negative contributions that are particularly difficult to detect through alternative methods. These include sulphonamide, acetamide, and other functional groups, which not only lack contributions to the overall mutagenic potential, but are known to actively lower it, if present in the chemical structures of what otherwise would be potential mutagens.
[The ethical aspects of physiological experiment].
Al'bertin, S V
2014-01-01
A modern classification of invasive procedures developed according to International Bioethical Principles has been presented. The experimental data convincingly demonstrate that using of noninvasive approaches and techniques give a good opportunity to reduce a number of animals recruited in experiment as well as to keep the normal (not distressful) physiological functions of animals. The data presented stress that development of noninvasive techniques is closely related both to scientific and social aspects of our life, allowing the scientists to provide high validity of experimental data obtained as well as to keep themselves as a human beings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freed, Melanie; Miller, Stuart; Tang, Katherine
Purpose: MANTIS is a Monte Carlo code developed for the detailed simulation of columnar CsI scintillator screens in x-ray imaging systems. Validation of this code is needed to provide a reliable and valuable tool for system optimization and accurate reconstructions for a variety of x-ray applications. Whereas previous validation efforts have focused on matching of summary statistics, in this work the authors examine the complete point response function (PRF) of the detector system in addition to relative light output values. Methods: Relative light output values and high-resolution PRFs have been experimentally measured with a custom setup. A corresponding set ofmore » simulated light output values and PRFs have also been produced, where detailed knowledge of the experimental setup and CsI:Tl screen structures are accounted for in the simulations. Four different screens were investigated with different thicknesses, column tilt angles, and substrate types. A quantitative comparison between the experimental and simulated PRFs was performed for four different incidence angles (0 deg., 15 deg., 30 deg., and 45 deg.) and two different x-ray spectra (40 and 70 kVp). The figure of merit (FOM) used measures the normalized differences between the simulated and experimental data averaged over a region of interest. Results: Experimental relative light output values ranged from 1.456 to 1.650 and were in approximate agreement for aluminum substrates, but poor agreement for graphite substrates. The FOMs for all screen types, incidence angles, and energies ranged from 0.1929 to 0.4775. To put these FOMs in context, the same FOM was computed for 2D symmetric Gaussians fit to the same experimental data. These FOMs ranged from 0.2068 to 0.8029. Our analysis demonstrates that MANTIS reproduces experimental PRFs with higher accuracy than a symmetric 2D Gaussian fit to the experimental data in the majority of cases. Examination of the spatial distribution of differences between the PRFs shows that the main reason for errors between MANTIS and the experimental data is that MANTIS-generated PRFs are sharper than the experimental PRFs. Conclusions: The experimental validation of MANTIS performed in this study demonstrates that MANTIS is able to reliably predict experimental PRFs, especially for thinner screens, and can reproduce the highly asymmetric shape seen in the experimental data. As a result, optimizations and reconstructions carried out using MANTIS should yield results indicative of actual detector performance. Better characterization of screen properties is necessary to reconcile the simulated light output values with experimental data.« less
Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A
2016-01-01
The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.
Holgado-Tello, Fco. P.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A.
2016-01-01
The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity. PMID:27378991
Wang, Wenyi; Kim, Marlene T.; Sedykh, Alexander
2015-01-01
Purpose Experimental Blood–Brain Barrier (BBB) permeability models for drug molecules are expensive and time-consuming. As alternative methods, several traditional Quantitative Structure-Activity Relationship (QSAR) models have been developed previously. In this study, we aimed to improve the predictivity of traditional QSAR BBB permeability models by employing relevant public bio-assay data in the modeling process. Methods We compiled a BBB permeability database consisting of 439 unique compounds from various resources. The database was split into a modeling set of 341 compounds and a validation set of 98 compounds. Consensus QSAR modeling workflow was employed on the modeling set to develop various QSAR models. A five-fold cross-validation approach was used to validate the developed models, and the resulting models were used to predict the external validation set compounds. Furthermore, we used previously published membrane transporter models to generate relevant transporter profiles for target compounds. The transporter profiles were used as additional biological descriptors to develop hybrid QSAR BBB models. Results The consensus QSAR models have R2=0.638 for fivefold cross-validation and R2=0.504 for external validation. The consensus model developed by pooling chemical and transporter descriptors showed better predictivity (R2=0.646 for five-fold cross-validation and R2=0.526 for external validation). Moreover, several external bio-assays that correlate with BBB permeability were identified using our automatic profiling tool. Conclusions The BBB permeability models developed in this study can be useful for early evaluation of new compounds (e.g., new drug candidates). The combination of chemical and biological descriptors shows a promising direction to improve the current traditional QSAR models. PMID:25862462
Tranpsort phenomena in solidification processing of functionally graded materials
NASA Astrophysics Data System (ADS)
Gao, Juwen
A combined numerical and experimental study of the transport phenomena during solidification processing of metal matrix composite functionally graded materials (FGMs) is conducted in this work. A multiphase transport model for the solidification of metal-matrix composite FGMs has been developed that accounts for macroscopic particle segregation due to liquid-particle flow and particle-solid interactions. An experimental study has also been conducted to gain physical insight as well as to validate the model. A novel method to in-situ measure the particle volume fraction using fiber optic probes is developed for transparent analogue solidification systems. The model is first applied to one-dimensional pure matrix FGM solidification under gravity or centrifugal field and is extensively validated against the experimental results. The mechanisms for the formation of particle concentration gradient are identified. Two-dimensional solidification of pure matrix FGM with convection is then studied using the model as well as experiments. The interaction among convection flow, solidification process and the particle transport is demonstrated. The results show the importance of convection in the particle concentration gradient formation. Then, simulations for alloy FGM solidification are carried out for unidirectional solidification as well as two-dimensional solidification with convection. The interplay among heat and species transport, convection and particle motion is investigated. Finally, future theoretical and experimental work is outlined.
Utilizing Metalized Fabrics for Liquid and Rip Detection and Localization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Stephen; Mahan, Cody; Kuhn, Michael J
2013-01-01
This paper proposes a novel technique for utilizing conductive textiles as a distributed sensor for detecting and localizing liquids (e.g., blood), rips (e.g., bullet holes), and potentially biosignals. The proposed technique is verified through both simulation and experimental measurements. Circuit theory is utilized to depict conductive fabric as a bounded, near-infinite grid of resistors. Solutions to the well-known infinite resistance grid problem are used to confirm the accuracy and validity of this modeling approach. Simulations allow for discontinuities to be placed within the resistor matrix to illustrate the effects of bullet holes within the fabric. A real-time experimental system wasmore » developed that uses a multiplexed Wheatstone bridge approach to reconstruct the resistor grid across the conductive fabric and detect liquids and rips. The resistor grid model is validated through a comparison of simulated and experimental results. Results suggest accuracy proportional to the electrode spacing in determining the presence and location of discontinuities in conductive fabric samples. Future work is focused on refining the experimental system to provide more accuracy in detecting and localizing events as well as developing a complete prototype that can be deployed for field testing. Potential applications include intelligent clothing, flexible, lightweight sensing systems, and combat wound detection.« less
Model-Based Experimental Development of Passive Compliant Robot Legs from Fiberglass Composites
Lin, Shang-Chang; Hu, Chia-Jui; Lin, Pei-Chun
2015-01-01
We report on the methodology of developing compliant, half-circular, and composite robot legs with designable stiffness. First, force-displacement experiments on flat cantilever composites made by one or multifiberglass cloths are executed. By mapping the cantilever mechanics to the virtual spring model, the equivalent elastic moduli of the composites can be derived. Next, by using the model that links the curved beam mechanics back to the virtual spring, the resultant stiffness of the composite in a half-circular shape can be estimated without going through intensive experimental tryouts. The overall methodology has been experimentally validated, and the fabricated composites were used on a hexapod robot to perform walking and leaping behaviors. PMID:27065748
Design and Implementation of a Smart Home System Using Multisensor Data Fusion Technology.
Hsu, Yu-Liang; Chou, Po-Huan; Chang, Hsing-Cheng; Lin, Shyan-Lung; Yang, Shih-Chin; Su, Heng-Yi; Chang, Chih-Chien; Cheng, Yuan-Sheng; Kuo, Yu-Chen
2017-07-15
This paper aims to develop a multisensor data fusion technology-based smart home system by integrating wearable intelligent technology, artificial intelligence, and sensor fusion technology. We have developed the following three systems to create an intelligent smart home environment: (1) a wearable motion sensing device to be placed on residents' wrists and its corresponding 3D gesture recognition algorithm to implement a convenient automated household appliance control system; (2) a wearable motion sensing device mounted on a resident's feet and its indoor positioning algorithm to realize an effective indoor pedestrian navigation system for smart energy management; (3) a multisensor circuit module and an intelligent fire detection and alarm algorithm to realize a home safety and fire detection system. In addition, an intelligent monitoring interface is developed to provide in real-time information about the smart home system, such as environmental temperatures, CO concentrations, communicative environmental alarms, household appliance status, human motion signals, and the results of gesture recognition and indoor positioning. Furthermore, an experimental testbed for validating the effectiveness and feasibility of the smart home system was built and verified experimentally. The results showed that the 3D gesture recognition algorithm could achieve recognition rates for automated household appliance control of 92.0%, 94.8%, 95.3%, and 87.7% by the 2-fold cross-validation, 5-fold cross-validation, 10-fold cross-validation, and leave-one-subject-out cross-validation strategies. For indoor positioning and smart energy management, the distance accuracy and positioning accuracy were around 0.22% and 3.36% of the total traveled distance in the indoor environment. For home safety and fire detection, the classification rate achieved 98.81% accuracy for determining the conditions of the indoor living environment.
Design and Implementation of a Smart Home System Using Multisensor Data Fusion Technology
Chou, Po-Huan; Chang, Hsing-Cheng; Lin, Shyan-Lung; Yang, Shih-Chin; Su, Heng-Yi; Chang, Chih-Chien; Cheng, Yuan-Sheng; Kuo, Yu-Chen
2017-01-01
This paper aims to develop a multisensor data fusion technology-based smart home system by integrating wearable intelligent technology, artificial intelligence, and sensor fusion technology. We have developed the following three systems to create an intelligent smart home environment: (1) a wearable motion sensing device to be placed on residents’ wrists and its corresponding 3D gesture recognition algorithm to implement a convenient automated household appliance control system; (2) a wearable motion sensing device mounted on a resident’s feet and its indoor positioning algorithm to realize an effective indoor pedestrian navigation system for smart energy management; (3) a multisensor circuit module and an intelligent fire detection and alarm algorithm to realize a home safety and fire detection system. In addition, an intelligent monitoring interface is developed to provide in real-time information about the smart home system, such as environmental temperatures, CO concentrations, communicative environmental alarms, household appliance status, human motion signals, and the results of gesture recognition and indoor positioning. Furthermore, an experimental testbed for validating the effectiveness and feasibility of the smart home system was built and verified experimentally. The results showed that the 3D gesture recognition algorithm could achieve recognition rates for automated household appliance control of 92.0%, 94.8%, 95.3%, and 87.7% by the 2-fold cross-validation, 5-fold cross-validation, 10-fold cross-validation, and leave-one-subject-out cross-validation strategies. For indoor positioning and smart energy management, the distance accuracy and positioning accuracy were around 0.22% and 3.36% of the total traveled distance in the indoor environment. For home safety and fire detection, the classification rate achieved 98.81% accuracy for determining the conditions of the indoor living environment. PMID:28714884
Generation, Analysis and Characterization of Anisotropic Engineered Meta Materials
NASA Astrophysics Data System (ADS)
Trifale, Ninad T.
A methodology for a systematic generation of highly anisotropic micro-lattice structures was investigated. Multiple algorithms for generation and validation of engineered structures are developed and evaluated. Set of all possible permutations of structures for an 8-node cubic unit cell were considered and the degree of anisotropy of meta-properties in heat transport and mechanical elasticity were evaluated. Feasibility checks were performed to ensure that the generated unit cell network was repeatable and a continuous lattice structure. Four different strategies for generating permutations of the structures are discussed. Analytical models were developed to predict effective thermal, mechanical and permeability characteristics of these cellular structures.Experimentation and numerical modeling techniques were used to validate the models that are developed. A self-consistent mechanical elasticity model was developed which connects the meso-scale properties to stiffness of individual struts. A three dimensional thermal resistance network analogy was used to evaluate the effective thermal conductivity of the structures. The struts were modeled as a network of one dimensional thermal resistive elements and effective conductivity evaluated. Models were validated against numerical simulations and experimental measurements on 3D printed samples. Model was developed to predict effective permeability of these engineered structures based on Darcy's law. Drag coefficients were evaluated for individual connections in transverse and longitudinal directions and an interaction term was calibrated from the experimental data in literature in order to predict permeability. Generic optimization framework coupled to finite element solver is developed for analyzing any application involving use of porous structures. An objective functions were generated structure to address frequently observed trade-off between the stiffness, thermal conductivity, permeability and porosity. Three application were analyzed for potential use of engineered materials. Heat spreader application involving thermal and mechanical constraints, artificial bone grafts application involving mechanical and permeability constraints and structural materials applications involving mechanical, thermal and porosity constraints is analyzed. Recommendations for optimum topologies for specific operating conditions are provided.
Development and Validation of Accident Models for FeCrAl Cladding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamble, Kyle Allan Lawrence; Hales, Jason Dean
2016-08-01
The purpose of this milestone report is to present the work completed in regards to material model development for FeCrAl cladding and highlight the results of applying these models to Loss of Coolant Accidents (LOCA) and Station Blackouts (SBO). With the limited experimental data available (essentially only the data used to create the models) true validation is not possible. In the absence of another alternative, qualitative comparisons during postulated accident scenarios between FeCrAl and Zircaloy-4 cladded rods have been completed demonstrating the superior performance of FeCrAl.
Reduced and Validated Kinetic Mechanisms for Hydrogen-CO-sir Combustion in Gas Turbines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yiguang Ju; Frederick Dryer
2009-02-07
Rigorous experimental, theoretical, and numerical investigation of various issues relevant to the development of reduced, validated kinetic mechanisms for synthetic gas combustion in gas turbines was carried out - including the construction of new radiation models for combusting flows, improvement of flame speed measurement techniques, measurements and chemical kinetic analysis of H{sub 2}/CO/CO{sub 2}/O{sub 2}/diluent mixtures, revision of the H{sub 2}/O{sub 2} kinetic model to improve flame speed prediction capabilities, and development of a multi-time scale algorithm to improve computational efficiency in reacting flow simulations.
Quantitative bioanalysis of strontium in human serum by inductively coupled plasma-mass spectrometry
Somarouthu, Srikanth; Ohh, Jayoung; Shaked, Jonathan; Cunico, Robert L; Yakatan, Gerald; Corritori, Suzana; Tami, Joe; Foehr, Erik D
2015-01-01
Aim: A bioanalytical method using inductively-coupled plasma-mass spectrometry to measure endogenous levels of strontium in human serum was developed and validated. Results & methodology: This article details the experimental procedures used for the method development and validation thus demonstrating the application of the inductively-coupled plasma-mass spectrometry method for quantification of strontium in human serum samples. The assay was validated for specificity, linearity, accuracy, precision, recovery and stability. Significant endogenous levels of strontium are present in human serum samples ranging from 19 to 96 ng/ml with a mean of 34.6 ± 15.2 ng/ml (SD). Discussion & conclusion: Calibration procedures and sample pretreatment were simplified for high throughput analysis. The validation demonstrates that the method was sensitive, selective for quantification of strontium (88Sr) and is suitable for routine clinical testing of strontium in human serum samples. PMID:28031925
Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.
Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B
2018-01-01
The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.
NASA Astrophysics Data System (ADS)
Stigliano, Robert Vincent
The use of magnetic nanoparticles (mNPs) to induce local hyperthermia has been emerging in recent years as a promising cancer therapy, in both a stand-alone and combination treatment setting, including surgery radiation and chemotherapy. The mNP solution can be injected either directly into the tumor, or administered intravenously. Studies have shown that some cancer cells associate with, internalize, and aggregate mNPs more preferentially than normal cells, with and without antibody targeting. Once the mNPs are delivered inside the cells, a low frequency (30-300kHz) alternating electromagnetic field is used to activate the mNPs. The nanoparticles absorb the applied field and provide localized heat generation at nano-micron scales. Treatment planning models have been shown to improve treatment efficacy in radiation therapy by limiting normal tissue damage while maximizing dose to the tumor. To date, there does not exist a clinical treatment planning model for magnetic nanoparticle hyperthermia which is robust, validated, and commercially available. The focus of this research is on the development and experimental validation of a treatment planning model, consisting of a coupled electromagnetic and thermal model that predicts dynamic thermal distributions during treatment. When allowed to incubate, the mNPs are often sequestered by cancer cells and packed into endosomes. The proximity of the mNPs has a strong influence on their ability to heat due to interparticle magnetic interaction effects. A model of mNP heating which takes into account the effects of magnetic interaction was developed, and validated against experimental data. An animal study in mice was conducted to determine the effects of mNP solution injection duration and PEGylation on macroscale mNP distribution within the tumor, in order to further inform the treatment planning model and future experimental technique. In clinical applications, a critical limiting factor for the maximum applied field is the heating caused by eddy currents, which are induced in the noncancerous tissue. Phantom studies were conducted to validate the ability of the model to accurately predict eddy current heating in the case of zero blood perfusion, and preliminary data was collected to show the validity of the model in live mice to incorporate blood perfusion.
López, Diego M; Blobel, Bernd; Gonzalez, Carolina
2010-01-01
Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.
Supersonic Retro-Propulsion Experimental Design for Computational Fluid Dynamics Model Validation
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Laws, Christopher T.; Kleb, W. L.; Rhode, Matthew N.; Spells, Courtney; McCrea, Andrew C.; Truble, Kerry A.; Schauerhamer, Daniel G.; Oberkampf, William L.
2011-01-01
The development of supersonic retro-propulsion, an enabling technology for heavy payload exploration missions to Mars, is the primary focus for the present paper. A new experimental model, intended to provide computational fluid dynamics model validation data, was recently designed for the Langley Research Center Unitary Plan Wind Tunnel Test Section 2. Pre-test computations were instrumental for sizing and refining the model, over the Mach number range of 2.4 to 4.6, such that tunnel blockage and internal flow separation issues would be minimized. A 5-in diameter 70-deg sphere-cone forebody, which accommodates up to four 4:1 area ratio nozzles, followed by a 10-in long cylindrical aftbody was developed for this study based on the computational results. The model was designed to allow for a large number of surface pressure measurements on the forebody and aftbody. Supplemental data included high-speed Schlieren video and internal pressures and temperatures. The run matrix was developed to allow for the quantification of various sources of experimental uncertainty, such as random errors due to run-to-run variations and bias errors due to flow field or model misalignments. Some preliminary results and observations from the test are presented, although detailed analyses of the data and uncertainties are still on going.
Computational Modeling and Validation for Hypersonic Inlets
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.
1996-01-01
Hypersonic inlet research activity at NASA is reviewed. The basis for the paper is the experimental tests performed with three inlets: the NASA Lewis Research Center Mach 5, the McDonnell Douglas Mach 12, and the NASA Langley Mach 18. Both three-dimensional PNS and NS codes have been used to compute the flow within the three inlets. Modeling assumptions in the codes involve the turbulence model, the nature of the boundary layer, shock wave-boundary layer interaction, and the flow spilled to the outside of the inlet. Use of the codes and the experimental data are helping to develop a clearer understanding of the inlet flow physics and to focus on the modeling improvements required in order to arrive at validated codes.
NASA Astrophysics Data System (ADS)
Crâştiu, I.; Nyaguly, E.; Deac, S.; Gozman-Pop, C.; Bârgău, A.; Bereteu, L.
2018-01-01
The purpose of this paper is the development and validation of an impulse excitation technique to determine flexural critical speeds of a single rotor shaft and multy-rotor shaft. The experimental measurement of the vibroacoustic response is carried out by using a condenser microphone as a transducer. By the means of Modal Analysis using Finite Element Method (FEM), the natural frequencies and shape modes of one rotor and three rotor specimens are determined. The vibration responses of the specimens, in simple supported conditions, are carried out using algorithms based on Fast Fourier Transform (FFT). To validate the results of the modal parameters estimated using Finite Element Analysis (FEA) these are compared with experimental ones.
NASA Astrophysics Data System (ADS)
L'Hostis, V.; Brunet, C.; Poupard, O.; Petre-Lazar, I.
2006-11-01
Several ageing models are available for the prediction of the mechanical consequences of rebar corrosion. They are used for service life prediction of reinforced concrete structures. Concerning corrosion diagnosis of reinforced concrete, some Non Destructive Testing (NDT) tools have been developed, and have been in use for some years. However, these developments require validation on existing concrete structures. The French project “Benchmark des Poutres de la Rance” contributes to this aspect. It has two main objectives: (i) validation of mechanical models to estimate the influence of rebar corrosion on the load bearing capacity of a structure, (ii) qualification of the use of the NDT results to collect information on steel corrosion within reinforced-concrete structures. Ten French and European institutions from both academic research laboratories and industrial companies contributed during the years 2004 and 2005. This paper presents the project that was divided into several work packages: (i) the reinforced concrete beams were characterized from non-destructive testing tools, (ii) the mechanical behaviour of the beams was experimentally tested, (iii) complementary laboratory analysis were performed and (iv) finally numerical simulations results were compared to the experimental results obtained with the mechanical tests.
DOT National Transportation Integrated Search
2014-10-01
This research program develops and validates structural design guidelines and details for concrete bridge decks with : corrosion-resistant reinforcing (CRR) bars. A two-phase experimental program was conducted where a control test set consistent : wi...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, B; Keall, P; Holloway, L
Purpose: MRI guided radiation therapy (MRIgRT) is a rapidly growing field; however, Linac operation in MRI fringe fields represents an ongoing challenge. We have previously shown in-silico that Linacs could be redesigned to function in the in-line orientation with no magnetic shielding by adopting an RF-gun configuration. Other authors have also published insilico studies of Linac operation in magnetic fields; however to date no experimental validation data is published. This work details the design, construction, and installation of an experimental beam line to validate our in-silico results. Methods: An RF-gun comprising 1.5 accelerating cells and capable of generating electron energiesmore » up to 3.2MeV is used. The experimental apparatus was designed to monitor both beam current (toroid current monitor), spot size (two phosphor screens with viewports), and generate peak magnetic fields of at least 1000G (three variable current electromagnetic coils). Thermal FEM simulations were developed to ensure coil temperature remained within 100degC. Other design considerations included beam disposal, vacuum maintenance, radiation shielding, earthquake safety, and machine protection interlocks. Results: The beam line has been designed, built, and installed in a radiation shielded bunker. Water cooling, power supplies, thermo-couples, cameras, and radiation shielding have been successfully connected and tested. Interlock testing, vacuum processing, and RF processing have been successfully completed. The first beam on is expected within weeks. The coil heating simulations show that with care, peak fields of up to 1200G (320G at cathode) can be produced using 40A current, which is well within the fields expected for MRI-Linac systems. The maximum coil temperature at this current was 84degC after 6 minutes. Conclusion: An experimental beam line has been constructed and installed at SLAC in order to experimentally characterise RF gun performance in in-line magnetic fields, validate in-silico design work, and provide the first published experimental data relating to accelerator functionality for MRIgRT.« less
A Nonparametric Statistical Approach to the Validation of Computer Simulation Models
1985-11-01
Ballistic Research Laboratory, the Experimental Design and Analysis Branch of the Systems Engineering and Concepts Analysis Division was funded to...2 Winter. E M. Wisemiler. D P. azd UjiharmJ K. Venrgcation ad Validatiot of Engineering Simulatiots with Minimal D2ta." Pmeedinr’ of the 1976 Summer...used by numerous authors. Law%6 has augmented their approach with specific suggestions for each of the three stage’s: 1. develop high face-validity
Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner
NASA Astrophysics Data System (ADS)
Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.
2015-02-01
Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.
A model of fluid and solute exchange in the human: validation and implications.
Bert, J L; Gyenge, C C; Bowen, B D; Reed, R K; Lund, T
2000-11-01
In order to understand better the complex, dynamic behaviour of the redistribution and exchange of fluid and solutes administered to normal individuals or to those with acute hypovolemia, mathematical models are used in addition to direct experimental investigation. Initial validation of a model developed by our group involved data from animal experiments (Gyenge, C.C., Bowen, B.D., Reed, R.K. & Bert, J.L. 1999b. Am J Physiol 277 (Heart Circ Physiol 46), H1228-H1240). For a first validation involving humans, we compare the results of simulations with a wide range of different types of data from two experimental studies. These studies involved administration of normal saline or hypertonic saline with Dextran to both normal and 10% haemorrhaged subjects. We compared simulations with data including the dynamic changes in plasma and interstitial fluid volumes VPL and VIT respectively, plasma and interstitial colloid osmotic pressures PiPL and PiIT respectively, haematocrit (Hct), plasma solute concentrations and transcapillary flow rates. The model predictions were overall in very good agreement with the wide range of experimental results considered. Based on the conditions investigated, the model was also validated for humans. We used the model both to investigate mechanisms associated with the redistribution and transport of fluid and solutes administered following a mild haemorrhage and to speculate on the relationship between the timing and amount of fluid infusions and subsequent blood volume expansion.
Development of Officer Selection Battery Forms 3 and 4. Technical Report 603.
ERIC Educational Resources Information Center
Fischl, M. A.; And Others
This report describes the development, standardization, and validation of two parallel forms of the Officer Selection Battery, a 2-hour, group administrable, paper and pencil test for assessing men and women applying for the Reserve Officers Training Corps (ROTC). Based on an extensive job analysis, 1,400 experimental items in 12 job areas were…
Machine learning, medical diagnosis, and biomedical engineering research - commentary.
Foster, Kenneth R; Koprowski, Robert; Skufca, Joseph D
2014-07-05
A large number of papers are appearing in the biomedical engineering literature that describe the use of machine learning techniques to develop classifiers for detection or diagnosis of disease. However, the usefulness of this approach in developing clinically validated diagnostic techniques so far has been limited and the methods are prone to overfitting and other problems which may not be immediately apparent to the investigators. This commentary is intended to help sensitize investigators as well as readers and reviewers of papers to some potential pitfalls in the development of classifiers, and suggests steps that researchers can take to help avoid these problems. Building classifiers should be viewed not simply as an add-on statistical analysis, but as part and parcel of the experimental process. Validation of classifiers for diagnostic applications should be considered as part of a much larger process of establishing the clinical validity of the diagnostic technique.
QSAR modeling: where have you been? Where are you going to?
Cherkasov, Artem; Muratov, Eugene N; Fourches, Denis; Varnek, Alexandre; Baskin, Igor I; Cronin, Mark; Dearden, John; Gramatica, Paola; Martin, Yvonne C; Todeschini, Roberto; Consonni, Viviana; Kuz'min, Victor E; Cramer, Richard; Benigni, Romualdo; Yang, Chihae; Rathman, James; Terfloth, Lothar; Gasteiger, Johann; Richard, Ann; Tropsha, Alexander
2014-06-26
Quantitative structure-activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists toward collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR studies, as well as encourage the use of high quality, validated QSARs for regulatory decision making.
QSAR Modeling: Where have you been? Where are you going to?
Cherkasov, Artem; Muratov, Eugene N.; Fourches, Denis; Varnek, Alexandre; Baskin, Igor I.; Cronin, Mark; Dearden, John; Gramatica, Paola; Martin, Yvonne C.; Todeschini, Roberto; Consonni, Viviana; Kuz'min, Victor E.; Cramer, Richard; Benigni, Romualdo; Yang, Chihae; Rathman, James; Terfloth, Lothar; Gasteiger, Johann; Richard, Ann; Tropsha, Alexander
2014-01-01
Quantitative Structure-Activity Relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss: (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists towards collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR studies, as well as encourage the use of high quality, validated QSARs for regulatory decision making. PMID:24351051
A scale for measuring hygiene behavior: development, reliability and validity.
Stevenson, Richard J; Case, Trevor I; Hodgson, Deborah; Porzig-Drummond, Renata; Barouei, Javad; Oaten, Megan J
2009-09-01
There is currently no general self-report measure for assessing hygiene behavior. This article details the development and testing of such a measure. In studies 1 to 4, a total of 855 participants were used for scale and subscale development and for reliability and validity testing. The latter involved establishing the relationships between self-reported hygiene behavior and existing measures, hand hygiene behavior, illness rates, and a physiological marker of immune function. In study 5, a total of 507 participants were used to assess the psychometric properties of the final revised version of the scale. The final 23-item scale comprised 5 subscales: general, household, food-related, handwashing technique, and personal hygiene. Studies 1 to 4 confirmed the scale's reliability and validity, and study 5 confirmed the scale's 5-factor structure. The scale is potentially suitable for multiple uses, in various settings, and for experimental and correlational approaches.
Niu, Ran; Skliar, Mikhail
2012-07-01
In this paper, we develop and validate a method to identify computationally efficient site- and patient-specific models of ultrasound thermal therapies from MR thermal images. The models of the specific absorption rate of the transduced energy and the temperature response of the therapy target are identified in the reduced basis of proper orthogonal decomposition of thermal images, acquired in response to a mild thermal test excitation. The method permits dynamic reidentification of the treatment models during the therapy by recursively utilizing newly acquired images. Such adaptation is particularly important during high-temperature therapies, which are known to substantially and rapidly change tissue properties and blood perfusion. The developed theory was validated for the case of focused ultrasound heating of a tissue phantom. The experimental and computational results indicate that the developed approach produces accurate low-dimensional treatment models despite temporal and spatial noises in MR images and slow image acquisition rate.
2012-08-01
U0=15m/s, Lv =350m Cloud Wind and Clear Sky Gust Simulation Using Dryden PSD* Harvested Energy from Normal Vibration (Red) to...energy control law based on limited energy constraints 4) Experimentally validated simultaneous energy harvesting and vibration control Summary...Experimental Characterization and Validation of Simultaneous Gust Alleviation and Energy Harvesting for Multifunctional Wing Spars AFOSR
Benson, James D; Benson, Charles T; Critser, John K
2014-08-01
Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3×3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87±0.06 (mean ± SD). Only the treatment variable of perfusing solution was found to be significant (p<0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jammes, C.; Filliatre, P.; De Izarra, G.
The neutron flux monitoring system of the French GEN-IV sodium-cooled fast reactor will rely on high temperature fission chambers installed in the reactor vessel and capable of operating over a wide-range neutron flux. The definition of such a system is presented and the technological solutions are justified with the use of simulation and experimental results. (authors)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberhardt, Matthew A.; Zarecki, Raphy; Reshef, Leah
Recent insights suggest that non-specific and/or promiscuous enzymes are common and active across life. Understanding the role of such enzymes is an important open question in biology. Here we develop a genome-wide method, PROPER, that uses a permissive PSI-BLAST approach to predict promiscuous activities of metabolic genes. Enzyme promiscuity is typically studied experimentally using multicopy suppression, in which over-expression of a promiscuous ‘replacer’ gene rescues lethality caused by inactivation of a ‘target’ gene. We use PROPER to predict multicopy suppression in Escherichia coli, achieving highly significant overlap with published cases (hypergeometric p = 4.4e-13). We then validate three novel predictedmore » target-replacer gene pairs in new multicopy suppression experiments. We next go beyond PROPER and develop a network-based approach, GEM-PROPER, that integrates PROPER with genome-scale metabolic modeling to predict promiscuous replacements via alternative metabolic pathways. GEM-PROPER predicts a new indirect replacer (thiG) for an essential enzyme (pdxB) in production of pyridoxal 5’-phosphate (the active form of Vitamin B 6), which we validate experimentally via multicopy suppression. Here, we perform a structural analysis of thiG to determine its potential promiscuous active site, which we validate experimentally by inactivating the pertaining residues and showing a loss of replacer activity. Thus, this study is a successful example where a computational investigation leads to a network-based identification of an indirect promiscuous replacement of a key metabolic enzyme, which would have been extremely difficult to identify directly.« less
Experimental and computational surface and flow-field results for an all-body hypersonic aircraft
NASA Technical Reports Server (NTRS)
Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.
1990-01-01
The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.
Progressive collapse of a two-story reinforced concrete frame with embedded smart aggregates
NASA Astrophysics Data System (ADS)
Laskar, Arghadeep; Gu, Haichang; Mo, Y. L.; Song, Gangbing
2009-07-01
This paper reports the experimental and analytical results of a two-story reinforced concrete frame instrumented with innovative piezoceramic-based smart aggregates (SAs) and subjected to a monotonic lateral load up to failure. A finite element model of the frame is developed and analyzed using a computer program called Open system for earthquake engineering simulation (OpenSees). The finite element analysis (FEA) is used to predict the load-deformation curve as well as the development of plastic hinges in the frame. The load-deformation curve predicted from FEA matched well with the experimental results. The sequence of development of plastic hinges in the frame is also studied from the FEA results. The locations of the plastic hinges, as obtained from the analysis, were similar to those observed during the experiment. An SA-based approach is also proposed to evaluate the health status of the concrete frame and identify the development of plastic hinges during the loading procedure. The results of the FEA are used to validate the SA-based approach for detecting the locations and occurrence of the plastic hinges leading to the progressive collapse of the frame. The locations and sequential development of the plastic hinges obtained from the SA-based approach corresponds well with the FEA results. The proposed SA-based approach, thus validated using FEA and experimental results, has a great potential to be applied in the health monitoring of large-scale civil infrastructures.
2017-09-01
VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY
Improved Method for Linear B-Cell Epitope Prediction Using Antigen’s Primary Sequence
Raghava, Gajendra P. S.
2013-01-01
One of the major challenges in designing a peptide-based vaccine is the identification of antigenic regions in an antigen that can stimulate B-cell’s response, also called B-cell epitopes. In the past, several methods have been developed for the prediction of conformational and linear (or continuous) B-cell epitopes. However, the existing methods for predicting linear B-cell epitopes are far from perfection. In this study, an attempt has been made to develop an improved method for predicting linear B-cell epitopes. We have retrieved experimentally validated B-cell epitopes as well as non B-cell epitopes from Immune Epitope Database and derived two types of datasets called Lbtope_Variable and Lbtope_Fixed length datasets. The Lbtope_Variable dataset contains 14876 B-cell epitope and 23321 non-epitopes of variable length where as Lbtope_Fixed length dataset contains 12063 B-cell epitopes and 20589 non-epitopes of fixed length. We also evaluated the performance of models on above datasets after removing highly identical peptides from the datasets. In addition, we have derived third dataset Lbtope_Confirm having 1042 epitopes and 1795 non-epitopes where each epitope or non-epitope has been experimentally validated in at least two studies. A number of models have been developed to discriminate epitopes and non-epitopes using different machine-learning techniques like Support Vector Machine, and K-Nearest Neighbor. We achieved accuracy from ∼54% to 86% using diverse s features like binary profile, dipeptide composition, AAP (amino acid pair) profile. In this study, for the first time experimentally validated non B-cell epitopes have been used for developing method for predicting linear B-cell epitopes. In previous studies, random peptides have been used as non B-cell epitopes. In order to provide service to scientific community, a web server LBtope has been developed for predicting and designing B-cell epitopes (http://crdd.osdd.net/raghava/lbtope/). PMID:23667458
Frequency Response Function Based Damage Identification for Aerospace Structures
NASA Astrophysics Data System (ADS)
Oliver, Joseph Acton
Structural health monitoring technologies continue to be pursued for aerospace structures in the interests of increased safety and, when combined with health prognosis, efficiency in life-cycle management. The current dissertation develops and validates damage identification technology as a critical component for structural health monitoring of aerospace structures and, in particular, composite unmanned aerial vehicles. The primary innovation is a statistical least-squares damage identification algorithm based in concepts of parameter estimation and model update. The algorithm uses frequency response function based residual force vectors derived from distributed vibration measurements to update a structural finite element model through statistically weighted least-squares minimization producing location and quantification of the damage, estimation uncertainty, and an updated model. Advantages compared to other approaches include robust applicability to systems which are heavily damped, large, and noisy, with a relatively low number of distributed measurement points compared to the number of analytical degrees-of-freedom of an associated analytical structural model (e.g., modal finite element model). Motivation, research objectives, and a dissertation summary are discussed in Chapter 1 followed by a literature review in Chapter 2. Chapter 3 gives background theory and the damage identification algorithm derivation followed by a study of fundamental algorithm behavior on a two degree-of-freedom mass-spring system with generalized damping. Chapter 4 investigates the impact of noise then successfully proves the algorithm against competing methods using an analytical eight degree-of-freedom mass-spring system with non-proportional structural damping. Chapter 5 extends use of the algorithm to finite element models, including solutions for numerical issues, approaches for modeling damping approximately in reduced coordinates, and analytical validation using a composite sandwich plate model. Chapter 6 presents the final extension to experimental systems-including methods for initial baseline correlation and data reduction-and validates the algorithm on an experimental composite plate with impact damage. The final chapter deviates from development and validation of the primary algorithm to discuss development of an experimental scaled-wing test bed as part of a collaborative effort for developing structural health monitoring and prognosis technology. The dissertation concludes with an overview of technical conclusions and recommendations for future work.
DOT National Transportation Integrated Search
2009-10-01
In this study, the concept of the hybrid FRP-concrete structural systems was applied to both bridge : superstructure and deck systems. Results from the both experimental and computational analysis for : both the hybrid bridge superstructure and deck ...
Bio-Optical Measurement and Modeling of the California Current and Southern Oceans
NASA Technical Reports Server (NTRS)
Mitchell, B. Gregg; Mitchell, B. Greg
2003-01-01
The SIMBIOS project's principal goals are to validate standard or experimental ocean color products through detailed bio-optical and biogeochemical measurements, and to combine Ocean optical observations with modeling to contribute to satellite vicarious radiometric calibration and algorithm development.
A proposed new test for aptitude screening of air traffic controller applicants.
DOT National Transportation Integrated Search
1972-05-01
The study concerns the development and experimental validation of a novel aptitude test, referred to as 'Directional Headings' (or DHT), for the selection of Air Traffic Control Specialist (ATCS) trainees. The test requires the subject to rapidly int...
Performance Validation Approach for the GTX Air-Breathing Launch Vehicle
NASA Technical Reports Server (NTRS)
Trefny, Charles J.; Roche, Joseph M.
2002-01-01
The primary objective of the GTX effort is to determine whether or not air-breathing propulsion can enable a launch vehicle to achieve orbit in a single stage. Structural weight, vehicle aerodynamics, and propulsion performance must be accurately known over the entire flight trajectory in order to make a credible assessment. Structural, aerodynamic, and propulsion parameters are strongly interdependent, which necessitates a system approach to design, evaluation, and optimization of a single-stage-to-orbit concept. The GTX reference vehicle serves this purpose, by allowing design, development, and validation of components and subsystems in a system context. The reference vehicle configuration (including propulsion) was carefully chosen so as to provide high potential for structural and volumetric efficiency, and to allow the high specific impulse of air-breathing propulsion cycles to be exploited. Minor evolution of the configuration has occurred as analytical and experimental results have become available. With this development process comes increasing validation of the weight and performance levels used in system performance determination. This paper presents an overview of the GTX reference vehicle and the approach to its performance validation. Subscale test rigs and numerical studies used to develop and validate component performance levels and unit structural weights are outlined. The sensitivity of the equivalent, effective specific impulse to key propulsion component efficiencies is presented. The role of flight demonstration in development and validation is discussed.
Validation of a C2-C7 cervical spine finite element model using specimen-specific flexibility data.
Kallemeyn, Nicole; Gandhi, Anup; Kode, Swathi; Shivanna, Kiran; Smucker, Joseph; Grosland, Nicole
2010-06-01
This study presents a specimen-specific C2-C7 cervical spine finite element model that was developed using multiblock meshing techniques. The model was validated using in-house experimental flexibility data obtained from the cadaveric specimen used for mesh development. The C2-C7 specimen was subjected to pure continuous moments up to +/-1.0 N m in flexion, extension, lateral bending, and axial rotation, and the motions at each level were obtained. Additionally, the specimen was divided into C2-C3, C4-C5, and C6-C7 functional spinal units (FSUs) which were tested in the intact state as well as after sequential removal of the interspinous, ligamentum flavum, and capsular ligaments. The finite element model was initially assigned baseline material properties based on the literature, but was calibrated using the experimental motion data which was obtained in-house, while utlizing the ranges of material property values as reported in the literature. The calibrated model provided good agreement with the nonlinear experimental loading curves, and can be used to further study the response of the cervical spine to various biomechanical investigations. Copyright 2010 IPEM. Published by Elsevier Ltd. All rights reserved.
The rheology of three-phase suspensions at low bubble capillary number
Truby, J. M.; Mueller, S. P.; Llewellin, E. W.; Mader, H. M.
2015-01-01
We develop a model for the rheology of a three-phase suspension of bubbles and particles in a Newtonian liquid undergoing steady flow. We adopt an ‘effective-medium’ approach in which the bubbly liquid is treated as a continuous medium which suspends the particles. The resulting three-phase model combines separate two-phase models for bubble suspension rheology and particle suspension rheology, which are taken from the literature. The model is validated against new experimental data for three-phase suspensions of bubbles and spherical particles, collected in the low bubble capillary number regime. Good agreement is found across the experimental range of particle volume fraction (0≤ϕp≲0.5) and bubble volume fraction (0≤ϕb≲0.3). Consistent with model predictions, experimental results demonstrate that adding bubbles to a dilute particle suspension at low capillarity increases its viscosity, while adding bubbles to a concentrated particle suspension decreases its viscosity. The model accounts for particle anisometry and is easily extended to account for variable capillarity, but has not been experimentally validated for these cases. PMID:25568617
A model for generating Surface EMG signal of m. Tibialis Anterior.
Siddiqi, Ariba; Kumar, Dinesh; Arjunan, Sridhar P
2014-01-01
A model that simulates surface electromyogram (sEMG) signal of m. Tibialis Anterior has been developed and tested. This has a firing rate equation that is based on experimental findings. It also has a recruitment threshold that is based on observed statistical distribution. Importantly, it has considered both, slow and fast type which has been distinguished based on their conduction velocity. This model has assumed that the deeper unipennate half of the muscle does not contribute significantly to the potential induced on the surface of the muscle and has approximated the muscle to have parallel structure. The model was validated by comparing the simulated and the experimental sEMG signal recordings. Experiments were conducted on eight subjects who performed isometric dorsiflexion at 10, 20, 30, 50, 75, and 100% maximal voluntary contraction. Normalized root mean square and median frequency of the experimental and simulated EMG signal were computed and the slopes of the linearity with the force were statistically analyzed. The gradients were found to be similar (p>0.05) for both experimental and simulated sEMG signal, validating the proposed model.
Design and Characterization of a Soft Robotic Therapeutic Glove for Rheumatoid Arthritis.
Chua, Matthew Chin Heng; Lim, Jeong Hoon; Yeow, Raye Chen Hua
2017-07-27
The modeling and experimentation of a pneumatic actuation system for the development of a soft robotic therapeutic glove is proposed in this article for the prevention of finger deformities in rheumatoid arthritis (RA) patients. The Rehabilitative Arthritis Glove (RA-Glove) is a soft robotic glove fitted with two internal inflatable actuators for lateral compression and massage of the fingers and their joints. Two mechanical models to predict the indentation and bending characteristics of the inflatable actuators based on their geometrical parameters will be presented and validated with experimental results. Experimental validation shows that the model was within a standard deviation of the experimental mean for input pressure range of 0 to 2 bars. Evaluation of the RA-Glove was also performed on six healthy human subjects. The stress distribution along the fingers of the subjects using the RA-Glove was also shown to be even and specific to the finger sizes. This article demonstrates the modeling of soft pneumatic actuators and highlights the potential of the RA-Glove as a therapeutic device for the prevention of arthritic deformities of the fingers.
Cuesta-Gragera, Ana; Navarro-Fontestad, Carmen; Mangas-Sanjuan, Victor; González-Álvarez, Isabel; García-Arieta, Alfredo; Trocóniz, Iñaki F; Casabó, Vicente G; Bermejo, Marival
2015-07-10
The objective of this paper is to apply a previously developed semi-physiologic pharmacokinetic model implemented in NONMEM to simulate bioequivalence trials (BE) of acetyl salicylic acid (ASA) in order to validate the model performance against ASA human experimental data. ASA is a drug with first-pass hepatic and intestinal metabolism following Michaelis-Menten kinetics that leads to the formation of two main metabolites in two generations (first and second generation metabolites). The first aim was to adapt the semi-physiological model for ASA in NOMMEN using ASA pharmacokinetic parameters from literature, showing its sequential metabolism. The second aim was to validate this model by comparing the results obtained in NONMEM simulations with published experimental data at a dose of 1000 mg. The validated model was used to simulate bioequivalence trials at 3 dose schemes (100, 1000 and 3000 mg) and with 6 test formulations with decreasing in vivo dissolution rate constants versus the reference formulation (kD 8-0.25 h (-1)). Finally, the third aim was to determine which analyte (parent drug, first generation or second generation metabolite) was more sensitive to changes in formulation performance. The validation results showed that the concentration-time curves obtained with the simulations reproduced closely the published experimental data, confirming model performance. The parent drug (ASA) was the analyte that showed to be more sensitive to the decrease in pharmaceutical quality, with the highest decrease in Cmax and AUC ratio between test and reference formulations. Copyright © 2015 Elsevier B.V. All rights reserved.
Complex Water Impact Visitor Information Validation and Qualification Sciences Experimental Complex Our the problem space. The Validation and Qualification Sciences Experimental Complex (VQSEC) at Sandia
Experimental Design and Some Threats to Experimental Validity: A Primer
ERIC Educational Resources Information Center
Skidmore, Susan
2008-01-01
Experimental designs are distinguished as the best method to respond to questions involving causality. The purpose of the present paper is to explicate the logic of experimental design and why it is so vital to questions that demand causal conclusions. In addition, types of internal and external validity threats are discussed. To emphasize the…
Dekker, Job; Belmont, Andrew S; Guttman, Mitchell; Leshyk, Victor O; Lis, John T; Lomvardas, Stavros; Mirny, Leonid A; O'Shea, Clodagh C; Park, Peter J; Ren, Bing; Politz, Joan C Ritland; Shendure, Jay; Zhong, Sheng
2017-09-13
The 4D Nucleome Network aims to develop and apply approaches to map the structure and dynamics of the human and mouse genomes in space and time with the goal of gaining deeper mechanistic insights into how the nucleus is organized and functions. The project will develop and benchmark experimental and computational approaches for measuring genome conformation and nuclear organization, and investigate how these contribute to gene regulation and other genome functions. Validated experimental technologies will be combined with biophysical approaches to generate quantitative models of spatial genome organization in different biological states, both in cell populations and in single cells.
Dekker, Job; Belmont, Andrew S.; Guttman, Mitchell; Leshyk, Victor O.; Lis, John T.; Lomvardas, Stavros; Mirny, Leonid A.; O’Shea, Clodagh C.; Park, Peter J.; Ren, Bing; Ritland Politz, Joan C.; Shendure, Jay; Zhong, Sheng
2017-01-01
Preface The 4D Nucleome Network aims to develop and apply approaches to map the structure and dynamics of the human and mouse genomes in space and time with the goal of gaining deeper mechanistic understanding of how the nucleus is organized and functions. The project will develop and benchmark experimental and computational approaches for measuring genome conformation and nuclear organization, and investigate how these contribute to gene regulation and other genome functions. Validated experimental approaches will be combined with biophysical modeling to generate quantitative models of spatial genome organization in different biological states, both in cell populations and in single cells. PMID:28905911
From Single-Cell Dynamics to Scaling Laws in Oncology
NASA Astrophysics Data System (ADS)
Chignola, Roberto; Sega, Michela; Stella, Sabrina; Vyshemirsky, Vladislav; Milotti, Edoardo
We are developing a biophysical model of tumor biology. We follow a strictly quantitative approach where each step of model development is validated by comparing simulation outputs with experimental data. While this strategy may slow down our advancements, at the same time it provides an invaluable reward: we can trust simulation outputs and use the model to explore territories of cancer biology where current experimental techniques fail. Here, we review our multi-scale biophysical modeling approach and show how a description of cancer at the cellular level has led us to general laws obeyed by both in vitro and in vivo tumors.
Integration of design and inspection
NASA Astrophysics Data System (ADS)
Simmonds, William H.
1990-08-01
Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.
Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael
2014-05-01
Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.
A Validation of Object-Oriented Design Metrics
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Briand, Lionel; Melo, Walcelio L.
1995-01-01
This paper presents the results of a study conducted at the University of Maryland in which we experimentally investigated the suite of Object-Oriented (00) design metrics introduced by [Chidamber and Kemerer, 1994]. In order to do this, we assessed these metrics as predictors of fault-prone classes. This study is complementary to [Lieand Henry, 1993] where the same suite of metrics had been used to assess frequencies of maintenance changes to classes. To perform our validation accurately, we collected data on the development of eight medium-sized information management systems based on identical requirements. All eight projects were developed using a sequential life cycle model, a well-known 00 analysis/design method and the C++ programming language. Based on experimental results, the advantages and drawbacks of these 00 metrics are discussed and suggestions for improvement are provided. Several of Chidamber and Kemerer's 00 metrics appear to be adequate to predict class fault-proneness during the early phases of the life-cycle. We also showed that they are, on our data set, better predictors than "traditional" code metrics, which can only be collected at a later phase of the software development processes.
On the mechanisms of secondary flows in a gas vortex unit
Niyogi, Kaustav; Torregrosa, Maria M.; Marin, Guy B.; Shtern, Vladimir N.
2018-01-01
The hydrodynamics of secondary flow phenomena in a disc‐shaped gas vortex unit (GVU) is investigated using experimentally validated numerical simulations. The simulation using ANSYS FLUENT® v.14a reveals the development of a backflow region along the core of the central gas exhaust, and of a counterflow multivortex region in the bulk of the disc part of the unit. Under the tested conditions, the GVU flow is found to be highly spiraling in nature. Secondary flow phenomena develop as swirl becomes stronger. The backflow region develops first via the swirl‐decay mechanism in the exhaust line. Near‐wall jet formation in the boundary layers near the GVU end‐walls eventually results in flow reversal in the bulk of the unit. When the jets grow stronger the counterflow becomes multivortex. The simulation results are validated with experimental data obtained from Stereoscopic Particle Image Velocimetry and surface oil visualization measurements. © 2018 The Authors AIChE Journal published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers AIChE J, 64: 1859–1873, 2018 PMID:29937545
NASA Astrophysics Data System (ADS)
Roesch, Frank; Nerb, Josef; Riess, Werner
2015-03-01
Our study investigated whether problem-oriented designed ecology lessons with phases of direct instruction and of open experimentation foster the development of cross-domain and domain-specific components of experimental problem-solving ability better than conventional lessons in science. We used a paper-and-pencil test to assess students' abilities in a quasi-experimental intervention study utilizing a pretest/posttest control-group design (N = 340; average performing sixth-grade students). The treatment group received lessons on forest ecosystems consistent with the principle of education for sustainable development. This learning environment was expected to help students enhance their ecological knowledge and their theoretical and methodological experimental competencies. Two control groups received either the teachers' usual lessons on forest ecosystems or non-specific lessons on other science topics. We found that the treatment promoted specific components of experimental problem-solving ability (generating epistemic questions, planning two-factorial experiments, and identifying correct experimental controls). However, the observed effects were small, and awareness for aspects of higher ecological experimental validity was not promoted by the treatment.
NASA Astrophysics Data System (ADS)
Chan, V. S.; Wong, C. P. C.; McLean, A. G.; Luo, G. N.; Wirth, B. D.
2013-10-01
The Xolotl code under development by PSI-SciDAC will enhance predictive modeling capability of plasma-facing materials under burning plasma conditions. The availability and application of experimental data to compare to code-calculated observables are key requirements to validate the breadth and content of physics included in the model and ultimately gain confidence in its results. A dedicated effort has been in progress to collect and organize a) a database of relevant experiments and their publications as previously carried out at sample exposure facilities in US and Asian tokamaks (e.g., DIII-D DiMES, and EAST MAPES), b) diagnostic and surface analysis capabilities available at each device, and c) requirements for future experiments with code validation in mind. The content of this evolving database will serve as a significant resource for the plasma-material interaction (PMI) community. Work supported in part by the US Department of Energy under GA-DE-SC0008698, DE-AC52-07NA27344 and DE-AC05-00OR22725.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Robert; Goudey, Howdy; Curcija, D. Charlie
Virtually every home in the US has some form of shades, blinds, drapes, or other window attachment, but few have been designed for energy savings. In order to provide a common basis of comparison for thermal performance it is important to have validated simulation tools. This study outlines a review and validation of the ISO 15099 centre-of-glass thermal transmittance correlations for naturally ventilated cavities through measurement and detailed simulations. The focus is on the impacts of room-side ventilated cavities, such as those found with solar screens and horizontal louvred blinds. The thermal transmittance of these systems is measured experimentally, simulatedmore » using computational fluid dynamics analysis, and simulated utilizing simplified correlations from ISO 15099. Finally, correlation coefficients are proposed for the ISO 15099 algorithm that reduces the mean error between measured and simulated heat flux for typical solar screens from 16% to 3.5% and from 13% to 1% for horizontal blinds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leichner, P.K.
This report summarizes research in beta-particle dosimetry, quantitative single-photon emission computed tomography (SPECT), the clinical implementation of these two areas of research in radioimmunotherapy (RIT), and postgraduate training provided since the inception of this grant on July 15, 1989. To improve beta-particle dosimetry, a point source function was developed that is valid for a wide range of beta emitters. Analytical solutions for beta-particle dose rates within out outside slabs of finite thickness were validated in experimental tumors and are now being used in clinical RIT. Quantitative SPECT based on the circular harmonic transform (CHT) algorithm was validated in phantom, experimental,more » and clinical studies. This has led to improved macrodosimetry in clinical RIT. In dosimetry at the multi-cellular level studies were made of the HepG2 human hepatoblastoma grown subcutaneously in nude mice. Histologic sections and autoradiographs were prepared to quantitate activity distributions of radiolabeled antibodies. Absorbed-dose calculations are being carried out for {sup 131}I and {sup 90}Y beta particles for these antibody distributions.« less
Hart, Robert; Goudey, Howdy; Curcija, D. Charlie
2017-05-16
Virtually every home in the US has some form of shades, blinds, drapes, or other window attachment, but few have been designed for energy savings. In order to provide a common basis of comparison for thermal performance it is important to have validated simulation tools. This study outlines a review and validation of the ISO 15099 centre-of-glass thermal transmittance correlations for naturally ventilated cavities through measurement and detailed simulations. The focus is on the impacts of room-side ventilated cavities, such as those found with solar screens and horizontal louvred blinds. The thermal transmittance of these systems is measured experimentally, simulatedmore » using computational fluid dynamics analysis, and simulated utilizing simplified correlations from ISO 15099. Finally, correlation coefficients are proposed for the ISO 15099 algorithm that reduces the mean error between measured and simulated heat flux for typical solar screens from 16% to 3.5% and from 13% to 1% for horizontal blinds.« less
WEC-SIM Validation Testing Plan FY14 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruehl, Kelley Michelle
2016-02-01
The WEC-Sim project is currently on track, having met both the SNL and NREL FY14 Milestones, as shown in Table 1 and Table 2. This is also reflected in the Gantt chart uploaded to the WEC-Sim SharePoint site in the FY14 Q4 Deliverables folder. The work completed in FY14 includes code verification through code-to-code comparison (FY14 Q1 and Q2), preliminary code validation through comparison to experimental data (FY14 Q2 and Q3), presentation and publication of the WEC-Sim project at OMAE 2014 [1], [2], [3] and GMREC/METS 2014 [4] (FY14 Q3), WEC-Sim code development and public open-source release (FY14 Q3), andmore » development of a preliminary WEC-Sim validation test plan (FY14 Q4). This report presents the preliminary Validation Testing Plan developed in FY14 Q4. The validation test effort started in FY14 Q4 and will go on through FY15. Thus far the team has developed a device selection method, selected a device, and placed a contract with the testing facility, established several collaborations including industry contacts, and have working ideas on the testing details such as scaling, device design, and test conditions.« less
Development and Initial Validation of the Pain Resilience Scale.
Slepian, P Maxwell; Ankawi, Brett; Himawan, Lina K; France, Christopher R
2016-04-01
Over the past decade, the role of positive psychology in pain experience has gained increasing attention. One such positive factor, identified as resilience, has been defined as the ability to maintain positive emotional and physical functioning despite physical or psychological adversity. Although cross-situational measures of resilience have been shown to be related to pain, it was hypothesized that a pain-specific resilience measure would serve as a stronger predictor of acute pain experience. To test this hypothesis, we conducted a series of studies to develop and validate the Pain Resilience Scale. Study 1 described exploratory and confirmatory factor analyses that support a scale with 2 distinct factors, Cognitive/Affective Positivity and Behavioral Perseverance. Study 2 showed test-retest reliability and construct validity of this new scale, including moderate positive relationships with measures of positive psychological functioning and small to moderate negative relationships with vulnerability measures such as pain catastrophizing. Finally, consistent with our initial hypothesis, study 3 showed that the Pain Resilience Scale is more strongly related to ischemic pain responses than existing measures of general resilience. Together, these studies support the predictive utility of this new pain-specific measure of resilience in the context of acute experimental pain. The Pain Resilience Scale represents a novel measure of Cognitive/Affective Positivity and Behavioral Perseverance during exposure to noxious stimuli. Construct validity is supported by expected relationships with existing pain-coping measures, and predictive validity is shown by individual differences in response to acute experimental pain. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricks, Allen; Blanchat, Thomas K.; Jernigan, Dann A.
2006-06-01
It is necessary to improve understanding and develop validation data of the heat flux incident to an object located within the fire plume for the validation of SIERRA/ FUEGO/SYRINX fire and SIERRA/CALORE. One key aspect of the validation data sets is the determination of the relative contribution of the radiative and convective heat fluxes. To meet this objective, a cylindrical calorimeter with sufficient instrumentation to measure total and radiative heat flux had been designed and fabricated. This calorimeter will be tested both in the controlled radiative environment of the Penlight facility and in a fire environment in the FLAME/Radiant Heatmore » (FRH) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisons between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. A significant question of interest to modeling heat flux incident to an object in or near a fire is the contribution of the radiation and convection modes of heat transfer. The series of experiments documented in this test plan is designed to provide data on the radiation partitioning, defined as the fraction of the total heat flux that is due to radiation.« less
NASA Astrophysics Data System (ADS)
Suwono, H.; Susanti, S.; Lestari, U.
2017-04-01
The learning activities that involve the students to learn actively is one of the characteristics of a qualified education. The learning strategy that involves students’ active learning is guided inquiry. Learning problems today are growing metacognitive skills and cognitive learning outcomes. It is the research and development of learning module by using 4D models of Thiagarajan. The first phase is Define, which analyses the problems and needs required by the prior preparation of the module. The second phase is Design, which formulates learning design and devices to obtain the initial draft of learning modules. The third stage is Develop, which is developing and writing module, module validation, product testing, revision, and the resulting an end-product results module development. The fourth stage is Disseminate, which is disseminating of the valid products. Modules were validated by education experts, practitioners, subject matter experts, and expert of online media. The results of the validation module indicated that the module was valid and could be used in teaching and learning. In the validation phase of testing methods, we used experiments to know the difference of metacognitive skills and learning outcomes between the control group and experimental group. The experimental design was a one group pretest-posttest design. The results of the data analysis showed that the modules could enhance metacognitive skills and learning outcomes. The advantages of this module is as follows, 1) module is accompanied by a video link on a website that contains practical activities that are appropriate to Curriculum 2013, 2) module is accompanied by a video link on a website that contains about manual laboratory activities that will be used in the classroom face-to-face, so that students are ready when doing laboratory activities, 3) this module can be online through chat to increase students’ understanding. The disadvantages of this module are the material presented in the modules is limited. It is suggested that for the better utilisation of the online activities, students should be present at every meeting of the activities, so as to make all the students participate actively. It is also suggested that school set up facilities to support blended learning.
A toolbox of immunoprecipitation-grade monoclonal antibodies to human transcription factors.
Venkataraman, Anand; Yang, Kun; Irizarry, Jose; Mackiewicz, Mark; Mita, Paolo; Kuang, Zheng; Xue, Lin; Ghosh, Devlina; Liu, Shuang; Ramos, Pedro; Hu, Shaohui; Bayron Kain, Diane; Keegan, Sarah; Saul, Richard; Colantonio, Simona; Zhang, Hongyan; Behn, Florencia Pauli; Song, Guang; Albino, Edisa; Asencio, Lillyann; Ramos, Leonardo; Lugo, Luvir; Morell, Gloriner; Rivera, Javier; Ruiz, Kimberly; Almodovar, Ruth; Nazario, Luis; Murphy, Keven; Vargas, Ivan; Rivera-Pacheco, Zully Ann; Rosa, Christian; Vargas, Moises; McDade, Jessica; Clark, Brian S; Yoo, Sooyeon; Khambadkone, Seva G; de Melo, Jimmy; Stevanovic, Milanka; Jiang, Lizhi; Li, Yana; Yap, Wendy Y; Jones, Brittany; Tandon, Atul; Campbell, Elliot; Montelione, Gaetano T; Anderson, Stephen; Myers, Richard M; Boeke, Jef D; Fenyö, David; Whiteley, Gordon; Bader, Joel S; Pino, Ignacio; Eichinger, Daniel J; Zhu, Heng; Blackshaw, Seth
2018-03-19
A key component of efforts to address the reproducibility crisis in biomedical research is the development of rigorously validated and renewable protein-affinity reagents. As part of the US National Institutes of Health (NIH) Protein Capture Reagents Program (PCRP), we have generated a collection of 1,406 highly validated immunoprecipitation- and/or immunoblotting-grade mouse monoclonal antibodies (mAbs) to 737 human transcription factors, using an integrated production and validation pipeline. We used HuProt human protein microarrays as a primary validation tool to identify mAbs with high specificity for their cognate targets. We further validated PCRP mAbs by means of multiple experimental applications, including immunoprecipitation, immunoblotting, chromatin immunoprecipitation followed by sequencing (ChIP-seq), and immunohistochemistry. We also conducted a meta-analysis that identified critical variables that contribute to the generation of high-quality mAbs. All validation data, protocols, and links to PCRP mAb suppliers are available at http://proteincapture.org.
Modeling Piezoelectric Stack Actuators for Control of Micromanipulation
NASA Technical Reports Server (NTRS)
Goldfarb, Michael; Celanovic, Nikola
1997-01-01
A nonlinear lumped-parameter model of a piezoelectric stack actuator has been developed to describe actuator behavior for purposes of control system analysis and design, and, in particular, for microrobotic applications requiring accurate position and/or force control. In formulating this model, the authors propose a generalized Maxwell resistive capacitor as a lumped-parameter causal representation of rate-independent hysteresis. Model formulation is validated by comparing results of numerical simulations to experimental data. Validation is followed by a discussion of model implications for purposes of actuator control.
1988-05-01
ifforiable manpower investement. On the basis of our current experience it seems that the basic design principles are valid. The system developed will... system is operational on various computer networks, and in both industrial and in research environments. The design pri,lciples for the construction of...to a useful numerical simulation and design system for very complex configurations and flows. 7. REFERENCES 1. Bartlett G. W. , "An experimental
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo
Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.
Photodetachment cross sections of negative ions - The range of validity of the Wigner threshold law
NASA Technical Reports Server (NTRS)
Farley, John W.
1989-01-01
The threshold behavior of the photodetachment cross section of negative ions as a function of photon frequency is usually described by the Wigner law. This paper reports the results of a model calculation using the zero-core-contribution (ZCC) approximation. Theoretical expressions for the leading correction to the Wigner law are developed, giving the range of validity of the Wigner law and the expected accuracy. The results are relevant to extraction of electron affinities from experimental photodetachment data.
Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R
2015-11-01
The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.
Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin
Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R
2016-01-01
The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a “complete mystical experience” that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. PMID:26442957
Funding for the 2ND IAEA technical meeting on fusion data processing, validation and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenwald, Martin
The International Atomic Energy Agency (IAEA) will organize the second Technical Meeting on Fusion Da Processing, Validation and Analysis from 30 May to 02 June, 2017, in Cambridge, MA USA. The meeting w be hosted by the MIT Plasma Science and Fusion Center (PSFC). The objective of the meeting is to provide a platform where a set of topics relevant to fusion data processing, validation and analysis are discussed with the view of extrapolation needs to next step fusion devices such as ITER. The validation and analysis of experimental data obtained from diagnostics used to characterize fusion plasmas are crucialmore » for a knowledge based understanding of the physical processes governing the dynamics of these plasmas. The meeting will aim at fostering, in particular, discussions of research and development results that set out or underline trends observed in the current major fusion confinement devices. General information on the IAEA, including its mission and organization, can be found at the IAEA websit Uncertainty quantification (UQ) Model selection, validation, and verification (V&V) Probability theory and statistical analysis Inverse problems & equilibrium reconstru ction Integrated data analysis Real time data analysis Machine learning Signal/image proc essing & pattern recognition Experimental design and synthetic diagnostics Data management« less
Computational Modeling of Micrometastatic Breast Cancer Radiation Dose Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Daniel L.; Debeb, Bisrat G.; Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas
Purpose: Prophylactic cranial irradiation (PCI) involves giving radiation to the entire brain with the goals of reducing the incidence of brain metastasis and improving overall survival. Experimentally, we have demonstrated that PCI prevents brain metastases in a breast cancer mouse model. We developed a computational model to expand on and aid in the interpretation of our experimental results. Methods and Materials: MATLAB was used to develop a computational model of brain metastasis and PCI in mice. Model input parameters were optimized such that the model output would match the experimental number of metastases per mouse from the unirradiated group. Anmore » independent in vivo–limiting dilution experiment was performed to validate the model. The effect of whole brain irradiation at different measurement points after tumor cells were injected was evaluated in terms of the incidence, number of metastases, and tumor burden and was then compared with the corresponding experimental data. Results: In the optimized model, the correlation between the number of metastases per mouse and the experimental fits was >95. Our attempt to validate the model with a limiting dilution assay produced 99.9% correlation with respect to the incidence of metastases. The model accurately predicted the effect of whole-brain irradiation given 3 weeks after cell injection but substantially underestimated its effect when delivered 5 days after cell injection. The model further demonstrated that delaying whole-brain irradiation until the development of gross disease introduces a dose threshold that must be reached before a reduction in incidence can be realized. Conclusions: Our computational model of mouse brain metastasis and PCI correlated strongly with our experiments with unirradiated mice. The results further suggest that early treatment of subclinical disease is more effective than irradiating established disease.« less
Code of Federal Regulations, 2011 CFR
2011-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...
Code of Federal Regulations, 2010 CFR
2010-07-01
... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...
A validated finite element model of a soft artificial muscle motor
NASA Astrophysics Data System (ADS)
Tse, Tony Chun H.; O'Brien, Benjamin; McKay, Thomas; Anderson, Iain A.
2011-04-01
The Biomimetics Laboratory has developed a soft artificial muscle motor based on Dielectric Elastomers. The motor, 'Flexidrive', is light-weight and has low system complexity. It works by gripping and turning a shaft with a soft gear, like we would with our fingers. The motor's performance depends on many factors, such as actuation waveform, electrode patterning, geometries and contact tribology between the shaft and gear. We have developed a finite element model (FEM) of the motor as a study and design tool. Contact interaction was integrated with previous material and electromechanical coupling models in ABAQUS. The model was experimentally validated through a shape and blocked force analysis.
Skill Transfer and Virtual Training for IND Response Decision-Making: Project Summary and Next Steps
2016-04-12
are likely to be very productive partners—independent video - game developers and academic game degree programs—are not familiar with working with...experimental validation. • Independent Video - Game Developers. Small companies and individuals that pursue video - game design and development can be...complexity, such as an improvised nuclear device (IND) detonation. The effort has examined game - based training methods to determine their suitability
Skill Transfer and Virtual Training for IND Response Decision-Making: Project Summary and Next Steps
2016-01-01
to be very productive partners—independent video - game developers and academic game degree programs—are not familiar with working with government...experimental validation. • Independent Video - Game Developers. Small companies and individuals that pursue video - game design and development can be inexpensive...Emergency Management Agency (FEMA) that examines alternative mechanisms for training and evaluation of emergency managers (EMs) to augment and
Coarse-grained simulation of polymer-filler blends
NASA Astrophysics Data System (ADS)
Legters, Gregg; Kuppa, Vikram; Beaucage, Gregory; Univ of Dayton Collaboration; Univ of Cincinnati Collaboration
The practical use of polymers often relies on additives that improve the property of the mixture. Examples of such complex blends include tires, pigments, blowing agents and other reactive additives in thermoplastics, and recycled polymers. Such systems usually exhibit a complex partitioning of the components. Most prior work has either focused on fine-grained details such as molecular modeling of chains at interfaces, or on coarse, heuristic, trial-and-error approaches to compounding (eg: tire industry). Thus, there is a significant gap in our understanding of how complex hierarchical structure (across several decades in length) develops in these multicomponent systems. This research employs dissipative particle thermodynamics in conjunction with a pseudo-thermodynamic parameter derived from scattering experiments to represent polymer-filler interactions. DPD simulations will probe how filler dispersion and hierarchical morphology develops in these complex blends, and are validated against experimental (scattering) data. The outcome of our approach is a practical solution to compounding issues, based on a mutually validating experimental and simulation methodology. Support from the NSF (CMMI-1636036/1635865) is gratefully acknowledged.
Predicting cancerlectins by the optimal g-gap dipeptides
NASA Astrophysics Data System (ADS)
Lin, Hao; Liu, Wei-Xin; He, Jiao; Liu, Xin-Hui; Ding, Hui; Chen, Wei
2015-12-01
The cancerlectin plays a key role in the process of tumor cell differentiation. Thus, to fully understand the function of cancerlectin is significant because it sheds light on the future direction for the cancer therapy. However, the traditional wet-experimental methods were money- and time-consuming. It is highly desirable to develop an effective and efficient computational tool to identify cancerlectins. In this study, we developed a sequence-based method to discriminate between cancerlectins and non-cancerlectins. The analysis of variance (ANOVA) was used to choose the optimal feature set derived from the g-gap dipeptide composition. The jackknife cross-validated results showed that the proposed method achieved the accuracy of 75.19%, which is superior to other published methods. For the convenience of other researchers, an online web-server CaLecPred was established and can be freely accessed from the website http://lin.uestc.edu.cn/server/CalecPred. We believe that the CaLecPred is a powerful tool to study cancerlectins and to guide the related experimental validations.
ENEL overall PWR plant models and neutronic integrated computing systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pedroni, G.; Pollachini, L.; Vimercati, G.
1987-01-01
To support the design activity of the Italian nuclear energy program for the construction of pressurized water reactors, the Italian Electricity Board (ENEL) needs to verify the design as a whole (that is, the nuclear steam supply system and balance of plant) both in steady-state operation and in transient. The ENEL has therefore developed two computer models to analyze both operational and incidental transients. The models, named STRIP and SFINCS, perform the analysis of the nuclear as well as the conventional part of the plant (the control system being properly taken into account). The STRIP model has been developed bymore » means of the French (Electricite de France) modular code SICLE, while SFINCS is based on the Italian (ENEL) modular code LEGO. STRIP validation was performed with respect to Fessenheim French power plant experimental data. Two significant transients were chosen: load step and total load rejection. SFINCS validation was performed with respect to Saint-Laurent French power plant experimental data and also by comparing the SFINCS-STRIP responses.« less
Experimental validation of an 8 element EMAT phased array probe for longitudinal wave generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Bourdais, Florian, E-mail: florian.lebourdais@cea.fr; Marchand, Benoit, E-mail: florian.lebourdais@cea.fr
2015-03-31
Sodium cooled Fast Reactors (SFR) use liquid sodium as a coolant. Liquid sodium being opaque, optical techniques cannot be applied to reactor vessel inspection. This makes it necessary to develop alternative ways of assessing the state of the structures immersed in the medium. Ultrasonic pressure waves are well suited for inspection tasks in this environment, especially using pulsed electromagnetic acoustic transducers (EMAT) that generate the ultrasound directly in the liquid sodium. The work carried out at CEA LIST is aimed at developing phased array EMAT probes conditioned for reactor use. The present work focuses on the experimental validation of amore » newly manufactured 8 element probe which was designed for beam forming imaging in a liquid sodium environment. A parametric study is carried out to determine the optimal setup of the magnetic assembly used in this probe. First laboratory tests on an aluminium block show that the probe has the required beam steering capabilities.« less
ERIC Educational Resources Information Center
Luze, Gayle J.; Linebarger, Deborah L.; Greenwood, Charles R.; Carta, Judith J.; Walker, Dale; Leitschuh, Carol; Atwater, Jane B.
2001-01-01
Describes the development of an experimental measure for assessing growth in expressive communication in children from birth to 3 years of age Results from a sample of 50 infants and toddlers assessed monthly for 9 months in indicated that the measure displayed adequate psychometric properties of reliability and validity and was sensitive to…
Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model
2010-03-01
EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End
Experimental validation of the RATE tool for inferring HLA restrictions of T cell epitopes.
Paul, Sinu; Arlehamn, Cecilia S Lindestam; Schulten, Veronique; Westernberg, Luise; Sidney, John; Peters, Bjoern; Sette, Alessandro
2017-06-21
The RATE tool was recently developed to computationally infer the HLA restriction of given epitopes from immune response data of HLA typed subjects without additional cumbersome experimentation. Here, RATE was validated using experimentally defined restriction data from a set of 191 tuberculosis-derived epitopes and 63 healthy individuals with MTB infection from the Western Cape Region of South Africa. Using this experimental dataset, the parameters utilized by the RATE tool to infer restriction were optimized, which included relative frequency (RF) of the subjects responding to a given epitope and expressing a given allele as compared to the general test population and the associated p-value in a Fisher's exact test. We also examined the potential for further optimization based on the predicted binding affinity of epitopes to potential restricting HLA alleles, and the absolute number of individuals expressing a given allele and responding to the specific epitope. Different statistical measures, including Matthew's correlation coefficient, accuracy, sensitivity and specificity were used to evaluate performance of RATE as a function of these criteria. Based on our results we recommend selection of HLA restrictions with cutoffs of p-value < 0.01 and RF ≥ 1.3. The usefulness of the tool was demonstrated by inferring new HLA restrictions for epitope sets where restrictions could not be experimentally determined due to lack of necessary cell lines and for an additional data set related to recognition of pollen derived epitopes from allergic patients. Experimental data sets were used to validate RATE tool and the parameters used by the RATE tool to infer restriction were optimized. New HLA restrictions were identified using the optimized RATE tool.
The DoE method as an efficient tool for modeling the behavior of monocrystalline Si-PV module
NASA Astrophysics Data System (ADS)
Kessaissia, Fatma Zohra; Zegaoui, Abdallah; Boutoubat, Mohamed; Allouache, Hadj; Aillerie, Michel; Charles, Jean-Pierre
2018-05-01
The objective of this paper is to apply the Design of Experiments (DoE) method to study and to obtain a predictive model of any marketed monocrystalline photovoltaic (mc-PV) module. This technique allows us to have a mathematical model that represents the predicted responses depending upon input factors and experimental data. Therefore, the DoE model for characterization and modeling of mc-PV module behavior can be obtained by just performing a set of experimental trials. The DoE model of the mc-PV panel evaluates the predictive maximum power, as a function of irradiation and temperature in a bounded domain of study for inputs. For the mc-PV panel, the predictive model for both one level and two levels were developed taking into account both influences of the main effect and the interactive effects on the considered factors. The DoE method is then implemented by developing a code under Matlab software. The code allows us to simulate, characterize, and validate the predictive model of the mc-PV panel. The calculated results were compared to the experimental data, errors were estimated, and an accurate validation of the predictive models was evaluated by the surface response. Finally, we conclude that the predictive models reproduce the experimental trials and are defined within a good accuracy.
Turbulence Modeling Validation, Testing, and Development
NASA Technical Reports Server (NTRS)
Bardina, J. E.; Huang, P. G.; Coakley, T. J.
1997-01-01
The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.
NASA Technical Reports Server (NTRS)
Ormsbee, A. I.; Bragg, M. B.; Maughmer, M. D.
1981-01-01
A set of relationships used to scale small sized dispersion studies to full size results are experimentally verified and, with some qualifications, basic deposition patterns are presented. In the process of validating these scaling laws, the basic experimental techniques used in conducting such studies both with and without an operational propeller were developed. The procedures that evolved are outlined in some detail. The envelope of test conditions that can be accommodated in the Langley Vortex Research Facility, which were developed theoretically, are verified using a series of vortex trajectory experiments that help to define the limitations due to wall interference effects for models of different sizes.
Schütte, Judith; Wang, Huange; Antoniou, Stella; Jarratt, Andrew; Wilson, Nicola K; Riepsaame, Joey; Calero-Nieto, Fernando J; Moignard, Victoria; Basilico, Silvia; Kinston, Sarah J; Hannah, Rebecca L; Chan, Mun Chiang; Nürnberg, Sylvia T; Ouwehand, Willem H; Bonzanni, Nicola; de Bruijn, Marella FTR; Göttgens, Berthold
2016-01-01
Transcription factor (TF) networks determine cell-type identity by establishing and maintaining lineage-specific expression profiles, yet reconstruction of mammalian regulatory network models has been hampered by a lack of comprehensive functional validation of regulatory interactions. Here, we report comprehensive ChIP-Seq, transgenic and reporter gene experimental data that have allowed us to construct an experimentally validated regulatory network model for haematopoietic stem/progenitor cells (HSPCs). Model simulation coupled with subsequent experimental validation using single cell expression profiling revealed potential mechanisms for cell state stabilisation, and also how a leukaemogenic TF fusion protein perturbs key HSPC regulators. The approach presented here should help to improve our understanding of both normal physiological and disease processes. DOI: http://dx.doi.org/10.7554/eLife.11469.001 PMID:26901438
FE Modelling of Tensile and Impact Behaviours of Squeeze Cast Magnesium Alloy AM60
NASA Astrophysics Data System (ADS)
DiCecco, Sante; Altenhof, William; Hu, Henry
In response to the need for reduced global emissions, the transportation industry has been steadily increasing the magnesium content in vehicles. This trend has resulted in experimental documentation of numerous alloy and casting combinations, while comparatively little work has been done regarding the development of numerical material models for vehicle crashworthiness simulations. In this study, material mechanical behaviour was implemented into an existing material model within the nonlinear FEA code LS-DYNA to emulate the mechanical behaviour of squeeze cast magnesium alloy AM60 with a relatively thick section of 10 mm thickness. Model validation was achieved by comparing the numerical and experimental results of a tensile test and Charpy impact event. Validation found an average absolute error of 5.44% between numerical and experimental tensile test data, whereas a relatively large discrepancy was found during Charpy evaluation. This discrepancy has been attributed to the presence of microstructure inhomogeneity in the squeeze cast magnesium alloy AM60.
Thermal conductivity of microporous layers: Analytical modeling and experimental validation
NASA Astrophysics Data System (ADS)
Andisheh-Tadbir, Mehdi; Kjeang, Erik; Bahrami, Majid
2015-11-01
A new compact relationship is developed for the thermal conductivity of the microporous layer (MPL) used in polymer electrolyte fuel cells as a function of pore size distribution, porosity, and compression pressure. The proposed model is successfully validated against experimental data obtained from a transient plane source thermal constants analyzer. The thermal conductivities of carbon paper samples with and without MPL were measured as a function of load (1-6 bars) and the MPL thermal conductivity was found between 0.13 and 0.17 W m-1 K-1. The proposed analytical model predicts the experimental thermal conductivities within 5%. A correlation generated from the analytical model was used in a multi objective genetic algorithm to predict the pore size distribution and porosity for an MPL with optimized thermal conductivity and mass diffusivity. The results suggest that an optimized MPL, in terms of heat and mass transfer coefficients, has an average pore size of 122 nm and 63% porosity.
An Experimental and Numerical Study of a Supersonic Burner for CFD Model Development
NASA Technical Reports Server (NTRS)
Magnotti, G.; Cutler, A. D.
2008-01-01
A laboratory scale supersonic burner has been developed for validation of computational fluid dynamics models. Detailed numerical simulations were performed for the flow inside the combustor, and coupled with finite element thermal analysis to obtain more accurate outflow conditions. A database of nozzle exit profiles for a wide range of conditions of interest was generated to be used as boundary conditions for simulation of the external jet, or for validation of non-intrusive measurement techniques. A set of experiments was performed to validate the numerical results. In particular, temperature measurements obtained by using an infrared camera show that the computed heat transfer was larger than the measured value. Relaminarization in the convergent part of the nozzle was found to be responsible for this discrepancy, and further numerical simulations sustained this conclusion.
Preliminary Design of Winged Experimental Rocket by University Consortium
NASA Astrophysics Data System (ADS)
Wakita, Masashi; Yonemoto, Koichi; Akiyama, Tomoki; Aso, Shigeru; Kohsetsu, Yuji; Nagata, Harunori
The project of Winged Experimental Rocket described here is a proposal by the alliance of universities (University Consortium) expanding and integrating the research activities of reusable space transportation system performed by individual universities, and is the proposal that aims at flight proof of the results of advanced research conducted by the universities and JAXA using the university-centered experimental launch systems. This paper verifies the validity of the winged experimental rocket by surveying the technical issues that should be demonstrated and by estimating the airframe scale, weight and finally the total cost. The development schedule of this project was set to five years, where two airframes of different scales will be developed to minimize the risks. A 1.5-meter-long airframe will be first manufactured and conduct flight tests in the third year to verify the design issues. Then, a 2.5-meter-long airframe will be finally developed and conduct a complete flight demonstration of various research issues in the fifth year.
Sfakiotakis, Stelios; Vamvuka, Despina
2015-12-01
The pyrolysis of six waste biomass samples was studied and the fuels were kinetically evaluated. A modified independent parallel reactions scheme (IPR) and a distributed activation energy model (DAEM) were developed and their validity was assessed and compared by checking their accuracy of fitting the experimental results, as well as their prediction capability in different experimental conditions. The pyrolysis experiments were carried out in a thermogravimetric analyzer and a fitting procedure, based on least squares minimization, was performed simultaneously at different experimental conditions. A modification of the IPR model, considering dependence of the pre-exponential factor on heating rate, was proved to give better fit results for the same number of tuned kinetic parameters, comparing to the known IPR model and very good prediction results for stepwise experiments. Fit of calculated data to the experimental ones using the developed DAEM model was also proved to be very good. Copyright © 2015 Elsevier Ltd. All rights reserved.
Role of metabolism and viruses in aflatoxin-induced liver cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groopman, John D.; Kensler, Thomas W.
The use of biomarkers in molecular epidemiology studies for identifying stages in the progression of development of the health effects of environmental agents has the potential for providing important information for critical regulatory, clinical and public health problems. Investigations of aflatoxins probably represent one of the most extensive data sets in the field and this work may serve as a template for future studies of other environmental agents. The aflatoxins are naturally occurring mycotoxins found on foods such as corn, peanuts, various other nuts and cottonseed and they have been demonstrated to be carcinogenic in many experimental models. As amore » result of nearly 30 years of study, experimental data and epidemiological studies in human populations, aflatoxin B{sub 1} was classified as carcinogenic to humans by the International Agency for Research on Cancer. The long-term goal of the research described herein is the application of biomarkers to the development of preventative interventions for use in human populations at high-risk for cancer. Several of the aflatoxin-specific biomarkers have been validated in epidemiological studies and are now being used as intermediate biomarkers in prevention studies. The development of these aflatoxin biomarkers has been based upon the knowledge of the biochemistry and toxicology of aflatoxins gleaned from both experimental and human studies. These biomarkers have subsequently been utilized in experimental models to provide data on the modulation of these markers under different situations of disease risk. This systematic approach provides encouragement for preventive interventions and should serve as a template for the development, validation and application of other chemical-specific biomarkers to cancer or other chronic diseases.« less
Aerothermal Testing for Project Orion Crew Exploration Vehicle
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Horvath, Thomas J.; Lillard, Randolph P.; Kirk, Benjamin S.; Fischer-Cassady, Amy
2009-01-01
The Project Orion Crew Exploration Vehicle aerothermodynamic experimentation strategy, as it relates to flight database development, is reviewed. Experimental data has been obtained to both validate the computational predictions utilized as part of the database and support the development of engineering models for issues not adequately addressed with computations. An outline is provided of the working groups formed to address the key deficiencies in data and knowledge for blunt reentry vehicles. The facilities utilized to address these deficiencies are reviewed, along with some of the important results obtained thus far. For smooth wall comparisons of computational convective heating predictions against experimental data from several facilities, confidence was gained with the use of algebraic turbulence model solutions as part of the database. For cavities and protuberances, experimental data is being used for screening various designs, plus providing support to the development of engineering models. With the reaction-control system testing, experimental data were acquired on the surface in combination with off-body flow visualization of the jet plumes and interactions. These results are being compared against predictions for improved understanding of aftbody thermal environments and uncertainties.
Finite Element Model Development and Validation for Aircraft Fuselage Structures
NASA Technical Reports Server (NTRS)
Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.
2000-01-01
The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results. The increased frequency range results in a corresponding increase in the number of modes, modal density and spatial resolution requirements. In this study, conventional modal tests using accelerometers are complemented with Scanning Laser Doppler Velocimetry and Electro-Optic Holography measurements to further resolve the spatial response characteristics. Whenever possible, component and subassembly modal tests are used to validate the finite element models at lower levels of assembly. Normal mode predictions for different finite element representations of components and assemblies are compared with experimental results to assess the most accurate techniques for modeling aircraft fuselage type structures.
The VALiDATe29 MRI Based Multi-Channel Atlas of the Squirrel Monkey Brain.
Schilling, Kurt G; Gao, Yurui; Stepniewska, Iwona; Wu, Tung-Lin; Wang, Feng; Landman, Bennett A; Gore, John C; Chen, Li Min; Anderson, Adam W
2017-10-01
We describe the development of the first digital atlas of the normal squirrel monkey brain and present the resulting product, VALiDATe29. The VALiDATe29 atlas is based on multiple types of magnetic resonance imaging (MRI) contrast acquired on 29 squirrel monkeys, and is created using unbiased, nonlinear registration techniques, resulting in a population-averaged stereotaxic coordinate system. The atlas consists of multiple anatomical templates (proton density, T1, and T2* weighted), diffusion MRI templates (fractional anisotropy and mean diffusivity), and ex vivo templates (fractional anisotropy and a structural MRI). In addition, the templates are combined with histologically defined cortical labels, and diffusion tractography defined white matter labels. The combination of intensity templates and image segmentations make this atlas suitable for the fundamental atlas applications of spatial normalization and label propagation. Together, this atlas facilitates 3D anatomical localization and region of interest delineation, and enables comparisons of experimental data across different subjects or across different experimental conditions. This article describes the atlas creation and its contents, and demonstrates the use of the VALiDATe29 atlas in typical applications. The atlas is freely available to the scientific community.
An observational examination of the literature in diagnostic anatomic pathology.
Foucar, Elliott; Wick, Mark R
2005-05-01
Original research published in the medical literature confronts the reader with three very basic and closely linked questions--are the authors' conclusions true in the contextual setting in which the work was performed (internally valid); if so, are the conclusions also applicable in other practice settings (externally valid); and, if the conclusions of the study are bona fide, do they represent an important contribution to medical practice or are they true-but-insignificant? Most publications attempt to convince readers that the researchers' conclusions are both internally valid and important, and occasionally papers also directly address external validity. Developing standardized methods to facilitate the prospective determination of research importance would be useful to both journals and their readers, but has proven difficult. In contrast, the evidence-based medicine (EBM) movement has had more success with understanding and codifying factors thought to promote research validity. Of the many variables that can influence research validity, research design is the one that has received the most attention. The present paper reviews the contributions of EBM to understanding research validity, looking for areas where EBM's body of knowledge is applicable to the anatomic pathology (AP) literature. As part of this project, the authors performed a pilot observational analysis of a representative sample of the current pertinent literature on diagnostic tissue pathology. The results of that review showed that most of the latter publications employ one of the four categories of "observational" research design that have been delineated by the EBM movement, and that the most common of these observational designs is a "cross-sectional" comparison. Pathologists do not presently use the "experimental" research designs so admired by advocates of EBM. Slightly > 50% of AP observational studies employed statistical evaluations to support their final conclusions. Comparison of the current AP literature with a selected group of papers published in 1977 shows a discernible change over that period that has affected not just technological procedures, but also research design and use of statistics. Although we feel that advocates of EBM deserve credit for bringing attention to the close link between research design and research validity, much of the EBM effort has centered on refining "experimental" methodology, and the complexities of observational research have often been treated in an inappropriately dismissive manner. For advocates of EBM, an observational study is what you are relegated to as a second choice when you are unable to do an experimental study. The latter viewpoint may be true for evaluating new chemotherapeutic agents, but is unacceptable to pathologists, whose research advances are currently completely dependent on well-conducted observational research. Rather than succumb to randomization envy and accept EBM's assertion that observational research is second best, the challenge to AP is to develop and adhere to standards for observational research that will allow our patients to benefit from the full potential of this time tested approach to developing valid insights into disease.
VIRmiRNA: a comprehensive resource for experimentally validated viral miRNAs and their targets.
Qureshi, Abid; Thakur, Nishant; Monga, Isha; Thakur, Anamika; Kumar, Manoj
2014-01-01
Viral microRNAs (miRNAs) regulate gene expression of viral and/or host genes to benefit the virus. Hence, miRNAs play a key role in host-virus interactions and pathogenesis of viral diseases. Lately, miRNAs have also shown potential as important targets for the development of novel antiviral therapeutics. Although several miRNA and their target repositories are available for human and other organisms in literature, but a dedicated resource on viral miRNAs and their targets are lacking. Therefore, we have developed a comprehensive viral miRNA resource harboring information of 9133 entries in three subdatabases. This includes 1308 experimentally validated miRNA sequences with their isomiRs encoded by 44 viruses in viral miRNA ' VIRMIRNA: ' and 7283 of their target genes in ' VIRMIRTAR': . Additionally, there is information of 542 antiviral miRNAs encoded by the host against 24 viruses in antiviral miRNA ' AVIRMIR': . The web interface was developed using Linux-Apache-MySQL-PHP (LAMP) software bundle. User-friendly browse, search, advanced search and useful analysis tools are also provided on the web interface. VIRmiRNA is the first specialized resource of experimentally proven virus-encoded miRNAs and their associated targets. This database would enhance the understanding of viral/host gene regulation and may also prove beneficial in the development of antiviral therapeutics. Database URL: http://crdd.osdd.net/servers/virmirna. © The Author(s) 2014. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miao, Yinbin; Mo, Kun; Jamison, Laura M.
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructure-basedmore » materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize the experimental efforts in FY16 including the following important experiments: (1) in-situ grain growth measurement of nano-grained UO 2; (2) investigation of surface morphology in micrograined UO 2; (3) Nano-indentation experiments on nano- and micro-grained UO 2. The highlight of this year is: we have successfully demonstrated our capability to in-situ measure grain size development while maintaining the stoichiometry of nano-grained UO 2 materials; the experiment is, for the first time, using synchrotron X-ray diffraction to in-situ measure grain growth behavior of UO 2.« less
Das, Sankha Subhra; Saha, Pritam
2018-01-01
Abstract MicroRNAs (miRNAs) are well-known as key regulators of diverse biological pathways. A series of experimental evidences have shown that abnormal miRNA expression profiles are responsible for various pathophysiological conditions by modulating genes in disease associated pathways. In spite of the rapid increase in research data confirming such associations, scientists still do not have access to a consolidated database offering these miRNA-pathway association details for critical diseases. We have developed miRwayDB, a database providing comprehensive information of experimentally validated miRNA-pathway associations in various pathophysiological conditions utilizing data collected from published literature. To the best of our knowledge, it is the first database that provides information about experimentally validated miRNA mediated pathway dysregulation as seen specifically in critical human diseases and hence indicative of a cause-and-effect relationship in most cases. The current version of miRwayDB collects an exhaustive list of miRNA-pathway association entries for 76 critical disease conditions by reviewing 663 published articles. Each database entry contains complete information on the name of the pathophysiological condition, associated miRNA(s), experimental sample type(s), regulation pattern (up/down) of miRNA, pathway association(s), targeted member of dysregulated pathway(s) and a brief description. In addition, miRwayDB provides miRNA, gene and pathway score to evaluate the role of a miRNA regulated pathways in various pathophysiological conditions. The database can also be used for other biomedical approaches such as validation of computational analysis, integrated analysis and prediction of computational model. It also offers a submission page to submit novel data from recently published studies. We believe that miRwayDB will be a useful tool for miRNA research community. Database URL: http://www.mirway.iitkgp.ac.in PMID:29688364
Roth, Sébastien; Torres, Fabien; Feuerstein, Philippe; Thoral-Pierre, Karine
2013-05-01
Finite element analysis is frequently used in several fields such as automotive simulations or biomechanics. It helps researchers and engineers to understand the mechanical behaviour of complex structures. The development of computer science brought the possibility to develop realistic computational models which can behave like physical ones, avoiding the difficulties and costs of experimental tests. In the framework of biomechanics, lots of FE models have been developed in the last few decades, enabling the investigation of the behaviour of the human body submitted to heavy damage such as in road traffic accidents or in ballistic impact. In both cases, the thorax/abdomen/pelvis system is frequently injured. The understanding of the behaviour of this complex system is of extreme importance. In order to explore the dynamic response of this system to impact loading, a finite element model of the human thorax/abdomen/pelvis system has, therefore, been developed including the main organs: heart, lungs, kidneys, liver, spleen, the skeleton (with vertebrae, intervertebral discs, ribs), stomach, intestines, muscles, and skin. The FE model is based on a 3D reconstruction, which has been made from medical records of anonymous patients, who have had medical scans with no relation to the present study. Several scans have been analyzed, and specific attention has been paid to the anthropometry of the reconstructed model, which can be considered as a 50th percentile male model. The biometric parameters and laws have been implemented in the dynamic FE code (Radioss, Altair Hyperworks 11©) used for dynamic simulations. Then the 50th percentile model was validated against experimental data available in the literature, in terms of deflection, force, whose curve must be in experimental corridors. However, for other anthropometries (small male or large male models) question about the validation and results of numerical accident replications can be raised. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Lima, Gustavo F; Freitas, Victor C G; Araújo, Renan P; Maitelli, André L; Salazar, Andrés O
2017-09-15
The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG's movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG's passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory.
Simon, Daniela; Kischkel, Eva; Spielberg, Rüdiger; Kathmann, Norbert
2012-06-30
Distressing symptom-related anxiety is difficult to study in obsessive-compulsive disorder (OCD) due to the disorder's heterogeneity. Our aim was to develop and validate a set of pictures and films comprising a variety of prominent OCD triggers that can be used for individually tailored symptom provocation in experimental studies. In a two-staged production procedure a large pool of OCD triggers and neutral contents was produced and preselected by three psychotherapists specialized in OCD. A sample of 13 OCD patients and 13 controls rated their anxiety, aversiveness and arousal during exposure to OCD-relevant, aversive and neutral control stimuli. Our findings demonstrate differences between the responses of patients and controls to OCD triggers only. Symptom-related anxiety was stronger in response to dynamic compared with static OCD-relevant stimuli. Due to the small number of 13 patients included in the study, only tentative conclusions can be drawn and this study merely provides a first step of validation. These standardized sets constitute valuable tools that can be used in experimental studies on the brain correlates of OCD symptoms and for the study of therapeutic interventions in order to contribute to future developments in the field. Copyright © 2012 Elsevier Ltd. All rights reserved.
Freitas, Victor C. G.; Araújo, Renan P.; Maitelli, André L.; Salazar, Andrés O.
2017-01-01
The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG’s movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG’s passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory. PMID:28914757
Prediction of physical protein protein interactions
NASA Astrophysics Data System (ADS)
Szilágyi, András; Grimm, Vera; Arakaki, Adrián K.; Skolnick, Jeffrey
2005-06-01
Many essential cellular processes such as signal transduction, transport, cellular motion and most regulatory mechanisms are mediated by protein-protein interactions. In recent years, new experimental techniques have been developed to discover the protein-protein interaction networks of several organisms. However, the accuracy and coverage of these techniques have proven to be limited, and computational approaches remain essential both to assist in the design and validation of experimental studies and for the prediction of interaction partners and detailed structures of protein complexes. Here, we provide a critical overview of existing structure-independent and structure-based computational methods. Although these techniques have significantly advanced in the past few years, we find that most of them are still in their infancy. We also provide an overview of experimental techniques for the detection of protein-protein interactions. Although the developments are promising, false positive and false negative results are common, and reliable detection is possible only by taking a consensus of different experimental approaches. The shortcomings of experimental techniques affect both the further development and the fair evaluation of computational prediction methods. For an adequate comparative evaluation of prediction and high-throughput experimental methods, an appropriately large benchmark set of biophysically characterized protein complexes would be needed, but is sorely lacking.
Development of an Integrated Nozzle for a Symmetric, RBCC Launch Vehicle Configuration
NASA Technical Reports Server (NTRS)
Smith, Timothy D.; Canabal, Francisco, III; Rice, Tharen; Blaha, Bernard
2000-01-01
The development of rocket based combined cycle (RBCC) engines is highly dependent upon integrating several different modes of operation into a single system. One of the key components to develop acceptable performance levels through each mode of operation is the nozzle. It must be highly integrated to serve the expansion processes of both rocket and air-breathing modes without undue weight, drag, or complexity. The NASA GTX configuration requires a fixed geometry, altitude-compensating nozzle configuration. The initial configuration, used mainly to estimate weight and cooling requirements was a 1 So half-angle cone, which cuts a concave surface from a point within the flowpath to the vehicle trailing edge. Results of 3-D CFD calculations on this geometry are presented. To address the critical issues associated with integrated, fixed geometry, multimode nozzle development, the GTX team has initiated a series of tasks to evolve the nozzle design, and validate performance levels. An overview of these tasks is given. The first element is a design activity to develop tools for integration of efficient expansion surfaces With the existing flowpath and vehicle aft-body, and to develop a second-generation nozzle design. A preliminary result using a "streamline-tracing" technique is presented. As the nozzle design evolves, a combination of 3-D CFD analysis and experimental evaluation will be used to validate the design procedure and determine the installed performance for propulsion cycle modeling. The initial experimental effort will consist of cold-flow experiments designed to validate the general trends of the streamline-tracing methodology and anchor the CFD analysis. Experiments will also be conducted to simulate nozzle performance during each mode of operation. As the design matures, hot-fire tests will be conducted to refine performance estimates and anchor more sophisticated reacting-flow analysis.
Hériché, Jean-Karim; Lees, Jon G.; Morilla, Ian; Walter, Thomas; Petrova, Boryana; Roberti, M. Julia; Hossain, M. Julius; Adler, Priit; Fernández, José M.; Krallinger, Martin; Haering, Christian H.; Vilo, Jaak; Valencia, Alfonso; Ranea, Juan A.; Orengo, Christine; Ellenberg, Jan
2014-01-01
The advent of genome-wide RNA interference (RNAi)–based screens puts us in the position to identify genes for all functions human cells carry out. However, for many functions, assay complexity and cost make genome-scale knockdown experiments impossible. Methods to predict genes required for cell functions are therefore needed to focus RNAi screens from the whole genome on the most likely candidates. Although different bioinformatics tools for gene function prediction exist, they lack experimental validation and are therefore rarely used by experimentalists. To address this, we developed an effective computational gene selection strategy that represents public data about genes as graphs and then analyzes these graphs using kernels on graph nodes to predict functional relationships. To demonstrate its performance, we predicted human genes required for a poorly understood cellular function—mitotic chromosome condensation—and experimentally validated the top 100 candidates with a focused RNAi screen by automated microscopy. Quantitative analysis of the images demonstrated that the candidates were indeed strongly enriched in condensation genes, including the discovery of several new factors. By combining bioinformatics prediction with experimental validation, our study shows that kernels on graph nodes are powerful tools to integrate public biological data and predict genes involved in cellular functions of interest. PMID:24943848
BioNetCAD: design, simulation and experimental validation of synthetic biochemical networks
Rialle, Stéphanie; Felicori, Liza; Dias-Lopes, Camila; Pérès, Sabine; El Atia, Sanaâ; Thierry, Alain R.; Amar, Patrick; Molina, Franck
2010-01-01
Motivation: Synthetic biology studies how to design and construct biological systems with functions that do not exist in nature. Biochemical networks, although easier to control, have been used less frequently than genetic networks as a base to build a synthetic system. To date, no clear engineering principles exist to design such cell-free biochemical networks. Results: We describe a methodology for the construction of synthetic biochemical networks based on three main steps: design, simulation and experimental validation. We developed BioNetCAD to help users to go through these steps. BioNetCAD allows designing abstract networks that can be implemented thanks to CompuBioTicDB, a database of parts for synthetic biology. BioNetCAD enables also simulations with the HSim software and the classical Ordinary Differential Equations (ODE). We demonstrate with a case study that BioNetCAD can rationalize and reduce further experimental validation during the construction of a biochemical network. Availability and implementation: BioNetCAD is freely available at http://www.sysdiag.cnrs.fr/BioNetCAD. It is implemented in Java and supported on MS Windows. CompuBioTicDB is freely accessible at http://compubiotic.sysdiag.cnrs.fr/ Contact: stephanie.rialle@sysdiag.cnrs.fr; franck.molina@sysdiag.cnrs.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20628073
NASA Astrophysics Data System (ADS)
Polverino, Pierpaolo; Esposito, Angelo; Pianese, Cesare; Ludwig, Bastian; Iwanschitz, Boris; Mai, Andreas
2016-02-01
In the current energetic scenario, Solid Oxide Fuel Cells (SOFCs) exhibit appealing features which make them suitable for environmental-friendly power production, especially for stationary applications. An example is represented by micro-combined heat and power (μ-CHP) generation units based on SOFC stacks, which are able to produce electric and thermal power with high efficiency and low pollutant and greenhouse gases emissions. However, the main limitations to their diffusion into the mass market consist in high maintenance and production costs and short lifetime. To improve these aspects, the current research activity focuses on the development of robust and generalizable diagnostic techniques, aimed at detecting and isolating faults within the entire system (i.e. SOFC stack and balance of plant). Coupled with appropriate recovery strategies, diagnosis can prevent undesired system shutdowns during faulty conditions, with consequent lifetime increase and maintenance costs reduction. This paper deals with the on-line experimental validation of a model-based diagnostic algorithm applied to a pre-commercial SOFC system. The proposed algorithm exploits a Fault Signature Matrix based on a Fault Tree Analysis and improved through fault simulations. The algorithm is characterized on the considered system and it is validated by means of experimental induction of faulty states in controlled conditions.
Oberhardt, Matthew A.; Zarecki, Raphy; Reshef, Leah; ...
2016-01-28
Recent insights suggest that non-specific and/or promiscuous enzymes are common and active across life. Understanding the role of such enzymes is an important open question in biology. Here we develop a genome-wide method, PROPER, that uses a permissive PSI-BLAST approach to predict promiscuous activities of metabolic genes. Enzyme promiscuity is typically studied experimentally using multicopy suppression, in which over-expression of a promiscuous ‘replacer’ gene rescues lethality caused by inactivation of a ‘target’ gene. We use PROPER to predict multicopy suppression in Escherichia coli, achieving highly significant overlap with published cases (hypergeometric p = 4.4e-13). We then validate three novel predictedmore » target-replacer gene pairs in new multicopy suppression experiments. We next go beyond PROPER and develop a network-based approach, GEM-PROPER, that integrates PROPER with genome-scale metabolic modeling to predict promiscuous replacements via alternative metabolic pathways. GEM-PROPER predicts a new indirect replacer (thiG) for an essential enzyme (pdxB) in production of pyridoxal 5’-phosphate (the active form of Vitamin B 6), which we validate experimentally via multicopy suppression. Here, we perform a structural analysis of thiG to determine its potential promiscuous active site, which we validate experimentally by inactivating the pertaining residues and showing a loss of replacer activity. Thus, this study is a successful example where a computational investigation leads to a network-based identification of an indirect promiscuous replacement of a key metabolic enzyme, which would have been extremely difficult to identify directly.« less
Jet Measurements for Development of Jet Noise Prediction Tools
NASA Technical Reports Server (NTRS)
Bridges, James E.
2006-01-01
The primary focus of my presentation is the development of the jet noise prediction code JeNo with most examples coming from the experimental work that drove the theoretical development and validation. JeNo is a statistical jet noise prediction code, based upon the Lilley acoustic analogy. Our approach uses time-average 2-D or 3-D mean and turbulent statistics of the flow as input. The output is source distributions and spectral directivity.
Experimental and modelling of Arthrospira platensis cultivation in open raceway ponds.
Ranganathan, Panneerselvam; Amal, J C; Savithri, S; Haridas, Ajith
2017-10-01
In this study, the growth of Arthrospira platensis was studied in an open raceway pond. Furthermore, dynamic model for algae growth and CFD modelling of hydrodynamics in open raceway pond were developed. The dynamic behaviour of the algal system was developed by solving mass balance equations of various components, considering light intensity and gas-liquid mass transfer. A CFD modelling of the hydrodynamics of open raceway pond was developed by solving mass and momentum balance equations of the liquid medium. The prediction of algae concentration from the dynamic model was compared with the experimental data. The hydrodynamic behaviour of the open raceway pond was compared with the literature data for model validation. The model predictions match the experimental findings. Furthermore, the hydrodynamic behaviour and residence time distribution in our small raceway pond were predicted. These models can serve as a tool to assess the pond performance criteria. Copyright © 2017 Elsevier Ltd. All rights reserved.
Statistical Methodologies to Integrate Experimental and Computational Research
NASA Technical Reports Server (NTRS)
Parker, P. A.; Johnson, R. T.; Montgomery, D. C.
2008-01-01
Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.
NASA Astrophysics Data System (ADS)
Stastnik, S.
2016-06-01
Development of materials for vertical outer building structures tends to application of hollow clay blocks filled with some appropriate insulation material. Ceramic fittings provide high thermal resistance, but the walls built from them suffer from condensation of air humidity in winter season frequently. The paper presents the computational simulation and experimental laboratory validation of moisture behaviour of such masonry with insulation prepared from waste fibres under the Central European climatic conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-31
... factors as the approved models, are validated by experimental test data, and receive the Administrator's... stage of the MEP involves applying the model against a database of experimental test cases including..., particularly the requirement for validation by experimental test data. That guidance is based on the MEP's...
Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.
2014-01-01
Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051
Electrohydraulic Forming of Near-Net Shape Automotive Panels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golovaschenko, Sergey F.
2013-09-26
The objective of this project was to develop the electrohydraulic forming (EHF) process as a near-net shape automotive panel manufacturing technology that simultaneously reduces the energy embedded in vehicles and the energy consumed while producing automotive structures. Pulsed pressure is created via a shockwave generated by the discharge of high voltage capacitors through a pair of electrodes in a liquid-filled chamber. The shockwave in the liquid initiated by the expansion of the plasma channel formed between two electrodes propagates towards the blank and causes the blank to be deformed into a one-sided die cavity. The numerical model of the EHFmore » process was validated experimentally and was successfully applied to the design of the electrode system and to a multi-electrode EHF chamber for full scale validation of the process. The numerical model was able to predict stresses in the dies during pulsed forming and was validated by the experimental study of the die insert failure mode for corner filling operations. The electrohydraulic forming process and its major subsystems, including durable electrodes, an EHF chamber, a water/air management system, a pulse generator and integrated process controls, were validated to be capable to operate in a fully automated, computer controlled mode for forming of a portion of a full-scale sheet metal component in laboratory conditions. Additionally, the novel processes of electrohydraulic trimming and electrohydraulic calibration were demonstrated at a reduced-scale component level. Furthermore, a hybrid process combining conventional stamping with EHF was demonstrated as a laboratory process for a full-scale automotive panel formed out of AHSS material. The economic feasibility of the developed EHF processes was defined by developing a cost model of the EHF process in comparison to the conventional stamping process.« less
Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario
2016-12-01
In the planning of a new cyclotron facility, an accurate knowledge of the radiation field around the accelerator is fundamental for the design of shielding, the protection of workers, the general public and the environment. Monte Carlo simulations can be very useful in this process, and their use is constantly increasing. However, few data have been published so far as regards the proper validation of Monte Carlo simulation against experimental measurements, particularly in the energy range of biomedical cyclotrons. In this work a detailed model of an existing installation of a GE PETtrace 16.5MeV cyclotron was developed using FLUKA. An extensive measurement campaign of the neutron ambient dose equivalent H ∗ (10) in marked positions around the cyclotron was conducted using a neutron rem-counter probe and CR39 neutron detectors. Data from a previous measurement campaign performed by our group using TLDs were also re-evaluated. The FLUKA model was then validated by comparing the results of high-statistics simulations with experimental data. In 10 out of 12 measurement locations, FLUKA simulations were in agreement within uncertainties with all the three different sets of experimental data; in the remaining 2 positions, the agreement was with 2/3 of the measurements. Our work allows to quantitatively validate our FLUKA simulation setup and confirms that Monte Carlo technique can produce accurate results in the energy range of biomedical cyclotrons. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Interferometric millimeter wave and THz wave doppler radar
Liao, Shaolin; Gopalsami, Nachappa; Bakhtiari, Sasan; Raptis, Apostolos C.; Elmer, Thomas
2015-08-11
A mixerless high frequency interferometric Doppler radar system and methods has been invented, numerically validated and experimentally tested. A continuous wave source, phase modulator (e.g., a continuously oscillating reference mirror) and intensity detector are utilized. The intensity detector measures the intensity of the combined reflected Doppler signal and the modulated reference beam. Rigorous mathematics formulas have been developed to extract bot amplitude and phase from the measured intensity signal. Software in Matlab has been developed and used to extract such amplitude and phase information from the experimental data. Both amplitude and phase are calculated and the Doppler frequency signature of the object is determined.
Children's Ability to Comprehend Main Ideas After Reading Expository Prose.
ERIC Educational Resources Information Center
Baumann, James F.
A study was conducted to evaluate children's ability to comprehend main ideas after reading connected discourse and to develop and validate a straightforward and intuitively simple system for identifying main ideas in prose. Three experimental passages were randomly selected from third and sixth grade social studies textbooks, and education…
Management Strategies for Promoting Teacher Collective Learning
ERIC Educational Resources Information Center
Cheng, Eric C. K.
2011-01-01
This paper aims to validate a theoretical model for developing teacher collective learning by using a quasi-experimental design, and explores the management strategies that would provide a school administrator practical steps to effectively promote collective learning in the school organization. Twenty aided secondary schools in Hong Kong were…
Key Skills Influencing Student Achievement
ERIC Educational Resources Information Center
Balch, Tonya; Gruenert, Steve
2009-01-01
A predictive, non-experimental, cross-sectional design (Johnson, 2001) was used to conduct a study to determine if elementary administrators' key counseling skills and select demographics predicted state-level student performance indicators in their respective schools. A secondary purpose of this study was to develop a valid and reliable on-line…
Design and Effects of a Concept Focused Discussion Environment in E-Learning
ERIC Educational Resources Information Center
Yilmaz, Erdi Okan; Yurdugul, Halil
2016-01-01
Problem Statement: Within the frame of learning management systems, this study develops a concept focused discussion environment and validates the effectiveness of this environment's use through an experimental study. Purpose of the Study: Online discussion forums, which are commonly used in learning management systems (LMS), can negatively…
USDA-ARS?s Scientific Manuscript database
Objective - To develop a noninvasive biomarker based Mycobacterium bovis specific detection system to track infection in domestic and wild animals. Design – Experimental longitudinal study for discovery and cross sectional design for validation Animals - Yearling white-tailed deer fawns (n=8) were ...
Development and Validation of a Monte Carlo Simulation Tool for Multi-Pinhole SPECT
Mok, Greta S. P.; Du, Yong; Wang, Yuchuan; Frey, Eric C.; Tsui, Benjamin M. W.
2011-01-01
Purpose In this work, we developed and validated a Monte Carlo simulation (MCS) tool for investigation and evaluation of multi-pinhole (MPH) SPECT imaging. Procedures This tool was based on a combination of the SimSET and MCNP codes. Photon attenuation and scatter in the object, as well as penetration and scatter through the collimator detector, are modeled in this tool. It allows accurate and efficient simulation of MPH SPECT with focused pinhole apertures and user-specified photon energy, aperture material, and imaging geometry. The MCS method was validated by comparing the point response function (PRF), detection efficiency (DE), and image profiles obtained from point sources and phantom experiments. A prototype single-pinhole collimator and focused four- and five-pinhole collimators fitted on a small animal imager were used for the experimental validations. We have also compared computational speed among various simulation tools for MPH SPECT, including SimSET-MCNP, MCNP, SimSET-GATE, and GATE for simulating projections of a hot sphere phantom. Results We found good agreement between the MCS and experimental results for PRF, DE, and image profiles, indicating the validity of the simulation method. The relative computational speeds for SimSET-MCNP, MCNP, SimSET-GATE, and GATE are 1: 2.73: 3.54: 7.34, respectively, for 120-view simulations. We also demonstrated the application of this MCS tool in small animal imaging by generating a set of low-noise MPH projection data of a 3D digital mouse whole body phantom. Conclusions The new method is useful for studying MPH collimator designs, data acquisition protocols, image reconstructions, and compensation techniques. It also has great potential to be applied for modeling the collimator-detector response with penetration and scatter effects for MPH in the quantitative reconstruction method. PMID:19779896
Experimental investigation of an RNA sequence space
NASA Technical Reports Server (NTRS)
Lee, Youn-Hyung; Dsouza, Lisa; Fox, George E.
1993-01-01
Modern rRNAs are the historic consequence of an ongoing evolutionary exploration of a sequence space. These extant sequences belong to a special subset of the sequence space that is comprised only of those primary sequences that can validly perform the biological function(s) required of the particular RNA. If it were possible to readily identify all such valid sequences, stochastic predictions could be made about the relative likelihood of various evolutionary pathways available to an RNA. Herein an experimental system which can assess whether a particular sequence is likely to have validity as a eubacterial 5S rRNA is described. A total of ten naturally occurring, and hence known to be valid, sequences and two point mutants of unknown validity were used to test the usefulness of the approach. Nine of the ten valid sequences tested positive whereas both mutants tested as clearly defective. The tenth valid sequence gave results that would be interpreted as reflecting a borderline status were the answer not known. These results demonstrate that it is possible to experimentally determine which sequences in local regions of the sequence space are potentially valid 5S rRNAs.
2013-01-01
In this paper, we develop and validate a method to identify computationally efficient site- and patient-specific models of ultrasound thermal therapies from MR thermal images. The models of the specific absorption rate of the transduced energy and the temperature response of the therapy target are identified in the reduced basis of proper orthogonal decomposition of thermal images, acquired in response to a mild thermal test excitation. The method permits dynamic reidentification of the treatment models during the therapy by recursively utilizing newly acquired images. Such adaptation is particularly important during high-temperature therapies, which are known to substantially and rapidly change tissue properties and blood perfusion. The developed theory was validated for the case of focused ultrasound heating of a tissue phantom. The experimental and computational results indicate that the developed approach produces accurate low-dimensional treatment models despite temporal and spatial noises in MR images and slow image acquisition rate. PMID:22531754
Quantitative validation of carbon-fiber laminate low velocity impact simulations
English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.
2015-09-26
Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less
Validation of Heat Transfer Thermal Decomposition and Container Pressurization of Polyurethane Foam.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Sarah Nicole; Dodd, Amanda B.; Larsen, Marvin E.
Polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. In fire environments, gas pressure from thermal decomposition of polymers can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of PMDI-based polyurethane foam is presented to assess the validity of the computational model. Both experimental measurement uncertainty and model prediction uncertainty are examined and compared. Both the mean value method and Latin hypercube sampling approach are used to propagate the uncertainty through the model. In addition to comparing computational and experimental results, the importance of each input parameter on the simulation resultmore » is also investigated. These results show that further development in the physics model of the foam and appropriate associated material testing are necessary to improve model accuracy.« less
Turbine-99 unsteady simulations - Validation
NASA Astrophysics Data System (ADS)
Cervantes, M. J.; Andersson, U.; Lövgren, H. M.
2010-08-01
The Turbine-99 test case, a Kaplan draft tube model, aimed to determine the state of the art within draft tube simulation. Three workshops were organized on the matter in 1999, 2001 and 2005 where the geometry and experimental data were provided as boundary conditions to the participants. Since the last workshop, computational power and flow modelling have been developed and the available data completed with unsteady pressure measurements and phase resolved velocity measurements in the cone. Such new set of data together with the corresponding phase resolved velocity boundary conditions offer new possibilities to validate unsteady numerical simulations in Kaplan draft tube. The present work presents simulation of the Turbine-99 test case with time dependent angular resolved inlet velocity boundary conditions. Different grids and time steps are investigated. The results are compared to experimental time dependent pressure and velocity measurements.
Roy, Gilles; Roy, Nathalie
2008-03-20
A multiple-field-of-view (MFOV) lidar is used to characterize size and optical depth of low concentration of bioaerosol clouds. The concept relies on the measurement of the forward scattered light by using the background aerosols at various distances at the back of a subvisible cloud. It also relies on the subtraction of the background aerosol forward scattering contribution and on the partial attenuation of the first-order backscattering. The validity of the concept developed to retrieve the effective diameter and the optical depth of low concentration bioaerosol clouds with good precision is demonstrated using simulation results and experimental MFOV lidar measurements. Calculations are also done to show that the method presented can be extended to small optical depth cloud retrieval.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tralshawala, Nilesh; Howard, Don; Knight, Bryon
2008-02-28
In conventional infrared thermography, determination of thermal diffusivity requires thickness information. Recently GE has been experimenting with the use of lateral heat flow to determine thermal diffusivity without thickness information. This work builds on previous work at NASA Langley and Wayne State University but we incorporate thermal time of flight (tof) analysis rather than curve fitting to obtain quantitative information. We have developed appropriate theoretical models and a tof based data analysis framework to experimentally determine all components of thermal diffusivity from the time-temperature measurements. Initial validation was carried out using finite difference simulations. Experimental validation was done using anisotropicmore » carbon fiber reinforced polymer (CFRP) composites. We found that in the CFRP samples used, the in-plane component of diffusivity is about eight times larger than the through-thickness component.« less
Rodent models of insomnia: a review of experimental procedures that induce sleep disturbances.
Revel, Florent G; Gottowik, Juergen; Gatti, Sylvia; Wettstein, Joseph G; Moreau, Jean-Luc
2009-06-01
Insomnia, the most common sleep disorder, is characterized by persistent difficulty in falling or staying asleep despite adequate opportunity to sleep, leading to daytime fatigue and mental dysfunction. As sleep is a sophisticated physiological process generated by a network of neuronal systems that cannot be reproduced in-vitro, pre-clinical development of hypnotic drugs requires in-vivo investigations. Accordingly, this review critically evaluates current and putative rodent models of insomnia which could be used to screen novel hypnotics. Only few valid insomnia models are currently available, although many experimental conditions lead to disturbance of physiological sleep. We categorized these conditions as a function of the procedure used to induce perturbation of sleep, and we discuss their respective advantages and pitfalls with respect to validity, feasibility and translational value to human research.
Neutron capture on short-lived nuclei via the surrogate (d,pγ) reaction
NASA Astrophysics Data System (ADS)
Cizewski, Jolie A.; Ratkiewicz, Andrew
2018-05-01
Rapid r-process nucleosynthesis is responsible for the creation of about half of the elements heavier than iron. Neutron capture on shortlived nuclei in cold processes or during freeze out from hot processes can have a significant impact on the final observed r-process abundances. We are validating the (d,pγ) reaction as a surrogate for neutron capture with measurements on 95Mo targets and a focus on discrete transitions. The experimental results have been analyzed within the Hauser-Feshbach approach with non-elastic breakup of the deuteron providing a neutron to be captured. Preliminary results support the (d,pγ) reaction as a valid surrogate for neutron capture. We are poised to measure the (d,pγ) reaction in inverse kinematics with unstable beams following the development of the experimental techniques.
Validation of Laser-Induced Fluorescent Photogrammetric Targets on Membrane Structures
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Dorrington, Adrian A.; Shortis, Mark R.; Hendricks, Aron R.
2004-01-01
The need for static and dynamic characterization of a new generation of inflatable space structures requires the advancement of classical metrology techniques. A new photogrammetric-based method for non-contact ranging and surface profiling has been developed at NASA Langley Research Center (LaRC) to support modal analyses and structural validation of this class of space structures. This full field measurement method, known as Laser-Induced Fluorescence (LIF) photogrammetry, has previously yielded promising experimental results. However, data indicating the achievable measurement precision had not been published. This paper provides experimental results that indicate the LIF-photogrammetry measurement precision for three different target types used on a reflective membrane structure. The target types were: (1) non-contact targets generated using LIF, (2) surface attached retro-reflective targets, and (3) surface attached diffuse targets. Results from both static and dynamic investigations are included.
Thermodynamic modeling and experimental validation of the Fe-Al-Ni-Cr-Mo alloy system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teng, Zhenke; Zhang, F; Miller, Michael K
2012-01-01
NiAl-type precipitate-strengthened ferritic steels have been known as potential materials for the steam turbine applications. In this study, thermodynamic descriptions of the B2-NiAl type nano-scaled precipitates and body-centered-cubic (BCC) Fe matrix phase for four alloys based on the Fe-Al-Ni-Cr-Mo system were developed as a function of the alloy composition at the aging temperature. The calculated phase structure, composition, and volume fraction were validated by the experimental investigations using synchrotron X-ray diffraction and atom probe tomography. With the ability to accurately predict the key microstructural features related to the mechanical properties in a given alloy system, the established thermodynamic model inmore » the current study may significantly accelerate the alloy design process of the NiAl-strengthened ferritic steels.« less
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
2014-04-15
SINGLE CYLINDER DIESEL ENGINE Amit Shrestha, Umashankar Joshi, Ziliang Zheng, Tamer Badawy, Naeim A. Henein, Wayne State University, Detroit, MI, USA...13-03-2014 4. TITLE AND SUBTITLE EXPERIMENTAL VALIDATION AND COMBUSTION MODELING OF A JP-8 SURROGATE IN A SINGLE CYLINDER DIESEL ENGINE 5a...INTERNATIONAL UNCLASSIFIED • Validate a two-component JP-8 surrogate in a single cylinder diesel engine. Validation parameters include – Ignition delay
The space shuttle payload planning working groups. Volume 8: Earth and ocean physics
NASA Technical Reports Server (NTRS)
1973-01-01
The findings and recommendations of the Earth and Ocean Physics working group of the space shuttle payload planning activity are presented. The requirements for the space shuttle mission are defined as: (1) precision measurement for earth and ocean physics experiments, (2) development and demonstration of new and improved sensors and analytical techniques, (3) acquisition of surface truth data for evaluation of new measurement techniques, (4) conduct of critical experiments to validate geophysical phenomena and instrumental results, and (5) development and validation of analytical/experimental models for global ocean dynamics and solid earth dynamics/earthquake prediction. Tables of data are presented to show the flight schedule estimated costs, and the mission model.
Study of steam condensation at sub-atmospheric pressure: setting a basic research using MELCOR code
NASA Astrophysics Data System (ADS)
Manfredini, A.; Mazzini, M.
2017-11-01
One of the most serious accidents that can occur in the experimental nuclear fusion reactor ITER is the break of one of the headers of the refrigeration system of the first wall of the Tokamak. This results in water-steam mixture discharge in vacuum vessel (VV), with consequent pressurization of this container. To prevent the pressure in the VV exceeds 150 KPa absolute, a system discharges the steam inside a suppression pool, at an absolute pressure of 4.2 kPa. The computer codes used to analyze such incident (eg. RELAP 5 or MELCOR) are not validated experimentally for such conditions. Therefore, we planned a basic research, in order to have experimental data useful to validate the heat transfer correlations used in these codes. After a thorough literature search on this topic, ACTA, in collaboration with the staff of ITER, defined the experimental matrix and performed the design of the experimental apparatus. For the thermal-hydraulic design of the experiments, we executed a series of calculations by MELCOR. This code, however, was used in an unconventional mode, with the development of models suited respectively to low and high steam flow-rate tests. The article concludes with a discussion of the placement of experimental data within the map featuring the phenomenon characteristics, showing the importance of the new knowledge acquired, particularly in the case of chugging.
Gantner, Melisa E; Peroni, Roxana N; Morales, Juan F; Villalba, María L; Ruiz, María E; Talevi, Alan
2017-08-28
Breast Cancer Resistance Protein (BCRP) is an ATP-dependent efflux transporter linked to the multidrug resistance phenomenon in many diseases such as epilepsy and cancer and a potential source of drug interactions. For these reasons, the early identification of substrates and nonsubstrates of this transporter during the drug discovery stage is of great interest. We have developed a computational nonlinear model ensemble based on conformational independent molecular descriptors using a combined strategy of genetic algorithms, J48 decision tree classifiers, and data fusion. The best model ensemble consists in averaging the ranking of the 12 decision trees that showed the best performance on the training set, which also demonstrated a good performance for the test set. It was experimentally validated using the ex vivo everted rat intestinal sac model. Five anticonvulsant drugs classified as nonsubstrates for BRCP by the model ensemble were experimentally evaluated, and none of them proved to be a BCRP substrate under the experimental conditions used, thus confirming the predictive ability of the model ensemble. The model ensemble reported here is a potentially valuable tool to be used as an in silico ADME filter in computer-aided drug discovery campaigns intended to overcome BCRP-mediated multidrug resistance issues and to prevent drug-drug interactions.
Gupta, T C
2007-08-01
A 15 degrees of freedom lumped parameter vibratory model of human body is developed, for vertical mode vibrations, using anthropometric data of the 50th percentile US male. The mass and stiffness of various segments are determined from the elastic modulii of bones and tissues and from the anthropometric data available, assuming the shape of all the segments is ellipsoidal. The damping ratio of each segment is estimated on the basis of the physical structure of the body in a particular posture. Damping constants of various segments are calculated from these damping ratios. The human body is modeled as a linear spring-mass-damper system. The optimal values of the damping ratios of the body segments are estimated, for the 15 degrees of freedom model of the 50th percentile US male, by comparing the response of the model with the experimental response. Formulating a similar vibratory model of the 50th percentile Indian male and comparing the frequency response of the model with the experimental response of the same group of subjects validate the modeling procedure. A range of damping ratios has been considered to develop a vibratory model, which can predict the vertical harmonic response of the human body.
Mechanical Behavior of Dowel-Type Joints Made of Wood Scrimber Composite
He, Minjuan; Tao, Duo; Li, Zheng; Li, Maolin
2016-01-01
As a renewable building material with low embodied energy characteristics, wood has gained more and more attention in the green and sustainable building industry. In terms of material resource and physical properties, scrimber composite not only makes full use of fast-growing wood species, but also has better mechanical performance and less inherent variability than natural wood material. In this study, the mechanical behavior of bolted beam-to-column joints built with a kind of scrimber composite was investigated both experimentally and numerically. Two groups of specimens were tested under monotonic and low frequency cyclic loading protocols. The experimental results showed that the bolted joints built with scrimber composite performed well in initial stiffness, ductility, and energy dissipation. A three-dimensional (3D) non-linear finite element model (FEM) for the bolted beam-to-column joints was then developed and validated by experimental results. The validated model was further used to investigate the failure mechanism of the bolted joints through stress analysis. This study can contribute to the application of the proposed scrimber composite in structural engineering, and the developed FEM can serve as a useful tool to evaluate the mechanical behavior of such bolted beam-to-column joints with different configurations in future research. PMID:28773703
Mechanical Behavior of Dowel-Type Joints Made of Wood Scrimber Composite.
He, Minjuan; Tao, Duo; Li, Zheng; Li, Maolin
2016-07-15
As a renewable building material with low embodied energy characteristics, wood has gained more and more attention in the green and sustainable building industry. In terms of material resource and physical properties, scrimber composite not only makes full use of fast-growing wood species, but also has better mechanical performance and less inherent variability than natural wood material. In this study, the mechanical behavior of bolted beam-to-column joints built with a kind of scrimber composite was investigated both experimentally and numerically. Two groups of specimens were tested under monotonic and low frequency cyclic loading protocols. The experimental results showed that the bolted joints built with scrimber composite performed well in initial stiffness, ductility, and energy dissipation. A three-dimensional (3D) non-linear finite element model (FEM) for the bolted beam-to-column joints was then developed and validated by experimental results. The validated model was further used to investigate the failure mechanism of the bolted joints through stress analysis. This study can contribute to the application of the proposed scrimber composite in structural engineering, and the developed FEM can serve as a useful tool to evaluate the mechanical behavior of such bolted beam-to-column joints with different configurations in future research.
Aksoy, Bülent Arman; Dančík, Vlado; Smith, Kenneth; Mazerik, Jessica N.; Ji, Zhou; Gross, Benjamin; Nikolova, Olga; Jaber, Nadia; Califano, Andrea; Schreiber, Stuart L.; Gerhard, Daniela S.; Hermida, Leandro C.; Jagu, Subhashini
2017-01-01
Abstract The Cancer Target Discovery and Development (CTD2) Network aims to use functional genomics to accelerate the translation of high-throughput and high-content genomic and small-molecule data towards use in precision oncology. As part of this goal, and to share its conclusions with the research community, the Network developed the ‘CTD2 Dashboard’ [https://ctd2-dashboard.nci.nih.gov/], which compiles CTD2 Network-generated conclusions, termed ‘observations’, associated with experimental entities, collected by its member groups (‘Centers’). Any researcher interested in learning about a given gene, protein, or compound (a ‘subject’) studied by the Network can come to the CTD2 Dashboard to quickly and easily find, review, and understand Network-generated experimental results. In particular, the Dashboard allows visitors to connect experiments about the same target, biomarker, etc., carried out by multiple Centers in the Network. The Dashboard’s unique knowledge representation allows information to be compiled around a subject, so as to become greater than the sum of the individual contributions. The CTD2 Network has broadly defined levels of validation for evidence (‘Tiers’) pertaining to a particular finding, and the CTD2 Dashboard uses these Tiers to indicate the extent to which results have been validated. Researchers can use the Network’s insights and tools to develop a new hypothesis or confirm existing hypotheses, in turn advancing the findings towards clinical applications. Database URL: https://ctd2-dashboard.nci.nih.gov/ PMID:29220450
Modelling of polymer photodegradation for solar cell modules
NASA Technical Reports Server (NTRS)
Somersall, A. C.; Guillet, J. E.
1981-01-01
A computer program developed to model and calculate by numerical integration the varying concentrations of chemical species formed during photooxidation of a polymeric material over time, using as input data a choice set of elementary reactions, corresponding rate constants and a convenient set of starting conditions is evaluated. Attempts were made to validate the proposed mechanism by experimentally monitoring the photooxidation products of small liquid alkane which are useful starting models for ethylene segments of polymers like EVA. The model system proved in appropriate for the intended purposes. Another validation model is recommended.
The development of methods for predicting and measuring distribution patterns of aerial sprays
NASA Technical Reports Server (NTRS)
Ormsbee, A. I.; Bragg, M. B.; Maughmer, M. D.
1979-01-01
The capability of conducting scale model experiments which involve the ejection of small particles into the wake of an aircraft close to the ground is developed. A set of relationships used to scale small-sized dispersion studies to full-size results are experimentally verified and, with some qualifications, basic deposition patterns are presented. In the process of validating these scaling laws, the basic experimental techniques used in conducting such studies, both with and without an operational propeller, were developed. The procedures that evolved are outlined. The envelope of test conditions that can be accommodated in the Langley Vortex Research Facility, which were developed theoretically, are verified using a series of vortex trajectory experiments that help to define the limitations due to wall interference effects for models of different sizes.
Assessing Students' Understanding of Macroevolution: Concerns regarding the validity of the MUM
NASA Astrophysics Data System (ADS)
Novick, Laura R.; Catley, Kefyn M.
2012-11-01
In a recent article, Nadelson and Southerland (2010. Development and preliminary evaluation of the Measure of Understanding of Macroevolution: Introducing the MUM. The Journal of Experimental Education, 78, 151-190) reported on their development of a multiple-choice concept inventory intended to assess college students' understanding of macroevolutionary concepts, the Measure of Understanding Macroevolution (MUM). Given that the only existing evolution inventories assess understanding of natural selection, a microevolutionary concept, a valid assessment of students' understanding of macroevolution would be a welcome and necessary addition to the field of science education. Although the conceptual framework underlying Nadelson and Southerland's test is promising, we believe the test has serious shortcomings with respect to validity evidence for the construct being tested. We argue and provide evidence that these problems are serious enough that the MUM should not be used in its current form to measure students' understanding of macroevolution.
Validation and Continued Development of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2017-10-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. An extended MHD model has shown good agreement with experimental data at 14 kHz injector operation. Efforts to extend the existing validation to a range of higher frequencies (36, 53, 68 kHz) using the PSI-Tet 3D extended MHD code will be presented, along with simulations of potential combinations of flux conserver features and helicity injector configurations and their impact on current drive performance, density control, and temperature for future SIHI experiments. Work supported by USDoE.
DEVELOPMENT AND VALIDATION OF A MULTIFIELD MODEL OF CHURN-TURBULENT GAS/LIQUID FLOWS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elena A. Tselishcheva; Steven P. Antal; Michael Z. Podowski
The accuracy of numerical predictions for gas/liquid two-phase flows using Computational Multiphase Fluid Dynamics (CMFD) methods strongly depends on the formulation of models governing the interaction between the continuous liquid field and bubbles of different sizes. The purpose of this paper is to develop, test and validate a multifield model of adiabatic gas/liquid flows at intermediate gas concentrations (e.g., churn-turbulent flow regime), in which multiple-size bubbles are divided into a specified number of groups, each representing a prescribed range of sizes. The proposed modeling concept uses transport equations for the continuous liquid field and for each bubble field. The overallmore » model has been implemented in the NPHASE-CMFD computer code. The results of NPHASE-CMFD simulations have been validated against the experimental data from the TOPFLOW test facility. Also, a parametric analysis on the effect of various modeling assumptions has been performed.« less
Dynamic modelling and experimental validation of three wheeled tilting vehicles
NASA Astrophysics Data System (ADS)
Amati, Nicola; Festini, Andrea; Pelizza, Luigi; Tonoli, Andrea
2011-06-01
The present paper describes the study of the stability in the straight running of a three-wheeled tilting vehicle for urban and sub-urban mobility. The analysis was carried out by developing a multibody model in the Matlab/SimulinkSimMechanics environment. An Adams-Motorcycle model and an equivalent analytical model were developed for the cross-validation and for highlighting the similarities with the lateral dynamics of motorcycles. Field tests were carried out to validate the model and identify some critical parameters, such as the damping on the steering system. The stability analysis demonstrates that the lateral dynamic motions are characterised by vibration modes that are similar to that of a motorcycle. Additionally, it shows that the wobble mode is significantly affected by the castor trail, whereas it is only slightly affected by the dynamics of the front suspension. For the present case study, the frame compliance also has no influence on the weave and wobble.
Finite-element analysis of NiTi wire deflection during orthodontic levelling treatment
NASA Astrophysics Data System (ADS)
Razali, M. F.; Mahmud, A. S.; Mokhtar, N.; Abdullah, J.
2016-02-01
Finite-element analysis is an important product development tool in medical devices industry for design and failure analysis of devices. This tool helps device designers to quickly explore various design options, optimizing specific designs and providing a deeper insight how a device is actually performing. In this study, three-dimensional finite-element models of superelastic nickel-titanium arch wire engaged in a three brackets system were developed. The aim was to measure the effect of binding friction developed on wire-bracket interaction towards the remaining recovery force available for tooth movement. Uniaxial and three brackets bending test were modelled and validated against experimental works. The prediction made by the three brackets bending models shows good agreement with the experimental results.
ERIC Educational Resources Information Center
Rossi, Robert Joseph
Methods drawn from four logical theories associated with studies of inductive processes are applied to the assessment and evaluation of experimental episode construct validity. It is shown that this application provides for estimates of episode informativeness with respect to the person examined in terms of the construct and to the construct…
NASA Astrophysics Data System (ADS)
Sadi, Maryam
2018-01-01
In this study a group method of data handling model has been successfully developed to predict heat capacity of ionic liquid based nanofluids by considering reduced temperature, acentric factor and molecular weight of ionic liquids, and nanoparticle concentration as input parameters. In order to accomplish modeling, 528 experimental data points extracted from the literature have been divided into training and testing subsets. The training set has been used to predict model coefficients and the testing set has been applied for model validation. The ability and accuracy of developed model, has been evaluated by comparison of model predictions with experimental values using different statistical parameters such as coefficient of determination, mean square error and mean absolute percentage error. The mean absolute percentage error of developed model for training and testing sets are 1.38% and 1.66%, respectively, which indicate excellent agreement between model predictions and experimental data. Also, the results estimated by the developed GMDH model exhibit a higher accuracy when compared to the available theoretical correlations.
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2005-01-01
Researchers from NASA Glenn Research Center s Combustion Branch and the Ohio Aerospace Institute (OAI) have developed a transferable calibration standard for an optical technique called spontaneous Raman scattering (SRS) in high-pressure flames. SRS is perhaps the only technique that provides spatially and temporally resolved, simultaneous multiscalar measurements in turbulent flames. Such measurements are critical for the validation of numerical models of combustion. This study has been a combined experimental and theoretical effort to develop a spectral calibration database for multiscalar diagnostics using SRS in high-pressure flames. However, in the past such measurements have used a one-of-a-kind experimental setup and a setup-dependent calibration procedure to empirically account for spectral interferences, or crosstalk, among the major species of interest. Such calibration procedures, being non-transferable, are prohibitively expensive to duplicate. A goal of this effort is to provide an SRS calibration database using transferable standards that can be implemented widely by other researchers for both atmospheric-pressure and high-pressure (less than 30 atm) SRS studies. A secondary goal of this effort is to provide quantitative multiscalar diagnostics in high pressure environments to validate computational combustion codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burge, S.W.
Erosion has been identified as one of the significant design issues in fluid beds. A cooperative R&D venture of industry, research, and government organizations was recently formed to meet the industry need for a better understanding of erosion in fluid beds. Research focussed on bed hydrodynamics, which are considered to be the primary erosion mechanism. As part of this work, ANL developed an analytical model (FLUFIX) for bed hydrodynamics. Partial validation was performed using data from experiments sponsored by the research consortium. Development of a three-dimensional fluid bed hydrodynamic model was part of Asea-Babcock`s in-kind contribution to the R&D venture.more » This model, FORCE2, was developed by Babcock & Wilcox`s Research and Development Division existing B&W program and on the gas-solids modeling and was based on an existing B&W program and on the gas-solids modeling technology developed by ANL and others. FORCE2 contains many of the features needed to model plant size beds and, therefore can be used along with the erosion technology to assess metal wastage in industrial equipment. As part of the development efforts, FORCE2 was partially validated using ANL`s two-dimensional model, FLUFIX, and experimental data. Time constraints as well as the lack of good hydrodynamic data, particularly at the plant scale, prohibited a complete validation of FORCE2. This report describes this initial validation of FORCE2.« less
Frasson, L; Neubert, J; Reina, S; Oldfield, M; Davies, B L; Rodriguez Y Baena, F
2010-01-01
The popularity of minimally invasive surgical procedures is driving the development of novel, safer and more accurate surgical tools. In this context a multi-part probe for soft tissue surgery is being developed in the Mechatronics in Medicine Laboratory at Imperial College, London. This study reports an optimization procedure using finite element methods, for the identification of an interlock geometry able to limit the separation of the segments composing the multi-part probe. An optimal geometry was obtained and the corresponding three-dimensional finite element model validated experimentally. Simulation results are shown to be consistent with the physical experiments. The outcome of this study is an important step in the provision of a novel miniature steerable probe for surgery.
In Vitro Simulation and Validation of the Circulation with Congenital Heart Defects
Figliola, Richard S.; Giardini, Alessandro; Conover, Tim; Camp, Tiffany A.; Biglino, Giovanni; Chiulli, John; Hsia, Tain-Yen
2010-01-01
Despite the recent advances in computational modeling, experimental simulation of the circulation with congenital heart defect using mock flow circuits remains an important tool for device testing, and for detailing the probable flow consequences resulting from surgical and interventional corrections. Validated mock circuits can be applied to qualify the results from novel computational models. New mathematical tools, coupled with advanced clinical imaging methods, allow for improved assessment of experimental circuit performance relative to human function, as well as the potential for patient-specific adaptation. In this review, we address the development of three in vitro mock circuits specific for studies of congenital heart defects. Performance of an in vitro right heart circulation circuit through a series of verification and validation exercises is described, including correlations with animal studies, and quantifying the effects of circuit inertiance on test results. We present our experience in the design of mock circuits suitable for investigations of the characteristics of the Fontan circulation. We use one such mock circuit to evaluate the accuracy of Doppler predictions in the presence of aortic coarctation. PMID:21218147
Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing
NASA Astrophysics Data System (ADS)
Salamone, Joseph A., III
Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.
Design and validation of the eyesafe ladar testbed (ELT) using the LadarSIM system simulator
NASA Astrophysics Data System (ADS)
Neilsen, Kevin D.; Budge, Scott E.; Pack, Robert T.; Fullmer, R. Rees; Cook, T. Dean
2009-05-01
The development of an experimental full-waveform LADAR system has been enhanced with the assistance of the LadarSIM system simulation software. The Eyesafe LADAR Test-bed (ELT) was designed as a raster scanning, single-beam, energy-detection LADAR with the capability of digitizing and recording the return pulse waveform at up to 2 GHz for 3D off-line image formation research in the laboratory. To assist in the design phase, the full-waveform LADAR simulation in LadarSIM was used to simulate the expected return waveforms for various system design parameters, target characteristics, and target ranges. Once the design was finalized and the ELT constructed, the measured specifications of the system and experimental data captured from the operational sensor were used to validate the behavior of the system as predicted during the design phase. This paper presents the methodology used, and lessons learned from this "design, build, validate" process. Simulated results from the design phase are presented, and these are compared to simulated results using measured system parameters and operational sensor data. The advantages of this simulation-based process are also presented.
Barros, Wilson; Gochberg, Daniel F.; Gore, John C.
2009-01-01
The description of the nuclear magnetic resonance magnetization dynamics in the presence of long-range dipolar interactions, which is based upon approximate solutions of Bloch–Torrey equations including the effect of a distant dipolar field, has been revisited. New experiments show that approximate analytic solutions have a broader regime of validity as well as dependencies on pulse-sequence parameters that seem to have been overlooked. In order to explain these experimental results, we developed a new method consisting of calculating the magnetization via an iterative formalism where both diffusion and distant dipolar field contributions are treated as integral operators incorporated into the Bloch–Torrey equations. The solution can be organized as a perturbative series, whereby access to higher order terms allows one to set better boundaries on validity regimes for analytic first-order approximations. Finally, the method legitimizes the use of simple analytic first-order approximations under less demanding experimental conditions, it predicts new pulse-sequence parameter dependencies for the range of validity, and clarifies weak points in previous calculations. PMID:19425789
Scientific, statistical, practical, and regulatory considerations in design space development.
Debevec, Veronika; Srčič, Stanko; Horvat, Matej
2018-03-01
The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.
Development of the Biological Experimental Design Concept Inventory (BEDCI).
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2014-01-01
Interest in student conception of experimentation inspired the development of a fully validated 14-question inventory on experimental design in biology (BEDCI) by following established best practices in concept inventory (CI) design. This CI can be used to diagnose specific examples of non-expert-like thinking in students and to evaluate the success of teaching strategies that target conceptual changes. We used BEDCI to diagnose non-expert-like student thinking in experimental design at the pre- and posttest stage in five courses (total n = 580 students) at a large research university in western Canada. Calculated difficulty and discrimination metrics indicated that BEDCI questions are able to effectively capture learning changes at the undergraduate level. A high correlation (r = 0.84) between responses by students in similar courses and at the same stage of their academic career, also suggests that the test is reliable. Students showed significant positive learning changes by the posttest stage, but some non-expert-like responses were widespread and persistent. BEDCI is a reliable and valid diagnostic tool that can be used in a variety of life sciences disciplines. © 2014 T. Deane et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Pivetta, Tiziana; Isaia, Francesco; Trudu, Federica; Pani, Alessandra; Manca, Matteo; Perra, Daniela; Amato, Filippo; Havel, Josef
2013-10-15
The combination of two or more drugs using multidrug mixtures is a trend in the treatment of cancer. The goal is to search for a synergistic effect and thereby reduce the required dose and inhibit the development of resistance. An advanced model-free approach for data exploration and analysis, based on artificial neural networks (ANN) and experimental design is proposed to predict and quantify the synergism of drugs. The proposed method non-linearly correlates the concentrations of drugs with the cytotoxicity of the mixture, providing the possibility of choosing the optimal drug combination that gives the maximum synergism. The use of ANN allows for the prediction of the cytotoxicity of each combination of drugs in the chosen concentration interval. The method was validated by preparing and experimentally testing the combinations with the predicted highest synergistic effect. In all cases, the data predicted by the network were experimentally confirmed. The method was applied to several binary mixtures of cisplatin and [Cu(1,10-orthophenanthroline)2(H2O)](ClO4)2, Cu(1,10-orthophenanthroline)(H2O)2(ClO4)2 or [Cu(1,10-orthophenanthroline)2(imidazolidine-2-thione)](ClO4)2. The cytotoxicity of the two drugs, alone and in combination, was determined against human acute T-lymphoblastic leukemia cells (CCRF-CEM). For all systems, a synergistic effect was found for selected combinations. © 2013 Elsevier B.V. All rights reserved.
Azevedo de Brito, Wanessa; Gomes Dantas, Monique; Andrade Nogueira, Fernando Henrique; Ferreira da Silva-Júnior, Edeildo; Xavier de Araújo-Júnior, João; Aquino, Thiago Mendonça de; Adélia Nogueira Ribeiro, Êurica; da Silva Solon, Lilian Grace; Soares Aragão, Cícero Flávio; Barreto Gomes, Ana Paula
2017-08-30
Guanylhydrazones are molecules with great pharmacological potential in various therapeutic areas, including antitumoral activity. Factorial design is an excellent tool in the optimization of a chromatographic method, because it is possible quickly change factors such as temperature, mobile phase composition, mobile phase pH, column length, among others to establish the optimal conditions of analysis. The aim of the present work was to develop and validate a HPLC and UHPLC methods for the simultaneous determination of guanylhydrazones with anticancer activity employing experimental design. Precise, exact, linear and robust HPLC and UHPLC methods were developed and validated for the simultaneous quantification of the guanylhydrazones LQM10, LQM14, and LQM17. The UHPLC method was more economic, with a four times less solvent consumption, and 20 times less injection volume, what allowed better column performance. Comparing the empirical approach employed in the HPLC method development to the DoE approach employed in the UHPLC method development, we can conclude that the factorial design made the method development faster, more practical and rational. This resulted in methods that can be employed in the analysis, evaluation and quality control of these new synthetic guanylhydrazones.
CANEapp: a user-friendly application for automated next generation transcriptomic data analysis.
Velmeshev, Dmitry; Lally, Patrick; Magistri, Marco; Faghihi, Mohammad Ali
2016-01-13
Next generation sequencing (NGS) technologies are indispensable for molecular biology research, but data analysis represents the bottleneck in their application. Users need to be familiar with computer terminal commands, the Linux environment, and various software tools and scripts. Analysis workflows have to be optimized and experimentally validated to extract biologically meaningful data. Moreover, as larger datasets are being generated, their analysis requires use of high-performance servers. To address these needs, we developed CANEapp (application for Comprehensive automated Analysis of Next-generation sequencing Experiments), a unique suite that combines a Graphical User Interface (GUI) and an automated server-side analysis pipeline that is platform-independent, making it suitable for any server architecture. The GUI runs on a PC or Mac and seamlessly connects to the server to provide full GUI control of RNA-sequencing (RNA-seq) project analysis. The server-side analysis pipeline contains a framework that is implemented on a Linux server through completely automated installation of software components and reference files. Analysis with CANEapp is also fully automated and performs differential gene expression analysis and novel noncoding RNA discovery through alternative workflows (Cuffdiff and R packages edgeR and DESeq2). We compared CANEapp to other similar tools, and it significantly improves on previous developments. We experimentally validated CANEapp's performance by applying it to data derived from different experimental paradigms and confirming the results with quantitative real-time PCR (qRT-PCR). CANEapp adapts to any server architecture by effectively using available resources and thus handles large amounts of data efficiently. CANEapp performance has been experimentally validated on various biological datasets. CANEapp is available free of charge at http://psychiatry.med.miami.edu/research/laboratory-of-translational-rna-genomics/CANE-app . We believe that CANEapp will serve both biologists with no computational experience and bioinformaticians as a simple, timesaving but accurate and powerful tool to analyze large RNA-seq datasets and will provide foundations for future development of integrated and automated high-throughput genomics data analysis tools. Due to its inherently standardized pipeline and combination of automated analysis and platform-independence, CANEapp is an ideal for large-scale collaborative RNA-seq projects between different institutions and research groups.
Validation Database Based Thermal Analysis of an Advanced RPS Concept
NASA Technical Reports Server (NTRS)
Balint, Tibor S.; Emis, Nickolas D.
2006-01-01
Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.
NASA Technical Reports Server (NTRS)
Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.
1982-01-01
This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.
Metrinome: Continuous Monitoring and Security Validation of Distributed Systems
2014-03-01
Integration into the SDLC ( Software Development Life Cycle), Retrieved Nov 06 2013, https://www.owasp.org/ images/f/f6/Integration_into_the_SDLC.ppt [2...assessment as part of the software development life cycle, current approaches suffer from a number of shortcomings that limit their application in...with assessing security and correct functionality. Second, integrated and end-to-end testing and experimentation is often postponed until software
Ramo, Nicole L.; Puttlitz, Christian M.
2018-01-01
Compelling evidence that many biological soft tissues display both strain- and time-dependent behavior has led to the development of fully non-linear viscoelastic modeling techniques to represent the tissue’s mechanical response under dynamic conditions. Since the current stress state of a viscoelastic material is dependent on all previous loading events, numerical analyses are complicated by the requirement of computing and storing the stress at each step throughout the load history. This requirement quickly becomes computationally expensive, and in some cases intractable, for finite element models. Therefore, we have developed a strain-dependent numerical integration approach for capturing non-linear viscoelasticity that enables calculation of the current stress from a strain-dependent history state variable stored from the preceding time step only, which improves both fitting efficiency and computational tractability. This methodology was validated based on its ability to recover non-linear viscoelastic coefficients from simulated stress-relaxation (six strain levels) and dynamic cyclic (three frequencies) experimental stress-strain data. The model successfully fit each data set with average errors in recovered coefficients of 0.3% for stress-relaxation fits and 0.1% for cyclic. The results support the use of the presented methodology to develop linear or non-linear viscoelastic models from stress-relaxation or cyclic experimental data of biological soft tissues. PMID:29293558
Validating the BISON fuel performance code to integral LWR experiments
Williamson, R. L.; Gamble, K. A.; Perez, D. M.; ...
2016-03-24
BISON is a modern finite element-based nuclear fuel performance code that has been under development at the Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to datemore » for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Our results demonstrate that 1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, 2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and 3) comparison of rod diameter results indicates a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. In the initial rod diameter comparisons they were unsatisfactory and have lead to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.« less
NASA Astrophysics Data System (ADS)
Stotsky, Jay A.; Hammond, Jason F.; Pavlovsky, Leonid; Stewart, Elizabeth J.; Younger, John G.; Solomon, Michael J.; Bortz, David M.
2016-07-01
The goal of this work is to develop a numerical simulation that accurately captures the biomechanical response of bacterial biofilms and their associated extracellular matrix (ECM). In this, the second of a two-part effort, the primary focus is on formally presenting the heterogeneous rheology Immersed Boundary Method (hrIBM) and validating our model by comparison to experimental results. With this extension of the Immersed Boundary Method (IBM), we use the techniques originally developed in Part I ([19]) to treat biofilms as viscoelastic fluids possessing variable rheological properties anchored to a set of moving locations (i.e., the bacteria locations). In particular, we incorporate spatially continuous variable viscosity and density fields into our model. Although in [14,15], variable viscosity is used in an IBM context to model discrete viscosity changes across interfaces, to our knowledge this work and Part I are the first to apply the IBM to model a continuously variable viscosity field. We validate our modeling approach from Part I by comparing dynamic moduli and compliance moduli computed from our model to data from mechanical characterization experiments on Staphylococcus epidermidis biofilms. The experimental setup is described in [26] in which biofilms are grown and tested in a parallel plate rheometer. In order to initialize the positions of bacteria in the biofilm, experimentally obtained three dimensional coordinate data was used. One of the major conclusions of this effort is that treating the spring-like connections between bacteria as Maxwell or Zener elements provides good agreement with the mechanical characterization data. We also found that initializing the simulations with different coordinate data sets only led to small changes in the mechanical characterization results. Matlab code used to produce results in this paper will be available at https://github.com/MathBioCU/BiofilmSim.
NASA Astrophysics Data System (ADS)
Joiner, N.; Esser, B.; Fertig, M.; Gülhan, A.; Herdrich, G.; Massuti-Ballester, B.
2016-12-01
This paper summarises the final synthesis of an ESA technology research programme entitled "Development of an Innovative Validation Strategy of Gas Surface Interaction Modelling for Re-entry Applications". The focus of the project was to demonstrate the correct pressure dependency of catalytic surface recombination, with an emphasis on Low Earth Orbit (LEO) re-entry conditions and thermal protection system materials. A physics-based model describing the prevalent recombination mechanisms was proposed for implementation into two CFD codes, TINA and TAU. A dedicated experimental campaign was performed to calibrate and validate the CFD model on TPS materials pertinent to the EXPERT space vehicle at a wide range of temperatures and pressures relevant to LEO. A new set of catalytic recombination data was produced that was able to improve the chosen model calibration for CVD-SiC and provide the first model calibration for the Nickel-Chromium super-alloy PM1000. The experimentally observed pressure dependency of catalytic recombination can only be reproduced by the Langmuir-Hinshelwood recombination mechanism. Due to decreasing degrees of (enthalpy and hence) dissociation with facility stagnation pressure, it was not possible to obtain catalytic recombination coefficients from the measurements at high experimental stagnation pressures. Therefore, the CFD model calibration has been improved by this activity based on the low pressure results. The results of the model calibration were applied to the existing EXPERT mission profile to examine the impact of the experimentally calibrated model at flight relevant conditions. The heat flux overshoot at the CVD-SiC/PM1000 junction on EXPERT is confirmed to produce radiative equilibrium temperatures in close proximity to the PM1000 melt temperature.This was anticipated within the margins of the vehicle design; however, due to the measurements made here for the first time at relevant temperatures for the junction, an increased confidence in this finding is placed on the computations.
NASA Astrophysics Data System (ADS)
Silvernail, Nathan L.
This research was carried out in collaboration with the United Launch Alliance (ULA), to advance an innovative Centaur-based on-orbit propellant storage and transfer system that takes advantage of rotational settling to simplify Fluid Management (FM), specifically enabling settled fluid transfer between two tanks and settled pressure control. This research consists of two specific objectives: (1) technique and process validation and (2) computational model development. In order to raise the Technology Readiness Level (TRL) of this technology, the corresponding FM techniques and processes must be validated in a series of experimental tests, including: laboratory/ground testing, microgravity flight testing, suborbital flight testing, and orbital testing. Researchers from Embry-Riddle Aeronautical University (ERAU) have joined with the Massachusetts Institute of Technology (MIT) Synchronized Position Hold Engage and Reorient Experimental Satellites (SPHERES) team to develop a prototype FM system for operations aboard the International Space Station (ISS). Testing of the integrated system in a representative environment will raise the FM system to TRL 6. The tests will demonstrate the FM system and provide unique data pertaining to the vehicle's rotational dynamics while undergoing fluid transfer operations. These data sets provide insight into the behavior and physical tendencies of the on-orbit refueling system. Furthermore, they provide a baseline for comparison against the data produced by various computational models; thus verifying the accuracy of the models output and validating the modeling approach. Once these preliminary models have been validated, the parameters defined by them will provide the basis of development for accurate simulations of full scale, on-orbit systems. The completion of this project and the models being developed will accelerate the commercialization of on-orbit propellant storage and transfer technologies as well as all in-space technologies that utilize or will utilize similar FM techniques and processes.
McAuliff, Bradley D; Kovera, Margaret Bull; Nunez, Gabriel
2009-06-01
This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed.
Validation of WIND for a Series of Inlet Flows
NASA Technical Reports Server (NTRS)
Slater, John W.; Abbott, John M.; Cavicchi, Richard H.
2002-01-01
Validation assessments compare WIND CFD simulations to experimental data for a series of inlet flows ranging in Mach number from low subsonic to hypersonic. The validation procedures follow the guidelines of the AIAA. The WIND code performs well in matching the available experimental data. The assessments demonstrate the use of WIND and provide confidence in its use for the analysis of aircraft inlets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ihme, Matthias; Driscoll, James
2015-08-31
The objective of this closely coordinated experimental and computational research effort is the development of simulation techniques for the prediction of combustion processes, relevant to the oxidation of syngas and high hydrogen content (HHC) fuels at gas-turbine relevant operating conditions. Specifically, the research goals are (i) the characterization of the sensitivity of syngas ignition processes to hydrodynamic processes and perturbations in temperature and mixture composition in rapid compression machines and ow-reactors and (ii) to conduct comprehensive experimental investigations in a swirl-stabilized gas turbine (GT) combustor under realistic high-pressure operating conditions in order (iii) to obtain fundamental understanding about mechanisms controllingmore » unstable flame regimes in HHC-combustion.« less
Evaluated cross-section libraries and kerma factors for neutrons up to 100 MeV on {sup 12}C
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chadwick, M.B.; Blann, M.; Cox, L.
1995-04-11
A program is being carried out at Lawrence Livermore National Laboratory to develop high-energy evaluated nuclear data libraries for use in Monte Carlo simulations of cancer radiation therapy. In this report we describe evaluated cross sections and kerma factors for neutrons with incident energies up to 100 MeV on {sup 12}C. The aim of this effort is to incorporate advanced nuclear physics modeling methods, with new experimental measurements, to generate cross section libraries needed for an accurate simulation of dose deposition in fast neutron therapy. The evaluated libraries are based mainly on nuclear model calculations, benchmarked to experimental measurements wheremore » they exist. We use the GNASH code system, which includes Hauser-Feshbach, preequilibrium, and direct reaction mechanisms. The libraries tabulate elastic and nonelastic cross sections, angle-energy correlated production spectra for light ejectiles with A{le}and kinetic energies given to light ejectiles and heavy recoil fragments. The major steps involved in this effort are: (1) development and validation of nuclear models for incident energies up to 100 MeV; (2) collation of experimental measurements, including new results from Louvain-la-Nueve and Los Alamos; (3) extension of the Livermore ENDL formats for representing high-energy data; (4) calculation and evaluation of nuclear data; and (5) validation of the libraries. We describe the evaluations in detail, with particular emphasis on our new high-energy modeling developments. Our evaluations agree well with experimental measurements of integrated and differential cross sections. We compare our results with the recent ENDF/B-VI evaluation which extends up to 32 MeV.« less
1993-03-01
KC-135 Gl-epoxy Winglet 1 *1 = experimental; 2 = prototype development; 3 = production 9 TABLE 3. ADVANCED COMPOSITES IN MILITARY AIRCRAFT (CONCLUDED...specially blended for related agent testing and would not be available, due to its high production cost, for regular distribution.1 ’Personal
Developing Ideas of Refraction, Lenses and Rainbow through the Use of Historical Resources
ERIC Educational Resources Information Center
Mihas, Pavlos
2008-01-01
The paper examines different ways of using historical resources in teaching refraction related subjects. Experimental procedures can be taught by using Ptolemy's and Al Haytham's methods. The student can check the validity of the approximations or rules which were presented by different people. The interpretation of the relations is another…
Mathematical model of water transport in Bacon and alkaline matrix-type hydrogen-oxygen fuel cells
NASA Technical Reports Server (NTRS)
Prokopius, P. R.; Easter, R. W.
1972-01-01
Based on general mass continuity and diffusive transport equations, a mathematical model was developed that simulates the transport of water in Bacon and alkaline-matrix fuel cells. The derived model was validated by using it to analytically reproduce various Bacon and matrix-cell experimental water transport transients.
NASA Astrophysics Data System (ADS)
Ciarletti, V.; Le Gall, A.; Berthelier, J. J.; Corbel, Ch.; Dolon, F.; Ney, R.; Reineix, A.; Guiffaud, Ch.; Clifford, S.; Heggy, E.
2007-03-01
A bi-static version of the HF GPR TAPIR developed for martian deep soundings has been operated in the Egyptian Western Desert. The study presented focuses on the retrieval of the direction of arrival of the observed echoes on both simulated and measured d
An analytical method for designing low noise helicopter transmissions
NASA Technical Reports Server (NTRS)
Bossler, R. B., Jr.; Bowes, M. A.; Royal, A. C.
1978-01-01
The development and experimental validation of a method for analytically modeling the noise mechanism in the helicopter geared power transmission systems is described. This method can be used within the design process to predict interior noise levels and to investigate the noise reducing potential of alternative transmission design details. Examples are discussed.
Integral nuclear data validation using experimental spent nuclear fuel compositions
Gauld, Ian C.; Williams, Mark L.; Michel-Sendis, Franco; ...
2017-07-19
Measurements of the isotopic contents of spent nuclear fuel provide experimental data that are a prerequisite for validating computer codes and nuclear data for many spent fuel applications. Under the auspices of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) and guidance of the Expert Group on Assay Data of Spent Nuclear Fuel of the NEA Working Party on Nuclear Criticality Safety, a new database of expanded spent fuel isotopic compositions has been compiled. The database, Spent Fuel Compositions (SFCOMPO) 2.0, includes measured data for more than 750 fuel samples acquired from 44 different reactors andmore » representing eight different reactor technologies. Measurements for more than 90 isotopes are included. This new database provides data essential for establishing the reliability of code systems for inventory predictions, but it also has broader potential application to nuclear data evaluation. Furthermore, the database, together with adjoint based sensitivity and uncertainty tools for transmutation systems developed to quantify the importance of nuclear data on nuclide concentrations, are described.« less
Integral nuclear data validation using experimental spent nuclear fuel compositions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauld, Ian C.; Williams, Mark L.; Michel-Sendis, Franco
Measurements of the isotopic contents of spent nuclear fuel provide experimental data that are a prerequisite for validating computer codes and nuclear data for many spent fuel applications. Under the auspices of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) and guidance of the Expert Group on Assay Data of Spent Nuclear Fuel of the NEA Working Party on Nuclear Criticality Safety, a new database of expanded spent fuel isotopic compositions has been compiled. The database, Spent Fuel Compositions (SFCOMPO) 2.0, includes measured data for more than 750 fuel samples acquired from 44 different reactors andmore » representing eight different reactor technologies. Measurements for more than 90 isotopes are included. This new database provides data essential for establishing the reliability of code systems for inventory predictions, but it also has broader potential application to nuclear data evaluation. Furthermore, the database, together with adjoint based sensitivity and uncertainty tools for transmutation systems developed to quantify the importance of nuclear data on nuclide concentrations, are described.« less
IFMIF: overview of the validation activities
NASA Astrophysics Data System (ADS)
Knaster, J.; Arbeiter, F.; Cara, P.; Favuzza, P.; Furukawa, T.; Groeschel, F.; Heidinger, R.; Ibarra, A.; Matsumoto, H.; Mosnier, A.; Serizawa, H.; Sugimoto, M.; Suzuki, H.; Wakai, E.
2013-11-01
The Engineering Validation and Engineering Design Activities (EVEDA) for the International Fusion Materials Irradiation Facility (IFMIF), an international collaboration under the Broader Approach Agreement between Japan Government and EURATOM, aims at allowing a rapid construction phase of IFMIF in due time with an understanding of the cost involved. The three main facilities of IFMIF (1) the Accelerator Facility, (2) the Target Facility and (3) the Test Facility are the subject of validation activities that include the construction of either full scale prototypes or smartly devised scaled down facilities that will allow a straightforward extrapolation to IFMIF needs. By July 2013, the engineering design activities of IFMIF matured with the delivery of an Intermediate IFMIF Engineering Design Report (IIEDR) supported by experimental results. The installation of a Linac of 1.125 MW (125 mA and 9 MeV) of deuterons started in March 2013 in Rokkasho (Japan). The world's largest liquid Li test loop is running in Oarai (Japan) with an ambitious experimental programme for the years ahead. A full scale high flux test module that will house ∼1000 small specimens developed jointly in Europe and Japan for the Fusion programme has been constructed by KIT (Karlsruhe) together with its He gas cooling loop. A full scale medium flux test module to carry out on-line creep measurement has been validated by CRPP (Villigen).
confFuse: High-Confidence Fusion Gene Detection across Tumor Entities.
Huang, Zhiqin; Jones, David T W; Wu, Yonghe; Lichter, Peter; Zapatka, Marc
2017-01-01
Background: Fusion genes play an important role in the tumorigenesis of many cancers. Next-generation sequencing (NGS) technologies have been successfully applied in fusion gene detection for the last several years, and a number of NGS-based tools have been developed for identifying fusion genes during this period. Most fusion gene detection tools based on RNA-seq data report a large number of candidates (mostly false positives), making it hard to prioritize candidates for experimental validation and further analysis. Selection of reliable fusion genes for downstream analysis becomes very important in cancer research. We therefore developed confFuse, a scoring algorithm to reliably select high-confidence fusion genes which are likely to be biologically relevant. Results: confFuse takes multiple parameters into account in order to assign each fusion candidate a confidence score, of which score ≥8 indicates high-confidence fusion gene predictions. These parameters were manually curated based on our experience and on certain structural motifs of fusion genes. Compared with alternative tools, based on 96 published RNA-seq samples from different tumor entities, our method can significantly reduce the number of fusion candidates (301 high-confidence from 8,083 total predicted fusion genes) and keep high detection accuracy (recovery rate 85.7%). Validation of 18 novel, high-confidence fusions detected in three breast tumor samples resulted in a 100% validation rate. Conclusions: confFuse is a novel downstream filtering method that allows selection of highly reliable fusion gene candidates for further downstream analysis and experimental validations. confFuse is available at https://github.com/Zhiqin-HUANG/confFuse.
Neuroinflammatory targets and treatments for epilepsy validated in experimental models.
Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M
2017-07-01
A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Silva, F G A; de Moura, M F S F; Dourado, N; Xavier, J; Pereira, F A M; Morais, J J L; Dias, M I R; Lourenço, P J; Judas, F M
2017-08-01
Fracture characterization of human cortical bone under mode II loading was analyzed using a miniaturized version of the end-notched flexure test. A data reduction scheme based on crack equivalent concept was employed to overcome uncertainties on crack length monitoring during the test. The crack tip shear displacement was experimentally measured using digital image correlation technique to determine the cohesive law that mimics bone fracture behavior under mode II loading. The developed procedure was validated by finite element analysis using cohesive zone modeling considering a trapezoidal with bilinear softening relationship. Experimental load-displacement curves, resistance curves and crack tip shear displacement versus applied displacement were used to validate the numerical procedure. The excellent agreement observed between the numerical and experimental results reveals the appropriateness of the proposed test and procedure to characterize human cortical bone fracture under mode II loading. The proposed methodology can be viewed as a novel valuable tool to be used in parametric and methodical clinical studies regarding features (e.g., age, diseases, drugs) influencing bone shear fracture under mode II loading.
Construction and Experimental Validation of a Petri Net Model of Wnt/β-Catenin Signaling.
Jacobsen, Annika; Heijmans, Nika; Verkaar, Folkert; Smit, Martine J; Heringa, Jaap; van Amerongen, Renée; Feenstra, K Anton
2016-01-01
The Wnt/β-catenin signaling pathway is important for multiple developmental processes and tissue maintenance in adults. Consequently, deregulated signaling is involved in a range of human diseases including cancer and developmental defects. A better understanding of the intricate regulatory mechanism and effect of physiological (active) and pathophysiological (hyperactive) WNT signaling is important for predicting treatment response and developing novel therapies. The constitutively expressed CTNNB1 (commonly and hereafter referred to as β-catenin) is degraded by a destruction complex, composed of amongst others AXIN1 and GSK3. The destruction complex is inhibited during active WNT signaling, leading to β-catenin stabilization and induction of β-catenin/TCF target genes. In this study we investigated the mechanism and effect of β-catenin stabilization during active and hyperactive WNT signaling in a combined in silico and in vitro approach. We constructed a Petri net model of Wnt/β-catenin signaling including main players from the plasma membrane (WNT ligands and receptors), cytoplasmic effectors and the downstream negative feedback target gene AXIN2. We validated that our model can be used to simulate both active (WNT stimulation) and hyperactive (GSK3 inhibition) signaling by comparing our simulation and experimental data. We used this experimentally validated model to get further insights into the effect of the negative feedback regulator AXIN2 upon WNT stimulation and observed an attenuated β-catenin stabilization. We furthermore simulated the effect of APC inactivating mutations, yielding a stabilization of β-catenin levels comparable to the Wnt-pathway activities observed in colorectal and breast cancer. Our model can be used for further investigation and viable predictions of the role of Wnt/β-catenin signaling in oncogenesis and development.
Construction and Experimental Validation of a Petri Net Model of Wnt/β-Catenin Signaling
Heijmans, Nika; Verkaar, Folkert; Smit, Martine J.; Heringa, Jaap
2016-01-01
The Wnt/β-catenin signaling pathway is important for multiple developmental processes and tissue maintenance in adults. Consequently, deregulated signaling is involved in a range of human diseases including cancer and developmental defects. A better understanding of the intricate regulatory mechanism and effect of physiological (active) and pathophysiological (hyperactive) WNT signaling is important for predicting treatment response and developing novel therapies. The constitutively expressed CTNNB1 (commonly and hereafter referred to as β-catenin) is degraded by a destruction complex, composed of amongst others AXIN1 and GSK3. The destruction complex is inhibited during active WNT signaling, leading to β-catenin stabilization and induction of β-catenin/TCF target genes. In this study we investigated the mechanism and effect of β-catenin stabilization during active and hyperactive WNT signaling in a combined in silico and in vitro approach. We constructed a Petri net model of Wnt/β-catenin signaling including main players from the plasma membrane (WNT ligands and receptors), cytoplasmic effectors and the downstream negative feedback target gene AXIN2. We validated that our model can be used to simulate both active (WNT stimulation) and hyperactive (GSK3 inhibition) signaling by comparing our simulation and experimental data. We used this experimentally validated model to get further insights into the effect of the negative feedback regulator AXIN2 upon WNT stimulation and observed an attenuated β-catenin stabilization. We furthermore simulated the effect of APC inactivating mutations, yielding a stabilization of β-catenin levels comparable to the Wnt-pathway activities observed in colorectal and breast cancer. Our model can be used for further investigation and viable predictions of the role of Wnt/β-catenin signaling in oncogenesis and development. PMID:27218469
Research Directions for Cyber Experimentation: Workshop Discussion Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeWaard, Elizabeth; Deccio, Casey; Fritz, David Jakob
Sandia National Laboratories hosted a workshop on August 11, 2017 entitled "Research Directions for Cyber Experimentation," which focused on identifying and addressing research gaps within the field of cyber experimentation , particularly emulation testbeds . This report mainly documents the discussion toward the end of the workshop, which included research gaps such as developing a sustainable research infrastructure, exp anding cyber experimentation, and making the field more accessible to subject matter experts who may not have a background in computer science . Other gaps include methodologies for rigorous experimentation, validation, and uncertainty quantification, which , if addressed, also have themore » potential to bridge the gap between cyber experimentation and cyber engineering. Workshop attendees presented various ways to overcome these research gaps, however the main conclusion for overcoming these gaps is better commun ication through increased workshops, conferences, email lists, and slack chann els, among other opportunities.« less
The flaws and human harms of animal experimentation.
Akhtar, Aysha
2015-10-01
Nonhuman animal ("animal") experimentation is typically defended by arguments that it is reliable, that animals provide sufficiently good models of human biology and diseases to yield relevant information, and that, consequently, its use provides major human health benefits. I demonstrate that a growing body of scientific literature critically assessing the validity of animal experimentation generally (and animal modeling specifically) raises important concerns about its reliability and predictive value for human outcomes and for understanding human physiology. The unreliability of animal experimentation across a wide range of areas undermines scientific arguments in favor of the practice. Additionally, I show how animal experimentation often significantly harms humans through misleading safety studies, potential abandonment of effective therapeutics, and direction of resources away from more effective testing methods. The resulting evidence suggests that the collective harms and costs to humans from animal experimentation outweigh potential benefits and that resources would be better invested in developing human-based testing methods.
Martins, Raquel R; McCracken, Andrew W; Simons, Mirre J P; Henriques, Catarina M; Rera, Michael
2018-02-05
The Smurf Assay (SA) was initially developed in the model organism Drosophila melanogaster where a dramatic increase of intestinal permeability has been shown to occur during aging (Rera et al. , 2011). We have since validated the protocol in multiple other model organisms (Dambroise et al. , 2016) and have utilized the assay to further our understanding of aging (Tricoire and Rera, 2015; Rera et al. , 2018). The SA has now also been used by other labs to assess intestinal barrier permeability (Clark et al. , 2015; Katzenberger et al. , 2015; Barekat et al. , 2016; Chakrabarti et al. , 2016; Gelino et al. , 2016). The SA in itself is simple; however, numerous small details can have a considerable impact on its experimental validity and subsequent interpretation. Here, we provide a detailed update on the SA technique and explain how to catch a Smurf while avoiding the most common experimental fallacies.
Experimental validation of a coupled neutron-photon inverse radiation transport solver
NASA Astrophysics Data System (ADS)
Mattingly, John; Mitchell, Dean J.; Harding, Lee T.
2011-10-01
Sandia National Laboratories has developed an inverse radiation transport solver that applies nonlinear regression to coupled neutron-photon deterministic transport models. The inverse solver uses nonlinear regression to fit a radiation transport model to gamma spectrometry and neutron multiplicity counting measurements. The subject of this paper is the experimental validation of that solver. This paper describes a series of experiments conducted with a 4.5 kg sphere of α-phase, weapons-grade plutonium. The source was measured bare and reflected by high-density polyethylene (HDPE) spherical shells with total thicknesses between 1.27 and 15.24 cm. Neutron and photon emissions from the source were measured using three instruments: a gross neutron counter, a portable neutron multiplicity counter, and a high-resolution gamma spectrometer. These measurements were used as input to the inverse radiation transport solver to evaluate the solver's ability to correctly infer the configuration of the source from its measured radiation signatures.
NASA Astrophysics Data System (ADS)
Underwood, Thomas; Loebner, Keith; Cappelli, Mark
2015-11-01
Detailed measurements of the thermodynamic and electrodynamic plasma state variables within the plume of a pulsed plasma accelerator are presented. A quadruple Langmuir probe operating in current-saturation mode is used to obtain time resolved measurements of the plasma density, temperature, potential, and velocity along the central axis of the accelerator. This data is used in conjunction with a fast-framing, intensified CCD camera to develop and validate a model predicting the existence of two distinct types of ionization waves corresponding to the upper and lower solution branches of the Hugoniot curve. A deviation of less than 8% is observed between the quasi-steady, one-dimensional theoretical model and the experimentally measured plume velocity. This work is supported by the U.S. Department of Energy Stewardship Science Academic Program in addition to the National Defense Science Engineering Graduate Fellowship.
NASA Astrophysics Data System (ADS)
Lee, Chang-Chun; Shih, Yan-Shin; Wu, Chih-Sheng; Tsai, Chia-Hao; Yeh, Shu-Tang; Peng, Yi-Hao; Chen, Kuang-Jung
2012-07-01
This work analyses the overall stress/strain characteristic of flexible encapsulations with organic light-emitting diode (OLED) devices. A robust methodology composed of a mechanical model of multi-thin film under bending loads and related stress simulations based on nonlinear finite element analysis (FEA) is proposed, and validated to be more reliable compared with related experimental data. With various geometrical combinations of cover plate, stacked thin films and plastic substrate, the position of the neutral axis (NA) plate, which is regarded as a key design parameter to minimize stress impact for the concerned OLED devices, is acquired using the present methodology. The results point out that both the thickness and mechanical properties of the cover plate help in determining the NA location. In addition, several concave and convex radii are applied to examine the reliable mechanical tolerance and to provide an insight into the estimated reliability of foldable OLED encapsulations.
A standardized set of 3-D objects for virtual reality research and applications.
Peeters, David
2018-06-01
The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.
Plasma Model V&V of Collisionless Electrostatic Shock
NASA Astrophysics Data System (ADS)
Martin, Robert; Le, Hai; Bilyeu, David; Gildea, Stephen
2014-10-01
A simple 1D electrostatic collisionless shock was selected as an initial validation and verification test case for a new plasma modeling framework under development at the Air Force Research Laboratory's In-Space Propulsion branch (AFRL/RQRS). Cross verification between PIC, Vlasov, and Fluid plasma models within the framework along with expected theoretical results will be shown. The non-equilibrium velocity distributions (VDF) captured by PIC and Vlasov will be compared to each other and the assumed VDF of the fluid model at selected points. Validation against experimental data from the University of California, Los Angeles double-plasma device will also be presented along with current work in progress at AFRL/RQRS towards reproducing the experimental results using higher fidelity diagnostics to help elucidate differences between model results and between the models and original experiment. DISTRIBUTION A: Approved for public release; unlimited distribution; PA (Public Affairs) Clearance Number 14332.
Experimental study of an adaptive elastic metamaterial controlled by electric circuits
NASA Astrophysics Data System (ADS)
Zhu, R.; Chen, Y. Y.; Barnhart, M. V.; Hu, G. K.; Sun, C. T.; Huang, G. L.
2016-01-01
The ability to control elastic wave propagation at a deep subwavelength scale makes locally resonant elastic metamaterials very relevant. A number of abilities have been demonstrated such as frequency filtering, wave guiding, and negative refraction. Unfortunately, few metamaterials develop into practical devices due to their lack of tunability for specific frequencies. With the help of multi-physics numerical modeling, experimental validation of an adaptive elastic metamaterial integrated with shunted piezoelectric patches has been performed in a deep subwavelength scale. The tunable bandgap capacity, as high as 45%, is physically realized by using both hardening and softening shunted circuits. It is also demonstrated that the effective mass density of the metamaterial can be fully tailored by adjusting parameters of the shunted electric circuits. Finally, to illustrate a practical application, transient wave propagation tests of the adaptive metamaterial subjected to impact loads are conducted to validate their tunable wave mitigation abilities in real-time.
Fiber Optic Thermo-Hygrometers for Soil Moisture Monitoring.
Leone, Marco; Principe, Sofia; Consales, Marco; Parente, Roberto; Laudati, Armando; Caliro, Stefano; Cutolo, Antonello; Cusano, Andrea
2017-06-20
This work deals with the fabrication, prototyping, and experimental validation of a fiber optic thermo-hygrometer-based soil moisture sensor, useful for rainfall-induced landslide prevention applications. In particular, we recently proposed a new generation of fiber Bragg grating (FBGs)-based soil moisture sensors for irrigation purposes. This device was realized by integrating, inside a customized aluminum protection package, a FBG thermo-hygrometer with a polymer micro-porous membrane. Here, we first verify the limitations, in terms of the volumetric water content (VWC) measuring range, of this first version of the soil moisture sensor for its exploitation in landslide prevention applications. Successively, we present the development, prototyping, and experimental validation of a novel, optimized version of a soil VWC sensor, still based on a FBG thermo-hygrometer, but able to reliably monitor, continuously and in real-time, VWC values up to 37% when buried in the soil.
Fiber Optic Thermo-Hygrometers for Soil Moisture Monitoring
Leone, Marco; Principe, Sofia; Consales, Marco; Parente, Roberto; Laudati, Armando; Caliro, Stefano; Cutolo, Antonello; Cusano, Andrea
2017-01-01
This work deals with the fabrication, prototyping, and experimental validation of a fiber optic thermo-hygrometer-based soil moisture sensor, useful for rainfall-induced landslide prevention applications. In particular, we recently proposed a new generation of fiber Bragg grating (FBGs)-based soil moisture sensors for irrigation purposes. This device was realized by integrating, inside a customized aluminum protection package, a FBG thermo-hygrometer with a polymer micro-porous membrane. Here, we first verify the limitations, in terms of the volumetric water content (VWC) measuring range, of this first version of the soil moisture sensor for its exploitation in landslide prevention applications. Successively, we present the development, prototyping, and experimental validation of a novel, optimized version of a soil VWC sensor, still based on a FBG thermo-hygrometer, but able to reliably monitor, continuously and in real-time, VWC values up to 37% when buried in the soil. PMID:28632172
High Level Analysis, Design and Validation of Distributed Mobile Systems with
NASA Astrophysics Data System (ADS)
Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.
System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.
Pandey, Daya Shankar; Pan, Indranil; Das, Saptarshi; Leahy, James J; Kwapinski, Witold
2015-03-01
A multi-gene genetic programming technique is proposed as a new method to predict syngas yield production and the lower heating value for municipal solid waste gasification in a fluidized bed gasifier. The study shows that the predicted outputs of the municipal solid waste gasification process are in good agreement with the experimental dataset and also generalise well to validation (untrained) data. Published experimental datasets are used for model training and validation purposes. The results show the effectiveness of the genetic programming technique for solving complex nonlinear regression problems. The multi-gene genetic programming are also compared with a single-gene genetic programming model to show the relative merits and demerits of the technique. This study demonstrates that the genetic programming based data-driven modelling strategy can be a good candidate for developing models for other types of fuels as well. Copyright © 2014 Elsevier Ltd. All rights reserved.
Monte Carlo simulation of Ray-Scan 64 PET system and performance evaluation using GATE toolkit
NASA Astrophysics Data System (ADS)
Li, Suying; Zhang, Qiushi; Vuletic, Ivan; Xie, Zhaoheng; Yang, Kun; Ren, Qiushi
2017-02-01
In this study, we aimed to develop a GATE model for the simulation of Ray-Scan 64 PET scanner and model its performance characteristics. A detailed implementation of system geometry and physical process were included in the simulation model. Then we modeled the performance characteristics of Ray-Scan 64 PET system for the first time, based on National Electrical Manufacturers Association (NEMA) NU-2 2007 protocols and validated the model against experimental measurement, including spatial resolution, sensitivity, counting rates and noise equivalent count rate (NECR). Moreover, an accurate dead time module was investigated to simulate the counting rate performance. Overall results showed reasonable agreement between simulation and experimental data. The validation results showed the reliability and feasibility of the GATE model to evaluate major performance of Ray-Scan 64 PET system. It provided a useful tool for a wide range of research applications.
2015-01-01
The 5-hydroxytryptamine 1A (5-HT1A) serotonin receptor has been an attractive target for treating mood and anxiety disorders such as schizophrenia. We have developed binary classification quantitative structure–activity relationship (QSAR) models of 5-HT1A receptor binding activity using data retrieved from the PDSP Ki database. The prediction accuracy of these models was estimated by external 5-fold cross-validation as well as using an additional validation set comprising 66 structurally distinct compounds from the World of Molecular Bioactivity database. These validated models were then used to mine three major types of chemical screening libraries, i.e., drug-like libraries, GPCR targeted libraries, and diversity libraries, to identify novel computational hits. The five best hits from each class of libraries were chosen for further experimental testing in radioligand binding assays, and nine of the 15 hits were confirmed to be active experimentally with binding affinity better than 10 μM. The most active compound, Lysergol, from the diversity library showed very high binding affinity (Ki) of 2.3 nM against 5-HT1A receptor. The novel 5-HT1A actives identified with the QSAR-based virtual screening approach could be potentially developed as novel anxiolytics or potential antischizophrenic drugs. PMID:24410373
Herbort, Maike C.; Iseev, Jenny; Stolz, Christopher; Roeser, Benedict; Großkopf, Nora; Wüstenberg, Torsten; Hellweg, Rainer; Walter, Henrik; Dziobek, Isabel; Schott, Björn H.
2016-01-01
We present the ToMenovela, a stimulus set that has been developed to provide a set of normatively rated socio-emotional stimuli showing varying amount of characters in emotionally laden interactions for experimental investigations of (i) cognitive and (ii) affective Theory of Mind (ToM), (iii) emotional reactivity, and (iv) complex emotion judgment with respect to Ekman’s basic emotions (happiness, anger, disgust, fear, sadness, surprise, Ekman and Friesen, 1975). Stimuli were generated with focus on ecological validity and consist of 190 scenes depicting daily-life situations. Two or more of eight main characters with distinct biographies and personalities are depicted on each scene picture. To obtain an initial evaluation of the stimulus set and to pave the way for future studies in clinical populations, normative data on each stimulus of the set was obtained from a sample of 61 neurologically and psychiatrically healthy participants (31 female, 30 male; mean age 26.74 ± 5.84), including a visual analog scale rating of Ekman’s basic emotions (happiness, anger, disgust, fear, sadness, surprise) and free-text descriptions of the content of each scene. The ToMenovela is being developed to provide standardized material of social scenes that are available to researchers in the study of social cognition. It should facilitate experimental control while keeping ecological validity high. PMID:27994562
Experimental Validation of Normalized Uniform Load Surface Curvature Method for Damage Localization
Jung, Ho-Yeon; Sung, Seung-Hoon; Jung, Hyung-Jo
2015-01-01
In this study, we experimentally validated the normalized uniform load surface (NULS) curvature method, which has been developed recently to assess damage localization in beam-type structures. The normalization technique allows for the accurate assessment of damage localization with greater sensitivity irrespective of the damage location. In this study, damage to a simply supported beam was numerically and experimentally investigated on the basis of the changes in the NULS curvatures, which were estimated from the modal flexibility matrices obtained from the acceleration responses under an ambient excitation. Two damage scenarios were considered for the single damage case as well as the multiple damages case by reducing the bending stiffness (EI) of the affected element(s). Numerical simulations were performed using MATLAB as a preliminary step. During the validation experiments, a series of tests were performed. It was found that the damage locations could be identified successfully without any false-positive or false-negative detections using the proposed method. For comparison, the damage detection performances were compared with those of two other well-known methods based on the modal flexibility matrix, namely, the uniform load surface (ULS) method and the ULS curvature method. It was confirmed that the proposed method is more effective for investigating the damage locations of simply supported beams than the two conventional methods in terms of sensitivity to damage under measurement noise. PMID:26501286
Damodara, Vijaya; Chen, Daniel H; Lou, Helen H; Rasel, Kader M A; Richmond, Peyton; Wang, Anan; Li, Xianchang
2017-05-01
Emissions from flares constitute unburned hydrocarbons, carbon monoxide (CO), soot, and other partially burned and altered hydrocarbons along with carbon dioxide (CO 2 ) and water. Soot or visible smoke is of particular concern for flare operators/regulatory agencies. The goal of the study is to develop a computational fluid dynamics (CFD) model capable of predicting flare combustion efficiency (CE) and soot emission. Since detailed combustion mechanisms are too complicated for (CFD) application, a 50-species reduced mechanism, LU 3.0.1, was developed. LU 3.0.1 is capable of handling C 4 hydrocarbons and soot precursor species (C 2 H 2 , C 2 H 4 , C 6 H 6 ). The new reduced mechanism LU 3.0.1 was first validated against experimental performance indicators: laminar flame speed, adiabatic flame temperature, and ignition delay. Further, CFD simulations using LU 3.0.1 were run to predict soot emission and CE of air-assisted flare tests conducted in 2010 in Tulsa, Oklahoma, using ANSYS Fluent software. Results of non-premixed probability density function (PDF) model and eddy dissipation concept (EDC) model are discussed. It is also noteworthy that when used in conjunction with the EDC turbulence-chemistry model, LU 3.0.1 can reasonably predict volatile organic compound (VOC) emissions as well. A reduced combustion mechanism containing 50 C 1 -C 4 species and soot precursors has been developed and validated against experimental data. The combustion mechanism is then employed in the computational fluid dynamics (CFD) of modeling of soot emission and combustion efficiency (CE) of controlled flares for which experimental soot and CE data are available. The validated CFD modeling tools are useful for oil, gas, and chemical industries to comply with U.S. Environmental Protection Agency's (EPA) mandate to achieve smokeless flaring with a high CE.
The Research on the High-Protein Low-Calorie Food Recipe for Teenager Gymnastics Athletes.
Wei, Cong
2015-01-01
In order to prevent teenager gymnastics athletes getting fat deposition, weight gain, they should supply a rational food. This paper considers the normal growth and development of athletes, body fat deposition proteins and hunger feel, configured high-protein low-calorie food recipe. Then analysis the composition and the essential amino acids of the recipe. In the final choiced 18 adolescent gymnastics athletes as subjects, to verify the validity of the formula. And analysis the experimental results. The experimental results analysis shows that this recipe basically meets the design requirements.
NASA Technical Reports Server (NTRS)
Cramer, J. M.; Pal, S.; Marshall, W. M.; Santoro, R. J.
2003-01-01
Contents include the folloving: 1. Motivation. Support NASA's 3d generation launch vehicle technology program. RBCC is promising candidate for 3d generation propulsion system. 2. Approach. Focus on ejector mode p3erformance (Mach 0-3). Perform testing on established flowpath geometry. Use conventional propulsion measurement techniques. Use advanced optical diagnostic techniques to measure local combustion gas properties. 3. Objectives. Gain physical understanding of detailing mixing and combustion phenomena. Establish an experimental data set for CFD code development and validation.
Development and validation of RAYDOSE: a Geant4-based application for molecular radiotherapy
NASA Astrophysics Data System (ADS)
Marcatili, S.; Pettinato, C.; Daniels, S.; Lewis, G.; Edwards, P.; Fanti, S.; Spezi, E.
2013-04-01
We developed and validated a Monte-Carlo-based application (RAYDOSE) to generate patient-specific 3D dose maps on the basis of pre-treatment imaging studies. A CT DICOM image is used to model patient geometry, while repeated PET scans are employed to assess radionuclide kinetics and distribution at the voxel level. In this work, we describe the structure of this application and present the tests performed to validate it against reference data and experiments. We used the spheres of a NEMA phantom to calculate S values and total doses. The comparison with reference data from OLINDA/EXM showed an agreement within 2% for a sphere size above 2.8 cm diameter. A custom heterogeneous phantom composed of several layers of Perspex and lung equivalent material was used to compare TLD measurements of gamma radiation from 131I to Monte Carlo simulations. An agreement within 5% was found. RAYDOSE has been validated against reference data and experimental measurements and can be a useful multi-modality platform for treatment planning and research in MRT.
Casper, T. A.; Meyer, W. H.; Jackson, G. L.; ...
2010-12-08
We are exploring characteristics of ITER startup scenarios in similarity experiments conducted on the DIII-D Tokamak. In these experiments, we have validated scenarios for the ITER current ramp up to full current and developed methods to control the plasma parameters to achieve stability. Predictive simulations of ITER startup using 2D free-boundary equilibrium and 1D transport codes rely on accurate estimates of the electron and ion temperature profiles that determine the electrical conductivity and pressure profiles during the current rise. Here we present results of validation studies that apply the transport model used by the ITER team to DIII-D discharge evolutionmore » and comparisons with data from our similarity experiments.« less
SU-E-T-664: Radiobiological Modeling of Prophylactic Cranial Irradiation in Mice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, D; Debeb, B; Woodward, W
Purpose: Prophylactic cranial irradiation (PCI) is a clinical technique used to reduce the incidence of brain metastasis and improve overall survival in select patients with ALL and SCLC, and we have shown the potential of PCI in select breast cancer patients through a mouse model (manuscript in preparation). We developed a computational model using our experimental results to demonstrate the advantage of treating brain micro-metastases early. Methods: MATLAB was used to develop the computational model of brain metastasis and PCI in mice. The number of metastases per mouse and the volume of metastases from four- and eight-week endpoints were fitmore » to normal and log-normal distributions, respectively. Model input parameters were optimized so that model output would match the experimental number of metastases per mouse. A limiting dilution assay was performed to validate the model. The effect of radiation at different time points was computationally evaluated through the endpoints of incidence, number of metastases, and tumor burden. Results: The correlation between experimental number of metastases per mouse and the Gaussian fit was 87% and 66% at the two endpoints. The experimental volumes and the log-normal fit had correlations of 99% and 97%. In the optimized model, the correlation between number of metastases per mouse and the Gaussian fit was 96% and 98%. The log-normal volume fit and the model agree 100%. The model was validated by a limiting dilution assay, where the correlation was 100%. The model demonstrates that cells are very sensitive to radiation at early time points, and delaying treatment introduces a threshold dose at which point the incidence and number of metastases decline. Conclusion: We have developed a computational model of brain metastasis and PCI in mice that is highly correlated to our experimental data. The model shows that early treatment of subclinical disease is highly advantageous.« less
Aircraft Engine Technology for Green Aviation to Reduce Fuel Burn
NASA Technical Reports Server (NTRS)
Hughes, Christopher E.; VanZante, Dale E.; Heidmann, James D.
2013-01-01
The NASA Fundamental Aeronautics Program Subsonic Fixed Wing Project and Integrated Systems Research Program Environmentally Responsible Aviation Project in the Aeronautics Research Mission Directorate are conducting research on advanced aircraft technology to address the environmental goals of reducing fuel burn, noise and NOx emissions for aircraft in 2020 and beyond. Both Projects, in collaborative partnerships with U.S. Industry, Academia, and other Government Agencies, have made significant progress toward reaching the N+2 (2020) and N+3 (beyond 2025) installed fuel burn goals by fundamental aircraft engine technology development, subscale component experimental investigations, full scale integrated systems validation testing, and development validation of state of the art computation design and analysis codes. Specific areas of propulsion technology research are discussed and progress to date.
Holland, Chris [UC San Diego, San Diego, California, United States
2017-12-09
The upcoming ITER experiment (www.iter.org) represents the next major milestone in realizing the promise of using nuclear fusion as a commercial energy source, by moving into the âburning plasmaâ regime where the dominant heat source is the internal fusion reactions. As part of its support for the ITER mission, the US fusion community is actively developing validated predictive models of the behavior of magnetically confined plasmas. In this talk, I will describe how the plasma community is using the latest high performance computing facilities to develop and refine our models of the nonlinear, multiscale plasma dynamics, and how recent advances in experimental diagnostics are allowing us to directly test and validate these models at an unprecedented level.
Technical Note: Procedure for the calibration and validation of kilo-voltage cone-beam CT models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilches-Freixas, Gloria; Létang, Jean Michel; Rit,
2016-09-15
Purpose: The aim of this work is to propose a general and simple procedure for the calibration and validation of kilo-voltage cone-beam CT (kV CBCT) models against experimental data. Methods: The calibration and validation of the CT model is a two-step procedure: the source model then the detector model. The source is described by the direction dependent photon energy spectrum at each voltage while the detector is described by the pixel intensity value as a function of the direction and the energy of incident photons. The measurements for the source consist of a series of dose measurements in air performedmore » at each voltage with varying filter thicknesses and materials in front of the x-ray tube. The measurements for the detector are acquisitions of projection images using the same filters and several tube voltages. The proposed procedure has been applied to calibrate and assess the accuracy of simple models of the source and the detector of three commercial kV CBCT units. If the CBCT system models had been calibrated differently, the current procedure would have been exclusively used to validate the models. Several high-purity attenuation filters of aluminum, copper, and silver combined with a dosimeter which is sensitive to the range of voltages of interest were used. A sensitivity analysis of the model has also been conducted for each parameter of the source and the detector models. Results: Average deviations between experimental and theoretical dose values are below 1.5% after calibration for the three x-ray sources. The predicted energy deposited in the detector agrees with experimental data within 4% for all imaging systems. Conclusions: The authors developed and applied an experimental procedure to calibrate and validate any model of the source and the detector of a CBCT unit. The present protocol has been successfully applied to three x-ray imaging systems. The minimum requirements in terms of material and equipment would make its implementation suitable in most clinical environments.« less
Bridging the Gap: Linking Simulation and Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krajewski, Paul E.; Carsley, John; Stoudt, Mark R.
2012-09-01
The Materials Genome Initiative (MGI) which is a key enabler for the Advanced Manufacturing Partnership, announced in 2011 by U.S. President Barack Obama, was established to accelerate the development and deployment of advanced materials. The MGI is driven by the need to "bridge the gap" between (I) experimental results and computational analysis to enable the rapid development and validation of new mateirals, and (II) the processes required to convert these materials into useable goods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fort, James A.; Pfund, David M.; Sheen, David M.
2007-04-01
The MFDRC was formed in 1998 to advance the state-of-the-art in simulating multiphase turbulent flows by developing advanced computational models for gas-solid flows that are experimentally validated over a wide range of industrially relevant conditions. The goal was to transfer the resulting validated models to interested US commercial CFD software vendors, who would then propagate the models as part of new code versions to their customers in the US chemical industry. Since the lack of detailed data sets at industrially relevant conditions is the major roadblock to developing and validating multiphase turbulence models, a significant component of the work involvedmore » flow measurements on an industrial-scale riser contributed by Westinghouse, which was subsequently installed at SNL. Model comparisons were performed against these datasets by LANL. A parallel Office of Industrial Technology (OIT) project within the consortium made similar comparisons between riser measurements and models at NETL. Measured flow quantities of interest included volume fraction, velocity, and velocity-fluctuation profiles for both gas and solid phases at various locations in the riser. Some additional techniques were required for these measurements beyond what was currently available. PNNL’s role on the project was to work with the SNL experimental team to develop and test two new measurement techniques, acoustic tomography and millimeter-wave velocimetry. Acoustic tomography is a promising technique for gas-solid flow measurements in risers and PNNL has substantial related experience in this area. PNNL is also active in developing millimeter wave imaging techniques, and this technology presents an additional approach to make desired measurements. PNNL supported the advanced diagnostics development part of this project by evaluating these techniques and then by adapting and developing the selected technology to bulk gas-solids flows and by implementing them for testing in the SNL riser testbed.« less
Content validity and reliability of test of gross motor development in Chilean children
Cano-Cappellacci, Marcelo; Leyton, Fernanda Aleitte; Carreño, Joshua Durán
2016-01-01
ABSTRACT OBJECTIVE To validate a Spanish version of the Test of Gross Motor Development (TGMD-2) for the Chilean population. METHODS Descriptive, transversal, non-experimental validity and reliability study. Four translators, three experts and 92 Chilean children, from five to 10 years, students from a primary school in Santiago, Chile, have participated. The Committee of Experts has carried out translation, back-translation and revision processes to determine the translinguistic equivalence and content validity of the test, using the content validity index in 2013. In addition, a pilot implementation was achieved to determine test reliability in Spanish, by using the intraclass correlation coefficient and Bland-Altman method. We evaluated whether the results presented significant differences by replacing the bat with a racket, using T-test. RESULTS We obtained a content validity index higher than 0.80 for language clarity and relevance of the TGMD-2 for children. There were significant differences in the object control subtest when comparing the results with bat and racket. The intraclass correlation coefficient for reliability inter-rater, intra-rater and test-retest reliability was greater than 0.80 in all cases. CONCLUSIONS The TGMD-2 has appropriate content validity to be applied in the Chilean population. The reliability of this test is within the appropriate parameters and its use could be recommended in this population after the establishment of normative data, setting a further precedent for the validation in other Latin American countries. PMID:26815160
Validation of numerical model for cook stove using Reynolds averaged Navier-Stokes based solver
NASA Astrophysics Data System (ADS)
Islam, Md. Moinul; Hasan, Md. Abdullah Al; Rahman, Md. Mominur; Rahaman, Md. Mashiur
2017-12-01
Biomass fired cook stoves, for many years, have been the main cooking appliance for the rural people of developing countries. Several researches have been carried out to the find efficient stoves. In the present study, numerical model of an improved household cook stove is developed to analyze the heat transfer and flow behavior of gas during operation. The numerical model is validated with the experimental results. Computation of the numerical model is executed the using non-premixed combustion model. Reynold's averaged Navier-Stokes (RaNS) equation along with the κ - ɛ model governed the turbulent flow associated within the computed domain. The computational results are in well agreement with the experiment. Developed numerical model can be used to predict the effect of different biomasses on the efficiency of the cook stove.
NASA Astrophysics Data System (ADS)
Reza, M.; Ibrahim, M.; Rahayu, Y. S.
2018-01-01
This research aims to develop problem-based learning oriented teaching materials to improve students’ mastery of concept and critical thinking skill. Its procedure was divided into two phases; developmental phase and experimental phase. This developmental research used Four-D Model. However, within this research, the process of development would not involve the last stages, which is disseminate. The teaching learning materials which were developed consist of lesson plan, student handbook, student worksheet, achievement test and critical thinking skill test. The experimental phase employs a research design called one group pretest-posttest design. Results show that the validity of the teaching materials which were developed was good and revealed the enhancement of students’ activities with positive response to the teaching learning process. Furthermore, the learning materials improve the students’ mastery of concept and critical thinking skill.
Development, Validation and Integration of the ATLAS Trigger System Software in Run 2
NASA Astrophysics Data System (ADS)
Keyes, Robert; ATLAS Collaboration
2017-10-01
The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware, and software, associated to various sub-detectors that must seamlessly cooperate in order to select one collision of interest out of every 40,000 delivered by the LHC every millisecond. These proceedings discuss the challenges, organization and work flow of the ongoing trigger software development, validation, and deployment. The goal of this development is to ensure that the most up-to-date algorithms are used to optimize the performance of the experiment. The goal of the validation is to ensure the reliability and predictability of the software performance. Integration tests are carried out to ensure that the software deployed to the online trigger farm during data-taking run as desired. Trigger software is validated by emulating online conditions using a benchmark run and mimicking the reconstruction that occurs during normal data-taking. This exercise is computationally demanding and thus runs on the ATLAS high performance computing grid with high priority. Performance metrics ranging from low-level memory and CPU requirements, to distributions and efficiencies of high-level physics quantities are visualized and validated by a range of experts. This is a multifaceted critical task that ties together many aspects of the experimental effort and thus directly influences the overall performance of the ATLAS experiment.
Seo, Hyun-Ju; Kim, Soo Young; Lee, Yoon Jae; Jang, Bo-Hyoung; Park, Ji-Eun; Sheen, Seung-Soo; Hahn, Seo Kyung
2016-02-01
To develop a study Design Algorithm for Medical Literature on Intervention (DAMI) and test its interrater reliability, construct validity, and ease of use. We developed and then revised the DAMI to include detailed instructions. To test the DAMI's reliability, we used a purposive sample of 134 primary, mainly nonrandomized studies. We then compared the study designs as classified by the original authors and through the DAMI. Unweighted kappa statistics were computed to test interrater reliability and construct validity based on the level of agreement between the original and DAMI classifications. Assessment time was also recorded to evaluate ease of use. The DAMI includes 13 study designs, including experimental and observational studies of interventions and exposure. Both the interrater reliability (unweighted kappa = 0.67; 95% CI [0.64-0.75]) and construct validity (unweighted kappa = 0.63, 95% CI [0.52-0.67]) were substantial. Mean classification time using the DAMI was 4.08 ± 2.44 minutes (range, 0.51-10.92). The DAMI showed substantial interrater reliability and construct validity. Furthermore, given its ease of use, it could be used to accurately classify medical literature for systematic reviews of interventions although minimizing disagreement between authors of such reviews. Copyright © 2016 Elsevier Inc. All rights reserved.
Rational selection of training and test sets for the development of validated QSAR models
NASA Astrophysics Data System (ADS)
Golbraikh, Alexander; Shen, Min; Xiao, Zhiyan; Xiao, Yun-De; Lee, Kuo-Hsiung; Tropsha, Alexander
2003-02-01
Quantitative Structure-Activity Relationship (QSAR) models are used increasingly to screen chemical databases and/or virtual chemical libraries for potentially bioactive molecules. These developments emphasize the importance of rigorous model validation to ensure that the models have acceptable predictive power. Using k nearest neighbors ( kNN) variable selection QSAR method for the analysis of several datasets, we have demonstrated recently that the widely accepted leave-one-out (LOO) cross-validated R2 (q2) is an inadequate characteristic to assess the predictive ability of the models [Golbraikh, A., Tropsha, A. Beware of q2! J. Mol. Graphics Mod. 20, 269-276, (2002)]. Herein, we provide additional evidence that there exists no correlation between the values of q 2 for the training set and accuracy of prediction ( R 2) for the test set and argue that this observation is a general property of any QSAR model developed with LOO cross-validation. We suggest that external validation using rationally selected training and test sets provides a means to establish a reliable QSAR model. We propose several approaches to the division of experimental datasets into training and test sets and apply them in QSAR studies of 48 functionalized amino acid anticonvulsants and a series of 157 epipodophyllotoxin derivatives with antitumor activity. We formulate a set of general criteria for the evaluation of predictive power of QSAR models.
Saxena, Anupam; Lipson, Hod; Valero-Cuevas, Francisco J.
2012-01-01
In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16th century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver's middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines. PMID:23144601
NASA Astrophysics Data System (ADS)
Khouli, F.
An aeroelastic phenomenon, known as blade sailing, encountered during maritime operation of helicopters is identified as being a factor that limits the tactical flexibility of helicopter operation in some sea conditions. The hazards associated with this phenomenon and its complexity, owing to the number of factors contributing to its occurrence, led previous investigators to conclude that advanced and validated simulation tools are best suited to investigate it. A research gap is identified in terms of scaled experimental investigation of this phenomenon and practical engineering solutions to alleviate its negative impact on maritime helicopter operation. The feasibility of a proposed strategy to alleviate it required addressing a gap in modelling thin-walled composite active beams/rotor blades. The modelling is performed by extending a mathematically-consistent and asymptotic reduction strategy of the 3-D elastic problem to account for embedded active materials. The derived active cross-sectional theory is validated using 2-D finite element results for closed and open cross-sections. The geometrically-exact intrinsic formulation of active maritime rotor systems is demonstrated to yield compact and symbolic governing equations. The intrinsic feature is shown to allow a classical and proven solution scheme to be successfully applied to obtain time history solutions. A Froude-scaled experimental rotor was designed, built, and tested in a scaled ship airwake environment and representative ship motion. Based on experimental and simulations data, conclusions are drawn regarding the influence of the maritime operation environment and the rotor operation parameters on the blade sailing phenomenon. The experimental data is also used to successfully validate the developed simulation tools. The feasibility of an open-loop control strategy based on the integral active twist concept to counter blade sailing is established in a Mach-scaled maritime operation environment. Recommendations are proposed to improve the strategy and further establish its validity in a full-scale maritime operation environment.
Saxena, Anupam; Lipson, Hod; Valero-Cuevas, Francisco J
2012-01-01
In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16(th) century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver's middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines.
Reference Correlation for the Viscosity of Ammonia from the Triple Point to 725 K and up to 50 MPa
NASA Astrophysics Data System (ADS)
Monogenidou, S. A.; Assael, M. J.; Huber, M. L.
2018-06-01
This paper presents a new wide-ranging correlation for the viscosity of ammonia based on critically evaluated experimental data. The correlation is designed to be used with a recently developed equation of state, and it is valid from the triple point to 725 K at pressures up to 50 MPa. The estimated uncertainty varies depending on the temperature and pressure, from 0.6% to 5%. The correlation behaves in a physically reasonable manner when extrapolated to 100 MPa; however, care should be taken when using the correlation outside of the validated range.
Finite Element Model Development For Aircraft Fuselage Structures
NASA Technical Reports Server (NTRS)
Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.
2000-01-01
The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results.
New millennium program ST6: autonomous technologies for future NASA spacecraft
NASA Technical Reports Server (NTRS)
Chmielewski, Arthur B.; Chien, Steve; Sherwood, Robert; Wyman, William; Brady, T.; Buckley, S.; Tillier, C.
2005-01-01
The purpose of NASA's New Millennium Program (NMP) is to validate advanced technologies in space and thus lower the risk for the first mission user. The focus of NMP is only on those technologies which need space environment for proper validation. The ST6 project has developed two advanced, experimental technologies for use on spacecraft of the future. These technologies are the Autonomous Sciencecraft Experiment and the Inertial Stellar Compass. These technologies will improve spacecraft's ability to: make decisions on what information to gather and send back to the ground, determine its own attitude and adjust its pointing.
NASA advanced turboprop research and concept validation program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlow, J.B. Jr.; Sievers, G.K.
1988-01-01
NASA has determined by experimental and analytical effort that use of advanced turboprop propulsion instead of the conventional turbofans in the older narrow-body airline fleet could reduce fuel consumption for this type of aircraft by up to 50 percent. In cooperation with industry, NASA has defined and implemented an Advanced Turboprop (ATP) program to develop and validate the technology required for these new high-speed, multibladed, thin, swept propeller concepts. This paper presents an overview of the analysis, model-scale test, and large-scale flight test elements of the program together with preliminary test results, as available.
Calibrated Blade-Element/Momentum Theory Aerodynamic Model of the MARIN Stock Wind Turbine: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goupee, A.; Kimball, R.; de Ridder, E. J.
2015-04-02
In this paper, a calibrated blade-element/momentum theory aerodynamic model of the MARIN stock wind turbine is developed and documented. The model is created using open-source software and calibrated to closely emulate experimental data obtained by the DeepCwind Consortium using a genetic algorithm optimization routine. The provided model will be useful for those interested in validating interested in validating floating wind turbine numerical simulators that rely on experiments utilizing the MARIN stock wind turbine—for example, the International Energy Agency Wind Task 30’s Offshore Code Comparison Collaboration Continued, with Correlation project.
Experimental validation of docking and capture using space robotics testbeds
NASA Technical Reports Server (NTRS)
Spofford, John
1991-01-01
Docking concepts include capture, berthing, and docking. The definitions of these terms, consistent with AIAA, are as follows: (1) capture (grasping)--the use of a manipulator to make initial contact and attachment between transfer vehicle and a platform; (2) berthing--positioning of a transfer vehicle or payload into platform restraints using a manipulator; and (3) docking--propulsive mechanical connection between vehicle and platform. The combination of the capture and berthing operations is effectively the same as docking; i.e., capture (grasping) + berthing = docking. These concepts are discussed in terms of Martin Marietta's ability to develop validation methods using robotics testbeds.
NASA Technical Reports Server (NTRS)
Tai, H.; Wilson, J. W.; Maiden, D. L.
2003-01-01
The atmospheric ionizing radiation (AIR) ER-2 preflight analysis, one of the first attempts to obtain a relatively complete measurement set of the high-altitude radiation level environment, is described in this paper. The primary thrust is to characterize the atmospheric radiation and to define dose levels at high-altitude flight. A secondary thrust is to develop and validate dosimetric techniques and monitoring devices for protecting aircrews. With a few chosen routes, we can measure the experimental results and validate the AIR model predictions. Eventually, as more measurements are made, we gain more understanding about the hazardous radiation environment and acquire more confidence in the prediction models.
Infrared Algorithm Development for Ocean Observations with EOS/MODIS
NASA Technical Reports Server (NTRS)
Brown, Otis B.
1997-01-01
Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.
Advanced ballistic range technology
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1993-01-01
Optical images, such as experimental interferograms, schlieren, and shadowgraphs, are routinely used to identify and locate features in experimental flow fields and for validating computational fluid dynamics (CFD) codes. Interferograms can also be used for comparing experimental and computed integrated densities. By constructing these optical images from flow-field simulations, one-to-one comparisons of computation and experiment are possible. During the period from February 1, 1992, to November 30, 1992, work has continued on the development of CISS (Constructed Interferograms, Schlieren, and Shadowgraphs), a code that constructs images from ideal- and real-gas flow-field simulations. In addition, research connected with the automated film-reading system and the proposed reactivation of the radiation facility has continued.
Bed inventory overturn in a circulating fluid bed riser with pant-leg structure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jinjing Li; Wei Wang; Hairui Yang
2009-05-15
The special phenomenon, nominated as bed inventory overturn, in circulating fluid bed (CFB) riser with pant-leg structure was studied with model calculation and experimental work. A compounded pressure drop mathematic model was developed and validated with the experimental data in a cold experimental test rig. The model calculation results agree well with the measured data. In addition, the intensity of bed inventory overturn is directly proportional to the fluidizing velocity and is inversely proportional to the branch point height. The results in the present study provide significant information for the design and operation of a CFB boiler with pant-leg structure.more » 15 refs., 10 figs., 1 tab.« less
Particle Substructure. A Common Theme of Discovery in this Century
DOE R&D Accomplishments Database
Panofsky, W. K. H.
1984-02-01
Some example of modern developments in particle physics are given which demonstrate that the fundamental rules of quantum mechanics, applied to all forces in nature as they became understood, have retained their validity. The well-established laws of electricity and magnetism, reformulated in terms of quantum mechanics, have exhibited a truly remarkable numerical agreement between theory and experiment over an enormous range of observation. As experimental techniques have grown from the top of a laboratory bench to the large accelerators of today, the basic components of experimentation have changed vastly in scale but only little in basic function. More important, the motivation of those engaged in this type of experimentation has hardly changed at all.
Faraday waves in a Hele-Shaw cell
NASA Astrophysics Data System (ADS)
Li, Jing; Li, Xiaochen; Chen, Kaijie; Xie, Bin; Liao, Shijun
2018-04-01
We investigate Faraday waves in a Hele-Shaw cell via experimental, numerical, and theoretical studies. Inspired by the Kelvin-Helmholtz-Darcy theory, we develop the gap-averaged Navier-Stokes equations and end up with the stable standing waves with half frequency of the external forced vibration. To overcome the dependency of a numerical model on the experimental parameter of wave length, we take two-phase flow into consideration and a novel dispersion relation is derived. The numerical results compare well with our experimental data, which effectively validates our proposed mathematical model. Therefore, this model can produce robust solutions of Faraday wave patterns and resolve related physical phenomena, which demonstrates the practical importance of the present study.
Optimizing LX-17 Thermal Decomposition Model Parameters with Evolutionary Algorithms
NASA Astrophysics Data System (ADS)
Moore, Jason; McClelland, Matthew; Tarver, Craig; Hsu, Peter; Springer, H. Keo
2017-06-01
We investigate and model the cook-off behavior of LX-17 because this knowledge is critical to understanding system response in abnormal thermal environments. Thermal decomposition of LX-17 has been explored in conventional ODTX (One-Dimensional Time-to-eXplosion), PODTX (ODTX with pressure-measurement), TGA (thermogravimetric analysis), and DSC (differential scanning calorimetry) experiments using varied temperature profiles. These experimental data are the basis for developing multiple reaction schemes with coupled mechanics in LLNL's multi-physics hydrocode, ALE3D (Arbitrary Lagrangian-Eulerian code in 2D and 3D). We employ evolutionary algorithms to optimize reaction rate parameters on high performance computing clusters. Once experimentally validated, this model will be scalable to a number of applications involving LX-17 and can be used to develop more sophisticated experimental methods. Furthermore, the optimization methodology developed herein should be applicable to other high explosive materials. This work was performed under the auspices of the U.S. DOE by LLNL under contract DE-AC52-07NA27344. LLNS, LLC.
Guan, Fada; Johns, Jesse M; Vasudevan, Latha; Zhang, Guoqing; Tang, Xiaobin; Poston, John W; Braby, Leslie A
2015-06-01
Coincident counts can be observed in experimental radiation spectroscopy. Accurate quantification of the radiation source requires the detection efficiency of the spectrometer, which is often experimentally determined. However, Monte Carlo analysis can be used to supplement experimental approaches to determine the detection efficiency a priori. The traditional Monte Carlo method overestimates the detection efficiency as a result of omitting coincident counts caused mainly by multiple cascade source particles. In this study, a novel "multi-primary coincident counting" algorithm was developed using the Geant4 Monte Carlo simulation toolkit. A high-purity Germanium detector for ⁶⁰Co gamma-ray spectroscopy problems was accurately modeled to validate the developed algorithm. The simulated pulse height spectrum agreed well qualitatively with the measured spectrum obtained using the high-purity Germanium detector. The developed algorithm can be extended to other applications, with a particular emphasis on challenging radiation fields, such as counting multiple types of coincident radiations released from nuclear fission or used nuclear fuel.
NASA Astrophysics Data System (ADS)
Ishihara, Koichi; Asai, Yusuke; Kudo, Riichi; Ichikawa, Takeo; Takatori, Yasushi; Mizoguchi, Masato
2013-12-01
Multiuser multiple-input multiple-output (MU-MIMO) has been proposed as a means to improve spectrum efficiency for various future wireless communication systems. This paper reports indoor experimental results obtained for a newly developed and implemented downlink (DL) MU-MIMO orthogonal frequency division multiplexing (OFDM) transceiver for gigabit wireless local area network systems in the microwave band. In the transceiver, the channel state information (CSI) is estimated at each user and fed back to an access point (AP) on a real-time basis. At the AP, the estimated CSI is used to calculate the transmit beamforming weight for DL MU-MIMO transmission. This paper also proposes a recursive inverse matrix computation scheme for computing the transmit weight in real time. Experiments with the developed transceiver demonstrate its feasibility in a number of indoor scenarios. The experimental results clarify that DL MU-MIMO-OFDM transmission can achieve a 972-Mbit/s transmission data rate with simple digital signal processing of single-antenna users in an indoor environment.
Discrete tyre model application for evaluation of vehicle limit handling performance
NASA Astrophysics Data System (ADS)
Siramdasu, Y.; Taheri, S.
2016-11-01
The goal of this study is twofold, first, to understand the transient and nonlinear effects of anti-lock braking systems (ABS), road undulations and driving dynamics on lateral performance of tyre and second, to develop objective handling manoeuvres and respective metrics to characterise these effects on vehicle behaviour. For studying the transient and nonlinear handling performance of the vehicle, the variations of relaxation length of tyre and tyre inertial properties play significant roles [Pacejka HB. Tire and vehicle dynamics. 3rd ed. Butterworth-Heinemann; 2012]. To accurately simulate these nonlinear effects during high-frequency vehicle dynamic manoeuvres, requires a high-frequency dynamic tyre model (? Hz). A 6 DOF dynamic tyre model integrated with enveloping model is developed and validated using fixed axle high-speed oblique cleat experimental data. Commercially available vehicle dynamics software CarSim® is used for vehicle simulation. The vehicle model was validated by comparing simulation results with experimental sinusoidal steering tests. The validated tyre model is then integrated with vehicle model and a commercial grade rule-based ABS model to perform various objective simulations. Two test scenarios of ABS braking in turn on a smooth road and accelerating in a turn on uneven and smooth roads are considered. Both test cases reiterated that while the tyre is operating in the nonlinear region of slip or slip angle, any road disturbance or high-frequency brake torque input variations can excite the inertial belt vibrations of the tyre. It is shown that these inertial vibrations can directly affect the developed performance metrics and potentially degrade the handling performance of the vehicle.
Morse, Michael S.; Lu, Ning; Wayllace, Alexandra; Godt, Jonathan W.
2017-01-01
To experimentally validate a recently developed theory for predicting the stability of cut slopes under unsaturated conditions, the authors measured increasing strain localization in unsaturated slope cuts prior to abrupt failure. Cut slope width and moisture content were controlled and varied in a laboratory, and a sliding door that extended the height of the free face of the slope was lowered until the cut slope failed. A particle image velocimetry tool was used to quantify soil displacement in the x-y">x-y (horizontal) and x-z">x-z (vertical) planes, and strain was calculated from the displacement. Areas of maximum strain localization prior to failure were shown to coincide with the location of the eventual failure plane. Experimental failure heights agreed with the recently developed stability theory for unsaturated cut slopes (within 14.3% relative error) for a range of saturation and cut slope widths. A theoretical threshold for sidewall influence on cut slope failures was also proposed to quantify the relationship between normalized sidewall width and critical height. The proposed relationship was consistent with the cut slope experiment results, and is intended for consideration in future geotechnical experiment design. The experimental data of evolution of strain localization presented herein provide a physical basis from which future numerical models of strain localization can be validated.
von Wilmowsky, Cornelius; Moest, Tobias; Nkenke, Emeka; Stelzle, Florian; Schlegel, Karl Andreas
2014-12-01
In order to determine whether a newly developed implant material conforms to the requirements of biocompatibility, it must undergo rigorous testing. To correctly interpret the results of studies on implant material osseointegration, it is necessary to have a sound understanding of all the testing methods. The aim of this overview is to elucidate the methods that are used for the experimental evaluation of the osseointegration of implant materials. In recent decades, there has been a constant proliferation of new materials and surface modifications in the field of dental implants. This continuous development of innovative biomaterials requires a precise and detailed evaluation in terms of biocompatibility and implant healing before clinical use. The current gold standard is in vivo animal testing on well validated animal models. However, long-term outcome studies on patients have to follow to finally validate and show patient benefit. No experimental set-up can provide answers for all possible research questions. However, a certain transferability of the results to humans might be possible if the experimental set-up is carefully chosen for the aspects and questions being investigated. To enhance the implant survival rate in the rising number of patients with chronic diseases which compromise wound healing and osseointegration, dental implant research on compromised animal models will further gain importance in future.
Prediction and Computation of Corrosion Rates of A36 Mild Steel in Oilfield Seawater
NASA Astrophysics Data System (ADS)
Paul, Subir; Mondal, Rajdeep
2018-04-01
The parameters which primarily control the corrosion rate and life of steel structures are several and they vary across the different ocean and seawater as well as along the depth. While the effect of single parameter on corrosion behavior is known, the conjoint effects of multiple parameters and the interrelationship among the variables are complex. Millions sets of experiments are required to understand the mechanism of corrosion failure. Statistical modeling such as ANN is one solution that can reduce the number of experimentation. ANN model was developed using 170 sets of experimental data of A35 mild steel in simulated seawater, varying the corrosion influencing parameters SO4 2-, Cl-, HCO3 -,CO3 2-, CO2, O2, pH and temperature as input and the corrosion current as output. About 60% of experimental data were used to train the model, 20% for testing and 20% for validation. The model was developed by programming in Matlab. 80% of the validated data could predict the corrosion rate correctly. Corrosion rates predicted by the ANN model are displayed in 3D graphics which show many interesting phenomenon of the conjoint effects of multiple variables that might throw new ideas of mitigation of corrosion by simply modifying the chemistry of the constituents. The model could predict the corrosion rates of some real systems.
Åsberg, Dennis; Chutkowski, Marcin; Leśko, Marek; Samuelsson, Jörgen; Kaczmarski, Krzysztof; Fornstedt, Torgny
2017-01-06
Large pressure gradients are generated in ultra-high-pressure liquid chromatography (UHPLC) using sub-2μm particles causing significant temperature gradients over the column due to viscous heating. These pressure and temperature gradients affect retention and ultimately result in important selectivity shifts. In this study, we developed an approach for predicting the retention time shifts due to these gradients. The approach is presented as a step-by-step procedure and it is based on empirical linear relationships describing how retention varies as a function of temperature and pressure and how the average column temperature increases with the flow rate. It requires only four experiments on standard equipment, is based on straightforward calculations, and is therefore easy to use in method development. The approach was rigorously validated against experimental data obtained with a quality control method for the active pharmaceutical ingredient omeprazole. The accuracy of retention time predictions was very good with relative errors always less than 1% and in many cases around 0.5% (n=32). Selectivity shifts observed between omeprazole and the related impurities when changing the flow rate could also be accurately predicted resulting in good estimates of the resolution between critical peak pairs. The approximations which the presented approach are based on were all justified. The retention factor as a function of pressure and temperature was studied in an experimental design while the temperature distribution in the column was obtained by solving the fundamental heat and mass balance equations for the different experimental conditions. We strongly believe that this approach is sufficiently accurate and experimentally feasible for this separation to be a valuable tool when developing a UHPLC method. After further validation with other separation systems, it could become a useful approach in UHPLC method development, especially in the pharmaceutical industry where demands are high for robustness and regulatory oversight. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Hoppe, Carl F.; Loevinger, Jane
1977-01-01
Self and peer evaluations and experimental measures of conformity were given to 107 adolescent private school boys. Student discipline records also indicated the number of demerits. The self-report measures and the demerits showed maximum conformity between the self-protective and conscientious ego stages as measured by the Sentence Completion…
Development and Experimental Validation of a Thermoelectric Test Bench for Laboratory Lessons
ERIC Educational Resources Information Center
Rodríguez García, Antonio; Astrain Ulibarrena, David; Martínez Echeverri, Álvaro; Aranguren Garacochea, Patricia; Pérez Artieda, Gurutze
2013-01-01
The refrigeration process reduces the temperature of a space or a given volume while the power generation process employs a source of thermal energy to generate electrical power. Because of the importance of these two processes, training of engineers in this area is of great interest. In engineering courses it is normally studied the vapor…
Similarity between turbulent kinetic energy and temperature spectra in the near-wall region
NASA Technical Reports Server (NTRS)
Antonia, R. A.; Kim, J.
1991-01-01
The similarity between turbulent kinetic energy and temperature spectra, previously confirmed using experimental data in various turbulent shear flows, is validated in the near-wall region using direct numerical simulation data in a fully developed turbulent channel flow. The dependence of this similarity on the molecular Prandtl number is also examined.
The Dimensionality of Inference Making: Are Local and Global Inferences Distinguishable?
ERIC Educational Resources Information Center
Muijselaar, Marloes M. L.
2018-01-01
We investigated the dimensionality of inference making in samples of 4- to 9-year-olds (Ns = 416-783) to determine if local and global coherence inferences could be distinguished. In addition, we examined the validity of our experimenter-developed inference measure by comparing with three additional measures of listening comprehension. Multitrait,…
ERIC Educational Resources Information Center
Kim, Jeonghyun; Jo, Il-Hyun; Park, Yeonjeong
2016-01-01
The learning analytics dashboard (LAD) is a newly developed learning support tool for virtual classrooms that is believed to allow students to review their online learning behavior patterns intuitively through the provision of visual information. The purpose of this study was to empirically validate the effects of LAD. An experimental study was…
DOE Office of Scientific and Technical Information (OSTI.GOV)
English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.
Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less
Validation of the Soil Moisture Active Passive mission using USDA-ARS experimental watersheds
USDA-ARS?s Scientific Manuscript database
The calibration and validation program of the Soil Moisture Active Passive mission (SMAP) relies upon an international cooperative of in situ networks to provide ground truth references across a variety of landscapes. The USDA Agricultural Research Service operates several experimental watersheds wh...
Oliva, Alexis; Monzón, Cecilia; Santoveña, Ana; Fariña, José B; Llabrés, Matías
2016-07-01
An ultra high performance liquid chromatography method was developed and validated for the quantitation of triamcinolone acetonide in an injectable ophthalmic hydrogel to determine the contribution of analytical method error in the content uniformity measurement. During the development phase, the design of experiments/design space strategy was used. For this, the free R-program was used as a commercial software alternative, a fast efficient tool for data analysis. The process capability index was used to find the permitted level of variation for each factor and to define the design space. All these aspects were analyzed and discussed under different experimental conditions by the Monte Carlo simulation method. Second, a pre-study validation procedure was performed in accordance with the International Conference on Harmonization guidelines. The validated method was applied for the determination of uniformity of dosage units and the reasons for variability (inhomogeneity and the analytical method error) were analyzed based on the overall uncertainty. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Lindensmith, Chris A.; Briggs, H. Clark; Beregovski, Yuri; Feria, V. Alfonso; Goullioud, Renaud; Gursel, Yekta; Hahn, Inseob; Kinsella, Gary; Orzewalla, Matthew; Phillips, Charles
2006-01-01
SIM Planetquest (SIM) is a large optical interferometer for making microarcsecond measurements of the positions of stars, and to detect Earth-sized planets around nearby stars. To achieve this precision, SIM requires stability of optical components to tens of picometers per hour. The combination of SIM s large size (9 meter baseline) and the high stability requirement makes it difficult and costly to measure all aspects of system performance on the ground. To reduce risks, costs and to allow for a design with fewer intermediate testing stages, the SIM project is developing an integrated thermal, mechanical and optical modeling process that will allow predictions of the system performance to be made at the required high precision. This modeling process uses commercial, off-the-shelf tools and has been validated against experimental results at the precision of the SIM performance requirements. This paper presents the description of the model development, some of the models, and their validation in the Thermo-Opto-Mechanical (TOM3) testbed which includes full scale brassboard optical components and the metrology to test them at the SIM performance requirement levels.
Transport property correlations for the niobium-1% zirconium alloy
NASA Astrophysics Data System (ADS)
Senor, David J.; Thomas, J. Kelly; Peddicord, K. L.
1990-10-01
Correlations were developed for the electrical resistivity (ρ), thermal conductivity ( k), and hemispherical total emittance (ɛ) of niobium-1% zirconium as functions of temperature. All three correlations were developed as empirical fits to experimental data. ρ = 5.571 + 4.160 × 10 -2(T) - 4.192 × 10 -6(T) 2 μΩcm , k = 13.16( T) 0.2149W/ mK, ɛ = 6.39 × 10 -2 + 4.98 × 10 -5( T) + 3.62 × 10 -8( T) 2 - 7.28 × 10 -12( T) 3. The relative standard deviation of the electrical resistivity correlation is 1.72% and it is valid over the temperature range 273 to 2700 K. The thermal conductivity correlation has a relative standard deviation of 3.24% and is valid over the temperature range 379 to 1421 K. The hemispherical total emittance correlation was developed for smooth surface materials only and represents a conservative estimate of the emittance of the alloy for space reactor fuel element modeling applications. It has a relative standard deviation of 9.50% and is valid over the temperature range 755 to 2670 K.
Variable camber wing based on pneumatic artificial muscles
NASA Astrophysics Data System (ADS)
Yin, Weilong; Liu, Libo; Chen, Yijin; Leng, Jinsong
2009-07-01
As a novel bionic actuator, pneumatic artificial muscle has high power to weight ratio. In this paper, a variable camber wing with the pneumatic artificial muscle is developed. Firstly, the experimental setup to measure the static output force of pneumatic artificial muscle is designed. The relationship between the static output force and the air pressure is investigated. Experimental result shows the static output force of pneumatic artificial muscle decreases nonlinearly with increasing contraction ratio. Secondly, the finite element model of the variable camber wing is developed. Numerical results show that the tip displacement of the trailing-edge increases linearly with increasing external load and limited with the maximum static output force of pneumatic artificial muscles. Finally, the variable camber wing model is manufactured to validate the variable camber concept. Experimental result shows that the wing camber increases with increasing air pressure and that it compare very well with the FEM result.
NASA Astrophysics Data System (ADS)
Bevilacqua, R.; Lehmann, T.; Romano, M.
2011-04-01
This work introduces a novel control algorithm for close proximity multiple spacecraft autonomous maneuvers, based on hybrid linear quadratic regulator/artificial potential function (LQR/APF), for applications including autonomous docking, on-orbit assembly and spacecraft servicing. Both theoretical developments and experimental validation of the proposed approach are presented. Fuel consumption is sub-optimized in real-time through re-computation of the LQR at each sample time, while performing collision avoidance through the APF and a high level decisional logic. The underlying LQR/APF controller is integrated with a customized wall-following technique and a decisional logic, overcoming problems such as local minima. The algorithm is experimentally tested on a four spacecraft simulators test bed at the Spacecraft Robotics Laboratory of the Naval Postgraduate School. The metrics to evaluate the control algorithm are: autonomy of the system in making decisions, successful completion of the maneuver, required time, and propellant consumption.
NASA Astrophysics Data System (ADS)
Mandelis, Andreas; Guo, Xinxin
2011-10-01
A differential photothermal radiometry method, wavelength-modulated differential photothermal radiometry (WM-DPTR), has been developed theoretically and experimentally for noninvasive, noncontact biological analyte detection, such as blood glucose monitoring. WM-DPTR features analyte specificity and sensitivity by combining laser excitation by two out-of-phase modulated beams at wavelengths near the peak and the base line of a prominent and isolated mid-IR analyte absorption band (here the carbon-oxygen-carbon bond in the pyran ring of the glucose molecule). A theoretical photothermal model of WM-DPTR signal generation and detection has been developed. Simulation results on water-glucose phantoms with the human blood range (0-300 mg/dl) glucose concentration demonstrated high sensitivity and resolution to meet wide clinical detection requirements. The model has also been validated by experimental data of the glucose-water system obtained using WM-DPTR.
Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet
2011-10-01
The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations. Only the model for one specimen met the validation criteria for average and peak pressure of both articulations; however the experimental measures for peak pressure also exhibited high variability. MRI-based modeling can reliably be used for evaluating the contact area and contact force with similar confidence as in currently available experimental techniques. Average contact pressure, and peak contact pressure were more variable from all measurement techniques, and these measures from MRI-based modeling should be used with some caution.
Geist, Rebecca E; DuBois, Chase H; Nichols, Timothy C; Caughey, Melissa C; Merricks, Elizabeth P; Raymer, Robin; Gallippi, Caterina M
2016-09-01
Acoustic radiation force impulse (ARFI) Surveillance of Subcutaneous Hemorrhage (ASSH) has been previously demonstrated to differentiate bleeding phenotype and responses to therapy in dogs and humans, but to date, the method has lacked experimental validation. This work explores experimental validation of ASSH in a poroelastic tissue-mimic and in vivo in dogs. The experimental design exploits calibrated flow rates and infusion durations of evaporated milk in tofu or heparinized autologous blood in dogs. The validation approach enables controlled comparisons of ASSH-derived bleeding rate (BR) and time to hemostasis (TTH) metrics. In tissue-mimicking experiments, halving the calibrated flow rate yielded ASSH-derived BRs that decreased by 44% to 48%. Furthermore, for calibrated flow durations of 5.0 minutes and 7.0 minutes, average ASSH-derived TTH was 5.2 minutes and 7.0 minutes, respectively, with ASSH predicting the correct TTH in 78% of trials. In dogs undergoing calibrated autologous blood infusion, ASSH measured a 3-minute increase in TTH, corresponding to the same increase in the calibrated flow duration. For a measured 5% decrease in autologous infusion flow rate, ASSH detected a 7% decrease in BR. These tissue-mimicking and in vivo preclinical experimental validation studies suggest the ASSH BR and TTH measures reflect bleeding dynamics. © The Author(s) 2015.
Baldo, Matías N; Angeli, Emmanuel; Gareis, Natalia C; Hunzicker, Gabriel A; Murguía, Marcelo C; Ortega, Hugo H; Hein, Gustavo J
2018-04-01
A relative bioavailability study (RBA) of two phenytoin (PHT) formulations was conducted in rabbits, in order to compare the results obtained from different matrices (plasma and blood from dried blood spot (DBS) sampling) and different experimental designs (classic and block). The method was developed by liquid chromatography tandem-mass spectrometry (LC-MS/MS) in plasma and blood samples. The different sample preparation techniques, plasma protein precipitation and DBS, were validated according to international requirements. The analytical method was validated with ranges 0.20-50.80 and 0.12-20.32 µg ml -1 , r > 0.999 for plasma and blood, respectively. Accuracy and precision were within acceptance criteria for bioanalytical assay validation (< 15 for bias and CV% and < 20 for limit of quantification (LOQ)). PHT showed long-term stability, both for plasma and blood, and under refrigerated and room temperature conditions. Haematocrit values were measured during the validation process and RBA study. Finally, the pharmacokinetic parameters (C max , T max and AUC 0-t ) obtained from the RBA study were tested. Results were highly comparable for matrices and experimental designs. A matrix correlation higher than 0.975 and a ratio of (PHT blood) = 1.158 (PHT plasma) were obtained. The results obtained herein show that the use of classic experimental design and DBS sampling for animal pharmacokinetic studies should be encouraged as they could help to prevent the use of a large number of animals and also animal euthanasia. Finally, the combination of DBS sampling with LC-MS/MS technology showed to be an excellent tool not only for therapeutic drug monitoring but also for RBA studies.
Development of a Scale-up Tool for Pervaporation Processes
Thiess, Holger; Strube, Jochen
2018-01-01
In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature), axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model. PMID:29342956
Xiang, Junfeng; Xie, Lijing; Gao, Feinong; Zhang, Yu; Yi, Jie; Wang, Tao; Pang, Siqin; Wang, Xibin
2018-01-01
Discrepancies in capturing material behavior of some materials, such as Particulate Reinforced Metal Matrix Composites, by using conventional ad hoc strategy make the applicability of Johnson-Cook constitutive model challenged. Despites applicable efforts, its extended formalism with more fitting parameters would increase the difficulty in identifying constitutive parameters. A weighted multi-objective strategy for identifying any constitutive formalism is developed to predict mechanical behavior in static and dynamic loading conditions equally well. These varying weighting is based on the Gaussian-distributed noise evaluation of experimentally obtained stress-strain data in quasi-static or dynamic mode. This universal method can be used to determine fast and directly whether the constitutive formalism is suitable to describe the material constitutive behavior by measuring goodness-of-fit. A quantitative comparison of different fitting strategies on identifying Al6063/SiCp’s material parameters is made in terms of performance evaluation including noise elimination, correlation, and reliability. Eventually, a three-dimensional (3D) FE model in small-hole drilling of Al6063/SiCp composites, using multi-objective identified constitutive formalism, is developed. Comparison with the experimental observations in thrust force, torque, and chip morphology provides valid evidence on the applicability of the developed multi-objective identification strategy in identifying constitutive parameters. PMID:29324688
Synchrotron characterization of nanograined UO 2 grain growth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mo, Kun; Miao, Yinbin; Yun, Di
2015-09-30
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructuremore » based materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize our preliminary synchrotron radiation experiments at APS to determine the grain size of nanograin UO 2. The methodology and experimental setup developed in this experiment can directly apply to the proposed in-situ grain growth measurements. The investigation of the grain growth kinetics was conducted based on isothermal annealing and grain growth characterization as functions of duration and temperature. The kinetic parameters such as activation energy for grain growth for UO 2 with different stoichiometry are obtained and compared with molecular dynamics (MD) simulations.« less
Supplying materials needed for grain growth characterizations of nano-grained UO 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mo, Kun; Miao, Yinbin; Yun, Di
2015-09-30
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructuremore » based materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize our preliminary synchrotron radiation experiments at APS to determine the grain size of nanograin UO 2. The methodology and experimental setup developed in this experiment can directly apply to the proposed in-situ grain growth measurements. The investigation of the grain growth kinetics was conducted based on isothermal annealing and grain growth characterization as functions of duration and temperature. The kinetic parameters such as activation energy for grain growth for UO 2 with different stoichiometry are obtained and compared with molecular dynamics (MD) simulations.« less
Further Validation of a CFD Code for Calculating the Performance of Two-Stage Light Gas Guns
NASA Technical Reports Server (NTRS)
Bogdanoff, David W.
2017-01-01
Earlier validations of a higher-order Godunov code for modeling the performance of two-stage light gas guns are reviewed. These validation comparisons were made between code predictions and experimental data from the NASA Ames 1.5" and 0.28" guns and covered muzzle velocities of 6.5 to 7.2 km/s. In the present report, five more series of code validation comparisons involving experimental data from the Ames 0.22" (1.28" pump tube diameter), 0.28", 0.50", 1.00" and 1.50" guns are presented. The total muzzle velocity range of the validation data presented herein is 3 to 11.3 km/s. The agreement between the experimental data and CFD results is judged to be very good. Muzzle velocities were predicted within 0.35 km/s for 74% of the cases studied with maximum differences being 0.5 km/s and for 4 out of 50 cases, 0.5 - 0.7 km/s.
Huntington Disease: Linking Pathogenesis to the Development of Experimental Therapeutics.
Mestre, Tiago A; Sampaio, Cristina
2017-02-01
Huntington disease (HD) is an autosomal dominant neurodegenerative condition caused by a CAG trinucleotide expansion in the huntingtin gene. At present, the HD field is experiencing exciting times with the assessment for the first time in human subjects of interventions aimed at core disease mechanisms. Out of a portfolio of interventions that claim a potential disease-modifying effect in HD, the target huntingtin has more robust validation. In this review, we discuss the spectrum of huntingtin-lowering therapies that are currently being considered. We provide a critical appraisal of the validation of huntingtin as a drug target, describing the advantages, challenges, and limitations of the proposed therapeutic interventions. The development of these new therapies relies strongly on the knowledge of HD pathogenesis and the ability to translate this knowledge into validated pharmacodynamic biomarkers. Altogether, the goal is to support a rational drug development that is ethical and cost-effective. Among the pharmacodynamic biomarkers under development, the quantification of mutant huntingtin in the cerebral spinal fluid and PET imaging targeting huntingtin or phosphodiesterase 10A deserve special attention. Huntingtin-lowering therapeutics are eagerly awaited as the first interventions that may be able to change the course of HD in a meaningful way.
Dynamic Simulation of Human Gait Model With Predictive Capability.
Sun, Jinming; Wu, Shaoli; Voglewede, Philip A
2018-03-01
In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.
Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta
2017-02-01
The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.
Predictive QSAR modeling workflow, model applicability domains, and virtual screening.
Tropsha, Alexander; Golbraikh, Alexander
2007-01-01
Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.
Hyper-X Engine Design and Ground Test Program
NASA Technical Reports Server (NTRS)
Voland, R. T.; Rock, K. E.; Huebner, L. D.; Witte, D. W.; Fischer, K. E.; McClinton, C. R.
1998-01-01
The Hyper-X Program, NASA's focused hypersonic technology program jointly run by NASA Langley and Dryden, is designed to move hypersonic, air-breathing vehicle technology from the laboratory environment to the flight environment, the last stage preceding prototype development. The Hyper-X research vehicle will provide the first ever opportunity to obtain data on an airframe integrated supersonic combustion ramjet propulsion system in flight, providing the first flight validation of wind tunnel, numerical and analytical methods used for design of these vehicles. A substantial portion of the integrated vehicle/engine flowpath development, engine systems verification and validation and flight test risk reduction efforts are experimentally based, including vehicle aeropropulsive force and moment database generation for flight control law development, and integrated vehicle/engine performance validation. The Mach 7 engine flowpath development tests have been completed, and effort is now shifting to engine controls, systems and performance verification and validation tests, as well as, additional flight test risk reduction tests. The engine wind tunnel tests required for these efforts range from tests of partial width engines in both small and large scramjet test facilities, to tests of the full flight engine on a vehicle simulator and tests of a complete flight vehicle in the Langley 8-Ft. High Temperature Tunnel. These tests will begin in the summer of 1998 and continue through 1999. The first flight test is planned for early 2000.
Woolf-King, Sarah E.; Maisto, Stephen; Carey, Michael; Vanable, Peter
2013-01-01
Experimental research on sexual decision making is limited, despite the public health importance of such work. We describe formative work conducted in advance of an experimental study designed to evaluate the effects of alcohol intoxication and sexual arousal on risky sexual decision making among men who have sex with men. In Study 1, we describe the procedures for selecting and validating erotic film clips (to be used for the experimental manipulation of arousal). In Study 2, we describe the tailoring of two interactive role-play videos to be used to measure risk perception and communication skills in an analog risky sex situation. Together, these studies illustrate a method for creating experimental stimuli to investigate sexual decision making in a laboratory setting. Research using this approach will support experimental research that affords a stronger basis for drawing causal inferences regarding sexual decision making. PMID:19760530
2003-03-01
Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig
Recent modelling advances for ultrasonic TOFD inspections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darmon, Michel; Ferrand, Adrien; Dorval, Vincent
The ultrasonic TOFD (Time of Flight Diffraction) Technique is commonly used to detect and characterize disoriented cracks using their edge diffraction echoes. An overview of the models integrated in the CIVA software platform and devoted to TOFD simulation is presented. CIVA allows to predict diffraction echoes from complex 3D flaws using a PTD (Physical Theory of Diffraction) based model. Other dedicated developments have been added to simulate lateral waves in 3D on planar entry surfaces and in 2D on irregular surfaces by a ray approach. Calibration echoes from Side Drilled Holes (SDHs), specimen echoes and shadowing effects from flaws canmore » also been modelled. Some examples of theoretical validation of the models are presented. In addition, experimental validations have been performed both on planar blocks containing calibration holes and various notches and also on a specimen with an irregular entry surface and allow to draw conclusions on the validity of all the developed models.« less
A high power ion thruster for deep space missions
NASA Astrophysics Data System (ADS)
Polk, James E.; Goebel, Dan M.; Snyder, John S.; Schneider, Analyn C.; Johnson, Lee K.; Sengupta, Anita
2012-07-01
The Nuclear Electric Xenon Ion System ion thruster was developed for potential outer planet robotic missions using nuclear electric propulsion (NEP). This engine was designed to operate at power levels ranging from 13 to 28 kW at specific impulses of 6000-8500 s and for burn times of up to 10 years. State-of-the-art performance and life assessment tools were used to design the thruster, which featured 57-cm-diameter carbon-carbon composite grids operating at voltages of 3.5-6.5 kV. Preliminary validation of the thruster performance was accomplished with a laboratory model thruster, while in parallel, a flight-like development model (DM) thruster was completed and two DM thrusters fabricated. The first thruster completed full performance testing and a 2000-h wear test. The second successfully completed vibration tests at the full protoflight levels defined for this NEP program and then passed performance validation testing. The thruster design, performance, and the experimental validation of the design tools are discussed in this paper.
A high power ion thruster for deep space missions.
Polk, James E; Goebel, Dan M; Snyder, John S; Schneider, Analyn C; Johnson, Lee K; Sengupta, Anita
2012-07-01
The Nuclear Electric Xenon Ion System ion thruster was developed for potential outer planet robotic missions using nuclear electric propulsion (NEP). This engine was designed to operate at power levels ranging from 13 to 28 kW at specific impulses of 6000-8500 s and for burn times of up to 10 years. State-of-the-art performance and life assessment tools were used to design the thruster, which featured 57-cm-diameter carbon-carbon composite grids operating at voltages of 3.5-6.5 kV. Preliminary validation of the thruster performance was accomplished with a laboratory model thruster, while in parallel, a flight-like development model (DM) thruster was completed and two DM thrusters fabricated. The first thruster completed full performance testing and a 2000-h wear test. The second successfully completed vibration tests at the full protoflight levels defined for this NEP program and then passed performance validation testing. The thruster design, performance, and the experimental validation of the design tools are discussed in this paper.