Bayesian cross-entropy methodology for optimal design of validation experiments
NASA Astrophysics Data System (ADS)
Jiang, X.; Mahadevan, S.
2006-07-01
An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.
The Role of Structural Models in the Solar Sail Flight Validation Process
NASA Technical Reports Server (NTRS)
Johnston, John D.
2004-01-01
NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.
Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.
Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B
2018-01-01
The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jernigan, Dann A.; Blanchat, Thomas K.
It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less
ERIC Educational Resources Information Center
St.Clair, Travis; Cook, Thomas D.; Hallberg, Kelly
2014-01-01
Although evaluators often use an interrupted time series (ITS) design to test hypotheses about program effects, there are few empirical tests of the design's validity. We take a randomized experiment on an educational topic and compare its effects to those from a comparative ITS (CITS) design that uses the same treatment group as the experiment…
CFD validation experiments for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1992-01-01
A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.
A Performance Management Framework for Civil Engineering
1990-09-01
cultural change. A non - equivalent control group design was chosen to augment the case analysis. Figure 3.18 shows the form of the quasi-experiment. The...The non - equivalent control group design controls the following obstacles to internal validity: history, maturation, testing, and instrumentation. The...and Stanley, 1963:48,50) Table 7. Validity of Quasi-Experiment The non - equivalent control group experimental design controls the following obstacles to
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricks, Allen; Blanchat, Thomas K.; Jernigan, Dann A.
2006-06-01
It is necessary to improve understanding and develop validation data of the heat flux incident to an object located within the fire plume for the validation of SIERRA/ FUEGO/SYRINX fire and SIERRA/CALORE. One key aspect of the validation data sets is the determination of the relative contribution of the radiative and convective heat fluxes. To meet this objective, a cylindrical calorimeter with sufficient instrumentation to measure total and radiative heat flux had been designed and fabricated. This calorimeter will be tested both in the controlled radiative environment of the Penlight facility and in a fire environment in the FLAME/Radiant Heatmore » (FRH) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisons between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. A significant question of interest to modeling heat flux incident to an object in or near a fire is the contribution of the radiation and convection modes of heat transfer. The series of experiments documented in this test plan is designed to provide data on the radiation partitioning, defined as the fraction of the total heat flux that is due to radiation.« less
Study design elements for rigorous quasi-experimental comparative effectiveness research.
Maciejewski, Matthew L; Curtis, Lesley H; Dowd, Bryan
2013-03-01
Quasi-experiments are likely to be the workhorse study design used to generate evidence about the comparative effectiveness of alternative treatments, because of their feasibility, timeliness, affordability and external validity compared with randomized trials. In this review, we outline potential sources of discordance in results between quasi-experiments and experiments, review study design choices that can improve the internal validity of quasi-experiments, and outline innovative data linkage strategies that may be particularly useful in quasi-experimental comparative effectiveness research. There is an urgent need to resolve the debate about the evidentiary value of quasi-experiments since equal consideration of rigorous quasi-experiments will broaden the base of evidence that can be brought to bear in clinical decision-making and governmental policy-making.
ERIC Educational Resources Information Center
St. Clair, Travis; Hallberg, Kelly; Cook, Thomas D.
2016-01-01
We explore the conditions under which short, comparative interrupted time-series (CITS) designs represent valid alternatives to randomized experiments in educational evaluations. To do so, we conduct three within-study comparisons, each of which uses a unique data set to test the validity of the CITS design by comparing its causal estimates to…
Design and validation of instruments to measure knowledge.
Elliott, T E; Regal, R R; Elliott, B A; Renier, C M
2001-01-01
Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.
ERIC Educational Resources Information Center
Sørlie, Mari-Anne; Ogden, Terje
2014-01-01
This paper reviews literature on the rationale, challenges, and recommendations for choosing a nonequivalent comparison (NEC) group design when evaluating intervention effects. After reviewing frequently addressed threats to validity, the paper describes recommendations for strengthening the research design and how the recommendations were…
A CFD validation roadmap for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1992-01-01
A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.
A CFD validation roadmap for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1993-01-01
A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.
In-Trail Procedure Air Traffic Control Procedures Validation Simulation Study
NASA Technical Reports Server (NTRS)
Chartrand, Ryan C.; Hewitt, Katrin P.; Sweeney, Peter B.; Graff, Thomas J.; Jones, Kenneth M.
2012-01-01
In August 2007, Airservices Australia (Airservices) and the United States National Aeronautics and Space Administration (NASA) conducted a validation experiment of the air traffic control (ATC) procedures associated with the Automatic Dependant Surveillance-Broadcast (ADS-B) In-Trail Procedure (ITP). ITP is an Airborne Traffic Situation Awareness (ATSA) application designed for near-term use in procedural airspace in which ADS-B data are used to facilitate climb and descent maneuvers. NASA and Airservices conducted the experiment in Airservices simulator in Melbourne, Australia. Twelve current operational air traffic controllers participated in the experiment, which identified aspects of the ITP that could be improved (mainly in the communication and controller approval process). Results showed that controllers viewed the ITP as valid and acceptable. This paper describes the experiment design and results.
RF Systems in Space. Volume I. Space Antennas Frequency (SARF) Simulation.
1983-04-01
lens SBR designs were investigated. The survivability of an SBR system was analyzed. The design of ground based SBR validation experiments for large...aperture SBR concepts were investigated. SBR designs were investigated for ground target detection. N1’IS GRAMI DTIC TAB E Unannounced E Justificat... designs :~~.~...: .-..:. ->.. - . *.* . ..- . . .. . -. . ..- . .4. To analyze the survivability of space radar 5. To design ground-based validation
An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle
NASA Astrophysics Data System (ADS)
Wang, Yue; Gao, Dan; Mao, Xuming
2018-03-01
A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.
USDA-ARS?s Scientific Manuscript database
The purpose of SMAP (Soil Moisture Active Passive) Validation Experiment 2012 (SMAPVEX12) campaign was to collect data for the pre-launch development and validation of SMAP soil moisture algorithms. SMAP is a National Aeronautics and Space Administration’s (NASA) satellite mission designed for the m...
How Generalizable Is Your Experiment? An Index for Comparing Samples and Populations
ERIC Educational Resources Information Center
Tipton, Elizabeth
2013-01-01
Recent research on the design of social experiments has highlighted the effects of different design choices on research findings. Since experiments rarely collect their samples using random selection, in order to address these external validity problems and design choices, recent research has focused on two areas. The first area is on methods for…
Laboratory Experimental Design for a Glycomic Study.
Ugrina, Ivo; Campbell, Harry; Vučković, Frano
2017-01-01
Proper attention to study design before, careful conduct of procedures during, and appropriate inference from results after scientific experiments are important in all scientific studies in order to ensure valid and sometimes definitive conclusions can be made. The design of experiments, also called experimental design, addresses the challenge of structuring and conducting experiments to answer the questions of interest as clearly and efficiently as possible.
Further Validation of the Coach Identity Prominence Scale
ERIC Educational Resources Information Center
Pope, J. Paige; Hall, Craig R.
2014-01-01
This study was designed to examine select psychometric properties of the Coach Identity Prominence Scale (CIPS), including the reliability, factorial validity, convergent validity, discriminant validity, and predictive validity. Coaches (N = 338) who averaged 37 (SD = 12.27) years of age, had a mean of 13 (SD = 9.90) years of coaching experience,…
Validation Experiences and Persistence among Community College Students
ERIC Educational Resources Information Center
Barnett, Elisabeth A.
2011-01-01
The purpose of this correlational research was to examine the extent to which community college students' experiences with validation by faculty (Rendon, 1994, 2002) predicted: (a) their sense of integration, and (b) their intent to persist. The research was designed as an elaboration of constructs within Tinto's (1993) Longitudinal Model of…
A Surrogate Approach to the Experimental Optimization of Multielement Airfoils
NASA Technical Reports Server (NTRS)
Otto, John C.; Landman, Drew; Patera, Anthony T.
1996-01-01
The incorporation of experimental test data into the optimization process is accomplished through the use of Bayesian-validated surrogates. In the surrogate approach, a surrogate for the experiment (e.g., a response surface) serves in the optimization process. The validation step of the framework provides a qualitative assessment of the surrogate quality, and bounds the surrogate-for-experiment error on designs "near" surrogate-predicted optimal designs. The utility of the framework is demonstrated through its application to the experimental selection of the trailing edge ap position to achieve a design lift coefficient for a three-element airfoil.
Merlin: Computer-Aided Oligonucleotide Design for Large Scale Genome Engineering with MAGE.
Quintin, Michael; Ma, Natalie J; Ahmed, Samir; Bhatia, Swapnil; Lewis, Aaron; Isaacs, Farren J; Densmore, Douglas
2016-06-17
Genome engineering technologies now enable precise manipulation of organism genotype, but can be limited in scalability by their design requirements. Here we describe Merlin ( http://merlincad.org ), an open-source web-based tool to assist biologists in designing experiments using multiplex automated genome engineering (MAGE). Merlin provides methods to generate pools of single-stranded DNA oligonucleotides (oligos) for MAGE experiments by performing free energy calculation and BLAST scoring on a sliding window spanning the targeted site. These oligos are designed not only to improve recombination efficiency, but also to minimize off-target interactions. The application further assists experiment planning by reporting predicted allelic replacement rates after multiple MAGE cycles, and enables rapid result validation by generating primer sequences for multiplexed allele-specific colony PCR. Here we describe the Merlin oligo and primer design procedures and validate their functionality compared to OptMAGE by eliminating seven AvrII restriction sites from the Escherichia coli genome.
Reasoning, Problem Solving, and Intelligence.
1980-04-01
designed to test the validity of their model of response choice in analogical reason- ing. In the first experiment, they set out to demonstrate that...second experiment were somewhat consistent with the prediction. The third experiment used a concept-formation design in which subjects were required to... designed to show interrelationships between various forms of inductive reasoning. Their model fits were highly comparable to those of Rumelhart and
Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta
2017-02-01
The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.
The marketing implications of affective product design.
Seva, Rosemary R; Duh, Henry Been-Lirn; Helander, Martin G
2007-11-01
Emotions are compelling human experiences and product designers can take advantage of this by conceptualizing emotion-engendering products that sell well in the market. This study hypothesized that product attributes influence users' emotions and that the relationship is moderated by the adherence of these product attributes to purchase criteria. It was further hypothesized that the emotional experience of the user influences purchase intention. A laboratory study was conducted to validate the hypotheses using mobile phones as test products. Sixty-two participants were asked to assess eight phones from a display of 10 phones and indicate their emotional experiences after assessment. Results suggest that some product attributes can cause intense emotional experience. The attributes relate to the phone's dimensions and the relationship between these dimensions. The study validated the notion of integrating affect in designing products that convey users' personalities.
From theory to experimental design-Quantifying a trait-based theory of predator-prey dynamics.
Laubmeier, A N; Wootton, Kate; Banks, J E; Bommarco, Riccardo; Curtsdotter, Alva; Jonsson, Tomas; Roslin, Tomas; Banks, H T
2018-01-01
Successfully applying theoretical models to natural communities and predicting ecosystem behavior under changing conditions is the backbone of predictive ecology. However, the experiments required to test these models are dictated by practical constraints, and models are often opportunistically validated against data for which they were never intended. Alternatively, we can inform and improve experimental design by an in-depth pre-experimental analysis of the model, generating experiments better targeted at testing the validity of a theory. Here, we describe this process for a specific experiment. Starting from food web ecological theory, we formulate a model and design an experiment to optimally test the validity of the theory, supplementing traditional design considerations with model analysis. The experiment itself will be run and described in a separate paper. The theory we test is that trophic population dynamics are dictated by species traits, and we study this in a community of terrestrial arthropods. We depart from the Allometric Trophic Network (ATN) model and hypothesize that including habitat use, in addition to body mass, is necessary to better model trophic interactions. We therefore formulate new terms which account for micro-habitat use as well as intra- and interspecific interference in the ATN model. We design an experiment and an effective sampling regime to test this model and the underlying assumptions about the traits dominating trophic interactions. We arrive at a detailed sampling protocol to maximize information content in the empirical data obtained from the experiment and, relying on theoretical analysis of the proposed model, explore potential shortcomings of our design. Consequently, since this is a "pre-experimental" exercise aimed at improving the links between hypothesis formulation, model construction, experimental design and data collection, we hasten to publish our findings before analyzing data from the actual experiment, thus setting the stage for strong inference.
Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda
2002-01-01
A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.
NASA Technical Reports Server (NTRS)
Generazio, Edward R. (Inventor)
2012-01-01
A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.
Perceptions vs Reality: A Longitudinal Experiment in Influenced Judgement Performance
2003-03-25
validity were manifested equally between treatment and control groups , thereby lending further validity to the experimental research design . External...Stanley (1975) identify this as a True Experimental Design : Pretest- Posttest Control Group Design . However, due to the longitudinal aspect required to...1975:43). Nonequivalence will be ruled out as pretest equivalence is shown between treatment and control groups (1975:47). For quasi
Goals and Status of the NASA Juncture Flow Experiment
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.; Morrison, Joseph H.
2016-01-01
The NASA Juncture Flow experiment is a new effort whose focus is attaining validation data in the juncture region of a wing-body configuration. The experiment is designed specifically for the purpose of CFD validation. Current turbulence models routinely employed by Reynolds-averaged Navier-Stokes CFD are inconsistent in their prediction of corner flow separation in aircraft juncture regions, so experimental data in the near-wall region of such a configuration will be useful both for assessment as well as for turbulence model improvement. This paper summarizes the Juncture Flow effort to date, including preliminary risk-reduction experiments already conducted and planned future experiments. The requirements and challenges associated with conducting a quality validation test are discussed.
Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.
We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less
In-Flight Thermal Performance of the Lidar In-Space Technology Experiment
NASA Technical Reports Server (NTRS)
Roettker, William
1995-01-01
The Lidar In-Space Technology Experiment (LITE) was developed at NASA s Langley Research Center to explore the applications of lidar operated from an orbital platform. As a technology demonstration experiment, LITE was developed to gain experience designing and building future operational orbiting lidar systems. Since LITE was the first lidar system to be flown in space, an important objective was to validate instrument design principles in such areas as thermal control, laser performance, instrument alignment and control, and autonomous operations. Thermal and structural analysis models of the instrument were developed during the design process to predict the behavior of the instrument during its mission. In order to validate those mathematical models, extensive engineering data was recorded during all phases of LITE's mission. This inflight engineering data was compared with preflight predictions and, when required, adjustments to the thermal and structural models were made to more accurately match the instrument s actual behavior. The results of this process for the thermal analysis and design of LITE are presented in this paper.
ERIC Educational Resources Information Center
Tang, Yang; Cook, Thomas D.; Kisbu-Sakarya, Yasemin
2015-01-01
Regression discontinuity design (RD) has been widely used to produce reliable causal estimates. Researchers have validated the accuracy of RD design using within study comparisons (Cook, Shadish & Wong, 2008; Cook & Steiner, 2010; Shadish et al, 2011). Within study comparisons examines the validity of a quasi-experiment by comparing its…
NASA Technical Reports Server (NTRS)
Moes, Timothy R.
2009-01-01
The principal objective of the Supersonics Project is to develop and validate multidisciplinary physics-based predictive design, analysis and optimization capabilities for supersonic vehicles. For aircraft, the focus will be on eliminating the efficiency, environmental and performance barriers to practical supersonic flight. Previous flight projects found that a shaped sonic boom could propagate all the way to the ground (F-5 SSBD experiment) and validated design tools for forebody shape modifications (F-5 SSBD and Quiet Spike experiments). The current project, Lift and Nozzle Change Effects on Tail Shock (LaNCETS) seeks to obtain flight data to develop and validate design tools for low-boom tail shock modifications. Attempts will be made to alter the shock structure of NASA's NF-15B TN/837 by changing the lift distribution by biasing the canard positions, changing the plume shape by under- and over-expanding the nozzles, and changing the plume shape using thrust vectoring. Additional efforts will measure resulting shocks with a probing aircraft (F-15B TN/836) and use the results to validate and update predictive tools. Preliminary flight results are presented and are available to provide truth data for developing and validating the CFD tools required to design low-boom supersonic aircraft.
Statistical issues in the design and planning of proteomic profiling experiments.
Cairns, David A
2015-01-01
The statistical design of a clinical proteomics experiment is a critical part of well-undertaken investigation. Standard concepts from experimental design such as randomization, replication and blocking should be applied in all experiments, and this is possible when the experimental conditions are well understood by the investigator. The large number of proteins simultaneously considered in proteomic discovery experiments means that determining the number of required replicates to perform a powerful experiment is more complicated than in simple experiments. However, by using information about the nature of an experiment and making simple assumptions this is achievable for a variety of experiments useful for biomarker discovery and initial validation.
Empirical Validation and Application of the Computing Attitudes Survey
ERIC Educational Resources Information Center
Dorn, Brian; Elliott Tew, Allison
2015-01-01
Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…
Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Barton L.
2016-08-16
This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical givenmore » the time and funds available« less
Designing biomedical proteomics experiments: state-of-the-art and future perspectives.
Maes, Evelyne; Kelchtermans, Pieter; Bittremieux, Wout; De Grave, Kurt; Degroeve, Sven; Hooyberghs, Jef; Mertens, Inge; Baggerman, Geert; Ramon, Jan; Laukens, Kris; Martens, Lennart; Valkenborg, Dirk
2016-05-01
With the current expanded technical capabilities to perform mass spectrometry-based biomedical proteomics experiments, an improved focus on the design of experiments is crucial. As it is clear that ignoring the importance of a good design leads to an unprecedented rate of false discoveries which would poison our results, more and more tools are developed to help researchers designing proteomic experiments. In this review, we apply statistical thinking to go through the entire proteomics workflow for biomarker discovery and validation and relate the considerations that should be made at the level of hypothesis building, technology selection, experimental design and the optimization of the experimental parameters.
NASA Technical Reports Server (NTRS)
Cayeux, P.; Raballand, F.; Borde, J.; Berges, J.-C.; Meyssignac, B.
2007-01-01
Within the framework of a partnership agreement, EADS ASTRIUM has worked since June 2006 for the CNES formation flying experiment on the PRISMA mission. EADS ASTRIUM is responsible for the anti-collision function. This responsibility covers the design and the development of the function as a Matlab/Simulink library, as well as its functional validation and performance assessment. PRISMA is a technology in-orbit testbed mission from the Swedish National Space Board, mainly devoted to formation flying demonstration. PRISMA is made of two micro-satellites that will be launched in 2009 on a quasi-circular SSO at about 700 km of altitude. The CNES FFIORD experiment embedded on PRISMA aims at flight validating an FFRF sensor designed for formation control, and assessing its performances, in preparation to future formation flying missions such as Simbol X; FFIORD aims as well at validating various typical autonomous rendezvous and formation guidance and control algorithms. This paper presents the principles of the collision avoidance function developed by EADS ASTRIUM for FFIORD; three kinds of maneuvers were implemented and are presented in this paper with their performances.
Oliva, Alexis; Monzón, Cecilia; Santoveña, Ana; Fariña, José B; Llabrés, Matías
2016-07-01
An ultra high performance liquid chromatography method was developed and validated for the quantitation of triamcinolone acetonide in an injectable ophthalmic hydrogel to determine the contribution of analytical method error in the content uniformity measurement. During the development phase, the design of experiments/design space strategy was used. For this, the free R-program was used as a commercial software alternative, a fast efficient tool for data analysis. The process capability index was used to find the permitted level of variation for each factor and to define the design space. All these aspects were analyzed and discussed under different experimental conditions by the Monte Carlo simulation method. Second, a pre-study validation procedure was performed in accordance with the International Conference on Harmonization guidelines. The validated method was applied for the determination of uniformity of dosage units and the reasons for variability (inhomogeneity and the analytical method error) were analyzed based on the overall uncertainty. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Design, Implementation and Validation of a Europe-Wide Pedagogical Framework for E-Learning
ERIC Educational Resources Information Center
Granic, Andrina; Mifsud, Charles; Cukusic, Maja
2009-01-01
Within the context of a Europe-wide project UNITE, a number of European partners set out to design, implement and validate a pedagogical framework (PF) for e- and m-Learning in secondary schools. The process of formulating and testing the PF was an evolutionary one that reflected the experiences and skills of the various European partners and…
Mahler, H I; Kulik, J A
1995-02-01
The purpose of this study was to demonstrate the validation of videotape interventions that were designed to prepare patients for coronary artery bypass graft (CABG) surgery. First, three videotapes were developed. Two of the tapes featured the experiences of three actual CABG patients and were constructed to present either an optimistic portrayal of the recovery period (mastery tape) or a portrayal designed to inoculate patients against potential problems (coping tape). The third videotape contained the more general nurse scenes and narration used in the other two tapes, but did not include the experiences of particular patients. We then conducted a study to establish the convergent and discriminant validity of the three tapes. That is, we sought to demonstrate both that the tapes did differ along the mastery-coping dimension, and that they did not differ in other respects (such as in the degree of information provided or the perceived credibility of the narrator). The validation study, conducted with 42 males who had previously undergone CABG, demonstrated that the intended equivalences and differences between the tapes were achieved. The importance of establishing the validity of health-related interventions is discussed.
Development and Validation of the Caring Loneliness Scale.
Karhe, Liisa; Kaunonen, Marja; Koivisto, Anna-Maija
2016-12-01
The Caring Loneliness Scale (CARLOS) includes 5 categories derived from earlier qualitative research. This article assesses the reliability and construct validity of a scale designed to measure patient experiences of loneliness in a professional caring relationship. Statistical analysis with 4 different sample sizes included Cronbach's alpha and exploratory factor analysis with principal axis factoring extraction. The sample size of 250 gave the most useful and comprehensible structure, but all 4 samples yielded underlying content of loneliness experiences. The initial 5 categories were reduced to 4 factors with 24 items and Cronbach's alpha ranging from .77 to .90. The findings support the reliability and validity of CARLOS for the assessment of Finnish breast cancer and heart surgery patients' experiences but as all instruments, further validation is needed.
Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.
Festing, M F
2001-01-01
In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.
1980-09-01
used to accomplish the necessary research . One such experi- ment design and its relationship to validity will be explored next. Nonequivalent Control ...interpreting the results. The non- equivalent control group design is of the quasi-experimental variety and is widely used in educational research . As...biofeed- back research literature is the controlled group outcome study. This design has also been discussed in Chapter III in two forms as the
Conflict: Operational Realism versus Analytical Rigor in Defense Modeling and Simulation
2012-06-14
Campbell, Experimental and Quasi- Eperimental Designs for Generalized Causal Inference, Boston: Houghton Mifflin Company, 2002. [7] R. T. Johnson, G...experimentation? In order for an experiment to be considered rigorous, and the results valid, the experiment should be designed using established...addition to the interview, the pilots were administered a written survey, designed to capture their reactions regarding the level of realism present
NASA Astrophysics Data System (ADS)
Nir, A.; Doughty, C.; Tsang, C. F.
Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no attempt to validate a specific model, but several models of increasing complexity are compared with experimental results. The outcome is interpreted as a demonstration of the paradigm proposed by van der Heijde, 26 that different constituencies have different objectives for the validation process and therefore their acceptance criteria differ also.
F-16XL-2 Supersonic Laminar Flow Control Flight Test Experiment
NASA Technical Reports Server (NTRS)
Anders, Scott G.; Fischer, Michael C.
1999-01-01
The F-16XL-2 Supersonic Laminar Flow Control Flight Test Experiment was part of the NASA High-Speed Research Program. The goal of the experiment was to demonstrate extensive laminar flow, to validate computational fluid dynamics (CFD) codes and design methodology, and to establish laminar flow control design criteria. Topics include the flight test hardware and design, airplane modification, the pressure and suction distributions achieved, the laminar flow achieved, and the data analysis and code correlation.
Analytical procedure validation and the quality by design paradigm.
Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno
2015-01-01
Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhinefrank, Kenneth E.; Lenee-Bluhm, Pukha; Prudell, Joseph H.
The most prudent path to a full-scale design, build and deployment of a wave energy conversion (WEC) system involves establishment of validated numerical models using physical experiments in a methodical scaling program. This Project provides essential additional rounds of wave tank testing at 1:33 scale and ocean/bay testing at a 1:7 scale, necessary to validate numerical modeling that is essential to a utility-scale WEC design and associated certification.
Quasi-experimental study designs series-paper 6: risk of bias assessment.
Waddington, Hugh; Aloe, Ariel M; Becker, Betsy Jane; Djimeu, Eric W; Hombrados, Jorge Garcia; Tugwell, Peter; Wells, George; Reeves, Barney
2017-09-01
Rigorous and transparent bias assessment is a core component of high-quality systematic reviews. We assess modifications to existing risk of bias approaches to incorporate rigorous quasi-experimental approaches with selection on unobservables. These are nonrandomized studies using design-based approaches to control for unobservable sources of confounding such as difference studies, instrumental variables, interrupted time series, natural experiments, and regression-discontinuity designs. We review existing risk of bias tools. Drawing on these tools, we present domains of bias and suggest directions for evaluation questions. The review suggests that existing risk of bias tools provide, to different degrees, incomplete transparent criteria to assess the validity of these designs. The paper then presents an approach to evaluating the internal validity of quasi-experiments with selection on unobservables. We conclude that tools for nonrandomized studies of interventions need to be further developed to incorporate evaluation questions for quasi-experiments with selection on unobservables. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Neville G.W.; Heuze, Francois E.; Miller, Hamish D.S.
1993-03-01
The reference design for the underground facilities at the Waste Isolation Pilot Plant was developed using the best criteria available at initiation of the detailed design effort. These design criteria are contained in the US Department of Energy document titled Design Criteria, Waste Isolation Pilot Plant (WIPP). Revised Mission Concept-IIA (RMC-IIA), Rev. 4, dated February 1984. The validation process described in the Design Validation Final Report has resulted in validation of the reference design of the underground openings based on these criteria. Future changes may necessitate modification of the Design Criteria document and/or the reference design. Validation of the referencemore » design as presented in this report permits the consideration of future design or design criteria modifications necessitated by these changes or by experience gained at the WIPP. Any future modifications to the design criteria and/or the reference design will be governed by a DOE Standard Operation Procedure (SOP) covering underground design changes. This procedure will explain the process to be followed in describing, evaluating and approving the change.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Paul A.; Liao, Chang-hsien
2007-11-15
A passive flow disturbance has been proven to enhance the conversion of fuel in a methanol-steam reformer. This study presents a statistical validation of the experiment based on a standard 2{sup k} factorial experiment design and the resulting empirical model of the enhanced hydrogen producing process. A factorial experiment design was used to statistically analyze the effects and interactions of various input factors in the experiment. Three input factors, including the number of flow disturbers, catalyst size, and reactant flow rate were investigated for their effects on the fuel conversion in the steam-reformation process. Based on the experimental results, anmore » empirical model was developed and further evaluated with an uncertainty analysis and interior point data. (author)« less
1988-05-01
ifforiable manpower investement. On the basis of our current experience it seems that the basic design principles are valid. The system developed will... system is operational on various computer networks, and in both industrial and in research environments. The design pri,lciples for the construction of...to a useful numerical simulation and design system for very complex configurations and flows. 7. REFERENCES 1. Bartlett G. W. , "An experimental
Forensic Uncertainty Quantification of Explosive Dispersal of Particles
NASA Astrophysics Data System (ADS)
Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho
2017-06-01
In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.
Neutron calibration sources in the Daya Bay experiment
Liu, J.; Carr, R.; Dwyer, D. A.; ...
2015-07-09
We describe the design and construction of the low rate neutron calibration sources used in the Daya Bay Reactor Anti-neutrino Experiment. Such sources are free of correlated gamma-neutron emission, which is essential in minimizing induced background in the anti-neutrino detector. Thus, the design characteristics have been validated in the Daya Bay anti-neutrino detector.
A New Approach to Public Speaking Course in ESL Classroom
ERIC Educational Resources Information Center
Hou, Minghua
2008-01-01
This paper is a project report on the experiment of an English public speaking and debating course with advanced level English majors in College of Arts and Science, Yangtze University. The paper analyzes the validity of the course, introduces the design rationale, the design and experiment process, and students' responses. The paper suggests that…
CFD Modeling Needs and What Makes a Good Supersonic Combustion Validation Experiment
NASA Technical Reports Server (NTRS)
Gaffney, Richard L., Jr.; Cutler, Andrew D.
2005-01-01
If a CFD code/model developer is asked what experimental data he wants to validate his code or numerical model, his answer will be: "Everything, everywhere, at all times." Since this is not possible, practical, or even reasonable, the developer must understand what can be measured within the limits imposed by the test article, the test location, the test environment and the available diagnostic equipment. At the same time, it is important for the expermentalist/diagnostician to understand what the CFD developer needs (as opposed to wants) in order to conduct a useful CFD validation experiment. If these needs are not known, it is possible to neglect easily measured quantities at locations needed by the developer, rendering the data set useless for validation purposes. It is also important for the experimentalist/diagnostician to understand what the developer is trying to validate so that the experiment can be designed to isolate (as much as possible) the effects of a particular physical phenomena that is associated with the model to be validated. The probability of a successful validation experiment can be greatly increased if the two groups work together, each understanding the needs and limitations of the other.
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Saha, Sankalita; Goebel, Kai
2011-01-01
Accelerated aging methodologies for electrolytic components have been designed and accelerated aging experiments have been carried out. The methodology is based on imposing electrical and/or thermal overstresses via electrical power cycling in order to mimic the real world operation behavior. Data are collected in-situ and offline in order to periodically characterize the devices' electrical performance as it ages. The data generated through these experiments are meant to provide capability for the validation of prognostic algorithms (both model-based and data-driven). Furthermore, the data allow validation of physics-based and empirical based degradation models for this type of capacitor. A first set of models and algorithms has been designed and tested on the data.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.
Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
2015-02-01
The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less
Design of experiments in medical physics: Application to the AAA beam model validation.
Dufreneix, S; Legrand, C; Di Bartolo, C; Bremaud, M; Mesgouez, J; Tiplica, T; Autret, D
2017-09-01
The purpose of this study is to evaluate the usefulness of the design of experiments in the analysis of multiparametric problems related to the quality assurance in radiotherapy. The main motivation is to use this statistical method to optimize the quality assurance processes in the validation of beam models. Considering the Varian Eclipse system, eight parameters with several levels were selected: energy, MLC, depth, X, Y 1 and Y 2 jaw dimensions, wedge and wedge jaw. A Taguchi table was used to define 72 validation tests. Measurements were conducted in water using a CC04 on a TrueBeam STx, a TrueBeam Tx, a Trilogy and a 2300IX accelerator matched by the vendor. Dose was computed using the AAA algorithm. The same raw data was used for all accelerators during the beam modelling. The mean difference between computed and measured doses was 0.1±0.5% for all beams and all accelerators with a maximum difference of 2.4% (under the 3% tolerance level). For all beams, the measured doses were within 0.6% for all accelerators. The energy was found to be an influencing parameter but the deviations observed were smaller than 1% and not considered clinically significant. Designs of experiment can help define the optimal measurement set to validate a beam model. The proposed method can be used to identify the prognostic factors of dose accuracy. The beam models were validated for the 4 accelerators which were found dosimetrically equivalent even though the accelerator characteristics differ. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph
2015-05-22
When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Huijun; Qu, Zheng; Tang, Shaofei; Pang, Mingqi; Zhang, Mingju
2017-08-01
In this paper, electromagnetic design and permanent magnet shape optimization for permanent magnet synchronous generator with hybrid excitation are investigated. Based on generator structure and principle, design outline is presented for obtaining high efficiency and low voltage fluctuation. In order to realize rapid design, equivalent magnetic circuits for permanent magnet and iron poles are developed. At the same time, finite element analysis is employed. Furthermore, by means of design of experiment (DOE) method, permanent magnet is optimized to reduce voltage waveform distortion. Finally, the validity of proposed design methods is validated by the analytical and experimental results.
Experiences in integrating auto-translated state-chart designs for model checking
NASA Technical Reports Server (NTRS)
Pingree, P. J.; Benowitz, E. G.
2003-01-01
In the complex environment of JPL's flight missions with increasing dependency on advanced software designs, traditional software validation methods of simulation and testing are being stretched to adequately cover the needs of software development.
Moreno, Javier; Clotet, Eduard; Lupiañez, Ruben; Tresanchez, Marcel; Martínez, Dani; Pallejà, Tomàs; Casanovas, Jordi; Palacín, Jordi
2016-10-10
This paper presents the design, implementation and validation of the three-wheel holonomic motion system of a mobile robot designed to operate in homes. The holonomic motion system is described in terms of mechanical design and electronic control. The paper analyzes the kinematics of the motion system and validates the estimation of the trajectory comparing the displacement estimated with the internal odometry of the motors and the displacement estimated with a SLAM procedure based on LIDAR information. Results obtained in different experiments have shown a difference on less than 30 mm between the position estimated with the SLAM and odometry, and a difference in the angular orientation of the mobile robot lower than 5° in absolute displacements up to 1000 mm.
Moreno, Javier; Clotet, Eduard; Lupiañez, Ruben; Tresanchez, Marcel; Martínez, Dani; Pallejà, Tomàs; Casanovas, Jordi; Palacín, Jordi
2016-01-01
This paper presents the design, implementation and validation of the three-wheel holonomic motion system of a mobile robot designed to operate in homes. The holonomic motion system is described in terms of mechanical design and electronic control. The paper analyzes the kinematics of the motion system and validates the estimation of the trajectory comparing the displacement estimated with the internal odometry of the motors and the displacement estimated with a SLAM procedure based on LIDAR information. Results obtained in different experiments have shown a difference on less than 30 mm between the position estimated with the SLAM and odometry, and a difference in the angular orientation of the mobile robot lower than 5° in absolute displacements up to 1000 mm. PMID:27735857
Methodology Series Module 9: Designing Questionnaires and Clinical Record Forms - Part II.
Setia, Maninder Singh
2017-01-01
This article is a continuation of the previous module on designing questionnaires and clinical record form in which we have discussed some basic points about designing the questionnaire and clinical record forms. In this section, we will discuss the reliability and validity of questionnaires. The different types of validity are face validity, content validity, criterion validity, and construct validity. The different types of reliability are test-retest reliability, inter-rater reliability, and intra-rater reliability. Some of these parameters are assessed by subject area experts. However, statistical tests should be used for evaluation of other parameters. Once the questionnaire has been designed, the researcher should pilot test the questionnaire. The items in the questionnaire should be changed based on the feedback from the pilot study participants and the researcher's experience. After the basic structure of the questionnaire has been finalized, the researcher should assess the validity and reliability of the questionnaire or the scale. If an existing standard questionnaire is translated in the local language, the researcher should assess the reliability and validity of the translated questionnaire, and these values should be presented in the manuscript. The decision to use a self- or interviewer-administered, paper- or computer-based questionnaire depends on the nature of the questions, literacy levels of the target population, and resources.
Methodology Series Module 9: Designing Questionnaires and Clinical Record Forms – Part II
Setia, Maninder Singh
2017-01-01
This article is a continuation of the previous module on designing questionnaires and clinical record form in which we have discussed some basic points about designing the questionnaire and clinical record forms. In this section, we will discuss the reliability and validity of questionnaires. The different types of validity are face validity, content validity, criterion validity, and construct validity. The different types of reliability are test-retest reliability, inter-rater reliability, and intra-rater reliability. Some of these parameters are assessed by subject area experts. However, statistical tests should be used for evaluation of other parameters. Once the questionnaire has been designed, the researcher should pilot test the questionnaire. The items in the questionnaire should be changed based on the feedback from the pilot study participants and the researcher's experience. After the basic structure of the questionnaire has been finalized, the researcher should assess the validity and reliability of the questionnaire or the scale. If an existing standard questionnaire is translated in the local language, the researcher should assess the reliability and validity of the translated questionnaire, and these values should be presented in the manuscript. The decision to use a self- or interviewer-administered, paper- or computer-based questionnaire depends on the nature of the questions, literacy levels of the target population, and resources. PMID:28584367
VDA, a Method of Choosing a Better Algorithm with Fewer Validations
Kluger, Yuval
2011-01-01
The multitude of bioinformatics algorithms designed for performing a particular computational task presents end-users with the problem of selecting the most appropriate computational tool for analyzing their biological data. The choice of the best available method is often based on expensive experimental validation of the results. We propose an approach to design validation sets for method comparison and performance assessment that are effective in terms of cost and discrimination power. Validation Discriminant Analysis (VDA) is a method for designing a minimal validation dataset to allow reliable comparisons between the performances of different algorithms. Implementation of our VDA approach achieves this reduction by selecting predictions that maximize the minimum Hamming distance between algorithmic predictions in the validation set. We show that VDA can be used to correctly rank algorithms according to their performances. These results are further supported by simulations and by realistic algorithmic comparisons in silico. VDA is a novel, cost-efficient method for minimizing the number of validation experiments necessary for reliable performance estimation and fair comparison between algorithms. Our VDA software is available at http://sourceforge.net/projects/klugerlab/files/VDA/ PMID:22046256
Development and validation of the crew-station system-integration research facility
NASA Technical Reports Server (NTRS)
Nedell, B.; Hardy, G.; Lichtenstein, T.; Leong, G.; Thompson, D.
1986-01-01
The various issues associated with the use of integrated flight management systems in aircraft were discussed. To address these issues a fixed base integrated flight research (IFR) simulation of a helicopter was developed to support experiments that contribute to the understanding of design criteria for rotorcraft cockpits incorporating advanced integrated flight management systems. A validation experiment was conducted that demonstrates the main features of the facility and the capability to conduct crew/system integration research.
NASA Astrophysics Data System (ADS)
Nishida, R. T.; Beale, S. B.; Pharoah, J. G.; de Haart, L. G. J.; Blum, L.
2018-01-01
This work is among the first where the results of an extensive experimental research programme are compared to performance calculations of a comprehensive computational fluid dynamics model for a solid oxide fuel cell stack. The model, which combines electrochemical reactions with momentum, heat, and mass transport, is used to obtain results for an established industrial-scale fuel cell stack design with complex manifolds. To validate the model, comparisons with experimentally gathered voltage and temperature data are made for the Jülich Mark-F, 18-cell stack operating in a test furnace. Good agreement is obtained between the model and experiment results for cell voltages and temperature distributions, confirming the validity of the computational methodology for stack design. The transient effects during ramp up of current in the experiment may explain a lower average voltage than model predictions for the power curve.
Design and Validation of an Augmented Reality System for Laparoscopic Surgery in a Real Environment
López-Mir, F.; Naranjo, V.; Fuertes, J. J.; Alcañiz, M.; Bueno, J.; Pareja, E.
2013-01-01
Purpose. This work presents the protocol carried out in the development and validation of an augmented reality system which was installed in an operating theatre to help surgeons with trocar placement during laparoscopic surgery. The purpose of this validation is to demonstrate the improvements that this system can provide to the field of medicine, particularly surgery. Method. Two experiments that were noninvasive for both the patient and the surgeon were designed. In one of these experiments the augmented reality system was used, the other one was the control experiment, and the system was not used. The type of operation selected for all cases was a cholecystectomy due to the low degree of complexity and complications before, during, and after the surgery. The technique used in the placement of trocars was the French technique, but the results can be extrapolated to any other technique and operation. Results and Conclusion. Four clinicians and ninety-six measurements obtained of twenty-four patients (randomly assigned in each experiment) were involved in these experiments. The final results show an improvement in accuracy and variability of 33% and 63%, respectively, in comparison to traditional methods, demonstrating that the use of an augmented reality system offers advantages for trocar placement in laparoscopic surgery. PMID:24236293
Development and psychometric testing of the rural pregnancy experience scale (RPES).
Kornelsen, Jude; Stoll, Kathrin; Grzybowski, Stefan
2011-01-01
Rural pregnant woman who lack local access to maternity care due to their remote living circumstances may experience stress and anxiety related to pregnancy and parturition. The Rural Pregnancy Experience Scale (RPES) was designed to assess the unique worry and concerns reflective of the stress and anxiety of rural pregnant women related to pregnancy and parturition. The items of the scale were designed based on the results of a qualitative study of the experiences of pregnant rural women, thereby building a priori content validity into the measure. The relevancy content validity index (CVI) for this instrument was 1.0 and the clarity CVI was .91, as rated by maternity care specialists. A field test of the RPES with 187 pregnant rural women from British Columbia indicated that it had two factors: financial worries and worries/concerns about maternity care services, which were consistent with the conceptual base of the tool. Cronbach's alpha for the total RPES was .91; for the financial worries subscale and the worries/concerns about maternity care services subscale, alpha were .89 and .88, respectively. Construct validity was supported by significant correlations between the total scores of the RPES and the Depression Anxiety Stress Scales (DASS [r =.39, p < .01]), and subscale scores on the RPES were significantly correlated and converged with the depression, anxiety, and stress subscales of the DASS supporting convergent validity (correlations ranged between .20; p < .05 and .43; p < .01). Construct validity was also supported by findings that the level of access and availability of maternity care services were significantly associated with RPES scores. It was concluded that the RPES is a reliable and valid measure of worries and concerns reflective of rural pregnant women's stress and anxiety related to pregnancy and parturition.
Students' Self-Evaluation and Reflection (Part 1): "Measurement"
ERIC Educational Resources Information Center
Cambra-Fierro, Jesus; Cambra-Berdun, Jesus
2007-01-01
Purpose: The objective of the paper is the development and validation of scales to assess reflective learning. Design/methodology/approach: The research is based on a literature review plus in-classroom experience. For the scale validation process, exploratory and confirmatory analyses were conducted, following proposals made by Anderson and…
National Transonic Facility Wall Pressure Calibration Using Modern Design of Experiments (Invited)
NASA Technical Reports Server (NTRS)
Underwood, Pamela J.; Everhart, Joel L.; DeLoach, Richard
2001-01-01
The Modern Design of Experiments (MDOE) has been applied to wind tunnel testing at NASA Langley Research Center for several years. At Langley, MDOE has proven to be a useful and robust approach to aerodynamic testing that yields significant reductions in the cost and duration of experiments while still providing for the highest quality research results. This paper extends its application to include empty tunnel wall pressure calibrations. These calibrations are performed in support of wall interference corrections. This paper will present the experimental objectives, and the theoretical design process. To validate the tunnel-empty-calibration experiment design, preliminary response surface models calculated from previously acquired data are also presented. Finally, lessons learned and future wall interference applications of MDOE are discussed.
U.S. perspective on technology demonstration experiments for adaptive structures
NASA Technical Reports Server (NTRS)
Aswani, Mohan; Wada, Ben K.; Garba, John A.
1991-01-01
Evaluation of design concepts for adaptive structures is being performed in support of several focused research programs. These include programs such as Precision Segmented Reflector (PSR), Control Structure Interaction (CSI), and the Advanced Space Structures Technology Research Experiment (ASTREX). Although not specifically designed for adaptive structure technology validation, relevant experiments can be performed using the Passive and Active Control of Space Structures (PACOSS) testbed, the Space Integrated Controls Experiment (SPICE), the CSI Evolutionary Model (CEM), and the Dynamic Scale Model Test (DSMT) Hybrid Scale. In addition to the ground test experiments, several space flight experiments have been planned, including a reduced gravity experiment aboard the KC-135 aircraft, shuttle middeck experiments, and the Inexpensive Flight Experiment (INFLEX).
Lay out, test verification and in orbit performance of HELIOS a temperature control system
NASA Technical Reports Server (NTRS)
Brungs, W.
1975-01-01
HELIOS temperature control system is described. The main design features and the impact of interactions between experiment, spacecraft system, and temperature control system requirements on the design are discussed. The major limitations of the thermal design regarding a closer sun approach are given and related to test experience and performance data obtained in orbit. Finally the validity of the test results achieved with prototype and flight spacecraft is evaluated by comparison between test data, orbit temperature predictions and flight data.
Design and validation of the Health Professionals' Attitudes Toward the Homeless Inventory (HPATHI).
Buck, David S; Monteiro, F Marconi; Kneuper, Suzanne; Rochon, Donna; Clark, Dana L; Melillo, Allegra; Volk, Robert J
2005-01-10
Recent literature has called for humanistic care of patients and for medical schools to begin incorporating humanism into medical education. To assess the attitudes of health-care professionals toward homeless patients and to demonstrate how those attitudes might impact optimal care, we developed and validated a new survey instrument, the Health Professional Attitudes Toward the Homeless Inventory (HPATHI). An instrument that measures providers' attitudes toward the homeless could offer meaningful information for the design and implementation of educational activities that foster more compassionate homeless health care. Our intention was to describe the process of designing and validating the new instrument and to discuss the usefulness of the instrument for assessing the impact of educational experiences that involve working directly with the homeless on the attitudes, interest, and confidence of medical students and other health-care professionals. The study consisted of three phases: identifying items for the instrument; pilot testing the initial instrument with a group of 72 third-year medical students; and modifying and administering the instrument in its revised form to 160 health-care professionals and third-year medical students. The instrument was analyzed for reliability and validity throughout the process. A 19-item version of the HPATHI had good internal consistency with a Cronbach's alpha of 0.88 and a test-retest reliability coefficient of 0.69. The HPATHI showed good concurrent validity, and respondents with more than one year of experience with homeless patients scored significantly higher than did those with less experience. Factor analysis yielded three subscales: Personal Advocacy, Social Advocacy, and Cynicism. The HPATHI demonstrated strong reliability for the total scale and satisfactory test-retest reliability. Extreme group comparisons suggested that experience with the homeless rather than medical training itself could affect health-care professionals' attitudes toward the homeless. This could have implications for the evaluation of medical school curricula.
NASA Technical Reports Server (NTRS)
Emmons, L. K.; Pfister, G. G.; Edwards, D. P.; Gille, J. C.; Sachse, G.; Blake, D.; Wofsy, S.; Gerbig, C.; Matross, D.; Nedelec, P.
2007-01-01
Measurements of carbon monoxide (CO) made as part of three aircraft experiments during the summer of 2004 over North America have been used for the continued validation of the CO retrievals from the Measurements of Pollution in the Troposphere (MOPITT) instrument on board the Terra satellite. Vertical profiles measured during the NASA INTEX-A campaign, designed to be coincident with MOPITT overpasses, as well as measurements made during the COBRA-2004 and MOZAIC experiments, provided valuable validation comparisons. On average, the MOPITT CO retrievals are biased slightly high for these North America locations. While the mean bias differs between the different aircraft experiments (e.g., 7.0 ppbv for MOZAIC to 18.4 ppbv for COBRA at 700 hPa), the standard deviations are quite large, so the results for the three data sets can be considered consistent. On average, it is estimated that MOPITT is 7- 14% high at 700 hPa and 03% high at 350 hPa. These results are consistent with the validation results for the Carr, Colorado, Harvard Forest, Massachusetts, and Poker Flats, Alaska, aircraft profiles for "phase 2" presented by Emmons et al. (2004) and are generally within the design criteria of 10% accuracy.
MRPrimer: a MapReduce-based method for the thorough design of valid and ranked primers for PCR
Kim, Hyerin; Kang, NaNa; Chon, Kang-Wook; Kim, Seonho; Lee, NaHye; Koo, JaeHyung; Kim, Min-Soo
2015-01-01
Primer design is a fundamental technique that is widely used for polymerase chain reaction (PCR). Although many methods have been proposed for primer design, they require a great deal of manual effort to generate feasible and valid primers, including homology tests on off-target sequences using BLAST-like tools. That approach is inconvenient for many target sequences of quantitative PCR (qPCR) due to considering the same stringent and allele-invariant constraints. To address this issue, we propose an entirely new method called MRPrimer that can design all feasible and valid primer pairs existing in a DNA database at once, while simultaneously checking a multitude of filtering constraints and validating primer specificity. Furthermore, MRPrimer suggests the best primer pair for each target sequence, based on a ranking method. Through qPCR analysis using 343 primer pairs and the corresponding sequencing and comparative analyses, we showed that the primer pairs designed by MRPrimer are very stable and effective for qPCR. In addition, MRPrimer is computationally efficient and scalable and therefore useful for quickly constructing an entire collection of feasible and valid primers for frequently updated databases like RefSeq. Furthermore, we suggest that MRPrimer can be utilized conveniently for experiments requiring primer design, especially real-time qPCR. PMID:26109350
2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation
NASA Technical Reports Server (NTRS)
Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.
2009-01-01
A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.
Plume Measurement System (PLUMES) Calibration Experiment
1994-08-01
calibration chamber was lished and documented. To apply acoustic designed and built Particles were suspended technology to monitoring suspended sedi- in the...procedures are described in Chap- ter 2. Repeatability of the experiments and validity of the results are de - scribed in Chapter 3. In Chapter 4, the range...went into their design . The first two subsections give an overview of the calibration chamber and its characteristics. The remaining subsections describe
SERENITY in Air Traffic Management
NASA Astrophysics Data System (ADS)
Felici, Massimo; Meduri, Valentino; Tedeschi, Alessandra; Riccucci, Carlo
This chapter is concerned with the validation of an implementation of the SERENITY Runtime Framework (SRF) tailored for the Air Traffic Management (ATM) domain. It reports our experience in the design and validation phases of a tool, which relies on the SRF in order to support Security and Dependability (S&D) Patterns into work practices. In particular, this chapter pinpoints the activities concerning the identification of S&D Patterns, the design of an ATM prototype and its validation. The validation activities involve qualitative as well as quantitative approaches. These activities as a whole highlight the validation process for adopting S&D Patterns within the ATM domain. Moreover, they stress how S&D Patters enhance and relate to critical features within an industry domain. The empirical results point out that S&D Patterns relate to work practices. Furthermore, they highlight design and validation activities in order to tailor systems relying on S&D Patterns to specific application domains. This strengths and supports the adoption of S&D Patterns in order to address AmI (Ambient Intelligence) requirements (e.g., awareness, proactiveness, resilience, etc.) within the ATM domain.
Reliability and Validity of a Spanish Version of the Posttraumatic Growth Inventory
ERIC Educational Resources Information Center
Weiss, Tzipi; Berger, Roni
2006-01-01
Objectives. This study was designed to adapt and validate a Spanish translation of the Posttraumatic Growth Inventory (PTGI) for the assessment of positive life changes following the stressful experiences of immigration. Method. A cross-cultural equivalence model was used to pursue semantic, content, conceptual, and technical equivalence.…
Achieving external validity in home advantage research: generalizing crowd noise effects
Myers, Tony D.
2014-01-01
Different factors have been postulated to explain the home advantage phenomenon in sport. One plausible explanation investigated has been the influence of a partisan home crowd on sports officials' decisions. Different types of studies have tested the crowd influence hypothesis including purposefully designed experiments. However, while experimental studies investigating crowd influences have high levels of internal validity, they suffer from a lack of external validity; decision-making in a laboratory setting bearing little resemblance to decision-making in live sports settings. This focused review initially considers threats to external validity in applied and theoretical experimental research. Discussing how such threats can be addressed using representative design by focusing on a recently published study that arguably provides the first experimental evidence of the impact of live crowd noise on officials in sport. The findings of this controlled experiment conducted in a real tournament setting offer a level of confirmation of the findings of laboratory studies in the area. Finally directions for future research and the future conduct of crowd noise studies are discussed. PMID:24917839
Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh
2009-02-20
Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method development time.
A design procedure and handling quality criteria for lateral directional flight control systems
NASA Technical Reports Server (NTRS)
Stein, G.; Henke, A. H.
1972-01-01
A practical design procedure for aircraft augmentation systems is described based on quadratic optimal control technology and handling-quality-oriented cost functionals. The procedure is applied to the design of a lateral-directional control system for the F4C aircraft. The design criteria, design procedure, and final control system are validated with a program of formal pilot evaluation experiments.
NASA Technical Reports Server (NTRS)
Schulte, Peter Z.; Moore, James W.
2011-01-01
The Crew Exploration Vehicle Parachute Assembly System (CPAS) project conducts computer simulations to verify that flight performance requirements on parachute loads and terminal rate of descent are met. Design of Experiments (DoE) provides a systematic method for variation of simulation input parameters. When implemented and interpreted correctly, a DoE study of parachute simulation tools indicates values and combinations of parameters that may cause requirement limits to be violated. This paper describes one implementation of DoE that is currently being developed by CPAS, explains how DoE results can be interpreted, and presents the results of several preliminary studies. The potential uses of DoE to validate parachute simulation models and verify requirements are also explored.
Hydrogen-oxygen proton-exchange membrane fuel cells and electrolyzers
NASA Technical Reports Server (NTRS)
Baldwin, R.; Pham, M.; Leonida, A.; Mcelroy, J.; Nalette, T.
1990-01-01
A flight experiment is planned for the validation, in a microgravity environment, of several ground-proven simplification features relating to SPE fuel cells and SPE electrolyzers. With a successful experiment, these features can be incorporated into equipment designs for specific extraterrestrial energy storage applications.
Fundamental arthroscopic skill differentiation with virtual reality simulation.
Rose, Kelsey; Pedowitz, Robert
2015-02-01
The purpose of this study was to investigate the use and validity of virtual reality modules as part of the educational approach to mastering arthroscopy in a safe environment by assessing the ability to distinguish between experience levels. Additionally, the study aimed to evaluate whether experts have greater ambidexterity than do novices. Three virtual reality modules (Swemac/Augmented Reality Systems, Linkoping, Sweden) were created to test fundamental arthroscopic skills. Thirty participants-10 experts consisting of faculty, 10 intermediate participants consisting of orthopaedic residents, and 10 novices consisting of medical students-performed each exercise. Steady and Telescope was designed to train centering and image stability. Steady and Probe was designed to train basic triangulation. Track and Moving Target was designed to train coordinated motions of arthroscope and probe. Metrics reflecting speed, accuracy, and efficiency of motion were used to measure construct validity. Steady and Probe and Track a Moving Target both exhibited construct validity, with better performance by experts and intermediate participants than by novices (P < .05), whereas Steady and Telescope did not show validity. There was an overall trend toward better ambidexterity as a function of greater surgical experience, with experts consistently more proficient than novices throughout all 3 modules. This study represents a new way to assess basic arthroscopy skills using virtual reality modules developed through task deconstruction. Participants with the most arthroscopic experience performed better and were more consistent than novices on all 3 virtual reality modules. Greater arthroscopic experience correlates with more symmetry of ambidextrous performance. However, further adjustment of the modules may better simulate fundamental arthroscopic skills and discriminate between experience levels. Arthroscopy training is a critical element of orthopaedic surgery resident training. Developing techniques to safely and effectively train these skills is critical for patient safety and resident education. Copyright © 2015 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Andromeda, A.; Lufri; Festiyed; Ellizar, E.; Iryani, I.; Guspatni, G.; Fitri, L.
2018-04-01
This Research & Development study aims to produce a valid and practical experiment integrated guided inquiry based module on topic of colloidal chemistry. 4D instructional design model was selected in this study. Limited trial of the product was conducted at SMAN 7 Padang. Instruments used were validity and practicality questionnaires. Validity and practicality data were analyzed using Kappa moment. Analysis of the data shows that Kappa moment for validity was 0.88 indicating a very high degree of validity. Kappa moments for the practicality from students and teachers were 0.89 and 0.95 respectively indicating high degree of practicality. Analysis on the module filled in by students shows that 91.37% students could correctly answer critical thinking, exercise, prelab, postlab and worksheet questions asked in the module. These findings indicate that the integrated guided inquiry based module on topic of colloidal chemistry was valid and practical for chemistry learning in senior high school.
Hanauer, David I.; Bauerle, Cynthia
2015-01-01
Science, technology, engineering, and mathematics education reform efforts have called for widespread adoption of evidence-based teaching in which faculty members attend to student outcomes through assessment practice. Awareness about the importance of assessment has illuminated the need to understand what faculty members know and how they engage with assessment knowledge and practice. The Faculty Self-Reported Assessment Survey (FRAS) is a new instrument for evaluating science faculty assessment knowledge and experience. Instrument validation was composed of two distinct studies: an empirical evaluation of the psychometric properties of the FRAS and a comparative known-groups validation to explore the ability of the FRAS to differentiate levels of faculty assessment experience. The FRAS was found to be highly reliable (α = 0.96). The dimensionality of the instrument enabled distinction of assessment knowledge into categories of program design, instrumentation, and validation. In the known-groups validation, the FRAS distinguished between faculty groups with differing levels of assessment experience. Faculty members with formal assessment experience self-reported higher levels of familiarity with assessment terms, higher frequencies of assessment activity, increased confidence in conducting assessment, and more positive attitudes toward assessment than faculty members who were novices in assessment. These results suggest that the FRAS can reliably and validly differentiate levels of expertise in faculty knowledge of assessment. PMID:25976653
Dasgupta, Annwesa P.; Anderson, Trevor R.
2014-01-01
It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students’ responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students’ experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. PMID:26086658
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-01-01
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272
Reconceptualising the external validity of discrete choice experiments.
Lancsar, Emily; Swait, Joffre
2014-10-01
External validity is a crucial but under-researched topic when considering using discrete choice experiment (DCE) results to inform decision making in clinical, commercial or policy contexts. We present the theory and tests traditionally used to explore external validity that focus on a comparison of final outcomes and review how this traditional definition has been empirically tested in health economics and other sectors (such as transport, environment and marketing) in which DCE methods are applied. While an important component, we argue that the investigation of external validity should be much broader than a comparison of final outcomes. In doing so, we introduce a new and more comprehensive conceptualisation of external validity, closely linked to process validity, that moves us from the simple characterisation of a model as being or not being externally valid on the basis of predictive performance, to the concept that external validity should be an objective pursued from the initial conceptualisation and design of any DCE. We discuss how such a broader definition of external validity can be fruitfully used and suggest innovative ways in which it can be explored in practice.
NASA Astrophysics Data System (ADS)
Nurjanah; Dahlan, J. A.; Wibisono, Y.
2017-02-01
This paper aims to make a design and development computer-based e-learning teaching material for improving mathematical understanding ability and spatial sense of junior high school students. Furthermore, the particular aims are (1) getting teaching material design, evaluation model, and intrument to measure mathematical understanding ability and spatial sense of junior high school students; (2) conducting trials computer-based e-learning teaching material model, asessment, and instrument to develop mathematical understanding ability and spatial sense of junior high school students; (3) completing teaching material models of computer-based e-learning, assessment, and develop mathematical understanding ability and spatial sense of junior high school students; (4) resulting research product is teaching materials of computer-based e-learning. Furthermore, the product is an interactive learning disc. The research method is used of this study is developmental research which is conducted by thought experiment and instruction experiment. The result showed that teaching materials could be used very well. This is based on the validation of computer-based e-learning teaching materials, which is validated by 5 multimedia experts. The judgement result of face and content validity of 5 validator shows that the same judgement result to the face and content validity of each item test of mathematical understanding ability and spatial sense. The reliability test of mathematical understanding ability and spatial sense are 0,929 and 0,939. This reliability test is very high. While the validity of both tests have a high and very high criteria.
Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel
2011-06-01
This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.
Disturbance Reduction Control Design for the ST7 Flight Validation Experiment
NASA Technical Reports Server (NTRS)
Maghami, P. G.; Hsu, O. C.; Markley, F. L.; Houghton, M. B.
2003-01-01
The Space Technology 7 experiment will perform an on-orbit system-level validation of two specific Disturbance Reduction System technologies: a gravitational reference sensor employing a free-floating test mass, and a set of micro-Newton colloidal thrusters. The ST7 Disturbance Reduction System is designed to maintain the spacecraft's position with respect to a free-floating test mass to less than 10 nm/Hz, over the frequency range of 1 to 30 mHz. This paper presents the design and analysis of the coupled, drag-free and attitude control systems that close the loop between the gravitational reference sensor and the micro-Newton thrusters, while incorporating star tracker data at low frequencies. A full 18 degree-of-freedom model, which incorporates rigid-body models of the spacecraft and two test masses, is used to evaluate the effects of actuation and measurement noise and disturbances on the performance of the drag-free system.
Observing System Simulation Experiments: An Overview
NASA Technical Reports Server (NTRS)
Prive, Nikki C.; Errico, Ronald M.
2016-01-01
An overview of Observing System Simulation Experiments (OSSEs) will be given, with focus on calibration and validation of OSSE frameworks. Pitfalls and practice will be discussed, including observation error characteristics, incestuousness, and experimental design. The potential use of OSSEs for investigation of the behaviour of data assimilation systems will be explored, including some results from experiments using the NASAGMAO OSSE.
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
Experimental and Quasi-Experimental Design.
ERIC Educational Resources Information Center
Cottrell, Edward B.
With an emphasis on the problems of control of extraneous variables and threats to internal and external validity, the arrangement or design of experiments is discussed. The purpose of experimentation in an educational institution, and the principles governing true experimentation (randomization, replication, and control) are presented, as are…
Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay
2016-04-01
Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The National Ignition Facility: alignment from construction to shot operations
NASA Astrophysics Data System (ADS)
Burkhart, S. C.; Bliss, E.; Di Nicola, P.; Kalantar, D.; Lowe-Webb, R.; McCarville, T.; Nelson, D.; Salmon, T.; Schindler, T.; Villanueva, J.; Wilhelmsen, K.
2010-08-01
The National Ignition Facility in Livermore, California, completed it's commissioning milestone on March 10, 2009 when it fired all 192 beams at a combined energy of 1.1 MJ at 351nm. Subsequently, a target shot series from August through December of 2009 culminated in scale ignition target design experiments up to 1.2 MJ in the National Ignition Campaign. Preparations are underway through the first half of of 2010 leading to DT ignition and gain experiments in the fall of 2010 into 2011. The top level requirement for beam pointing to target of 50μm rms is the culmination of 15 years of engineering design of a stable facility, commissioning of precision alignment, and precise shot operations controls. Key design documents which guided this project were published in the mid 1990's, driving systems designs. Precision Survey methods were used throughout construction, commissioning and operations for precision placement. Rigorous commissioning processes were used to ensure and validate placement and alignment throughout commissioning and in present day operations. Accurate and rapid system alignment during operations is accomplished by an impressive controls system to align and validate alignment readiness, assuring machine safety and productive experiments.
Laboratory outreach: student assessment of flow cytometer fluidics in zero gravity.
Crucian, B; Norman, J; Brentz, J; Pietrzyk, R; Sams, C
2000-10-01
Due to the the clinical utility of the flow cytometer, the National Aeronautics and Space Administration (NASA) is interested in the design of a space flight-compatible cytometer for use on long-duration space missions. Because fluid behavior is altered dramatically during space flight, it was deemed necessary to validate the principles of hydrodynamic focusing and laminar flow (cytometer fluidics) in a true microgravity environment. An experiment to validate these properties was conducted by 12 students from Sweetwater High School (Sweetwater, TX) participating in the NASA Reduced Gravity Student Flight Opportunity, Class of 2000. This program allows high school students to gain scientific experience by conducting an experiment on the NASA KC-135 zero gravity laboratory aircraft. The KC-135 creates actual zero-gravity conditions in 30-second intervals by flying a highly inclined parabolic flight path. The experiment was designed by their mentor in the program, the Johnson Space Center's flow cytometrist Brian Crucian, PhD, MT(ASCP). The students performed the experiment, with the mentor, onboard the NASA zero-gravity research aircraft in April 2000.
Optimal fiber design for large capacity long haul coherent transmission [Invited].
Hasegawa, Takemi; Yamamoto, Yoshinori; Hirano, Masaaki
2017-01-23
Fiber figure of merit (FOM), derived from the GN-model theory and validated by several experiments, can predict improvement in OSNR or transmission distance using advanced fibers. We review the FOM theory and present design results of optimal fiber for large capacity long haul transmission, showing variation in design results according to system configuration.
Bor, Jacob; Geldsetzer, Pascal; Venkataramani, Atheendar; Bärnighausen, Till
2015-01-01
Purpose of review Randomized, population-representative trials of clinical interventions are rare. Quasi-experiments have been used successfully to generate causal evidence on the cascade of HIV care in a broad range of real-world settings. Recent findings Quasi-experiments exploit exogenous, or quasi-random, variation occurring naturally in the world or because of an administrative rule or policy change to estimate causal effects. Well designed quasi-experiments have greater internal validity than typical observational research designs. At the same time, quasi-experiments may also have potential for greater external validity than experiments and can be implemented when randomized clinical trials are infeasible or unethical. Quasi-experimental studies have established the causal effects of HIV testing and initiation of antiretroviral therapy on health, economic outcomes and sexual behaviors, as well as indirect effects on other community members. Recent quasi-experiments have evaluated specific interventions to improve patient performance in the cascade of care, providing causal evidence to optimize clinical management of HIV. Summary Quasi-experiments have generated important data on the real-world impacts of HIV testing and treatment and on interventions to improve the cascade of care. With the growth in large-scale clinical and administrative data, quasi-experiments enable rigorous evaluation of policies implemented in real-world settings. PMID:26371463
Bor, Jacob; Geldsetzer, Pascal; Venkataramani, Atheendar; Bärnighausen, Till
2015-11-01
Randomized, population-representative trials of clinical interventions are rare. Quasi-experiments have been used successfully to generate causal evidence on the cascade of HIV care in a broad range of real-world settings. Quasi-experiments exploit exogenous, or quasi-random, variation occurring naturally in the world or because of an administrative rule or policy change to estimate causal effects. Well designed quasi-experiments have greater internal validity than typical observational research designs. At the same time, quasi-experiments may also have potential for greater external validity than experiments and can be implemented when randomized clinical trials are infeasible or unethical. Quasi-experimental studies have established the causal effects of HIV testing and initiation of antiretroviral therapy on health, economic outcomes and sexual behaviors, as well as indirect effects on other community members. Recent quasi-experiments have evaluated specific interventions to improve patient performance in the cascade of care, providing causal evidence to optimize clinical management of HIV. Quasi-experiments have generated important data on the real-world impacts of HIV testing and treatment and on interventions to improve the cascade of care. With the growth in large-scale clinical and administrative data, quasi-experiments enable rigorous evaluation of policies implemented in real-world settings.
NASA Technical Reports Server (NTRS)
Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John
2011-01-01
A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.
Targeting BRCAness in Gastric Cancer
2017-10-01
modified CRISPR system using dCas9-KRAB expressing variants of these cells, and validated them for CRISPRi screening. These reagents will next be used...OE19 Adherent Stable Oesophagus/ gastric cardia - - - OE33 Adherent Stable Esophageal adenocarcinoma - - - Figure 2. Validation of CRISPR ...project planning and reporting Patrick O’Leary Postdoctoral Fellow UCSF PH.D. Design, execution, and interpretation of CRISPR experiments
NASA Technical Reports Server (NTRS)
O'Donnell, James R.; Hsu, Oscar C.; Maghami, Peirman G.; Markley, F. Landis
2006-01-01
As originally proposed, the Space Technology-7 Disturbance Reduction System (DRS) project, managed out of the Jet Propulsion Laboratory, was designed to validate technologies required for future missions such as the Laser Interferometer Space Antenna (LISA). The two technologies to be demonstrated by DRS were Gravitational Reference Sensors (GRSs) and Colloidal MicroNewton Thrusters (CMNTs). Control algorithms being designed by the Dynamic Control System (DCS) team at the Goddard Space Flight Center would control the spacecraft so that it flew about a freely-floating GRS test mass, keeping it centered within its housing. For programmatic reasons, the GRSs were descoped from DRS. The primary goals of the new mission are to validate the performance of the CMNTs and to demonstrate precise spacecraft position control. DRS will fly as a part of the European Space Agency (ESA) LISA Pathfinder (LPF) spacecraft along with a similar ESA experiment, the LISA Technology Package (LTP). With no GRS, the DCS attitude and drag-free control systems make use of the sensor being developed by ESA as a part of the LTP. The control system is designed to maintain the spacecraft s position with respect to the test mass, to within 10 nm/the square root of Hz over the DRS science frequency band of 1 to 30 mHz.
OPTIMAL EXPERIMENT DESIGN FOR MAGNETIC RESONANCE FINGERPRINTING
Zhao, Bo; Haldar, Justin P.; Setsompop, Kawin; Wald, Lawrence L.
2017-01-01
Magnetic resonance (MR) fingerprinting is an emerging quantitative MR imaging technique that simultaneously acquires multiple tissue parameters in an efficient experiment. In this work, we present an estimation-theoretic framework to evaluate and design MR fingerprinting experiments. More specifically, we derive the Cramér-Rao bound (CRB), a lower bound on the covariance of any unbiased estimator, to characterize parameter estimation for MR fingerprinting. We then formulate an optimal experiment design problem based on the CRB to choose a set of acquisition parameters (e.g., flip angles and/or repetition times) that maximizes the signal-to-noise ratio efficiency of the resulting experiment. The utility of the proposed approach is validated by numerical studies. Representative results demonstrate that the optimized experiments allow for substantial reduction in the length of an MR fingerprinting acquisition, and substantial improvement in parameter estimation performance. PMID:28268369
Optimal experiment design for magnetic resonance fingerprinting.
Bo Zhao; Haldar, Justin P; Setsompop, Kawin; Wald, Lawrence L
2016-08-01
Magnetic resonance (MR) fingerprinting is an emerging quantitative MR imaging technique that simultaneously acquires multiple tissue parameters in an efficient experiment. In this work, we present an estimation-theoretic framework to evaluate and design MR fingerprinting experiments. More specifically, we derive the Cramér-Rao bound (CRB), a lower bound on the covariance of any unbiased estimator, to characterize parameter estimation for MR fingerprinting. We then formulate an optimal experiment design problem based on the CRB to choose a set of acquisition parameters (e.g., flip angles and/or repetition times) that maximizes the signal-to-noise ratio efficiency of the resulting experiment. The utility of the proposed approach is validated by numerical studies. Representative results demonstrate that the optimized experiments allow for substantial reduction in the length of an MR fingerprinting acquisition, and substantial improvement in parameter estimation performance.
Single subject controlled experiments in aphasia: The science and the state of the science
Thompson, Cynthia K.
2007-01-01
This paper discusses the use of single subject controlled experimental designs for investigating the effect of treatment for aphasia. A brief historical perspective is presented, followed by discussions of the advantages and disadvantages of single subject and group approaches, the basic requirements of single subject experimental research, and crucial considerations in design selection. In the final sections, results of reviews of published single subject controlled experiments are discussed, with emphasis on internal validity issues, the number of participants enrolled in published studies, operational specification of the dependent and independent variables, and reliability of measurement. Learning outcomes As a result of reading this paper, the participant will: (1) understand the mechanisms required for demonstration of internal and external validity using single subject controlled experimental designs, (2) become familiar with the basic requirements of single subject controlled experimental research, (3) understand the types of single subject controlled experimental designs that are the most appropriate for studying the effects of treatment for aphasia, and (4) become familiar with trends in the published aphasia treatment literature in which single subject controlled experimental designs have been used. PMID:16635494
Development and validation of a low-cost mobile robotics testbed
NASA Astrophysics Data System (ADS)
Johnson, Michael; Hayes, Martin J.
2012-03-01
This paper considers the design, construction and validation of a low-cost experimental robotic testbed, which allows for the localisation and tracking of multiple robotic agents in real time. The testbed system is suitable for research and education in a range of different mobile robotic applications, for validating theoretical as well as practical research work in the field of digital control, mobile robotics, graphical programming and video tracking systems. It provides a reconfigurable floor space for mobile robotic agents to operate within, while tracking the position of multiple agents in real-time using the overhead vision system. The overall system provides a highly cost-effective solution to the topical problem of providing students with practical robotics experience within severe budget constraints. Several problems encountered in the design and development of the mobile robotic testbed and associated tracking system, such as radial lens distortion and the selection of robot identifier templates are clearly addressed. The testbed performance is quantified and several experiments involving LEGO Mindstorm NXT and Merlin System MiaBot robots are discussed.
Integration and Test Flight Validation Plans for the Pulsed Plasma Thruster Experiment on EO- 1
NASA Technical Reports Server (NTRS)
Zakrzwski, Charles; Benson, Scott; Sanneman, Paul; Hoskins, Andy; Bauer, Frank H. (Technical Monitor)
2002-01-01
The Pulsed Plasma Thruster (PPT) Experiment on the Earth Observing One (EO-1) spacecraft has been designed to demonstrate the capability of a new generation PPT to perform spacecraft attitude control. The PPT is a small, self-contained pulsed electromagnetic propulsion system capable of delivering high specific impulse (900-1200 s), very small impulse bits (10-1000 uN-s) at low average power (less than 1 to 100 W). Teflon fuel is ablated and slightly ionized by means of a capacitative discharge. The discharge also generates electromagnetic fields that accelerate the plasma by means of the Lorentz Force. EO-1 has a single PPT that can produce thrust in either the positive or negative pitch direction. The flight validation has been designed to demonstrate of the ability of the PPT to provide precision pointing accuracy, response and stability, and confirmation of benign plume and EMI effects. This paper will document the success of the flight validation.
Supersonic Coaxial Jet Experiment for CFD Code Validation
NASA Technical Reports Server (NTRS)
Cutler, A. D.; Carty, A. A.; Doerner, S. E.; Diskin, G. S.; Drummond, J. P.
1999-01-01
A supersonic coaxial jet facility has been designed to provide experimental data suitable for the validation of CFD codes used to analyze high-speed propulsion flows. The center jet is of a light gas and the coflow jet is of air, and the mixing layer between them is compressible. Various methods have been employed in characterizing the jet flow field, including schlieren visualization, pitot, total temperature and gas sampling probe surveying, and RELIEF velocimetry. A Navier-Stokes code has been used to calculate the nozzle flow field and the results compared to the experiment.
Mass Spectrometry for the Masses
ERIC Educational Resources Information Center
Persinger, Jared D.; Hoops, Geoffrey, C.; Samide, Michael J.
2004-01-01
A simple, qualitative experiment is developed for implementation, where the gas chromatography-mass spectrometry (GC-MS) plays an important role, into the laboratory curriculum of a chemistry course designed for nonscience majors. This laboratory experiment is well suited for the students as it helps them to determine the validity of their…
Development, Validity, and Reliability of the Campus Residential Experience Survey
ERIC Educational Resources Information Center
Sriram, Rishi; Scales, Laine; Shushok, Frank, Jr.
2017-01-01
The importance of living on campus is well established, but extant research that examines administrator perceptions of what comprises the best educational experience for students living on campus is generally unavailable. This study reports the development of a psychometric instrument designed to uncover underlying paradigms and attitudes of…
Structurally compliant rocket engine combustion chamber: Experimental and analytical validation
NASA Technical Reports Server (NTRS)
Jankovsky, Robert S.; Arya, Vinod K.; Kazaroff, John M.; Halford, Gary R.
1994-01-01
A new, structurally compliant rocket engine combustion chamber design has been validated through analysis and experiment. Subscale, tubular channel chambers have been cyclically tested and analytically evaluated. Cyclic lives were determined to have a potential for 1000 percent increase over those of rectangular channel designs, the current state of the art. Greater structural compliance in the circumferential direction gave rise to lower thermal strains during hot firing, resulting in lower thermal strain ratcheting and longer predicted fatigue lives. Thermal, structural, and durability analyses of the combustion chamber design, involving cyclic temperatures, strains, and low-cycle fatigue lives, have corroborated the experimental observations.
MRPrimer: a MapReduce-based method for the thorough design of valid and ranked primers for PCR.
Kim, Hyerin; Kang, NaNa; Chon, Kang-Wook; Kim, Seonho; Lee, NaHye; Koo, JaeHyung; Kim, Min-Soo
2015-11-16
Primer design is a fundamental technique that is widely used for polymerase chain reaction (PCR). Although many methods have been proposed for primer design, they require a great deal of manual effort to generate feasible and valid primers, including homology tests on off-target sequences using BLAST-like tools. That approach is inconvenient for many target sequences of quantitative PCR (qPCR) due to considering the same stringent and allele-invariant constraints. To address this issue, we propose an entirely new method called MRPrimer that can design all feasible and valid primer pairs existing in a DNA database at once, while simultaneously checking a multitude of filtering constraints and validating primer specificity. Furthermore, MRPrimer suggests the best primer pair for each target sequence, based on a ranking method. Through qPCR analysis using 343 primer pairs and the corresponding sequencing and comparative analyses, we showed that the primer pairs designed by MRPrimer are very stable and effective for qPCR. In addition, MRPrimer is computationally efficient and scalable and therefore useful for quickly constructing an entire collection of feasible and valid primers for frequently updated databases like RefSeq. Furthermore, we suggest that MRPrimer can be utilized conveniently for experiments requiring primer design, especially real-time qPCR. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
In-flight results of adaptive attitude control law for a microsatellite
NASA Astrophysics Data System (ADS)
Pittet, C.; Luzi, A. R.; Peaucelle, D.; Biannic, J.-M.; Mignot, J.
2015-06-01
Because satellites usually do not experience large changes of mass, center of gravity or inertia in orbit, linear time invariant (LTI) controllers have been widely used to control their attitude. But, as the pointing requirements become more stringent and the satellite's structure more complex with large steerable and/or deployable appendices and flexible modes occurring in the control bandwidth, one unique LTI controller is no longer sufficient. One solution consists in designing several LTI controllers, one for each set point, but the switching between them is difficult to tune and validate. Another interesting solution is to use adaptive controllers, which could present at least two advantages: first, as the controller automatically and continuously adapts to the set point without changing the structure, no switching logic is needed in the software; second, performance and stability of the closed-loop system can be assessed directly on the whole flight domain. To evaluate the real benefits of adaptive control for satellites, in terms of design, validation and performances, CNES selected it as end-of-life experiment on PICARD microsatellite. This paper describes the design, validation and in-flight results of the new adaptive attitude control law, compared to nominal control law.
Audience Design through Social Interaction during Group Discussion
Rogers, Shane L.; Fay, Nicolas; Maybery, Murray
2013-01-01
This paper contrasts two accounts of audience design during multiparty communication: audience design as a strategic individual-level message adjustment or as a non-strategic interaction-level message adjustment. Using a non-interactive communication task, Experiment 1 showed that people distinguish between messages designed for oneself and messages designed for another person; consistent with strategic message design, messages designed for another person/s were longer (number of words) than those designed for oneself. However, audience size did not affect message length (messages designed for different sized audiences were similar in length). Using an interactive communication task Experiment 2 showed that as group size increased so too did communicative effort (number of words exchanged between interlocutors). Consistent with a non-strategic account, as group members were added more social interaction was necessary to coordinate the group's collective situation model. Experiment 3 validates and extends the production measures used in Experiment 1 and 2 using a comprehension task. Taken together, our results indicate that audience design arises as a non-strategic outcome of social interaction during group discussion. PMID:23437343
Designing and validation of a yoga-based intervention for schizophrenia.
Govindaraj, Ramajayam; Varambally, Shivarama; Sharma, Manjunath; Gangadhar, Bangalore Nanjundaiah
2016-06-01
Schizophrenia is a chronic mental illness which causes significant distress and dysfunction. Yoga has been found to be effective as an add-on therapy in schizophrenia. Modules of yoga used in previous studies were based on individual researcher's experience. This study aimed to develop and validate a specific generic yoga-based intervention module for patients with schizophrenia. The study was conducted at NIMHANS Integrated Centre for Yoga (NICY). A yoga module was designed based on traditional and contemporary yoga literature as well as published studies. The yoga module along with three case vignettes of adult patients with schizophrenia was sent to 10 yoga experts for their validation. Experts (n = 10) gave their opinion on the usefulness of a yoga module for patients with schizophrenia with some modifications. In total, 87% (13 of 15 items) of the items in the initial module were retained, with modification in the remainder as suggested by the experts. A specific yoga-based module for schizophrenia was designed and validated by experts. Further studies are needed to confirm efficacy and clinical utility of the module. Additional clinical validation is suggested.
Measuring and Enhancing Creativity
ERIC Educational Resources Information Center
Mahboub, Kamyar C.; Portillo, Margaret B.; Liu, Yinhui; Chandraratna, Susantha
2004-01-01
The purpose of this study was to assess ways by which creativity may be enhanced in a design-oriented course. In order to demonstrate the validity of the approach, a statistically based study was employed. Additionally, the experiment was replicated in two design-oriented fields at the University of Kentucky. These fields were civil engineering…
Validation Database Based Thermal Analysis of an Advanced RPS Concept
NASA Technical Reports Server (NTRS)
Balint, Tibor S.; Emis, Nickolas D.
2006-01-01
Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.
A statistical approach to selecting and confirming validation targets in -omics experiments
2012-01-01
Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145
Hot rocket plume experiment - Survey and conceptual design. [of rhenium-iridium bipropellants
NASA Technical Reports Server (NTRS)
Millard, Jerry M.; Luan, Taylor W.; Dowdy, Mack W.
1992-01-01
Attention is given to a space-borne engine plume experiment study to fly an experiment which will both verify and quantify the reduced contamination from advanced rhenium-iridium earth-storable bipropellant rockets (hot rockets) and provide a correlation between high-fidelity, in-space measurements and theoretical plume and surface contamination models. The experiment conceptual design is based on survey results from plume and contamination technologists throughout the U.S. With respect to shuttle use, cursory investigations validate Hitchhiker availability and adaptability, adequate remote manipulator system (RMS) articulation and dynamic capability, acceptable RMS attachment capability, adequate power and telemetry capability, and adequate flight altitude and attitude/orbital capability.
Benej, Martin; Bendlova, Bela; Vaclavikova, Eliska; Poturnajova, Martina
2011-10-06
Reliable and effective primary screening of mutation carriers is the key condition for common diagnostic use. The objective of this study is to validate the method high resolution melting (HRM) analysis for routine primary mutation screening and accomplish its optimization, evaluation and validation. Due to their heterozygous nature, germline point mutations of c-RET proto-oncogene, associated to multiple endocrine neoplasia type 2 (MEN2), are suitable for HRM analysis. Early identification of mutation carriers has a major impact on patients' survival due to early onset of medullary thyroid carcinoma (MTC) and resistance to conventional therapy. The authors performed a series of validation assays according to International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) guidelines for validation of analytical procedures, along with appropriate design and optimization experiments. After validated evaluation of HRM, the method was utilized for primary screening of 28 pathogenic c-RET mutations distributed among nine exons of c-RET gene. Validation experiments confirm the repeatability, robustness, accuracy and reproducibility of HRM. All c-RET gene pathogenic variants were detected with no occurrence of false-positive/false-negative results. The data provide basic information about design, establishment and validation of HRM for primary screening of genetic variants in order to distinguish heterozygous point mutation carriers among the wild-type sequence carriers. HRM analysis is a powerful and reliable tool for rapid and cost-effective primary screening, e.g., of c-RET gene germline and/or sporadic mutations and can be used as a first line potential diagnostic tool.
Valladares-Rodriguez, Sonia; Perez-Rodriguez, Roberto; Facal, David; Fernandez-Iglesias, Manuel J; Anido-Rifon, Luis; Mouriño-Garcia, Marcos
2017-01-01
Assessment of episodic memory has been traditionally used to evaluate potential cognitive impairments in senior adults. Typically, episodic memory evaluation is based on personal interviews and pen-and-paper tests. This article presents the design, development and a preliminary validation of a novel digital game to assess episodic memory intended to overcome the limitations of traditional methods, such as the cost of its administration, its intrusive character, the lack of early detection capabilities, the lack of ecological validity, the learning effect and the existence of confounding factors. Our proposal is based on the gamification of the California Verbal Learning Test (CVLT) and it has been designed to comply with the psychometric characteristics of reliability and validity. Two qualitative focus groups and a first pilot experiment were carried out to validate the proposal. A more ecological, non-intrusive and better administrable tool to perform cognitive assessment was developed. Initial evidence from the focus groups and pilot experiment confirmed the developed game's usability and offered promising results insofar its psychometric validity is concerned. Moreover, the potential of this game for the cognitive classification of senior adults was confirmed, and administration time is dramatically reduced with respect to pen-and-paper tests. Additional research is needed to improve the resolution of the game for the identification of specific cognitive impairments, as well as to achieve a complete validation of the psychometric properties of the digital game. Initial evidence show that serious games can be used as an instrument to assess the cognitive status of senior adults, and even to predict the onset of mild cognitive impairments or Alzheimer's disease.
Perez-Rodriguez, Roberto; Facal, David; Fernandez-Iglesias, Manuel J.; Anido-Rifon, Luis; Mouriño-Garcia, Marcos
2017-01-01
Introduction Assessment of episodic memory has been traditionally used to evaluate potential cognitive impairments in senior adults. Typically, episodic memory evaluation is based on personal interviews and pen-and-paper tests. This article presents the design, development and a preliminary validation of a novel digital game to assess episodic memory intended to overcome the limitations of traditional methods, such as the cost of its administration, its intrusive character, the lack of early detection capabilities, the lack of ecological validity, the learning effect and the existence of confounding factors. Materials and Methods Our proposal is based on the gamification of the California Verbal Learning Test (CVLT) and it has been designed to comply with the psychometric characteristics of reliability and validity. Two qualitative focus groups and a first pilot experiment were carried out to validate the proposal. Results A more ecological, non-intrusive and better administrable tool to perform cognitive assessment was developed. Initial evidence from the focus groups and pilot experiment confirmed the developed game’s usability and offered promising results insofar its psychometric validity is concerned. Moreover, the potential of this game for the cognitive classification of senior adults was confirmed, and administration time is dramatically reduced with respect to pen-and-paper tests. Limitations Additional research is needed to improve the resolution of the game for the identification of specific cognitive impairments, as well as to achieve a complete validation of the psychometric properties of the digital game. Conclusion Initial evidence show that serious games can be used as an instrument to assess the cognitive status of senior adults, and even to predict the onset of mild cognitive impairments or Alzheimer’s disease. PMID:28674661
Toward Supersonic Retropropulsion CFD Validation
NASA Technical Reports Server (NTRS)
Kleb, Bil; Schauerhamer, D. Guy; Trumble, Kerry; Sozer, Emre; Barnhardt, Michael; Carlson, Jan-Renee; Edquist, Karl
2011-01-01
This paper begins the process of verifying and validating computational fluid dynamics (CFD) codes for supersonic retropropulsive flows. Four CFD codes (DPLR, FUN3D, OVERFLOW, and US3D) are used to perform various numerical and physical modeling studies toward the goal of comparing predictions with a wind tunnel experiment specifically designed to support CFD validation. Numerical studies run the gamut in rigor from code-to-code comparisons to observed order-of-accuracy tests. Results indicate that this complex flowfield, involving time-dependent shocks and vortex shedding, design order of accuracy is not clearly evident. Also explored is the extent of physical modeling necessary to predict the salient flowfield features found in high-speed Schlieren images and surface pressure measurements taken during the validation experiment. Physical modeling studies include geometric items such as wind tunnel wall and sting mount interference, as well as turbulence modeling that ranges from a RANS (Reynolds-Averaged Navier-Stokes) 2-equation model to DES (Detached Eddy Simulation) models. These studies indicate that tunnel wall interference is minimal for the cases investigated; model mounting hardware effects are confined to the aft end of the model; and sparse grid resolution and turbulence modeling can damp or entirely dissipate the unsteadiness of this self-excited flow.
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-07-08
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Technical Reports Server (NTRS)
Benzie, M. A.
1998-01-01
The objective of this research project was to examine processing and design parameters in the fabrication of composite components to obtain a better understanding and attempt to minimize springback associated with composite materials. To accomplish this, both processing and design parameters were included in a Taguchi-designed experiment. Composite angled panels were fabricated, by hand layup techniques, and the fabricated panels were inspected for springback effects. This experiment yielded several significant results. The confirmation experiment validated the reproducibility of the factorial effects, error recognized, and experiment as reliable. The material used in the design of tooling needs to be a major consideration when fabricating composite components, as expected. The factors dealing with resin flow, however, raise several potentially serious material and design questions. These questions must be dealt with up front in order to minimize springback: viscosity of the resin, vacuum bagging of the part for cure, and the curing method selected. These factors directly affect design, material selection, and processing methods.
NASA Astrophysics Data System (ADS)
Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven
2016-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data (downscaled values) and metadata (characterizing different aspects of the downscaling methods). This constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods. Here, we present an overall validation, analyzing marginal and temporal aspects to assess the intrinsic performance and added value of statistical downscaling methods at both annual and seasonal levels. This validation takes into account the different properties/limitations of different approaches and techniques (as reported in the provided metadata) in order to perform a fair comparison. It is pointed out that this experiment alone is not sufficient to evaluate the limitations of (MOS) bias correction techniques. Moreover, it also does not fully validate PP since we don't learn whether we have the right predictors and whether the PP assumption is valid. These problems will be analyzed in the subsequent community-open VALUE experiments 2) and 3), which will be open for participation along the present year.
A Hardware Model Validation Tool for Use in Complex Space Systems
NASA Technical Reports Server (NTRS)
Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.
2010-01-01
One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.
Hybrid Soft Soil Tire Model (HSSTM). Part 1: Tire Material and Structure Modeling
2015-04-28
commercially available vehicle simulation packages. Model parameters are obtained using a validated finite element tire model, modal analysis, and other...design of experiment matrix. This data, in addition to modal analysis data were used to validate the tire model. Furthermore, to study the validity...é ë ê ê ê ê ê ê ê ù û ú ú ú ú ú ú ú (78) The applied forces to the rim center consist of the axle forces and suspension forces: FFF Gsuspension G
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jordan, Amy B.; Stauffer, Philip H.; Reed, Donald T.
The primary objective of the experimental effort described here is to aid in understanding the complex nature of liquid, vapor, and solid transport occurring around heated nuclear waste in bedded salt. In order to gain confidence in the predictive capability of numerical models, experimental validation must be performed to ensure that (a) hydrological and physiochemical parameters and (b) processes are correctly simulated. The experiments proposed here are designed to study aspects of the system that have not been satisfactorily quantified in prior work. In addition to exploring the complex coupled physical processes in support of numerical model validation, lessons learnedmore » from these experiments will facilitate preparations for larger-scale experiments that may utilize similar instrumentation techniques.« less
A Validation Framework for the Long Term Preservation of High Energy Physics Data
NASA Astrophysics Data System (ADS)
Ozerov, Dmitri; South, David M.
2014-06-01
The study group on data preservation in high energy physics, DPHEP, is moving to a new collaboration structure, which will focus on the implementation of preservation projects, such as those described in the group's large scale report published in 2012. One such project is the development of a validation framework, which checks the compatibility of evolving computing environments and technologies with the experiments software for as long as possible, with the aim of substantially extending the lifetime of the analysis software, and hence of the usability of the data. The framework is designed to automatically test and validate the software and data of an experiment against changes and upgrades to the computing environment, as well as changes to the experiment software itself. Technically, this is realised using a framework capable of hosting a number of virtual machine images, built with different configurations of operating systems and the relevant software, including any necessary external dependencies.
A High Performance Pulsatile Pump for Aortic Flow Experiments in 3-Dimensional Models.
Chaudhury, Rafeed A; Atlasman, Victor; Pathangey, Girish; Pracht, Nicholas; Adrian, Ronald J; Frakes, David H
2016-06-01
Aortic pathologies such as coarctation, dissection, and aneurysm represent a particularly emergent class of cardiovascular diseases. Computational simulations of aortic flows are growing increasingly important as tools for gaining understanding of these pathologies, as well as for planning their surgical repair. In vitro experiments are required to validate the simulations against real world data, and the experiments require a pulsatile flow pump system that can provide physiologic flow conditions characteristic of the aorta. We designed a newly capable piston-based pulsatile flow pump system that can generate high volume flow rates (850 mL/s), replicate physiologic waveforms, and pump high viscosity fluids against large impedances. The system is also compatible with a broad range of fluid types, and is operable in magnetic resonance imaging environments. Performance of the system was validated using image processing-based analysis of piston motion as well as particle image velocimetry. The new system represents a more capable pumping solution for aortic flow experiments than other available designs, and can be manufactured at a relatively low cost.
NASA Astrophysics Data System (ADS)
Fujiwara, Yukihiro; Yoshii, Masakazu; Arai, Yasuhito; Adachi, Shuichi
Advanced safety vehicle(ASV)assists drivers’ manipulation to avoid trafic accidents. A variety of researches on automatic driving systems are necessary as an element of ASV. Among them, we focus on visual feedback approach in which the automatic driving system is realized by recognizing road trajectory using image information. The purpose of this paper is to examine the validity of this approach by experiments using a radio-controlled car. First, a practical image processing algorithm to recognize white lines on the road is proposed. Second, a model of the radio-controlled car is built by system identication experiments. Third, an automatic steering control system is designed based on H∞ control theory. Finally, the effectiveness of the designed control system is examined via traveling experiments.
Fuel-Air Mixing and Combustion in Scramjets
NASA Technical Reports Server (NTRS)
Drummond, J. P.; Diskin, Glenn S.; Cutler, A. D.
2002-01-01
Activities in the area of scramjet fuel-air mixing and combustion associated with the Research and Technology Organization Working Group on Technologies for Propelled Hypersonic Flight are described. Work discussed in this paper has centered on the design of two basic experiments for studying the mixing and combustion of fuel and air in a scramjet. Simulations were conducted to aid in the design of these experiments. The experimental models were then constructed, and data were collected in the laboratory. Comparison of the data from a coaxial jet mixing experiment and a supersonic combustor experiment with a combustor code were then made and described. This work was conducted by NATO to validate combustion codes currently employed in scramjet design and to aid in the development of improved turbulence and combustion models employed by the codes.
Lost in space: design of experiments and scientific exploration in a Hogarth Universe.
Lendrem, Dennis W; Lendrem, B Clare; Woods, David; Rowland-Jones, Ruth; Burke, Matthew; Chatfield, Marion; Isaacs, John D; Owen, Martin R
2015-11-01
A Hogarth, or 'wicked', universe is an irregular environment generating data to support erroneous beliefs. Here, we argue that development scientists often work in such a universe. We demonstrate that exploring these multidimensional spaces using small experiments guided by scientific intuition alone, gives rise to an illusion of validity and a misplaced confidence in that scientific intuition. By contrast, design of experiments (DOE) permits the efficient mapping of such complex, multidimensional spaces. We describe simulation tools that enable research scientists to explore these spaces in relative safety. Copyright © 2015 Elsevier Ltd. All rights reserved.
Process material management in the Space Station environment
NASA Technical Reports Server (NTRS)
Perry, J. L.; Humphries, W. R.
1988-01-01
The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.
Experiment module concepts study. Volume 3: Module and subsystem design
NASA Technical Reports Server (NTRS)
Hunter, J. R.; Chiarappa, D. J.
1970-01-01
The final common module set exhibiting wide commonality is described. The set consists of three types of modules: one free flying module and two modules that operate attached to the space station. The common module designs provide for the experiment program as defined. The feasibility, economy, and practicality of these modules hinges on factors that do not affect the approach or results of the commonality process, but are important to the validity of the common module concepts. Implementation of the total experiment program requires thirteen common modules: five CM-1, five CM-3, and three CM-4 modules.
Smartkuber: A Serious Game for Cognitive Health Screening of Elderly Players.
Boletsis, Costas; McCallum, Simon
2016-08-01
The goal of this study was to design and develop a serious game for cognitive health screening of the elderly, namely Smartkuber, and evaluate its construct, criteria (concurrent and predictive), and content validity, assessing its relationship with the Montreal Cognitive Assessment (MoCA) test. Furthermore, the study aims to evaluate the elderly players' game experience with Smartkuber. Thirteen older adults were enrolled in the study. The game was designed and developed by a multidisciplinary team. The study follows a mixed methodological approach, utilizing the In-Game Experience Questionnaire to assess the players' game experience and a correlational study, to examine the relationship between the Smartkuber and MoCA scores. The learning effect is also examined by comparing the mean game scores of the first and last game sessions of each player (Delta scores). All 13 participants (mean age: 68.69, SD: 7.24) successfully completed the study. Smartkuber demonstrated high concurrent validity with the MoCA test (r = 0.81, P = 0.001) and satisfying levels of predictive and content validity. The Delta scores showed no statistically significant differences in scoring, thus indicating no learning effects during the Smartkuber game sessions. The study shows that Smartkuber is a promising tool for cognitive health screening, providing an entertaining and motivating gaming experience to elderly players. Limitations of the study and future directions are discussed.
SATS HVO Concept Validation Experiment
NASA Technical Reports Server (NTRS)
Consiglio, Maria; Williams, Daniel; Murdoch, Jennifer; Adams, Catherine
2005-01-01
A human-in-the-loop simulation experiment was conducted at the NASA Langley Research Center s (LaRC) Air Traffic Operations Lab (ATOL) in an effort to comprehensively validate tools and procedures intended to enable the Small Aircraft Transportation System, Higher Volume Operations (SATS HVO) concept of operations. The SATS HVO procedures were developed to increase the rate of operations at non-towered, non-radar airports in near all-weather conditions. A key element of the design is the establishment of a volume of airspace around designated airports where pilots accept responsibility for self-separation. Flights operating at these airports, are given approach sequencing information computed by a ground based automated system. The SATS HVO validation experiment was conducted in the ATOL during the spring of 2004 in order to determine if a pilot can safely and proficiently fly an airplane while performing SATS HVO procedures. Comparative measures of flight path error, perceived workload and situation awareness were obtained for two types of scenarios. Baseline scenarios were representative of today s system utilizing procedure separation, where air traffic control grants one approach or departure clearance at a time. SATS HVO scenarios represented approaches and departure procedures as described in the SATS HVO concept of operations. Results from the experiment indicate that low time pilots were able to fly SATS HVO procedures and maintain self-separation as safely and proficiently as flying today's procedures.
NASA Technical Reports Server (NTRS)
Generazio, Ed
2007-01-01
This viewgraph presentation reviews some of the issues that people who specialize in Non destructive evaluation (NDE) have with determining the statistics of the probability of detection. There is discussion of the use of the binominal distribution, and the probability of hit. The presentation then reviews the concepts of Directed Design of Experiments for Validating Probability of Detection of Inspection Systems (DOEPOD). Several cases are reviewed, and discussed. The concept of false calls is also reviewed.
McConnell, Bridget L.; Urushihara, Kouji; Miller, Ralph R.
2009-01-01
Three conditioned suppression experiments with rats investigated contrasting predictions made by the extended comparator hypothesis and acquisition-focused models of learning, specifically, modified SOP and the revised Rescorla-Wagner model, concerning retrospective revaluation. Two target cues (X and Y) were partially reinforced using a stimulus relative validity design (i.e., AX-Outcome/ BX-No outcome/ CY-Outcome/ DY-No outcome), and subsequently one of the companion cues for each target was extinguished in compound (BC-No outcome). In Experiment 1, which used spaced trials for relative validity training, greater suppression was observed to target cue Y for which the excitatory companion cue had been extinguished relative to target cue X for which the nonexcitatory companion cue had been extinguished. Experiment 2 replicated these results in a sensory preconditioning preparation. Experiment 3 massed the trials during relative validity training, and the opposite pattern of data was observed. The results are consistent with the predictions of the extended comparator hypothesis. Furthermore, this set of experiments is unique in being able to differentiate between these models without invoking higher-order comparator processes. PMID:20141324
A practical approach to automate randomized design of experiments for ligand-binding assays.
Tsoi, Jennifer; Patel, Vimal; Shih, Judy
2014-03-01
Design of experiments (DOE) is utilized in optimizing ligand-binding assay by modeling factor effects. To reduce the analyst's workload and error inherent with DOE, we propose the integration of automated liquid handlers to perform the randomized designs. A randomized design created from statistical software was imported into custom macro converting the design into a liquid-handler worklist to automate reagent delivery. An optimized assay was transferred to a contract research organization resulting in a successful validation. We developed a practical solution for assay optimization by integrating DOE and automation to increase assay robustness and enable successful method transfer. The flexibility of this process allows it to be applied to a variety of assay designs.
NASA Technical Reports Server (NTRS)
Foster, John D.; Moralez, Ernesto, III; Franklin, James A.; Schroeder, Jeffery A.
1987-01-01
Results of a substantial body of ground-based simulation experiments indicate that a high degree of precision of operation for recovery aboard small ships in heavy seas and low visibility with acceptable levels of effort by the pilot can be achieved by integrating the aircraft flight and propulsion controls. The availability of digital fly-by-wire controls makes it feasible to implement an integrated control design to achieve and demonstrate in flight the operational benefits promised by the simulation experience. It remains to validate these systems concepts in flight to establish their value for advanced short takeoff vertical landing (STOVL) aircraft designs. This paper summarizes analytical studies and simulation experiments which provide a basis for the flight research program that will develop and validate critical technologies for advanced STOVL aircraft through the development and evaluation of advanced, integrated control and display concepts, and lays out the plan for the flight program that will be conducted on NASA's V/STOL Research Aircraft (VSRA).
THE VALIDITY OF HUMAN AND COMPUTERIZED WRITING ASSESSMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring
2005-09-01
This paper summarizes an experiment designed to assess the validity of essay grading between holistic and analytic human graders and a computerized grader based on latent semantic analysis. The validity of the grade was gauged by the extent to which the student’s knowledge of the topic correlated with the grader’s expert knowledge. To assess knowledge, Pathfinder networks were generated by the student essay writers, the holistic and analytic graders, and the computerized grader. It was found that the computer generated grades more closely matched the definition of valid grading than did human generated grades.
Formulaic Language in Computer-Supported Communication: Theory Meets Reality.
ERIC Educational Resources Information Center
Wray, Alison
2002-01-01
Attempts to validate a psycholinguistic model of language processing. One experiment designed to provide insight into the model is TALK, is a system developed to promote conversational fluency in non-speaking individuals. TALK, designed primarily for people with cerebral palsy and motor neuron disease. Talk is demonstrated to be a viable tool for…
Panamanian women׳s experience of vaginal examination in labour: A questionnaire validation.
Bonilla-Escobar, Francisco J; Ortega-Lenis, Delia; Rojas-Mirquez, Johanna C; Ortega-Loubon, Christian
2016-05-01
to validate a tool that allows healthcare providers to obtain accurate information regarding Panamanian women׳s thoughts and feelings about vaginal examination during labour that can be used in other Latin-American countries. validation study based on a database from a cross-sectional study carried out in two tertiary care hospitals in Panama City, Panama. Women in the immediate postpartum period who had spontaneous labour onset and uncomplicated deliveries were included in the study from April to August 2008. Researchers used a survey designed by Lewin et al. that included 20 questions related to a patient׳s experience during a vaginal examination. five constructs (factors) related to a patient׳s experience of vaginal examination during labour were identified: Approval (Alpha Cronbach׳s 0.72), Perception (0.67), Rejection (0.40), Consent (0.51), and Stress (0.20). it was demonstrated the validity of the scale and its constructs used to obtain information related to vaginal examination during labour, including patients' experiences with examination and healthcare staff performance. utilisation of the scale will allow institutions to identify items that need improvement and address these areas in order to promote the best care for patients in labour. Copyright © 2016 Elsevier Ltd. All rights reserved.
Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A
2011-01-01
The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less
Raza, Syed T
2013-06-01
A circular aortic stapler has been developed to anastomose the open end of the aorta to a size-matched Dacron tube graft in one quick motion and without having to pull sutures through the aortic wall. A prototype was developed, and its design and function were tested in bench experiments and compared with hand-sewn anastomosis. The basic design of the stapler is a central rod (anvil) surrounded by 10 stapling limbs, which can be closed over the anvil in a full circle, with staples extruded by turning a knob at the back. To test its function, a Dacron tube graft was inserted in the middle of a length of bovine aorta. One side was anastomosed with the stapler and the other hand-sewn in each of 10 experiments. Bovine blood was infused under increasing pressure. It took considerably less time to complete the stapled anastomosis than the hand-sewn side (3 minutes, 46 seconds versus 15 minutes, 42 seconds). Initial leak occurred at low pressures on the hand-sewn side (mean pressure 40 mm Hg) compared with the stapled side (mean pressure 70 mm Hg). In 7 of 10 experiments, the leak became too brisk on the hand-sewn side to sustain pressure, compared with 3 of 10 with stapled anastomoses. The stapling device performed well in all cases except when the bovine aorta was too thick for the staples (two cases) or when there was a missed branch at the anastomotic site (one case). These experiments validate the concept and the design of this aortic stapler. There are some limitations in the current design, which will need to be modified before its use in live animals or clinically.
Analytic Modeling of Pressurization and Cryogenic Propellant Conditions for Lunar Landing Vehicle
NASA Technical Reports Server (NTRS)
Corpening, Jeremy
2010-01-01
This slide presentation reviews the development, validation and application of the model to the Lunar Landing Vehicle. The model named, Computational Propellant and Pressurization Program -- One Dimensional (CPPPO), is used to model in this case cryogenic propellant conditions of the Altair Lunar lander. The validation of CPPPO was accomplished via comparison to an existing analytic model (i.e., ROCETS), flight experiment and ground experiments. The model was used to the Lunar Landing Vehicle perform a parametric analysis on pressurant conditions and to examine the results of unequal tank pressurization and draining for multiple tank designs.
NASA Technical Reports Server (NTRS)
Wales, R. O. (Editor)
1981-01-01
The overall mission and spacecraft systems, testing, and operations are summarized. The mechanical subsystems are reviewed, encompassing mechanical design requirements; separation and deployment mechanisms; design and performance evaluation; and the television camera reflector monitor. Thermal control and contamination are discussed in terms of thermal control subsystems, design validation, subsystems performance, the advanced flight experiment, and the quartz-crystal microbalance contamination monitor.
Validated simulator for space debris removal with nets and other flexible tethers applications
NASA Astrophysics Data System (ADS)
Gołębiowski, Wojciech; Michalczyk, Rafał; Dyrek, Michał; Battista, Umberto; Wormnes, Kjetil
2016-12-01
In the context of active debris removal technologies and preparation activities for the e.Deorbit mission, a simulator for net-shaped elastic bodies dynamics and their interactions with rigid bodies, has been developed. Its main application is to aid net design and test scenarios for space debris deorbitation. The simulator can model all the phases of the debris capturing process: net launch, flight and wrapping around the target. It handles coupled simulation of rigid and flexible bodies dynamics. Flexible bodies were implemented using Cosserat rods model. It allows to simulate flexible threads or wires with elasticity and damping for stretching, bending and torsion. Threads may be combined into structures of any topology, so the software is able to simulate nets, pure tethers, tether bundles, cages, trusses, etc. Full contact dynamics was implemented. Programmatic interaction with simulation is possible - i.e. for control implementation. The underlying model has been experimentally validated and due to significant gravity influence, experiment had to be performed in microgravity conditions. Validation experiment for parabolic flight was a downscaled process of Envisat capturing. The prepacked net was launched towards the satellite model, it expanded, hit the model and wrapped around it. The whole process was recorded with 2 fast stereographic camera sets for full 3D trajectory reconstruction. The trajectories were used to compare net dynamics to respective simulations and then to validate the simulation tool. The experiments were performed on board of a Falcon-20 aircraft, operated by National Research Council in Ottawa, Canada. Validation results show that model reflects phenomenon physics accurately enough, so it may be used for scenario evaluation and mission design purposes. The functionalities of the simulator are described in detail in the paper, as well as its underlying model, sample cases and methodology behind validation. Results are presented and typical use cases are discussed showing that the software may be used to design throw nets for space debris capturing, but also to simulate deorbitation process, chaser control system or general interactions between rigid and elastic bodies - all in convenient and efficient way. The presented work was led by SKA Polska under the ESA contract, within the CleanSpace initiative.
Zimmermann, Karin; Cignacco, Eva; Eskola, Katri; Engberg, Sandra; Ramelet, Anne-Sylvie; Von der Weid, Nicolas; Bergstraesser, Eva
2015-12-01
To develop and test the Parental PELICAN Questionnaire, an instrument to retrospectively assess parental experiences and needs during their child's end-of-life care. To offer appropriate care for dying children, healthcare professionals need to understand the illness experience from the family perspective. A questionnaire specific to the end-of-life experiences and needs of parents losing a child is needed to evaluate the perceived quality of paediatric end-of-life care. This is an instrument development study applying mixed methods based on recommendations for questionnaire design and validation. The Parental PELICAN Questionnaire was developed in four phases between August 2012-March 2014: phase 1: item generation; phase 2: validity testing; phase 3: translation; phase 4: pilot testing. Psychometric properties were assessed after applying the Parental PELICAN Questionnaire in a sample of 224 bereaved parents in April 2014. Validity testing covered the evidence based on tests of content, internal structure and relations to other variables. The Parental PELICAN Questionnaire consists of approximately 90 items in four slightly different versions accounting for particularities of the four diagnostic groups. The questionnaire's items were structured according to six quality domains described in the literature. Evidence of initial validity and reliability could be demonstrated with the involvement of healthcare professionals and bereaved parents. The Parental PELICAN Questionnaire holds promise as a measure to assess parental experiences and needs and is applicable to a broad range of paediatric specialties and settings. Future validation is needed to evaluate its suitability in different cultures. © 2015 John Wiley & Sons Ltd.
Validation of a wireless modular monitoring system for structures
NASA Astrophysics Data System (ADS)
Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind
2002-06-01
A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.
Uth, Nicholas; Mueller, Jens; Smucker, Byran; Yousefi, Azizeh-Mitra
2017-02-21
This study reports the development of biological/synthetic scaffolds for bone tissue engineering (TE) via 3D bioplotting. These scaffolds were composed of poly(L-lactic-co-glycolic acid) (PLGA), type I collagen, and nano-hydroxyapatite (nHA) in an attempt to mimic the extracellular matrix of bone. The solvent used for processing the scaffolds was 1,1,1,3,3,3-hexafluoro-2-propanol. The produced scaffolds were characterized by scanning electron microscopy, microcomputed tomography, thermogravimetric analysis, and unconfined compression test. This study also sought to validate the use of finite-element optimization in COMSOL Multiphysics for scaffold design. Scaffold topology was simplified to three factors: nHA content, strand diameter, and strand spacing. These factors affect the ability of the scaffold to bear mechanical loads and how porous the structure can be. Twenty four scaffolds were constructed according to an I-optimal, split-plot designed experiment (DE) in order to generate experimental models of the factor-response relationships. Within the design region, the DE and COMSOL models agreed in their recommended optimal nHA (30%) and strand diameter (460 μm). However, the two methods disagreed by more than 30% in strand spacing (908 μm for DE; 601 μm for COMSOL). Seven scaffolds were 3D-bioplotted to validate the predictions of DE and COMSOL models (4.5-9.9 MPa measured moduli). The predictions for these scaffolds showed relative agreement for scaffold porosity (mean absolute percentage error of 4% for DE and 13% for COMSOL), but were substantially poorer for scaffold modulus (51% for DE; 21% for COMSOL), partly due to some simplifying assumptions made by the models. Expanding the design region in future experiments (e.g., higher nHA content and strand diameter), developing an efficient solvent evaporation method, and exerting a greater control over layer overlap could allow developing PLGA-nHA-collagen scaffolds to meet the mechanical requirements for bone TE.
System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling
Bacelli, Giorgio; Coe, Ryan; Patterson, David; ...
2017-04-01
Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less
System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacelli, Giorgio; Coe, Ryan; Patterson, David
Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less
Flight control system design factors for applying automated testing techniques
NASA Technical Reports Server (NTRS)
Sitz, Joel R.; Vernon, Todd H.
1990-01-01
Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.
Ensuring the Quality of Evidence: Using the Best Design to Answer Health IT Questions.
Weir, Charlene R
2016-01-01
The quality of logic in a research design determines the value of the results and our confidence regarding the validity of the findings. The purpose of this contribution is to review the principles of research design as they apply to research and evaluation in health IT. We review the architecture of research design, the definitions of cause, sources of bias and confounds, and the importance of measurement as related to the various types of health IT questions. The goal is to provide practitioners a roadmap for making decisions for their own specific study. The contribution is organized around the Threats to Validity taxonomy and explains how different design models address these threats through the use of blocking, factorial design, control groups and time series analysis. The contribution discusses randomized experiments, and includes regression discontinuity designs and various quasi-experimental designs with a special emphasis on how to improve pre/post designs. At the end, general recommendations are provided for improving weaker designs and general research procedures.
Validating a new methodology for optical probe design and image registration in fNIRS studies
Wijeakumar, Sobanawartiny; Spencer, John P.; Bohache, Kevin; Boas, David A.; Magnotta, Vincent A.
2015-01-01
Functional near-infrared spectroscopy (fNIRS) is an imaging technique that relies on the principle of shining near-infrared light through tissue to detect changes in hemodynamic activation. An important methodological issue encountered is the creation of optimized probe geometry for fNIRS recordings. Here, across three experiments, we describe and validate a processing pipeline designed to create an optimized, yet scalable probe geometry based on selected regions of interest (ROIs) from the functional magnetic resonance imaging (fMRI) literature. In experiment 1, we created a probe geometry optimized to record changes in activation from target ROIs important for visual working memory. Positions of the sources and detectors of the probe geometry on an adult head were digitized using a motion sensor and projected onto a generic adult atlas and a segmented head obtained from the subject's MRI scan. In experiment 2, the same probe geometry was scaled down to fit a child's head and later digitized and projected onto the generic adult atlas and a segmented volume obtained from the child's MRI scan. Using visualization tools and by quantifying the amount of intersection between target ROIs and channels, we show that out of 21 ROIs, 17 and 19 ROIs intersected with fNIRS channels from the adult and child probe geometries, respectively. Further, both the adult atlas and adult subject-specific MRI approaches yielded similar results and can be used interchangeably. However, results suggest that segmented heads obtained from MRI scans be used for registering children's data. Finally, in experiment 3, we further validated our processing pipeline by creating a different probe geometry designed to record from target ROIs involved in language and motor processing. PMID:25705757
WOLF REXUS EXPERIMENT - European Planetary Science Congress
NASA Astrophysics Data System (ADS)
Buzdugan, A.
2017-09-01
WOLF experiment is developing a reaction wheel-based control system, effectively functioning as active nutation damper. One reaction wheel is used to reduce the undesirable lateral rates of spinning cylindrically symmetric free falling units, ejected from a sounding rocket. Once validated in REXUS flight, the concept and the design developed during WOLF experiment can be used for other application which require a flat spin of the free falling units.
Saavedra, Javier; López, Marcelino; Gonzáles, Sergio; Cubero, Rosario
2016-09-01
Employment has been highlighted as a determinant of health and as an essential milestone in the recovery process of people with serious mental illness. Different types of programs and public services have been designed to improve the employability of this population. However, there has not been much interest in the meanings attributed to these experiences and the negative aspects of work experience. In this research, we explored the meanings that participants attributed to their work experience and the impact of work on their recovery process. Research participants lived in Andalusia (Spain), a region in southern Europe with a high unemployment rate. Two versions of a semi-structured interview were designed: one for people who were working, and one for unemployed people. Participants' narratives were categorized according to grounded theory and the analyses were validated in group sessions. Apart from several positive effects for recovery, the analysis of the narratives about work experience outlined certain obstacles to recovery. For example, participants mentioned personal conflicts and stress, job insecurity and meaningless jobs. While valid, the idea that employment is beneficial for recovery must be qualified by the personal meanings attributed to these experiences, and the specific cultural and economic factors of each context.
Validation of the Small Hot Jet Acoustic Rig for Jet Noise Research
NASA Technical Reports Server (NTRS)
Bridges, James; Brown, Clifford A.
2005-01-01
The development and acoustic validation of the Small Hot Jet Aeroacoustic Rig (SHJAR) is documented. Originally conceived to support fundamental research in jet noise, the rig has been designed and developed using the best practices of the industry. While validating the rig for acoustic work, a method of characterizing all extraneous rig noise was developed. With this in hand, the researcher can know when the jet data being measured is being contaminated and design the experiment around this limitation. Also considered is the question of uncertainty, where it is shown that there is a fundamental uncertainty of 0.5dB or so to the best experiments, confirmed by repeatability studies. One area not generally accounted for in the uncertainty analysis is the variation which can result from differences in initial condition of the nozzle shear layer. This initial condition was modified and the differences in both flow and sound were documented. The bottom line is that extreme caution must be applied when working on small jet rigs, but that highly accurate results can be made independent of scale.
NASA Technical Reports Server (NTRS)
Harris, Charles D.; Harvey, William D.; Brooks, Cuyler W., Jr.
1988-01-01
A large-chord, swept, supercritical, laminar-flow-control (LFC) airfoil was designed and constructed and is currently undergoing tests in the Langley 8 ft Transonic Pressure Tunnel. The experiment was directed toward evaluating the compatibility of LFC and supercritical airfoils, validating prediction techniques, and generating a data base for future transport airfoil design as part of NASA's ongoing research program to significantly reduce drag and increase aircraft efficiency. Unique features of the airfoil included a high design Mach number with shock free flow and boundary layer control by suction. Special requirements for the experiment included modifications to the wind tunnel to achieve the necessary flow quality and contouring of the test section walls to simulate free air flow about a swept model at transonic speeds. Design of the airfoil with a slotted suction surface, the suction system, and modifications to the tunnel to meet test requirements are discussed.
The Value of Interrupted Time-Series Experiments for Community Intervention Research
Biglan, Anthony; Ary, Dennis; Wagenaar, Alexander C.
2015-01-01
Greater use of interrupted time-series experiments is advocated for community intervention research. Time-series designs enable the development of knowledge about the effects of community interventions and policies in circumstances in which randomized controlled trials are too expensive, premature, or simply impractical. The multiple baseline time-series design typically involves two or more communities that are repeatedly assessed, with the intervention introduced into one community at a time. It is particularly well suited to initial evaluations of community interventions and the refinement of those interventions. This paper describes the main features of multiple baseline designs and related repeated-measures time-series experiments, discusses the threats to internal validity in multiple baseline designs, and outlines techniques for statistical analyses of time-series data. Examples are given of the use of multiple baseline designs in evaluating community interventions and policy changes. PMID:11507793
Henderson, Valerie C; Kimmelman, Jonathan; Fergusson, Dean; Grimshaw, Jeremy M; Hackam, Dan G
2013-01-01
The vast majority of medical interventions introduced into clinical development prove unsafe or ineffective. One prominent explanation for the dismal success rate is flawed preclinical research. We conducted a systematic review of preclinical research guidelines and organized recommendations according to the type of validity threat (internal, construct, or external) or programmatic research activity they primarily address. We searched MEDLINE, Google Scholar, Google, and the EQUATOR Network website for all preclinical guideline documents published up to April 9, 2013 that addressed the design and conduct of in vivo animal experiments aimed at supporting clinical translation. To be eligible, documents had to provide guidance on the design or execution of preclinical animal experiments and represent the aggregated consensus of four or more investigators. Data from included guidelines were independently extracted by two individuals for discrete recommendations on the design and implementation of preclinical efficacy studies. These recommendations were then organized according to the type of validity threat they addressed. A total of 2,029 citations were identified through our search strategy. From these, we identified 26 guidelines that met our eligibility criteria--most of which were directed at neurological or cerebrovascular drug development. Together, these guidelines offered 55 different recommendations. Some of the most common recommendations included performance of a power calculation to determine sample size, randomized treatment allocation, and characterization of disease phenotype in the animal model prior to experimentation. By identifying the most recurrent recommendations among preclinical guidelines, we provide a starting point for developing preclinical guidelines in other disease domains. We also provide a basis for the study and evaluation of preclinical research practice. Please see later in the article for the Editors' Summary.
Managing design excellence tools during the development of new orthopaedic implants.
Défossez, Henri J P; Serhan, Hassan
2013-11-01
Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.
Development of an Integrated Nozzle for a Symmetric, RBCC Launch Vehicle Configuration
NASA Technical Reports Server (NTRS)
Smith, Timothy D.; Canabal, Francisco, III; Rice, Tharen; Blaha, Bernard
2000-01-01
The development of rocket based combined cycle (RBCC) engines is highly dependent upon integrating several different modes of operation into a single system. One of the key components to develop acceptable performance levels through each mode of operation is the nozzle. It must be highly integrated to serve the expansion processes of both rocket and air-breathing modes without undue weight, drag, or complexity. The NASA GTX configuration requires a fixed geometry, altitude-compensating nozzle configuration. The initial configuration, used mainly to estimate weight and cooling requirements was a 1 So half-angle cone, which cuts a concave surface from a point within the flowpath to the vehicle trailing edge. Results of 3-D CFD calculations on this geometry are presented. To address the critical issues associated with integrated, fixed geometry, multimode nozzle development, the GTX team has initiated a series of tasks to evolve the nozzle design, and validate performance levels. An overview of these tasks is given. The first element is a design activity to develop tools for integration of efficient expansion surfaces With the existing flowpath and vehicle aft-body, and to develop a second-generation nozzle design. A preliminary result using a "streamline-tracing" technique is presented. As the nozzle design evolves, a combination of 3-D CFD analysis and experimental evaluation will be used to validate the design procedure and determine the installed performance for propulsion cycle modeling. The initial experimental effort will consist of cold-flow experiments designed to validate the general trends of the streamline-tracing methodology and anchor the CFD analysis. Experiments will also be conducted to simulate nozzle performance during each mode of operation. As the design matures, hot-fire tests will be conducted to refine performance estimates and anchor more sophisticated reacting-flow analysis.
Results from SMAP Validation Experiments 2015 and 2016
NASA Astrophysics Data System (ADS)
Colliander, A.; Jackson, T. J.; Cosh, M. H.; Misra, S.; Crow, W.; Powers, J.; Wood, E. F.; Mohanty, B.; Judge, J.; Drewry, D.; McNairn, H.; Bullock, P.; Berg, A. A.; Magagi, R.; O'Neill, P. E.; Yueh, S. H.
2017-12-01
NASA's Soil Moisture Active Passive (SMAP) mission was launched in January 2015. The objective of the mission is global mapping of soil moisture and freeze/thaw state. Well-characterized sites with calibrated in situ soil moisture measurements are used to determine the quality of the soil moisture data products; these sites are designated as core validation sites (CVS). To support the CVS-based validation, airborne field experiments are used to provide high-fidelity validation data and to improve the SMAP retrieval algorithms. The SMAP project and NASA coordinated airborne field experiments at three CVS locations in 2015 and 2016. SMAP Validation Experiment 2015 (SMAPVEX15) was conducted around the Walnut Gulch CVS in Arizona in August, 2015. SMAPVEX16 was conducted at the South Fork CVS in Iowa and Carman CVS in Manitoba, Canada from May to August 2016. The airborne PALS (Passive Active L-band Sensor) instrument mapped all experiment areas several times resulting in 30 coincidental measurements with SMAP. The experiments included intensive ground sampling regime consisting of manual sampling and augmentation of the CVS soil moisture measurements with temporary networks of soil moisture sensors. Analyses using the data from these experiments have produced various results regarding the SMAP validation and related science questions. The SMAPVEX15 data set has been used for calibration of a hyper-resolution model for soil moisture product validation; development of a multi-scale parameterization approach for surface roughness, and validation of disaggregation of SMAP soil moisture with optical thermal signal. The SMAPVEX16 data set has been already used for studying the spatial upscaling within a pixel with highly heterogeneous soil texture distribution; for understanding the process of radiative transfer at plot scale in relation to field scale and SMAP footprint scale over highly heterogeneous vegetation distribution; for testing a data fusion based soil moisture downscaling approach; and for investigating soil moisture impact on estimation of vegetation fluorescence from airborne measurements. The presentation will describe the collected data and showcase some of the most important results achieved so far.
Results of the Vapor Compression Distillation Flight Experiment (VCD-FE)
NASA Technical Reports Server (NTRS)
Hutchens, Cindy; Graves, Rex
2004-01-01
Vapor Compression Distillation (VCD) is the chosen technology for urine processing aboard the International Space Station (ISS). Key aspects of the VCD design have been verified and significant improvements made throughout the ground;based development history. However, an important element lacking from previous subsystem development efforts was flight-testing. Consequently, the demonstration and validation of the VCD technology and the investigation of subsystem performance in micro-gravity were the primary goals of the VCD-FE. The Vapor Compression Distillation Flight Experiment (VCD-E) was a flight experiment aboard the Space Shuttle Columbia during the STS-107 mission. The VCD-FE was a full-scale developmental version of the Space Station Urine Processor Assembly (UPA) and was designed to test some of the potential micro-gravity issues with the design. This paper summarizes the experiment results.
A Complex Systems Approach to Causal Discovery in Psychiatry.
Saxe, Glenn N; Statnikov, Alexander; Fenyo, David; Ren, Jiwen; Li, Zhiguo; Prasad, Meera; Wall, Dennis; Bergman, Nora; Briggs, Ernestine C; Aliferis, Constantin
2016-01-01
Conventional research methodologies and data analytic approaches in psychiatric research are unable to reliably infer causal relations without experimental designs, or to make inferences about the functional properties of the complex systems in which psychiatric disorders are embedded. This article describes a series of studies to validate a novel hybrid computational approach--the Complex Systems-Causal Network (CS-CN) method-designed to integrate causal discovery within a complex systems framework for psychiatric research. The CS-CN method was first applied to an existing dataset on psychopathology in 163 children hospitalized with injuries (validation study). Next, it was applied to a much larger dataset of traumatized children (replication study). Finally, the CS-CN method was applied in a controlled experiment using a 'gold standard' dataset for causal discovery and compared with other methods for accurately detecting causal variables (resimulation controlled experiment). The CS-CN method successfully detected a causal network of 111 variables and 167 bivariate relations in the initial validation study. This causal network had well-defined adaptive properties and a set of variables was found that disproportionally contributed to these properties. Modeling the removal of these variables resulted in significant loss of adaptive properties. The CS-CN method was successfully applied in the replication study and performed better than traditional statistical methods, and similarly to state-of-the-art causal discovery algorithms in the causal detection experiment. The CS-CN method was validated, replicated, and yielded both novel and previously validated findings related to risk factors and potential treatments of psychiatric disorders. The novel approach yields both fine-grain (micro) and high-level (macro) insights and thus represents a promising approach for complex systems-oriented research in psychiatry.
Subramanian, Venkatesan; Nagappan, Kannappan; Sandeep Mannemala, Sai
2015-01-01
A sensitive, accurate, precise and rapid HPLC-PDA method was developed and validated for the simultaneous determination of torasemide and spironolactone in human plasma using Design of experiments. Central composite design was used to optimize the method using content of acetonitrile, concentration of buffer and pH of mobile phase as independent variables, while the retention factor of spironolactone, resolution between torasemide and phenobarbitone; and retention time of phenobarbitone were chosen as dependent variables. The chromatographic separation was achieved on Phenomenex C(18) column and the mobile phase comprising 20 mM potassium dihydrogen ortho phosphate buffer (pH-3.2) and acetonitrile in 82.5:17.5 v/v pumped at a flow rate of 1.0 mL min(-1). The method was validated according to USFDA guidelines in terms of selectivity, linearity, accuracy, precision, recovery and stability. The limit of quantitation values were 80 and 50 ng mL(-1) for torasemide and spironolactone respectively. Furthermore, the sensitivity and simplicity of the method suggests the validity of method for routine clinical studies.
Mitigating Communication Delays in Remotely Connected Hardware-in-the-loop Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cale, James; Johnson, Brian; Dall'Anese, Emiliano
Here, this paper introduces a potential approach for mitigating the effects of communication delays between multiple, closed-loop hardware-in-the-loop experiments which are virtually connected, yet physically separated. The method consists of an analytical method for the compensation of communication delays, along with the supporting computational and communication infrastructure. The control design leverages tools for the design of observers for the compensation of measurement errors in systems with time-varying delays. The proposed methodology is validated through computer simulation and hardware experimentation connecting hardware-in-the-loop experiments conducted between laboratories separated by a distance of over 100 km.
Mitigating Communication Delays in Remotely Connected Hardware-in-the-loop Experiments
Cale, James; Johnson, Brian; Dall'Anese, Emiliano; ...
2018-03-30
Here, this paper introduces a potential approach for mitigating the effects of communication delays between multiple, closed-loop hardware-in-the-loop experiments which are virtually connected, yet physically separated. The method consists of an analytical method for the compensation of communication delays, along with the supporting computational and communication infrastructure. The control design leverages tools for the design of observers for the compensation of measurement errors in systems with time-varying delays. The proposed methodology is validated through computer simulation and hardware experimentation connecting hardware-in-the-loop experiments conducted between laboratories separated by a distance of over 100 km.
Spacelab Life Sciences 1 - The stepping stone
NASA Technical Reports Server (NTRS)
Dalton, B. P.; Leon, H.; Hogan, R.; Clarke, B.; Tollinger, D.
1988-01-01
The Spacelab Life Sciences (SLS-1) mission scheduled for launch in March 1990 will study the effects of microgravity on physiological parameters of humans and animals. The data obtained will guide equipment design, performance of activities involving the use of animals, and prediction of human physiological responses during long-term microgravity exposure. The experiments planned for the SLS-1 mission include a particulate-containment demonstration test, integrated rodent experiments, jellyfish experiments, and validation of the small-mass measuring instrument. The design and operation of the Research Animal Holding Facility, General-Purpose Work Station, General-Purpose Transfer Unit, and Animal Enclosure Module are discussed and illustrated with drawings and diagrams.
RACEWAY REACTOR FOR MICROALGAL BIODIESEL PRODUCTION
The proposed mathematical model incorporating mass transfer, hydraulics, carbonate/aquatic chemistry, biokinetics, biology and reactor design will be calibrated and validated using the data to be generated from the experiments. The practical feasibility of the proposed reactor...
Comptational Design Of Functional CA-S-H and Oxide Doped Alloy Systems
NASA Astrophysics Data System (ADS)
Yang, Shizhong; Chilla, Lokeshwar; Yang, Yan; Li, Kuo; Wicker, Scott; Zhao, Guang-Lin; Khosravi, Ebrahim; Bai, Shuju; Zhang, Boliang; Guo, Shengmin
Computer aided functional materials design accelerates the discovery of novel materials. This presentation will cover our recent research advance on the Ca-S-H system properties prediction and oxide doped high entropy alloy property simulation and experiment validation. Several recent developed computational materials design methods were utilized to the two systems physical and chemical properties prediction. A comparison of simulation results to the corresponding experiment data will be introduced. This research is partially supported by NSF CIMM project (OIA-15410795 and the Louisiana BoR), NSF HBCU Supplement climate change and ecosystem sustainability subproject 3, and LONI high performance computing time allocation loni mat bio7.
Results of Microgravity Fluid Dynamics Captured With the Spheres-Slosh Experiment
NASA Technical Reports Server (NTRS)
Lapilli, Gabriel; Kirk, Daniel; Gutierrez, Hector; Schallhorn, Paul; Marsell, Brandon; Roth, Jacob; Moder, Jeffrey
2015-01-01
This paper provides an overview of the SPHERES-Slosh Experiment (SSE) aboard the International Space Station (ISS) and presents on-orbit results with data analysis. In order to predict the location of the liquid propellant during all times of a spacecraft mission, engineers and mission analysts utilize Computational Fluid Dynamics (CFD). These state-of-the-art computer programs numerically solve the fluid flow equations to predict the location of the fluid at any point in time during different spacecraft maneuvers. The models and equations used by these programs have been extensively validated on the ground, but long duration data has never been acquired in a microgravity environment. The SSE aboard the ISS is designed to acquire this type of data, used by engineers on earth to validate and improve the CFD prediction models, improving the design of the next generation of space vehicles as well as the safety of current missions. The experiment makes use of two Synchronized Position Hold, Engage, Reorient Experimental Satellites (SPHERES) connected by a frame. In the center of the frame there is a plastic, pill shaped tank that is partially filled with green-colored water. A pair of high resolution cameras records the movement of the liquid inside the tank as the experiment maneuvers within the Japanese Experimental Module test volume. Inertial measurement units record the accelerations and rotations of the tank, making the combination of stereo imaging and inertial data the inputs for CFD model validation.
Result of Microgravity Fluid Dynamics Captured with the SPHERES-Slosh Experiment
NASA Technical Reports Server (NTRS)
Lapilli, Gabriel; Kirk, Daniel; Gutierrez, Hector; Schallhorn, Paul; Marsell, Brandon; Roth, Jacob; Moder, Jeffrey
2015-01-01
This paper provides an overview of the SPHERES-Slosh Experiment (SSE) aboard the International Space Station (ISS) and presents on-orbit results with data analysis. In order to predict the location of the liquid propellant during all times of a spacecraft mission, engineers and mission analysts utilize Computational Fluid Dynamics (CFD). These state-of-the-art computer programs numerically solve the fluid flow equations to predict the location of the fluid at any point in time during different spacecraft maneuvers. The models and equations used by these programs have been extensively validated on the ground, but long duration data has never been acquired in a microgravity environment. The SSE aboard the ISS is designed to acquire this type of data, used by engineers on earth to validate and improve the CFD prediction models, improving the design of the next generation of space vehicles as well as the safety of current missions. The experiment makes use of two Synchronized Position Hold, Engage, Reorient Experimental Satellites (SPHERES) connected by a frame. In the center of the frame there is a plastic, pill shaped tank that is partially filled with green-colored water. A pair of high resolution cameras records the movement of the liquid inside the tank as the experiment maneuvers within the Japanese Experimental Module test volume. Inertial measurement units record the accelerations and rotations of the tank, making the combination of stereo imaging and inertial data the inputs for CFD model validation.
Results of Microgravity Fluid Dynamics Captured with the Spheres-Slosh Experiment
NASA Technical Reports Server (NTRS)
Lapilli, Gabriel; Kirk, Daniel Robert; Gutierrez, Hector; Schallhorn, Paul; Marsell, Brandon; Roth, Jacob; Jeffrey Moder
2015-01-01
This paper provides an overview of the SPHERES-Slosh Experiment (SSE) aboard the International Space Station (ISS) and presents on-orbit results with data analysis. In order to predict the location of the liquid propellant during all times of a spacecraft mission, engineers and mission analysts utilize Computational Fluid Dynamics (CFD). These state-of-the-art computer programs numerically solve the fluid flow equations to predict the location of the fluid at any point in time during different spacecraft maneuvers. The models and equations used by these programs have been extensively validated on the ground, but long duration data has never been acquired in a microgravity environment. The SSE aboard the ISS is designed to acquire this type of data, used by engineers on earth to validate and improve the CFD prediction models, improving the design of the next generation of space vehicles as well as the safety of current missions. The experiment makes use of two Synchronized Position Hold, Engage, Reorient Experimental Satellites (SPHERES) connected by a frame. In the center of the frame there is a plastic, pill shaped tank that is partially filled with green-colored water. A pair of high resolution cameras records the movement of the liquid inside the tank as the experiment maneuvers within the Japanese Experimental Module test volume. Inertial measurement units record the accelerations and rotations of the tank, making the combination of stereo imaging and inertial data the inputs for CFD model validation.
Developments in Sensitivity Methodologies and the Validation of Reactor Physics Calculations
Palmiotti, Giuseppe; Salvatores, Massimo
2012-01-01
The sensitivity methodologies have been a remarkable story when adopted in the reactor physics field. Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. A review of the methods used is provided, and several examples illustrate the success of the methodology in reactor physics. A new application as the improvement of nuclear basic parameters using integral experiments is also described.
New Reactor Physics Benchmark Data in the March 2012 Edition of the IRPhEP Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; J. Blair Briggs; Jim Gulliford
2012-11-01
The International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications. Numerous experiments that have been performed worldwide, represent a large investment of infrastructure, expertise, and cost, and are valuable resources of data for present and future research. These valuable assets provide the basis for recording, development, and validation of methods. If the experimental data are lost, the high cost to repeat many of these measurements may be prohibitive. The purpose of the IRPhEP is to provide an extensively peer-reviewed set ofmore » reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. Contributors from around the world collaborate in the evaluation and review of selected benchmark experiments for inclusion in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [1]. Several new evaluations have been prepared for inclusion in the March 2012 edition of the IRPhEP Handbook.« less
The Objectives of NASA's Living with a Star Space Environment Testbed
NASA Technical Reports Server (NTRS)
Barth, Janet L.; LaBel, Kenneth A.; Brewer, Dana; Kauffman, Billy; Howard, Regan; Griffin, Geoff; Day, John H. (Technical Monitor)
2001-01-01
NASA is planning to fly a series of Space Environment Testbeds (SET) as part of the Living With A Star (LWS) Program. The goal of the testbeds is to improve and develop capabilities to mitigate and/or accommodate the affects of solar variability in spacecraft and avionics design and operation. This will be accomplished by performing technology validation in space to enable routine operations, characterize technology performance in space, and improve and develop models, guidelines and databases. The anticipated result of the LWS/SET program is improved spacecraft performance, design, and operation for survival of the radiation, spacecraft charging, meteoroid, orbital debris and thermosphere/ionosphere environments. The program calls for a series of NASA Research Announcements (NRAs) to be issued to solicit flight validation experiments, improvement in environment effects models and guidelines, and collateral environment measurements. The selected flight experiments may fly on the SET experiment carriers and flights of opportunity on other commercial and technology missions. This paper presents the status of the project so far, including a description of the types of experiments that are intended to fly on SET-1 and a description of the SET-1 carrier parameters.
INL Experimental Program Roadmap for Thermal Hydraulic Code Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glenn McCreery; Hugh McIlroy
2007-09-01
Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role ofmore » expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments related to VHTRs, sodium-cooled fast reactors, and light-water reactors. These experiments range from relatively low-cost benchtop experiments for investigating individual phenomena to large electrically-heated integral facilities for investigating reactor accidents and transients.« less
Digital Fly-By-Wire Flight Control Validation Experience
NASA Technical Reports Server (NTRS)
Szalai, K. J.; Jarvis, C. R.; Krier, G. E.; Megna, V. A.; Brock, L. D.; Odonnell, R. N.
1978-01-01
The experience gained in digital fly-by-wire technology through a flight test program being conducted by the NASA Dryden Flight Research Center in an F-8C aircraft is described. The system requirements are outlined, along with the requirements for flight qualification. The system is described, including the hardware components, the aircraft installation, and the system operation. The flight qualification experience is emphasized. The qualification process included the theoretical validation of the basic design, laboratory testing of the hardware and software elements, systems level testing, and flight testing. The most productive testing was performed on an iron bird aircraft, which used the actual electronic and hydraulic hardware and a simulation of the F-8 characteristics to provide the flight environment. The iron bird was used for sensor and system redundancy management testing, failure modes and effects testing, and stress testing in many cases with the pilot in the loop. The flight test program confirmed the quality of the validation process by achieving 50 flights without a known undetected failure and with no false alarms.
A Shuttle Upper Atmosphere Mass Spectrometer /SUMS/ experiment
NASA Technical Reports Server (NTRS)
Blanchard, R. C.; Duckett, R. J.; Hinson, E. W.
1982-01-01
A magnetic mass spectrometer is currently being adapted to the Space Shuttle Orbiter to provide repeated high altitude atmosphere data to support in situ rarefied flow aerodynamics research, i.e., in the high velocity, low density flight regime. The experiment, called Shuttle Upper Atmosphere Mass Spectrometer (SUMS), is the first attempt to design mass spectrometer equipment for flight vehicle aerodynamic data extraction. The SUMS experiment will provide total freestream atmospheric quantitites, principally total mass density, above altitudes at which conventional pressure measurements are valid. Experiment concepts, the expected flight profile, tradeoffs in the design of the total system and flight data reduction plans are discussed. Development plans are based upon a SUMS first flight after the Orbiter initial development flights.
Four experimental demonstrations of active vibration control for flexible structures
NASA Technical Reports Server (NTRS)
Phillips, Doug; Collins, Emmanuel G., Jr.
1990-01-01
Laboratory experiments designed to test prototype active-vibration-control systems under development for future flexible space structures are described, summarizing previously reported results. The control-synthesis technique employed for all four experiments was the maximum-entropy optimal-projection (MEOP) method (Bernstein and Hyland, 1988). Consideration is given to: (1) a pendulum experiment on large-amplitude LF dynamics; (2) a plate experiment on broadband vibration suppression in a two-dimensional structure; (3) a multiple-hexagon experiment combining the factors studied in (1) and (2) to simulate the complexity of a large space structure; and (4) the NASA Marshall ACES experiment on a lightweight deployable 45-foot beam. Extensive diagrams, drawings, graphs, and photographs are included. The results are shown to validate the MEOP design approach, demonstrating that good performance is achievable using relatively simple low-order decentralized controllers.
Janssen, Ellen M; Marshall, Deborah A; Hauber, A Brett; Bridges, John F P
2017-12-01
The recent endorsement of discrete-choice experiments (DCEs) and other stated-preference methods by regulatory and health technology assessment (HTA) agencies has placed a greater focus on demonstrating the validity and reliability of preference results. Areas covered: We present a practical overview of tests of validity and reliability that have been applied in the health DCE literature and explore other study qualities of DCEs. From the published literature, we identify a variety of methods to assess the validity and reliability of DCEs. We conceptualize these methods to create a conceptual model with four domains: measurement validity, measurement reliability, choice validity, and choice reliability. Each domain consists of three categories that can be assessed using one to four procedures (for a total of 24 tests). We present how these tests have been applied in the literature and direct readers to applications of these tests in the health DCE literature. Based on a stakeholder engagement exercise, we consider the importance of study characteristics beyond traditional concepts of validity and reliability. Expert commentary: We discuss study design considerations to assess the validity and reliability of a DCE, consider limitations to the current application of tests, and discuss future work to consider the quality of DCEs in healthcare.
Rakotonarivo, O Sarobidy; Schaafsma, Marije; Hockley, Neal
2016-12-01
While discrete choice experiments (DCEs) are increasingly used in the field of environmental valuation, they remain controversial because of their hypothetical nature and the contested reliability and validity of their results. We systematically reviewed evidence on the validity and reliability of environmental DCEs from the past thirteen years (Jan 2003-February 2016). 107 articles met our inclusion criteria. These studies provide limited and mixed evidence of the reliability and validity of DCE. Valuation results were susceptible to small changes in survey design in 45% of outcomes reporting reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2-90% of respondents protested against a feature of the survey, and a considerable proportion found DCEs to be incomprehensible or inconsequential (17-40% and 10-62% respectively). DCE remains useful for non-market valuation, but its results should be used with caution. Given the sparse and inconclusive evidence base, we recommend that tests of reliability and validity are more routinely integrated into DCE studies and suggest how this might be achieved. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Exploring the Design, Development and Use of Websites through Accessibility and Usability Studies
ERIC Educational Resources Information Center
Foley, Alan
2011-01-01
In this paper, data obtained from a university website accessibility and usability validation process are analyzed and used to demonstrate how the design process can affect the online experience for users with disabilities. Interviews, observations, and use data (e.g. where users clicked on a page or what path taken through a site) were collected.…
An Open-Structure Treadmill Gait Trainer: From Research to Application.
Li, Jian; Chen, Diansheng; Fan, Yubo
2017-01-01
Lower limb rehabilitation robots are designed to enhance gait function in individuals with motor impairments. Although numerous rehabilitation robots have been developed, only few of these robots have been used in practical health care, particularly in China. The objective of this study is to construct a lower limb rehabilitation robot and bridge the gap between research and application. Open structure to facilitate practical application was created for the whole robot. Three typical movement patterns of a single leg were adopted in designing the exoskeletons, and force models for patient training were established and analyzed under three different conditions, respectively, and then a control system and security strategy were introduced. After establishing the robot, a preliminary experiment on the actual use of a prototype by patients was conducted to validate the functionality of the robot. The experiment showed that different patients and stages displayed different performances, and results on the trend variations across patients and across stages confirmed the validity of the robot and suggested that the design may lead to a system that could be successful in the treatment of patients with walking disorders in China. Furthermore, this study could provide a reference for a similar application design.
Optimization of EGFR high positive cell isolation procedure by design of experiments methodology.
Levi, Ofer; Tal, Baruch; Hileli, Sagi; Shapira, Assaf; Benhar, Itai; Grabov, Pavel; Eliaz, Noam
2015-01-01
Circulating tumor cells (CTCs) in blood circulation may play a role in monitoring and even in early detection of metastasis patients. Due to the limited presence of CTCs in blood circulation, viable CTCs isolation technology must supply a very high recovery rate. Here, we implement design of experiments (DOE) methodology in order to optimize the Bio-Ferrography (BF) immunomagnetic isolation (IMI) procedure for the EGFR high positive CTCs application. All consequent DOE phases such as screening design, optimization experiments and validation experiments were used. A significant recovery rate of more than 95% was achieved while isolating 100 EGFR high positive CTCs from 1 mL human whole blood. The recovery achievement in this research positions BF technology as one of the most efficient IMI technologies, which is ready to be challenged with patients' blood samples. © 2015 International Clinical Cytometry Society.
Development, Validation, and Application of OSSEs at NASA/GMAO
NASA Technical Reports Server (NTRS)
Errico, Ronald; Prive, Nikki
2015-01-01
During the past several years, NASA Goddard's Global Modeling and Assimilation Office (GMAO) has been developing a framework for conducting Observing System Simulation Experiments (OSSEs). The motivation and design of that framework will be described and a sample of validation results presented. Fundamentals issues will be highlighted, particularly the critical importance of appropriately simulating system errors. Some problems that have just arisen in the newest experimental system will also be mentioned.
A Supersonic Argon/Air Coaxial Jet Experiment for Computational Fluid Dynamics Code Validation
NASA Technical Reports Server (NTRS)
Clifton, Chandler W.; Cutler, Andrew D.
2007-01-01
A non-reacting experiment is described in which data has been acquired for the validation of CFD codes used to design high-speed air-breathing engines. A coaxial jet-nozzle has been designed to produce pressure-matched exit flows of Mach 1.8 at 1 atm in both a center jet of argon and a coflow jet of air, creating a supersonic, incompressible mixing layer. The flowfield was surveyed using total temperature, gas composition, and Pitot probes. The data set was compared to CFD code predictions made using Vulcan, a structured grid Navier-Stokes code, as well as to data from a previous experiment in which a He-O2 mixture was used instead of argon in the center jet of the same coaxial jet assembly. Comparison of experimental data from the argon flowfield and its computational prediction shows that the CFD produces an accurate solution for most of the measured flowfield. However, the CFD prediction deviates from the experimental data in the region downstream of x/D = 4, underpredicting the mixing-layer growth rate.
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
NASA Astrophysics Data System (ADS)
Sutherland, Herbert J.
1988-08-01
Sandia National Laboratories has erected a research oriented, 34- meter diameter, Darrieus vertical axis wind turbine near Bushland, Texas. This machine, designated the Sandia 34-m VAWT Test Bed, is equipped with a large array of strain gauges that have been placed at critical positions about the blades. This manuscript details a series of four-point bend experiments that were conducted to validate the output of the blade strain gauge circuits. The output of a particular gauge circuit is validated by comparing its output to equivalent gauge circuits (in this stress state) and to theoretical predictions. With only a few exceptions, the difference between measured and predicted strain values for a gauge circuit was found to be of the order of the estimated repeatability for the measurement system.
Experimental validation of predicted cancer genes using FRET
NASA Astrophysics Data System (ADS)
Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.
2018-07-01
Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.
The Development and Validation of the Game User Experience Satisfaction Scale (GUESS).
Phan, Mikki H; Keebler, Joseph R; Chaparro, Barbara S
2016-12-01
The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors. Playtesting is often conducted in the video game industry to help game developers build better games by providing insight into the players' attitudes and preferences. However, quality feedback is difficult to obtain from playtesting sessions without a quality gaming assessment tool. There is a need for a psychometrically validated and comprehensive gaming scale that is appropriate for playtesting and game evaluation purposes. The process of developing and validating this new scale followed current best practices of scale development and validation. As a result, a mixed-method design that consisted of item pool generation, expert review, questionnaire pilot study, exploratory factor analysis (N = 629), and confirmatory factor analysis (N = 729) was implemented. A new instrument measuring video game satisfaction, called the Game User Experience Satisfaction Scale (GUESS), with nine subscales emerged. The GUESS was demonstrated to have content validity, internal consistency, and convergent and discriminant validity. The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres. Thus, it can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience. The GUESS can be administered to evaluate user satisfaction of different types of video games by a variety of users. © 2016, Human Factors and Ergonomics Society.
Beierlein, V; Köllner, V; Neu, R; Schulz, H
2016-12-01
Objectives: The assessment of work pressures is of particular importance in psychosomatic rehabilitation. An established questionnaire is the Occupational Stress and Coping Inventory (German abbr. AVEM), but it is quite long and with regard to scoring time-consuming in routine clinical care. It should therefore be tested, whether a shortened version of the AVEM can be developed, which is able to assess the formerly described three second-order factors of the AVEM, namely Working Commitment, Resilience, and Emotions, sufficiently reliable and valid, and which also may be used for screening of patients with prominent work-related behavior and experience patterns. Methods: Data were collected at admission from consecutive samples of three hospitals of psychosomatic rehabilitation ( N = 10,635 patients). The sample was randomly divided in two subsamples (design and validation sample). Using exploratory principal component analyses in the design sample, items with the highest factor loadings for the three new scales were selected and evaluated psychometrically using the validation sample. Possible Cut-off values ought to be derived from distribution patterns of scores in the scales. Relationships with sociodemographic, occupational and diagnosis-related characteristics, as well as with patterns of work-related experiences and behaviors are examined. Results: The three performed principal component analyses explained in the design sample on the respective first factor between 31 % and 34 % of the variance. The selected 20 items were assigned to the 3-factor structure in the validation sample as expected. The three new scales are sufficiently reliable with values of Cronbach's α between 0,84 and 0,88. The naming of the three new scales is based on the names of the secondary factors. Cut-off values for the identification of distinctive patient-reported data are proposed. Conclusion: Main advantages of the proposed shortened version AVEM-3D are that with a considerable smaller number of items the three main dimensions of relevant work-related behavior and experience patterns can be reliably measured. The proposed measure is simple and economic to use and interpret. Based on the present sample we provide means and standard deviations as reference at admission of psychosomatic rehabilitation. As a limitation it should be mentioned that further evaluation of reliability, validity and sensitivity to change restricted to the items of the shortened version is necessary. The practicability and validity of the proposed cut-off values cannot yet be conclusively assessed. Finally, the validity of the AVEM-3D in groups of indications other than psychosomatic patients and in healthy persons remains to be examined. © Georg Thieme Verlag KG Stuttgart · New York.
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-02-06
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.
A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-01-01
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184
The Design of PSB-VVER Experiments Relevant to Accident Management
NASA Astrophysics Data System (ADS)
Nevo, Alessandro Del; D'Auria, Francesco; Mazzini, Marino; Bykov, Michael; Elkin, Ilya V.; Suslov, Alexander
Experimental programs carried-out in integral test facilities are relevant for validating the best estimate thermal-hydraulic codes(1), which are used for accident analyses, design of accident management procedures, licensing of nuclear power plants, etc. The validation process, in fact, is based on well designed experiments. It consists in the comparison of the measured and calculated parameters and the determination whether a computer code has an adequate capability in predicting the major phenomena expected to occur in the course of transient and/or accidents. University of Pisa was responsible of the numerical design of the 12 experiments executed in PSB-VVER facility (2), operated at Electrogorsk Research and Engineering Center (Russia), in the framework of the TACIS 2.03/97 Contract 3.03.03 Part A, EC financed (3). The paper describes the methodology adopted at University of Pisa, starting form the scenarios foreseen in the final test matrix until the execution of the experiments. This process considers three key topics: a) the scaling issue and the simulation, with unavoidable distortions, of the expected performance of the reference nuclear power plants; b) the code assessment process involving the identification of phenomena challenging the code models; c) the features of the concerned integral test facility (scaling limitations, control logics, data acquisition system, instrumentation, etc.). The activities performed in this respect are discussed, and emphasis is also given to the relevance of the thermal losses to the environment. This issue affects particularly the small scaled facilities and has relevance on the scaling approach related to the power and volume of the facility.
CFD Validation with Experiment and Verification with Physics of a Propellant Damping Device
NASA Technical Reports Server (NTRS)
Yang, H. Q.; Peugeot, John
2011-01-01
This paper will document our effort in validating a coupled fluid-structure interaction CFD tool in predicting a damping device performance in the laboratory condition. Consistently good comparisons of "blind" CFD predictions against experimental data under various operation conditions, design parameters, and cryogenic environment will be presented. The power of the coupled CFD-structures interaction code in explaining some unexpected phenomena of the device observed during the technology development will be illustrated. The evolution of the damper device design inside the LOX tank will be used to demonstrate the contribution of the tool in understanding, optimization and implementation of LOX damper in Ares I vehicle. It is due to the present validation effort, the LOX damper technology has matured to TRL 5. The present effort has also contributed to the transition of the technology from an early conceptual observation to the baseline design of thrust oscillation mitigation for the Ares I within a 10 month period.
Topology optimization based design of unilateral NMR for generating a remote homogeneous field.
Wang, Qi; Gao, Renjing; Liu, Shutian
2017-06-01
This paper presents a topology optimization based design method for the design of unilateral nuclear magnetic resonance (NMR), with which a remote homogeneous field can be obtained. The topology optimization is actualized by seeking out the optimal layout of ferromagnetic materials within a given design domain. The design objective is defined as generating a sensitive magnetic field with optimal homogeneity and maximal field strength within a required region of interest (ROI). The sensitivity of the objective function with respect to the design variables is derived and the method for solving the optimization problem is presented. A design example is provided to illustrate the utility of the design method, specifically the ability to improve the quality of the magnetic field over the required ROI by determining the optimal structural topology for the ferromagnetic poles. Both in simulations and experiments, the sensitive region of the magnetic field achieves about 2 times larger than that of the reference design, validating validates the feasibility of the design method. Copyright © 2017. Published by Elsevier Inc.
Michels, David A; Parker, Monica; Salas-Solano, Oscar
2012-03-01
This paper describes the framework of quality by design applied to the development, optimization and validation of a sensitive capillary electrophoresis-sodium dodecyl sulfate (CE-SDS) assay for monitoring impurities that potentially impact drug efficacy or patient safety produced in the manufacture of therapeutic MAb products. Drug substance or drug product samples are derivatized with fluorogenic 3-(2-furoyl)quinoline-2-carboxaldehyde and nucleophilic cyanide before separation by CE-SDS coupled to LIF detection. Three design-of-experiments enabled critical labeling parameters to meet method requirements for detecting minor impurities while building precision and robustness into the assay during development. The screening design predicted optimal conditions to control labeling artifacts while two full factorial designs demonstrated method robustness through control of temperature and cyanide parameters within the normal operating range. Subsequent validation according to the guidelines of the International Committee of Harmonization showed the CE-SDS/LIF assay was specific, accurate, and precise (RSD ≤ 0.8%) for relative peak distribution and linear (R > 0.997) between the range of 0.5-1.5 mg/mL with LOD and LOQ of 10 ng/mL and 35 ng/mL, respectively. Validation confirmed the system suitability criteria used as a level of control to ensure reliable method performance. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Further cross-cultural validation of the theory of mental self-government.
Zhang, L F
1999-03-01
This study was designed to achieve two objectives. The 1st was to investigate the cross-cultural validity of the Thinking Styles Inventory (TSI; R. J. Sternberg & R. K. Wagner, 1992), which is based on the theory of mental self-government (R. J. Sternberg, 1988, 1990, 1997). The 2nd was to examine the relationships between thinking styles as assessed by the TSI and a number of student characteristics, including age, gender, college class level, work experience, and travel experience. One hundred fifty-one students from the University of Hong Kong participated in the study. Results indicated that the thinking styles evaluated by the TSI could be identified among the participants. Moreover, there were significant relationships between certain thinking styles, especially creativity-relevant styles and 3 student characteristics: age, work experience, and travel experience. Implications of these findings for teaching and learning in and outside the classroom are discussed.
Takase, Miyuki; Imai, Takiko; Uemura, Chizuru
2016-06-01
This paper examines the psychometric properties of the Learning Experience Scale. A survey method was used to collect data from a total of 502 nurses. Data were analyzed by factor analysis and the known-groups technique to examine the construct validity of the scale. In addition, internal consistency was evaluated by Cronbach's alpha, and stability was examined by test-retest correlation. Factor analysis showed that the Learning Experience Scale consisted of five factors: learning from practice, others, training, feedback, and reflection. The scale also had the power to discriminate between nurses with high and low levels of nursing competence. The internal consistency and the stability of the scale were also acceptable. The Learning Experience Scale is a valid and reliable instrument, and helps organizations to effectively design learning interventions for nurses. © 2015 Wiley Publishing Asia Pty Ltd.
Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment
NASA Technical Reports Server (NTRS)
Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.
2008-01-01
Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.
Inhibitor-based validation of a homology model of the active-site of tripeptidyl peptidase II.
De Winter, Hans; Breslin, Henry; Miskowski, Tamara; Kavash, Robert; Somers, Marijke
2005-04-01
A homology model of the active site region of tripeptidyl peptidase II (TPP II) was constructed based on the crystal structures of four subtilisin-like templates. The resulting model was subsequently validated by judging expectations of the model versus observed activities for a broad set of prepared TPP II inhibitors. The structure-activity relationships observed for the prepared TPP II inhibitors correlated nicely with the structural details of the TPP II active site model, supporting the validity of this model and its usefulness for structure-based drug design and pharmacophore searching experiments.
Advanced Method of Boundary-Layer Control Based on Localized Plasma Generation
2009-05-01
measurements, validation of experiments, wind-tunnel testing of the microwave / plasma generation system , preliminary assessment of energy required...and design of a microwave generator , electrodynamic and multivibrator systems for experiments in the IHM-NAU wind tunnel: MW generator and its high...equipped with the microwave - generation and protection systems to study advanced methods of flow control (Kiev) Fig. 2.1,a. The blade
OSI-compatible protocols for mobile-satellite communications: The AMSS experience
NASA Technical Reports Server (NTRS)
Moher, Michael
1990-01-01
The protocol structure of the international aeronautical mobile satellite service (AMSS) is reviewed with emphasis on those aspects of protocol performance, validation, and conformance which are peculiar to mobile services. This is in part an analysis of what can be learned from the AMSS experience with protocols which is relevant to the design of other mobile satellite data networks, e.g., land mobile.
Short-range inverse-square law experiment in space
NASA Technical Reports Server (NTRS)
Paik, H. J.; Moody, M. V.
2002-01-01
Newton's inverse-square law is a cornerstone of General Relativity. Its validity has been demonstrated to better than one part in thousand in ranges greater than 1 cm. The range below 1 mm has been left largely unexplored, due to the difficulties associated with designing sensitive short-range experiments. However, the theoretical rationale for testing Newton's law at ranges below 1 mm has become very strong recently.
Investigation of Abnormal Heat Transfer and Flow in a VHTR Reactor Core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kawaji, Masahiro; Valentin, Francisco I.; Artoun, Narbeh
2015-12-21
The main objective of this project was to identify and characterize the conditions under which abnormal heat transfer phenomena would occur in a Very High Temperature Reactor (VHTR) with a prismatic core. High pressure/high temperature experiments have been conducted to obtain data that could be used for validation of VHTR design and safety analysis codes. The focus of these experiments was on the generation of benchmark data for design and off-design heat transfer for forced, mixed and natural circulation in a VHTR core. In particular, a flow laminarization phenomenon was intensely investigated since it could give rise to hot spotsmore » in the VHTR core.« less
Du, Yongxing; Zhang, Lingze; Sang, Lulu; Wu, Daocheng
2016-04-29
In this paper, an Archimedean planar spiral antenna for the application of thermotherapy was designed. This type of antenna was chosen for its compact structure, flexible application and wide heating area. The temperature field generated by the use of this Two-armed Spiral Antenna in a muscle-equivalent phantom was simulated and subsequently validated by experimentation. First, the specific absorption rate (SAR) of the field was calculated using the Finite Element Method (FEM) by Ansoft's High Frequency Structure Simulation (HFSS). Then, the temperature elevation in the phantom was simulated by an explicit finite difference approximation of the bioheat equation (BHE). The temperature distribution was then validated by a phantom heating experiment. The results showed that this antenna had a good heating ability and a wide heating area. A comparison between the calculation and the measurement showed a fair agreement in the temperature elevation. The validated model could be applied for the analysis of electromagnetic-temperature distribution in phantoms during the process of antenna design or thermotherapy experimentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calderer, Antoni; Yang, Xiaolei; Angelidis, Dionysios
2015-10-30
The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.
ERIC Educational Resources Information Center
Lorch, Robert F., Jr.; Lorch, Elizabeth P.; Freer, Benjamin Dunham; Dunlap, Emily E.; Hodell, Emily C.; Calderhead, William J.
2014-01-01
Students (n = 1,069) from 60 4th-grade classrooms were taught the control of variables strategy (CVS) for designing experiments. Half of the classrooms were in schools that performed well on a state-mandated test of science achievement, and half were in schools that performed relatively poorly. Three teaching interventions were compared: an…
Control Activity in Support of NASA Turbine Based Combined Cycle (TBCC) Research
NASA Technical Reports Server (NTRS)
Stueber, Thomas J.; Vrnak, Daniel R.; Le, Dzu K.; Ouzts, Peter J.
2010-01-01
Control research for a Turbine Based Combined Cycle (TBCC) propulsion system is the current focus of the Hypersonic Guidance, Navigation, and Control (GN&C) discipline team. The ongoing work at the NASA Glenn Research Center (GRC) supports the Hypersonic GN&C effort in developing tools to aid the design of control algorithms to manage a TBCC airbreathing propulsion system during a critical operating period. The critical operating period being addressed in this paper is the span when the propulsion system transitions from one cycle to another, referred to as mode transition. One such tool, that is a basic need for control system design activities, is computational models (hereto forth referred to as models) of the propulsion system. The models of interest for designing and testing controllers are Control Development Models (CDMs) and Control Validation Models (CVMs). CDMs and CVMs are needed for each of the following propulsion system elements: inlet, turbine engine, ram/scram dual-mode combustor, and nozzle. This paper presents an overall architecture for a TBCC propulsion system model that includes all of the propulsion system elements. Efforts are under way, focusing on one of the propulsion system elements, to develop CDMs and CVMs for a TBCC propulsion system inlet. The TBCC inlet aerodynamic design being modeled is that of the Combined-Cycle Engine (CCE) Testbed. The CCE Testbed is a large-scale model of an aerodynamic design that was verified in a small-scale screening experiment. The modeling approach includes employing existing state-of-the-art simulation codes, developing new dynamic simulations, and performing system identification experiments on the hardware in the NASA GRC 10 by10-Foot Supersonic Wind Tunnel. The developed CDMs and CVMs will be available for control studies prior to hardware buildup. The system identification experiments on the CCE Testbed will characterize the necessary dynamics to be represented in CDMs for control design. These system identification models will also be the reference models to validate the CDM and CVM models. Validated models will give value to the tools used to develop the models.
Corrosivity Sensor for Exposed Pipelines Based on Wireless Energy Transfer.
Lawand, Lydia; Shiryayev, Oleg; Al Handawi, Khalil; Vahdati, Nader; Rostron, Paul
2017-05-30
External corrosion was identified as one of the main causes of pipeline failures worldwide. A solution that addresses the issue of detecting and quantifying corrosivity of environment for application to existing exposed pipelines has been developed. It consists of a sensing array made of an assembly of thin strips of pipeline steel and a circuit that provides a visual sensor reading to the operator. The proposed sensor is passive and does not require a constant power supply. Circuit design was validated through simulations and lab experiments. Accelerated corrosion experiment was conducted to confirm the feasibility of the proposed corrosivity sensor design.
Tamaki, S; Sakai, M; Yoshihashi, S; Manabe, M; Zushi, N; Murata, I; Hoashi, E; Kato, I; Kuri, S; Oshiro, S; Nagasaki, M; Horiike, H
2015-12-01
Mock-up experiment for development of accelerator based neutron source for Osaka University BNCT project was carried out at Birmingham University, UK. In this paper, spatial distribution of neutron flux intensity was evaluated by foil activation method. Validity of the design code system was confirmed by comparing measured gold foil activities with calculations. As a result, it was found that the epi-thermal neutron beam was well collimated by our neutron moderator assembly. Also, the design accuracy was evaluated to have less than 20% error. Copyright © 2015 Elsevier Ltd. All rights reserved.
Experimental design, power and sample size for animal reproduction experiments.
Chapman, Phillip L; Seidel, George E
2008-01-01
The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.
Troiano, Luigi; Birtolo, Cosimo; Armenise, Roberto
2016-01-01
In many circumstances, concepts, ideas and emotions are mainly conveyed by colors. Color vision disorders can heavily limit the user experience in accessing Information Society. Therefore, color vision impairments should be taken into account in order to make information and services accessible to a broader audience. The task is not easy for designers that generally are not affected by any color vision disorder. In any case, the design of accessible user interfaces should not lead to to boring color schemes. The selection of appealing and harmonic color combinations should be preserved. In past research we investigated a generative approach led by evolutionary computing in supporting interface designers to make colors accessible to impaired users. This approach has also been followed by other authors. The contribution of this paper is to provide an experimental validation to the claim that this approach is actually beneficial to designers and users.
Scherrer, Stephen R; Rideout, Brendan P; Giorli, Giacomo; Nosal, Eva-Marie; Weng, Kevin C
2018-01-01
Passive acoustic telemetry using coded transmitter tags and stationary receivers is a popular method for tracking movements of aquatic animals. Understanding the performance of these systems is important in array design and in analysis. Close proximity detection interference (CPDI) is a condition where receivers fail to reliably detect tag transmissions. CPDI generally occurs when the tag and receiver are near one another in acoustically reverberant settings. Here we confirm transmission multipaths reflected off the environment arriving at a receiver with sufficient delay relative to the direct signal cause CPDI. We propose a ray-propagation based model to estimate the arrival of energy via multipaths to predict CPDI occurrence, and we show how deeper deployments are particularly susceptible. A series of experiments were designed to develop and validate our model. Deep (300 m) and shallow (25 m) ranging experiments were conducted using Vemco V13 acoustic tags and VR2-W receivers. Probabilistic modeling of hourly detections was used to estimate the average distance a tag could be detected. A mechanistic model for predicting the arrival time of multipaths was developed using parameters from these experiments to calculate the direct and multipath path lengths. This model was retroactively applied to the previous ranging experiments to validate CPDI observations. Two additional experiments were designed to validate predictions of CPDI with respect to combinations of deployment depth and distance. Playback of recorded tags in a tank environment was used to confirm multipaths arriving after the receiver's blanking interval cause CPDI effects. Analysis of empirical data estimated the average maximum detection radius (AMDR), the farthest distance at which 95% of tag transmissions went undetected by receivers, was between 840 and 846 m for the deep ranging experiment across all factor permutations. From these results, CPDI was estimated within a 276.5 m radius of the receiver. These empirical estimations were consistent with mechanistic model predictions. CPDI affected detection at distances closer than 259-326 m from receivers. AMDR determined from the shallow ranging experiment was between 278 and 290 m with CPDI neither predicted nor observed. Results of validation experiments were consistent with mechanistic model predictions. Finally, we were able to predict detection/nondetection with 95.7% accuracy using the mechanistic model's criterion when simulating transmissions with and without multipaths. Close proximity detection interference results from combinations of depth and distance that produce reflected signals arriving after a receiver's blanking interval has ended. Deployment scenarios resulting in CPDI can be predicted with the proposed mechanistic model. For deeper deployments, sea-surface reflections can produce CPDI conditions, resulting in transmission rejection, regardless of the reflective properties of the seafloor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard R. Schultz; Paul D. Bayless; Richard W. Johnson
2010-09-01
The Oregon State University (OSU) High Temperature Test Facility (HTTF) is an integral experimental facility that will be constructed on the OSU campus in Corvallis, Oregon. The HTTF project was initiated, by the U.S. Nuclear Regulatory Commission (NRC), on September 5, 2008 as Task 4 of the 5 year High Temperature Gas Reactor Cooperative Agreement via NRC Contract 04-08-138. Until August, 2010, when a DOE contract was initiated to fund additional capabilities for the HTTF project, all of the funding support for the HTTF was provided by the NRC via their cooperative agreement. The U.S. Department of Energy (DOE) beganmore » their involvement with the HTTF project in late 2009 via the Next Generation Nuclear Plant project. Because the NRC interests in HTTF experiments were only centered on the depressurized conduction cooldown (DCC) scenario, NGNP involvement focused on expanding the experimental envelope of the HTTF to include steady-state operations and also the pressurized conduction cooldown (PCC). Since DOE has incorporated the HTTF as an ingredient in the NGNP thermal-fluids validation program, several important outcomes should be noted: 1. The reference prismatic reactor design, that serves as the basis for scaling the HTTF, became the modular high temperature gas-cooled reactor (MHTGR). The MHTGR has also been chosen as the reference design for all of the other NGNP thermal-fluid experiments. 2. The NGNP validation matrix is being planned using the same scaling strategy that has been implemented to design the HTTF, i.e., the hierarchical two-tiered scaling methodology developed by Zuber in 1991. Using this approach a preliminary validation matrix has been designed that integrates the HTTF experiments with the other experiments planned for the NGNP thermal-fluids verification and validation project. 3. Initial analyses showed that the inherent power capability of the OSU infrastructure, which only allowed a total operational facility power capability of 0.6 MW, is inadequate to permit steady-state operation at reasonable conditions. 4. To enable the HTTF to operate at a more representative steady-state conditions, DOE recently allocated funding via a DOE subcontract to HTTF to permit an OSU infrastructure upgrade such that 2.2 MW will become available for HTTF experiments. 5. Analyses have been performed to study the relationship between HTTF and MHTGR via the hierarchical two-tiered scaling methodology which has been used successfully in the past, e.g., APEX facility scaling to the Westinghouse AP600 plant. These analyses have focused on the relationship between key variables that will be measured in the HTTF to the counterpart variables in the MHTGR with a focus on natural circulation, using nitrogen as a working fluid, and core heat transfer. 6. Both RELAP5-3D and computational fluid dynamics (CD-Adapco’s STAR-CCM+) numerical models of the MHTGR and the HTTF have been constructed and analyses are underway to study the relationship between the reference reactor and the HTTF. The HTTF is presently being designed. It has ¼-scaling relationship to the MHTGR in both the height and the diameter. Decisions have been made to design the reactor cavity cooling system (RCCS) simulation as a boundary condition for the HTTF to ensure that (a) the boundary condition is well defined and (b) the boundary condition can be modified easily to achieve the desired heat transfer sink for HTTF experimental operations.« less
Looby, Mairead; Ibarra, Neysi; Pierce, James J; Buckley, Kevin; O'Donovan, Eimear; Heenan, Mary; Moran, Enda; Farid, Suzanne S; Baganz, Frank
2011-01-01
This study describes the application of quality by design (QbD) principles to the development and implementation of a major manufacturing process improvement for a commercially distributed therapeutic protein produced in Chinese hamster ovary cell culture. The intent of this article is to focus on QbD concepts, and provide guidance and understanding on how the various components combine together to deliver a robust process in keeping with the principles of QbD. A fed-batch production culture and a virus inactivation step are described as representative examples of upstream and downstream unit operations that were characterized. A systematic approach incorporating QbD principles was applied to both unit operations, involving risk assessment of potential process failure points, small-scale model qualification, design and execution of experiments, definition of operating parameter ranges and process validation acceptance criteria followed by manufacturing-scale implementation and process validation. Statistical experimental designs were applied to the execution of process characterization studies evaluating the impact of operating parameters on product quality attributes and process performance parameters. Data from process characterization experiments were used to define the proven acceptable range and classification of operating parameters for each unit operation. Analysis of variance and Monte Carlo simulation methods were used to assess the appropriateness of process design spaces. Successful implementation and validation of the process in the manufacturing facility and the subsequent manufacture of hundreds of batches of this therapeutic protein verifies the approaches taken as a suitable model for the development, scale-up and operation of any biopharmaceutical manufacturing process. Copyright © 2011 American Institute of Chemical Engineers (AIChE).
ATS-6 engineering performance report. Volume 2: Orbit and attitude controls
NASA Technical Reports Server (NTRS)
Wales, R. O. (Editor)
1981-01-01
Attitude control is reviewed, encompassing the attitude control subsystem, spacecraft attitude precision pointing and slewing adaptive control experiment, and RF interferometer experiment. The spacecraft propulsion system (SPS) is discussed, including subsystem, SPS design description and validation, orbital operations and performance, in-orbit anomalies and contingency operations, and the cesium bombardment ion engine experiment. Thruster failure due to plugging of the propellant feed passages, a major cause for mission termination, are considered among the critical generic failures on the satellite.
The Multidimensional Loss Scale: validating a cross-cultural instrument for measuring loss.
Vromans, Lyn; Schweitzer, Robert D; Brough, Mark
2012-04-01
The Multidimensional Loss Scale (MLS) represents the first instrument designed specifically to index Experience of Loss Events and Loss Distress across multiple domains (cultural, social, material, and intrapersonal) relevant to refugee settlement. Recently settled Burmese adult refugees (N = 70) completed a questionnaire battery, including MLS items. Analyses explored MLS internal consistency, convergent and divergent validity, and factor structure. Cronbach alphas indicated satisfactory internal consistency for Experience of Loss Events (0.85) and Loss Distress (0.92), reflecting a unitary construct of multidimensional loss. Loss Distress did not correlate with depression or anxiety symptoms and correlated moderately with interpersonal grief and trauma symptoms, supporting divergent and convergent validity. Factor analysis provided preliminary support for a five-factor model: Loss of Symbolic Self, Loss of Interdependence, Loss of Home, Interpersonal Loss, and Loss of Intrapersonal Integrity. Received well by participants, the new scale shows promise for application in future research and practice.
Validation and Continued Development of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2017-10-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. An extended MHD model has shown good agreement with experimental data at 14 kHz injector operation. Efforts to extend the existing validation to a range of higher frequencies (36, 53, 68 kHz) using the PSI-Tet 3D extended MHD code will be presented, along with simulations of potential combinations of flux conserver features and helicity injector configurations and their impact on current drive performance, density control, and temperature for future SIHI experiments. Work supported by USDoE.
Laser metrology and optic active control system for GAIA
NASA Astrophysics Data System (ADS)
D'Angelo, F.; Bonino, L.; Cesare, S.; Castorina, G.; Mottini, S.; Bertinetto, F.; Bisi, M.; Canuto, E.; Musso, F.
2017-11-01
The Laser Metrology and Optic Active Control (LM&OAC) program has been carried out under ESA contract with the purpose to design and validate a laser metrology system and an actuation mechanism to monitor and control at microarcsec level the stability of the Basic Angle (angle between the lines of sight of the two telescopes) of GAIA satellite. As part of the program, a breadboard (including some EQM elements) of the laser metrology and control system has been built and submitted to functional, performance and environmental tests. In the followings we describe the mission requirements, the system architecture, the breadboard design, and finally the performed validation tests. Conclusion and appraisals from this experience are also reported.
NASA DOEPOD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. Although 0.90 POD with 95% confidence at critical flaw sizes is often stated as an inspection requirement in inspection documents, including NASA Standards, NASA critical aerospace applications have historically only accepted 0.978 POD or better with a 95% one-sided lower confidence bound exceeding 0.90 at critical flaw sizes, a90/95.
Verification and Validation of Autonomy Software at NASA
NASA Technical Reports Server (NTRS)
Pecheur, Charles
2000-01-01
Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.
Verification and Validation of Autonomy Software at NASA
NASA Technical Reports Server (NTRS)
Pecheur, Charles
2000-01-01
Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.
1989-01-01
The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.
Hanauer, David I; Bauerle, Cynthia
2015-01-01
Science, technology, engineering, and mathematics education reform efforts have called for widespread adoption of evidence-based teaching in which faculty members attend to student outcomes through assessment practice. Awareness about the importance of assessment has illuminated the need to understand what faculty members know and how they engage with assessment knowledge and practice. The Faculty Self-Reported Assessment Survey (FRAS) is a new instrument for evaluating science faculty assessment knowledge and experience. Instrument validation was composed of two distinct studies: an empirical evaluation of the psychometric properties of the FRAS and a comparative known-groups validation to explore the ability of the FRAS to differentiate levels of faculty assessment experience. The FRAS was found to be highly reliable (α = 0.96). The dimensionality of the instrument enabled distinction of assessment knowledge into categories of program design, instrumentation, and validation. In the known-groups validation, the FRAS distinguished between faculty groups with differing levels of assessment experience. Faculty members with formal assessment experience self-reported higher levels of familiarity with assessment terms, higher frequencies of assessment activity, increased confidence in conducting assessment, and more positive attitudes toward assessment than faculty members who were novices in assessment. These results suggest that the FRAS can reliably and validly differentiate levels of expertise in faculty knowledge of assessment. © 2015 D. I. Hanauer and C. Bauerle. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
A user-targeted synthesis of the VALUE perfect predictor experiment
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutierrez, Jose; Kotlarski, Sven; Hertig, Elke; Wibig, Joanna; Rössler, Ole; Huth, Radan
2016-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. We consider different aspects: (1) marginal aspects such as mean, variance and extremes; (2) temporal aspects such as spell length characteristics; (3) spatial aspects such as the de-correlation length of precipitation extremes; and multi-variate aspects such as the interplay of temperature and precipitation or scale-interactions. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur. Experiment 1 (perfect predictors): what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Experiment 2 (Global climate model predictors): how is the overall representation of regional climate, including errors inherited from global climate models? Experiment 3 (pseudo reality): do methods fail in representing regional climate change? Here, we present a user-targeted synthesis of the results of the first VALUE experiment. In this experiment, downscaling methods are driven with ERA-Interim reanalysis data to eliminate global climate model errors, over the period 1979-2008. As reference data we use, depending on the question addressed, (1) observations from 86 meteorological stations distributed across Europe; (2) gridded observations at the corresponding 86 locations or (3) gridded spatially extended observations for selected European regions. With more than 40 contributing methods, this study is the most comprehensive downscaling inter-comparison project so far. The results clearly indicate that for several aspects, the downscaling skill varies considerably between different methods. For specific purposes, some methods can therefore clearly be excluded.
Validating vignette and conjoint survey experiments against real-world behavior
Hainmueller, Jens; Hangartner, Dominik; Yamamoto, Teppei
2015-01-01
Survey experiments, like vignette and conjoint analyses, are widely used in the social sciences to elicit stated preferences and study how humans make multidimensional choices. However, there is a paucity of research on the external validity of these methods that examines whether the determinants that explain hypothetical choices made by survey respondents match the determinants that explain what subjects actually do when making similar choices in real-world situations. This study compares results from conjoint and vignette analyses on which immigrant attributes generate support for naturalization with closely corresponding behavioral data from a natural experiment in Switzerland, where some municipalities used referendums to decide on the citizenship applications of foreign residents. Using a representative sample from the same population and the official descriptions of applicant characteristics that voters received before each referendum as a behavioral benchmark, we find that the effects of the applicant attributes estimated from the survey experiments perform remarkably well in recovering the effects of the same attributes in the behavioral benchmark. We also find important differences in the relative performances of the different designs. Overall, the paired conjoint design, where respondents evaluate two immigrants side by side, comes closest to the behavioral benchmark; on average, its estimates are within 2% percentage points of the effects in the behavioral benchmark. PMID:25646415
Application of 3D printing to prototype and develop novel plant tissue culture systems.
Shukla, Mukund R; Singh, Amritpal S; Piunno, Kevin; Saxena, Praveen K; Jones, A Maxwell P
2017-01-01
Due to the complex process of designing and manufacturing new plant tissue culture vessels through conventional means there have been limited efforts to innovate improved designs. Further, development and availability of low cost, energy efficient LEDs of various spectra has made it a promising light source for plant growth in controlled environments. However, direct replacement of conventional lighting sources with LEDs does not address problems with uniformity, spectral control, or the challenges in conducting statistically valid experiments to assess the effects of light. Prototyping using 3D printing and LED based light sources could help overcome these limitations and lead to improved culture systems. A modular culture vessel design in which the fluence rate and spectrum of light are independently controlled was designed, prototyped using 3D printing, and evaluated for plant growth. This design is compatible with semi-solid and liquid based culture systems. Observations on morphology, chlorophyll content, and chlorophyll fluorescence based stress parameters from in vitro plants cultured under different light spectra with similar overall fluence rate indicated different responses in Nicotiana tabacum and Artemisia annua plantlets. This experiment validates the utility of 3D printing to design and test functional vessels and demonstrated that optimal light spectra for in vitro plant growth is species-specific. 3D printing was successfully used to prototype novel culture vessels with independently controlled variable fluence rate/spectra LED lighting. This system addresses several limitations associated with current lighting systems, providing more uniform lighting and allowing proper replication/randomization for experimental plant biology while increasing energy efficiency. A complete procedure including the design and prototyping of a culture vessel using 3D printing, commercial scale injection molding of the prototype, and conducting a properly replicated experiment are discussed. This open source design has the scope for further improvement and adaptation and demonstrates the power of 3D printing to improve the design of culture systems.
Ego-Dissolution and Psychedelics: Validation of the Ego-Dissolution Inventory (EDI).
Nour, Matthew M; Evans, Lisa; Nutt, David; Carhart-Harris, Robin L
2016-01-01
The experience of a compromised sense of "self", termed ego-dissolution, is a key feature of the psychedelic experience. This study aimed to validate the Ego-Dissolution Inventory (EDI), a new 8-item self-report scale designed to measure ego-dissolution. Additionally, we aimed to investigate the specificity of the relationship between psychedelics and ego-dissolution. Sixteen items relating to altered ego-consciousness were included in an internet questionnaire; eight relating to the experience of ego-dissolution (comprising the EDI), and eight relating to the antithetical experience of increased self-assuredness, termed ego-inflation. Items were rated using a visual analog scale. Participants answered the questionnaire for experiences with classical psychedelic drugs, cocaine and/or alcohol. They also answered the seven questions from the Mystical Experiences Questionnaire (MEQ) relating to the experience of unity with one's surroundings. Six hundred and ninety-one participants completed the questionnaire, providing data for 1828 drug experiences (1043 psychedelics, 377 cocaine, 408 alcohol). Exploratory factor analysis demonstrated that the eight EDI items loaded exclusively onto a single common factor, which was orthogonal to a second factor comprised of the items relating to ego-inflation (rho = -0.110), demonstrating discriminant validity. The EDI correlated strongly with the MEQ-derived measure of unitive experience (rho = 0.735), demonstrating convergent validity. EDI internal consistency was excellent (Cronbach's alpha 0.93). Three analyses confirmed the specificity of ego-dissolution for experiences occasioned by psychedelic drugs. Firstly, EDI score correlated with drug-dose for psychedelic drugs (rho = 0.371), but not for cocaine (rho = 0.115) or alcohol (rho = -0.055). Secondly, the linear regression line relating the subjective intensity of the experience to ego-dissolution was significantly steeper for psychedelics (unstandardized regression coefficient = 0.701) compared with cocaine (0.135) or alcohol (0.144). Ego-inflation, by contrast, was specifically associated with cocaine experiences. Finally, a binary Support Vector Machine classifier identified experiences occasioned by psychedelic drugs vs. cocaine or alcohol with over 85% accuracy using ratings of ego-dissolution and ego-inflation alone. Our results demonstrate the psychometric structure, internal consistency and construct validity of the EDI. Moreover, we demonstrate the close relationship between ego-dissolution and the psychedelic experience. The EDI will facilitate the study of the neuronal correlates of ego-dissolution, which is relevant for psychedelic-assisted psychotherapy and our understanding of psychosis.
Roles of Naturalistic Observation in Comparative Psychology
ERIC Educational Resources Information Center
Miller, David B.
1977-01-01
"Five roles are considered by which systematic, quantified field research can augment controlled laboratory experimentation in terms of increasing the validity of laboratory studies." Advocates that comparative psychologists should "take more initiative in designing, executing, and interpreting our experiments with regard to the natural history of…
Al-Mamun, Mohammad; Zhu, Zhengju; Yin, Huajie; Su, Xintai; Zhang, Haimin; Liu, Porun; Yang, Huagui; Wang, Dan; Tang, Zhiyong; Wang, Yun; Zhao, Huijun
2016-08-04
A novel surface sulfur (S) doped cobalt (Co) catalyst for the oxygen evolution reaction (OER) is theoretically designed through the optimisation of the electronic structure of highly reactive surface atoms which is also validated by electrocatalytic OER experiments.
Cars Thermometry in a Supersonic Combustor for CFD Code Validation
NASA Technical Reports Server (NTRS)
Cutler, A. D.; Danehy, P. M.; Springer, R. R.; DeLoach, R.; Capriotti, D. P.
2002-01-01
An experiment has been conducted to acquire data for the validation of computational fluid dynamics (CFD) codes used in the design of supersonic combustors. The primary measurement technique is coherent anti-Stokes Raman spectroscopy (CARS), although surface pressures and temperatures have also been acquired. Modern- design- of-experiment techniques have been used to maximize the quality of the data set (for the given level of effort) and minimize systematic errors. The combustor consists of a diverging duct with single downstream- angled wall injector. Nominal entrance Mach number is 2 and enthalpy nominally corresponds to Mach 7 flight. Temperature maps are obtained at several planes in the flow for two cases: in one case the combustor is piloted by injecting fuel upstream of the main injector, the second is not. Boundary conditions and uncertainties are adequately characterized. Accurate CFD calculation of the flow will ultimately require accurate modeling of the chemical kinetics and turbulence-chemistry interactions as well as accurate modeling of the turbulent mixing
Content-based VLE designs improve learning efficiency in constructivist statistics education.
Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward
2011-01-01
We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content-based design outperforms the traditional VLE-based design.
Fundamentals of endoscopic surgery: creation and validation of the hands-on test.
Vassiliou, Melina C; Dunkin, Brian J; Fried, Gerald M; Mellinger, John D; Trus, Thadeus; Kaneva, Pepa; Lyons, Calvin; Korndorffer, James R; Ujiki, Michael; Velanovich, Vic; Kochman, Michael L; Tsuda, Shawn; Martinez, Jose; Scott, Daniel J; Korus, Gary; Park, Adrian; Marks, Jeffrey M
2014-03-01
The Fundamentals of Endoscopic Surgery™ (FES) program consists of online materials and didactic and skills-based tests. All components were designed to measure the skills and knowledge required to perform safe flexible endoscopy. The purpose of this multicenter study was to evaluate the reliability and validity of the hands-on component of the FES examination, and to establish the pass score. Expert endoscopists identified the critical skill set required for flexible endoscopy. They were then modeled in a virtual reality simulator (GI Mentor™ II, Simbionix™ Ltd., Airport City, Israel) to create five tasks and metrics. Scores were designed to measure both speed and precision. Validity evidence was assessed by correlating performance with self-reported endoscopic experience (surgeons and gastroenterologists [GIs]). Internal consistency of each test task was assessed using Cronbach's alpha. Test-retest reliability was determined by having the same participant perform the test a second time and comparing their scores. Passing scores were determined by a contrasting groups methodology and use of receiver operating characteristic curves. A total of 160 participants (17 % GIs) performed the simulator test. Scores on the five tasks showed good internal consistency reliability and all had significant correlations with endoscopic experience. Total FES scores correlated 0.73, with participants' level of endoscopic experience providing evidence of their validity, and their internal consistency reliability (Cronbach's alpha) was 0.82. Test-retest reliability was assessed in 11 participants, and the intraclass correlation was 0.85. The passing score was determined and is estimated to have a sensitivity (true positive rate) of 0.81 and a 1-specificity (false positive rate) of 0.21. The FES hands-on skills test examines the basic procedural components required to perform safe flexible endoscopy. It meets rigorous standards of reliability and validity required for high-stakes examinations, and, together with the knowledge component, may help contribute to the definition and determination of competence in endoscopy.
Pauthenier, Cyrille; Faulon, Jean-Loup
2014-07-01
PrecisePrimer is a web-based primer design software made to assist experimentalists in any repetitive primer design task such as preparing, cloning and shuffling DNA libraries. Unlike other popular primer design tools, it is conceived to generate primer libraries with popular PCR polymerase buffers proposed as pre-set options. PrecisePrimer is also meant to design primers in batches, such as for DNA libraries creation of DNA shuffling experiments and to have the simplest interface possible. It integrates the most up-to-date melting temperature algorithms validated with experimental data, and cross validated with other computational tools. We generated a library of primers for the extraction and cloning of 61 genes from yeast DNA genomic extract using default parameters. All primer pairs efficiently amplified their target without any optimization of the PCR conditions. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Multi-Evaporator Miniature Loop Heat Pipe for Small Spacecraft Thermal Control
NASA Technical Reports Server (NTRS)
Ku, Jentung; Ottenstein, Laura; Douglas, Donya
2008-01-01
This paper presents the development of the Thermal Loop experiment under NASA's New Millennium Program Space Technology 8 (ST8) Project. The Thermal Loop experiment was originally planned for validating in space an advanced heat transport system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers. Details of the thermal loop concept, technical advances and benefits, Level 1 requirements and the technology validation approach are described. An MLHP breadboard has been built and tested in the laboratory and thermal vacuum environments, and has demonstrated excellent performance that met or exceeded the design requirements. The MLHP retains all features of state-of-the-art loop heat pipes and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. In addition, an analytical model has been developed to simulate the steady state and transient operation of the MHLP, and the model predictions agreed very well with experimental results. A protoflight MLHP has been built and is being tested in a thermal vacuum chamber to validate its performance and technical readiness for a flight experiment.
Validation and Continued Development of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2016-10-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.
Style preference survey: a report on the psychometric properties and a cross-validation experiment.
Smith, Sherri L; Ricketts, Todd; McArdle, Rachel A; Chisolm, Theresa H; Alexander, Genevieve; Bratt, Gene
2013-02-01
Several self-report measures exist that target different aspects of outcomes for hearing aid use. Currently, no comprehensive questionnaire specifically assesses factors that may be important for differentiating outcomes pertaining to hearing aid style. The goal of this work was to develop the Style Preference Survey (SPS), a questionnaire aimed at outcomes associated with hearing aid style differences. Two experiments were conducted. After initial item development, Experiment 1 was conducted to refine the items and to determine its psychometric properties. Experiment 2 was designed to cross-validate the findings from the initial experiment. An observational design was used in both experiments. Participants who wore traditional, custom-fitted (TC) or open-canal (OC) style hearing aids from 3 mo to 3 yr completed the initial experiment. One-hundred and eighty-four binaural hearing aid users (120 of whom wore TC hearing aids and 64 of whom wore OC hearing aids) participated. A new sample of TC and OC users (n = 185) participated in the cross-validation experiment. Currently available self-report measures were reviewed to identify items that might differentiate between hearing aid styles, particularly preference for OC versus TC hearing aid styles. A total of 15 items were selected and modified from available self-report measures. An additional 55 items were developed through consensus of six audiologists for the initial version of the SPS. In the first experiment, the initial SPS version was mailed to 550 veterans who met the inclusion criteria. A total of 184 completed the SPS. Approximately three weeks later, a subset of participants (n = 83) completed the SPS a second time. Basic analyses were conducted to evaluate the psychometric properties of the SPS including subscale structure, internal consistency, test-retest reliability, and responsiveness. Based on the results of Experiment 1, the SPS was revised. A cross-validation experiment was then conducted using the revised version of the SPS to confirm the subscale structure, internal consistency, and responsiveness of the questionnaire in a new sample of participants. The final factor analysis led to the ultimate version of the SPS, which had a total of 35 items encompassing five subscales: (1) Feedback, (2) Occlusion/Own Voice Effects, (3) Localization, (4) Fit, Comfort, and Cosmetics, and (5) Ease of Use. The internal consistency of the total SPS (Cronbach's α = .92) and of the subscales (each Cronbach's α > .75) was high. Intraclass correlations (ICCs) showed that the test-retest reliability of the total SPS (ICC = .93) and of the subscales (each ICC > .80) also was high. TC hearing aid users had significantly poorer outcomes than OC hearing aid users on 4 of the 5 subscales, suggesting that the SPS largely is responsive to factors related to style-specific differences. The results suggest that the SPS has good psychometric properties and is a valid and reliable measure of outcomes related to style-specific, hearing aid preference. American Academy of Audiology.
Shanks, Ryan A; Robertson, Chuck L; Haygood, Christian S; Herdliksa, Anna M; Herdliska, Heather R; Lloyd, Steven A
2017-01-01
Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model's ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads.
False Dichotomies and Health Policy Research Designs: Randomized Trials Are Not Always the Answer.
Soumerai, Stephen B; Ceccarelli, Rachel; Koppel, Ross
2017-02-01
Some medical scientists argue that only data from randomized controlled trials (RCTs) are trustworthy. They claim data from natural experiments and administrative data sets are always spurious and cannot be used to evaluate health policies and other population-wide phenomena in the real world. While many acknowledge biases caused by poor study designs, in this article we argue that several valid designs using administrative data can produce strong findings, particularly the interrupted time series (ITS) design. Many policy studies neither permit nor require an RCT for cause-and-effect inference. Framing our arguments using Campbell and Stanley's classic research design monograph, we show that several "quasi-experimental" designs, especially interrupted time series (ITS), can estimate valid effects (or non-effects) of health interventions and policies as diverse as public insurance coverage, speed limits, hospital safety programs, drug abuse regulation and withdrawal of drugs from the market. We further note the recent rapid uptake of ITS and argue for expanded training in quasi-experimental designs in medical and graduate schools and in post-doctoral curricula.
Soil Moisture Active Passive (SMAP) Calibration and Validation Plan and Current Activities
NASA Technical Reports Server (NTRS)
Jackson, T. J.; Cosh, M.; Bindlish, R.; Crow, W.; Colliander, A.; Njoku, E.; McDonald, K.; Kimball, J.; Belair, S.; Walker, J.;
2010-01-01
The primary objective of the SMAP calibration and validation (Cal/Val) program is demonstrating that the science requirements (product accuracy and bias) have been met over the mission life. This begins during pre-launch with activities that contribute to high quality products and establishing post-launch validation infrastructure and continues through the mission life. However, the major focus is on a relatively short Cal/Val period following launch. The general approach and elements of the SMAP Cal/Val plan will be described and along with details on several ongoing or recent field experiments designed to address both near- and long-term Cal/Val.
BIM LAU-PE: Seedlings in Microgravity
NASA Astrophysics Data System (ADS)
Gass, S.; Pennese, R.; Chapuis, D.; Dainesi, P.; Nebuloni, S.; Garcia, M.; Oriol, A.
2015-09-01
The effect of gravity on plant roots is an intensive subject of research. Sounding rockets represent a costeffective platform to study this effect under microgravity conditions. As part of the upcoming MASER 13 sounding rocket campaign, two experiments on Arabidopsis thaliana seedlings have been devised: GRAMAT and SPARC. These experiments are aimed at studying (1) the genes that are specifically switched on or off during microgravity, and (2) the position of auxin-transporting proteins during microgravity. To perform these experiments, RUAG Space Switzerland site of Nyon, in collaboration with the Swedish Space Corporation (SSC) and the University of Freiburg, has developed the BIM LAU-PE (Biolology In Microgravity Late Access Unit Plant Experiment). In the following an overview of the BIM LAU-PE design is presented, highlighting specific module design features and verifications performed. A particular emphasis is placed on the parabolic flight experiments, including results of the micro-g injection system validation.
Validation of Design and Analysis Techniques of Tailored Composite Structures
NASA Technical Reports Server (NTRS)
Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.
2004-01-01
Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.
Tawfik, Hatem A; Fouad, Yousef A; Hafez, Rashad
2015-01-01
To introduce and evaluate the safety of a novel dual-sided electrosurgery handpiece design for simultaneous tissue cutting and coagulation. We designed a prototype double-sided handpiece allowing automatic switching between two electrodes with a simple handpiece flip. The concept of the system as a surgical instrument was assessed by an animal experiment. The skin of 15 Wistar albino white rats could be successfully incised and coagulated using both ends of the handpiece, thereby confirming the prospects and clinical applications of the system. The dual-sided electrosurgery handpiece is a simple and safe alternative to the traditional electrosurgery pencil, allowing the simultaneous use of two electrodes without the hassle of frequent electrode replacement.
Development and validation of the Alcohol Myopia Scale.
Lac, Andrew; Berger, Dale E
2013-09-01
Alcohol myopia theory conceptualizes the ability of alcohol to narrow attention and how this demand on mental resources produces the impairments of self-inflation, relief, and excess. The current research was designed to develop and validate a scale based on this framework. People who were alcohol users rated items representing myopic experiences arising from drinking episodes in the past month. In Study 1 (N = 260), the preliminary 3-factor structure was supported by exploratory factor analysis. In Study 2 (N = 289), the 3-factor structure was substantiated with confirmatory factor analysis, and it was superior in fit to an empirically indefensible 1-factor structure. The final 14-item scale was evaluated with internal consistency reliability, discriminant validity, convergent validity, criterion validity, and incremental validity. The alcohol myopia scale (AMS) illuminates conceptual underpinnings of this theory and yields insights for understanding the tunnel vision that arises from intoxication.
Development of Thermal Radiation Experiments Kit Based on Data Logger for Physics Learning Media
NASA Astrophysics Data System (ADS)
Permana, H.; Iswanto, B. H.
2018-04-01
Thermal Radiation Experiments Kit (TREK) based on data logger for physics learning media was developed. TREK will be used as a learning medium on the subject of Temperature and Heat to explain the concept of emissivity of a material in grade XI so that it can add variations of experiments which are commonly done such as thermal expansion, transfer of thermal energy (conduction, convection, and radiation), and specific heat capacity. DHT11 sensor is used to measure temperature and microcontroller Arduino-uno used as data logger. The object tested are in the form of coated glass thin films and aluminum with different colors. TREK comes with a user manual and student worksheet (LKS) to make it easier for teachers and students to use. TREK was developed using the ADDIE Development Model (Analyze, Design, Development, Implementation, and Evaluation). And validated by experts, physics teachers, and students. Validation instrument is a questionnaire with a five-item Likert response scale with reviewed aspect coverage: appropriate content and concepts, design, and user friendly. The results showed that TREK was excellent (experts 88.13%, science teachers 95.68%, and students 85.77%).
Drones for aerodynamic and structural testing /DAST/ - A status report
NASA Technical Reports Server (NTRS)
Murrow, H. N.; Eckstrom, C. V.
1978-01-01
A program for providing research data on aerodynamic loads and active control systems on wings with supercritical airfoils in the transonic speed range is described. Analytical development, wind tunnel tests, and flight tests are included. A Firebee II target drone vehicle has been modified for use as a flight test facility. The program currently includes flight experiments on two aeroelastic research wings. The primary purpose of the first flight experiment is to demonstrate an active control system for flutter suppression on a transport-type wing. Design and fabrication of the wing are complete and after installing research instrumentation and the flutter suppression system, flight testing is expected to begin in early 1979. The experiment on the second research wing - a fuel-conservative transport type - is to demonstrate multiple active control systems including flutter suppression, maneuver load alleviation, gust load alleviation, and reduce static stability. Of special importance for this second experiment is the development and validation of integrated design methods which include the benefits of active controls in the structural design.
NASA Astrophysics Data System (ADS)
Villa, Enrique; Cano, Juan L.; Aja, Beatriz; Terán, J. Vicente; de la Fuente, Luisa; Mediavilla, Ángel; Artal, Eduardo
2018-03-01
This paper describes the analysis, design and characterization of a polarimetric receiver developed for covering the 35 to 47 GHz frequency band in the new instrument aimed at completing the ground-based Q-U-I Joint Tenerife Experiment. This experiment is designed to measure polarization in the Cosmic Microwave Background. The described high frequency instrument is a HEMT-based array composed of 29 pixels. A thorough analysis of the behaviour of the proposed receiver, based on electronic phase switching, is presented for a noise-like linearly polarized input signal, obtaining simultaneously I, Q and U Stokes parameters of the input signal. Wideband subsystems are designed, assembled and characterized for the polarimeter. Their performances are described showing appropriate results within the 35-to-47 GHz frequency band. Functionality tests are performed at room and cryogenic temperatures with adequate results for both temperature conditions, which validate the receiver concept and performance.
Development and Implementation of Minimum Hiring Specifications
ERIC Educational Resources Information Center
Herbstritt, Michael R.
1978-01-01
Specifications were developed to avoid possible discrimination and confusion in the hiring practices at a large southeastern university. They were developed through job analysis and a systematic file search designed to find the education and prior related work experience possessed by each incumbent. The specifications were validated as…
Off-design Performance Analysis of Multi-Stage Transonic Axial Compressors
NASA Astrophysics Data System (ADS)
Du, W. H.; Wu, H.; Zhang, L.
Because of the complex flow fields and component interaction in modern gas turbine engines, they require extensive experiment to validate performance and stability. The experiment process can become expensive and complex. Modeling and simulation of gas turbine engines are way to reduce experiment costs, provide fidelity and enhance the quality of essential experiment. The flow field of a transonic compressor contains all the flow aspects, which are difficult to present-boundary layer transition and separation, shock-boundary layer interactions, and large flow unsteadiness. Accurate transonic axial compressor off-design performance prediction is especially difficult, due in large part to three-dimensional blade design and the resulting flow field. Although recent advancements in computer capacity have brought computational fluid dynamics to forefront of turbomachinery design and analysis, the grid and turbulence model still limit Reynolds-average Navier-Stokes (RANS) approximations in the multi-stage transonic axial compressor flow field. Streamline curvature methods are still the dominant numerical approach as an important tool for turbomachinery to analyze and design, and it is generally accepted that streamline curvature solution techniques will provide satisfactory flow prediction as long as the losses, deviation and blockage are accurately predicted.
Observations with the ROWS instrument during the Grand Banks calibration/validation experiments
NASA Technical Reports Server (NTRS)
Vandemark, D.; Chapron, B.
1994-01-01
As part of a global program to validate the ocean surface sensors on board ERS-1, a joint experiment on the Grand Banks of Newfoundland was carried out in Nov. 1991. The principal objective was to provide a field validation of ERS-1 Synthetic Aperture Radar (SAR) measurement of ocean surface structure. The NASA-P3 aircraft measurements made during this experiment provide independent measurements of the ocean surface along the validation swath. The Radar Ocean Wave Spectrometer (ROWS) is a radar sensor designed to measure direction of the long wave components using spectral analysis of the tilt induced radar backscatter modulation. This technique greatly differs from SAR and thus, provides a unique set of measurements for use in evaluating SAR performance. Also, an altimeter channel in the ROWS gives simultaneous information on the surface wave height and radar mean square slope parameter. The sets of geophysical parameters (wind speed, significant wave height, directional spectrum) are used to study the SAR's ability to accurately measure ocean gravity waves. The known distortion imposed on the true directional spectrum by the SAR imaging mechanism is discussed in light of the direct comparisons between ERS-1 SAR, airborne Canadian Center for Remote Sensing (CCRS) SAR, and ROWS spectra and the use of the nonlinear ocean SAR transform.
Proposal of an Extended Taxonomy of Serious Games for Health Rehabilitation.
Rego, Paula Alexandra; Moreira, Pedro Miguel; Reis, Luís Paulo
2018-06-29
Serious Games is a field of research that has evolved substantially with valuable contributions to many application domains and areas. Patients often consider traditional rehabilitation approaches to be repetitive and boring, making it difficult for them to maintain their ongoing interest and assure the completion of the treatment program. Since the publication of our first taxonomy of Serious Games for Health Rehabilitation (SGHR), many studies have been published with game prototypes in this area. Based on literature review, our goal is to propose an updated taxonomy taking into account the works, updates, and innovations in game criteria that have been researched since our first publication in 2010. In addition, we aim to present the validation mechanism used for the proposed extended taxonomy. Based on a literature review in the area and on the analysis of the contributions made by other researchers, we propose an extended taxonomy for SGHR. For validating the taxonomy proposal, a questionnaire was designed to use on a survey among experts in the area. An extended taxonomy for SGHR was proposed. As we have identified that, in general, and besides the mechanisms associated with the adoption of a given taxonomy, there were no reported validation mechanisms for the proposals, we designed a mechanism to validate our proposal. The mechanism uses a questionnaire addressed to a sample of researchers and professionals with experience and expertise in domains of knowledge interrelated with SGHR, such as Computer Graphics, Game Design, Interaction Design, Computer Programming, and Health Rehabilitation. The extended taxonomy proposal for health rehabilitation serious games provides the research community with a tool to fully characterize serious games. The mechanism designed for validating the taxonomy proposal is another contribution of this work.
Assessing the Two-Plasmon Decay Threat Through Simulations and Experiments on the NIKE Laser System
NASA Astrophysics Data System (ADS)
Phillips, Lee; Weaver, J. L.; Oh, J.; Schmitt, A. J.; Obenschain, S.
2010-11-01
NIKE is a Krf laser system at the Naval Research Laboratory used to explore hydrodynamic stability, equation of state, and other physics problems arising in IFE research. The comparatively short KrF wavelength is expected to raise the threshold of most parametric instabilities. We report on simulations performed using the FAST3d radiation hydrocode to design TPD experiments that have have allowed us to explore the validity of simple threshold formulas and help establish the accuracy of our simulations. We have also studied proposed high-gain shock ignition designs and devised experiments that can approach the relevant scalelength-temperature regime, allowing us a potential experimental method to study the LPI threat to these designs by direct observation. Through FAST3d studies of shock-ignited and conventional direct-drive designs with KrF (248 nm) and 3rd harmonic (351nm) drivers, we examine the benefits of the shorter wavelength KrF light in reducing the LPI threat.
Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy
2014-01-01
It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students' responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students' experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. © 2014 A. P. Dasgupta et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Predictive design and interpretation of colliding pulse injected laser wakefield experiments
NASA Astrophysics Data System (ADS)
Cormier-Michel, Estelle; Ranjbar, Vahid H.; Cowan, Ben M.; Bruhwiler, David L.; Geddes, Cameron G. R.; Chen, Min; Ribera, Benjamin; Esarey, Eric; Schroeder, Carl B.; Leemans, Wim P.
2010-11-01
The use of colliding laser pulses to control the injection of plasma electrons into the plasma wake of a laser plasma accelerator is a promising approach to obtaining stable, tunable electron bunches with reduced emittance and energy spread. Colliding Pulse Injection (CPI) experiments are being performed by groups around the world. We will present recent particle-in-cell simulations, using the parallel VORPAL framework, of CPI for physical parameters relevant to ongoing experiments of the LOASIS program at LBNL. We evaluate the effect of laser and plasma tuning, on the trapped electron bunch and perform parameter scans in order to optimize the quality of the bunch. Impact of non-ideal effects such as imperfect laser modes and laser self focusing are also evaluated. Simulation data are validated against current experimental results, and are used to design future experiments.
Cryogenic Design of the Setup for MARE-1 in Milan
NASA Astrophysics Data System (ADS)
Schaeffer, D.; Arnaboldi, C.; Ceruti, G.; Ferri, E.; Kilbourne, C.; Kraft-Bermuth, S.; Margesin, B.; McCammon, D.; Monfardini, A.; Nucciotti, A.; Pessina, G.; Previtali, E.; Sisti, M.
2008-05-01
A large worldwide collaboration is growing around the project of Micro-calorimeter Arrays for a Rhenium Experiment (MARE) for a direct calorimetric measurement of the neutrino mass. To validate the use of cryogenic detectors by checking the presence of unexpected systematic errors, two first experiments are planned using the available techniques composed of arrays of 300 detectors to measure 1010 events in a reasonable time of 3 years (step MARE-1) to reach a sensitivity on the neutrino mass of ˜2 eV/c2. Our experiment in Milan is based on compensated doped silicon implanted thermistor arrays made in NASA/GSFC and on AgReO4 crystals. We present here the design of the cryogenic system that integrates all the requirements for such experiment (electronics for high impedances, low parasitic capacitances, low micro-phonic noise).
Optical Closed-Loop Propulsion Control System Development
NASA Technical Reports Server (NTRS)
Poppel, Gary L.
1998-01-01
The overall objective of this program was to design and fabricate the components required for optical closed-loop control of a F404-400 turbofan engine, by building on the experience of the NASA Fiber Optic Control System Integration (FOCSI) program. Evaluating the performance of fiber optic technology at the component and system levels will result in helping to validate its use on aircraft engines. This report includes descriptions of three test plans. The EOI Acceptance Test is designed to demonstrate satisfactory functionality of the EOI, primarily fail-safe throughput of the F404 sensor signals in the normal mode, and validation, switching, and output of the five analog sensor signals as generated from validated optical sensor inputs, in the optical mode. The EOI System Test is designed to demonstrate acceptable F404 ECU functionality as interfaced with the EOI, making use of a production ECU test stand. The Optical Control Engine Test Request describes planned hardware installation, optical signal calibrations, data system coordination, test procedures, and data signal comparisons for an engine test demonstration of the optical closed-loop control.
Quasi-experimental study designs series-paper 7: assessing the assumptions.
Bärnighausen, Till; Oldenburg, Catherine; Tugwell, Peter; Bommer, Christian; Ebert, Cara; Barreto, Mauricio; Djimeu, Eric; Haber, Noah; Waddington, Hugh; Rockers, Peter; Sianesi, Barbara; Bor, Jacob; Fink, Günther; Valentine, Jeffrey; Tanner, Jeffrey; Stanley, Tom; Sierra, Eduardo; Tchetgen, Eric Tchetgen; Atun, Rifat; Vollmer, Sebastian
2017-09-01
Quasi-experimental designs are gaining popularity in epidemiology and health systems research-in particular for the evaluation of health care practice, programs, and policy-because they allow strong causal inferences without randomized controlled experiments. We describe the concepts underlying five important quasi-experimental designs: Instrumental Variables, Regression Discontinuity, Interrupted Time Series, Fixed Effects, and Difference-in-Differences designs. We illustrate each of the designs with an example from health research. We then describe the assumptions required for each of the designs to ensure valid causal inference and discuss the tests available to examine the assumptions. Copyright © 2017 Elsevier Inc. All rights reserved.
Flight Validation of On-Demand Operations: The Deep Space One Beacon Monitor Operations Experiment
NASA Technical Reports Server (NTRS)
Wyatt, Jay; Sherwood, Rob; Sue, Miles; Szijjarto, John
2000-01-01
After a brief overview of the operational concept, this paper will provide a detailed description of the _as-flown_ flight software components, the DS1 experiment plan, and experiment results to date. Special emphasis will be given to experiment results and lessons learned since the basic system design has been previously reported. Mission scenarios where beacon operations is highly applicable will be described. Detailed cost savings estimates for a sample science mission will be provided as will cumulative savings that are possible over the next fifteen years of NASA missions.
ERIC Educational Resources Information Center
Elkins, Kelly M.; Kadunc, Raelynn E.
2012-01-01
In this laboratory experiment, real-time polymerase chain reaction (real-time PCR) was conducted using published human TPOX single-locus DNA primers for validation and various student-designed short tandem repeat (STR) primers for Combined DNA Index System (CODIS) loci. SYBR Green was used to detect the amplification of the expected amplicons. The…
Emmons, Karen M; Doubeni, Chyke A; Fernandez, Maria E; Miglioretti, Diana L; Samet, Jonathan M
2018-06-05
On 5 and 6 December 2017, the National Institutes of Health (NIH) convened the Pathways to Prevention Workshop: Methods for Evaluating Natural Experiments in Obesity to identify the status of methods for assessing natural experiments to reduce obesity, areas in which these methods could be improved, and research needs for advancing the field. This article considers findings from a systematic evidence review on methods for evaluating natural experiments in obesity, workshop presentations by experts and stakeholders, and public comment. Research gaps are identified, and recommendations related to 4 key issues are provided. Recommendations on population-based data sources and data integration include maximizing use and sharing of existing surveillance and research databases and ensuring significant effort to integrate and link databases. Recommendations on measurement include use of standardized and validated measures of obesity-related outcomes and exposures, systematic measurement of co-benefits and unintended consequences, and expanded use of validated technologies for measurement. Study design recommendations include improving guidance, documentation, and communication about methods used; increasing use of designs that minimize bias in natural experiments; and more carefully selecting control groups. Cross-cutting recommendations target activities that the NIH and other funders might undertake to improve the rigor of natural experiments in obesity, including training and collaboration on modeling and causal inference, promoting the importance of community engagement in the conduct of natural experiments, ensuring maintenance of relevant surveillance systems, and supporting extended follow-up assessments for exemplar natural experiments. To combat the significant public health threat posed by obesity, researchers should continue to take advantage of natural experiments. The recommendations in this report aim to strengthen evidence from such studies.
Kreutz, Jason E; Munson, Todd; Huynh, Toan; Shen, Feng; Du, Wenbin; Ismagilov, Rustem F
2011-11-01
This paper presents a protocol using theoretical methods and free software to design and analyze multivolume digital PCR (MV digital PCR) devices; the theory and software are also applicable to design and analysis of dilution series in digital PCR. MV digital PCR minimizes the total number of wells required for "digital" (single molecule) measurements while maintaining high dynamic range and high resolution. In some examples, multivolume designs with fewer than 200 total wells are predicted to provide dynamic range with 5-fold resolution similar to that of single-volume designs requiring 12,000 wells. Mathematical techniques were utilized and expanded to maximize the information obtained from each experiment and to quantify performance of devices and were experimentally validated using the SlipChip platform. MV digital PCR was demonstrated to perform reliably, and results from wells of different volumes agreed with one another. No artifacts due to different surface-to-volume ratios were observed, and single molecule amplification in volumes ranging from 1 to 125 nL was self-consistent. The device presented here was designed to meet the testing requirements for measuring clinically relevant levels of HIV viral load at the point-of-care (in plasma, <500 molecules/mL to >1,000,000 molecules/mL), and the predicted resolution and dynamic range was experimentally validated using a control sequence of DNA. This approach simplifies digital PCR experiments, saves space, and thus enables multiplexing using separate areas for each sample on one chip, and facilitates the development of new high-performance diagnostic tools for resource-limited applications. The theory and software presented here are general and are applicable to designing and analyzing other digital analytical platforms including digital immunoassays and digital bacterial analysis. It is not limited to SlipChip and could also be useful for the design of systems on platforms including valve-based and droplet-based platforms. In a separate publication by Shen et al. (J. Am. Chem. Soc., 2011, DOI: 10.1021/ja2060116), this approach is used to design and test digital RT-PCR devices for quantifying RNA.
Corrosivity Sensor for Exposed Pipelines Based on Wireless Energy Transfer
Lawand, Lydia; Shiryayev, Oleg; Al Handawi, Khalil; Vahdati, Nader; Rostron, Paul
2017-01-01
External corrosion was identified as one of the main causes of pipeline failures worldwide. A solution that addresses the issue of detecting and quantifying corrosivity of environment for application to existing exposed pipelines has been developed. It consists of a sensing array made of an assembly of thin strips of pipeline steel and a circuit that provides a visual sensor reading to the operator. The proposed sensor is passive and does not require a constant power supply. Circuit design was validated through simulations and lab experiments. Accelerated corrosion experiment was conducted to confirm the feasibility of the proposed corrosivity sensor design. PMID:28556805
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atamturktur, Sez; Unal, Cetin; Hemez, Francois
The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed frameworkmore » is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this framework, the project team has focused on optimizing resource allocation for improving numerical models through further code development and experimentation. Related to further code development, we have developed a code prioritization index (CPI) for coupled numerical models. CPI is implemented to effectively improve the predictive capability of the coupled model by increasing the sophistication of constituent codes. In relation to designing new experiments, we investigated the information gained by the addition of each new experiment used for calibration and bias correction of a simulation model. Additionally, the variability of ‘information gain’ through the design domain has been investigated in order to identify the experiment settings where maximum information gain occurs and thus guide the experimenters in the selection of the experiment settings. This idea was extended to evaluate the information gain from each experiment can be improved by intelligently selecting the experiments, leading to the development of the Batch Sequential Design (BSD) technique. Additionally, we evaluated the importance of sufficiently exploring the domain of applicability in experiment-based validation of high-consequence modeling and simulation by developing a new metric to quantify coverage. This metric has also been incorporated into the design of new experiments. Finally, we have proposed a data-aware calibration approach for the calibration of numerical models. This new method considers the complexity of a numerical model (the number of parameters to be calibrated, parameter uncertainty, and form of the model) and seeks to identify the number of experiments necessary to calibrate the model based on the level of sophistication of the physics. The final component in the project team’s work to improve model calibration and validation methods is the incorporation of robustness to non-probabilistic uncertainty in the input parameters. This is an improvement to model validation and uncertainty quantification stemming beyond the originally proposed scope of the project. We have introduced a new metric for incorporating the concept of robustness into experiment-based validation of numerical models. This project has accounted for the graduation of two Ph.D. students (Kendra Van Buren and Josh Hegenderfer) and two M.S. students (Matthew Egeberg and Parker Shields). One of the doctoral students is now working in the nuclear engineering field and the other one is a post-doctoral fellow at the Los Alamos National Laboratory. Additionally, two more Ph.D. students (Garrison Stevens and Tunc Kulaksiz) who are working towards graduation have been supported by this project.« less
Optimal design of a thermally stable composite optical bench
NASA Technical Reports Server (NTRS)
Gray, C. E., Jr.
1985-01-01
The Lidar Atmospheric Sensing Experiment will be performed aboard an ER-2 aircraft; the lidar system used will be mounted on a lightweight, thermally stable graphite/epoxy optical bench whose design is presently subjected to analytical study and experimental validation. Attention is given to analytical methods for the selection of such expected laminate properties as the thermal expansion coefficient, the apparent in-plane moduli, and ultimate strength. For a symmetric laminate in which one of the lamina angles remains variable, an optimal lamina angle is selected to produce a design laminate with a near-zero coefficient of thermal expansion. Finite elements are used to model the structural concept of the design, with a view to the optical bench's thermal structural response as well as the determination of the degree of success in meeting the experiment's alignment tolerances.
NASA Astrophysics Data System (ADS)
Pezzella, Giuseppe; Richiello, Camillo; Russo, Gennaro
2011-05-01
This paper deals with the aerodynamic and aerothermodynamic trade-off analysis carried out with the aim to design a hypersonic flying test bed (FTB), namely USV3. Such vehicle will have to be launched with a small expendable launcher and shall re-enter the Earth atmosphere allowing to perform several experiments on critical re-entry phenomena. The demonstrator under study is a re-entry space glider characterized by a relatively simple vehicle architecture able to validate hypersonic aerothermodynamic design database and passenger experiments, including thermal shield and hot structures. Then, a summary review of the aerodynamic characteristics of two FTB concepts, compliant with a phase-A design level, has been provided hereinafter. Indeed, several design results, based both on engineering approach and computational fluid dynamics, are reported and discussed in the paper.
Development of chemistry attitudes and experiences questionnaire (CAEQ)
NASA Astrophysics Data System (ADS)
Dalgety, Jacinta; Coll, Richard K.; Jones, Alister
2003-09-01
In this article we describe the development of the Chemistry Attitudes and Experiences Questionnaire (CAEQ) that measures first-year university chemistry students' attitude toward chemistry, chemistry self-efficacy, and learning experiences. The instrument was developed as part of a larger study and sought to fulfill a need for an instrument to investigate factors that influence student enrollment choice. We set out to design the instrument in a manner that would maximize construct validity. The CAEQ was piloted with a cohort of science and technology students (n = 129) at the end of their first year. Based on statistical analysis the instrument was modified and subsequently administered on two occasions at two tertiary institutions (n = 669). Statistical data along with additional data gathered from interviews suggest that the CAEQ possesses good construct validity and will prove a useful tool for tertiary level educators who wish to gain an understanding of factors that influence student choice of chemistry enrolment.
NASA Technical Reports Server (NTRS)
Komendera, Erik E.; Dorsey, John T.
2017-01-01
Developing a capability for the assembly of large space structures has the potential to increase the capabilities and performance of future space missions and spacecraft while reducing their cost. One such application is a megawatt-class solar electric propulsion (SEP) tug, representing a critical transportation ability for the NASA lunar, Mars, and solar system exploration missions. A series of robotic assembly experiments were recently completed at Langley Research Center (LaRC) that demonstrate most of the assembly steps for the SEP tug concept. The assembly experiments used a core set of robotic capabilities: long-reach manipulation and dexterous manipulation. This paper describes cross-cutting capabilities and technologies for in-space assembly (ISA), applies the ISA approach to a SEP tug, describes the design and development of two assembly demonstration concepts, and summarizes results of two sets of assembly experiments that validate the SEP tug assembly steps.
CFD Modeling of Free-Piston Stirling Engines
NASA Technical Reports Server (NTRS)
Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.
2001-01-01
NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.
Phase I Development of Neutral Beam Injector Solid-State Power System
NASA Astrophysics Data System (ADS)
Prager, James; Ziemba, Timothy; Miller, Kenneth E.; Slobodov, Ilia; Anderson, Seth
2017-10-01
Neutral beam injection (NBI) is an important tool for plasma heating, current drive and a diagnostic at fusion science experiments around the United States, including tokamaks, validation platform experiments, and privately funded fusion concepts. Currently, there are no vendors in the United States for NBI power systems. Eagle Harbor Technologies (EHT), Inc. is developing a new power system for NBI that takes advantage of the latest developments in solid-state switching. EHT has developed a resonant converter that can be scaled to the power levels required for NBI at small-scale validation platform experiments like the Lithium Tokamak Experiment. This power system can be used to modulate the NBI voltages over the course of a plasma shot, which can lead to improved control over the plasma. EHT will present initial modeling used to design this system as well as experimental data showing operation at 15 kV and 40 A for 10 ms into a test load. With support of DOE SBIR.
Application of Micro-ramp Flow Control Devices to an Oblique Shock Interaction
NASA Technical Reports Server (NTRS)
Hirt, Stefanie; Anderson, Bernhard
2007-01-01
Tests are planned in the 15cm x 15cm supersonic wind tunnel at NASA Glenn to demonstrate the applicability of micro-ramp flow control to the management of shock wave boundary layer interactions. These tests will be used as a database for computational fluid dynamics (CFD) validation and Design of Experiments (DoE) design information. Micro-ramps show potential for mechanically simple and fail-safe boundary layer control.
Development of a Parachute System for Deceleration of Flying Vehicles in Supersonic Regimes
NASA Astrophysics Data System (ADS)
Pilyugin, N. N.; Khlebnikov, V. S.
2010-09-01
Aerodynamic problems arising during design and development of braking systems for re-entry vehicles are analyzed. Aerodynamic phenomena and laws valid in a supersonic flow around a pair of bodies having different shapes are studied. Results of this research can be used in solving application problems (arrangement and optimization of experiments; design and development of various braking systems for re-entry vehicles moving with supersonic speeds in the atmosphere).
Livingstone Model-Based Diagnosis of Earth Observing One Infusion Experiment
NASA Technical Reports Server (NTRS)
Hayden, Sandra C.; Sweet, Adam J.; Christa, Scott E.
2004-01-01
The Earth Observing One satellite, launched in November 2000, is an active earth science observation platform. This paper reports on the progress of an infusion experiment in which the Livingstone 2 Model-Based Diagnostic engine is deployed on Earth Observing One, demonstrating the capability to monitor the nominal operation of the spacecraft under command of an on-board planner, and demonstrating on-board diagnosis of spacecraft failures. Design and development of the experiment, specification and validation of diagnostic scenarios, characterization of performance results and benefits of the model- based approach are presented.
Observing System Simulation Experiments for Fun and Profit
NASA Technical Reports Server (NTRS)
Prive, Nikki C.
2015-01-01
Observing System Simulation Experiments can be powerful tools for evaluating and exploring both the behavior of data assimilation systems and the potential impacts of future observing systems. With great power comes great responsibility - given a pure modeling framework, how can we be sure our results are meaningful? The challenges and pitfalls of OSSE calibration and validation will be addressed, as well as issues of incestuousness, selection of appropriate metrics, and experiment design. The use of idealized observational networks to investigate theoretical ideas in a fully complex modeling framework will also be discussed
Educated Guesses and Other Ways to Address the Pharmacological Uncertainty of Designer Drugs
Berning, Moritz
2016-01-01
This study examines how experimentation with designer drugs is mediated by the Internet. We selected a popular drug forum that presents reports on self-experimentation with little or even completely unexplored designer drugs to examine: (1) how participants report their “trying out” of new compounds and (2) how participants reduce the pharmacological uncertainty associated with using these substances. Our methods included passive observation online, engaging more actively with the online community using an avatar, and off-line interviews with key interlocutors to validate our online findings. This article reflects on how forum participants experiment with designer drugs, their trust in suppliers and the testimonials of others, the use of ethno-scientific techniques that involve numerical weighing, “allergy dosing,” and the use of standardized trip reports. We suggest that these techniques contribute to a sense of control in the face of the possible toxicity of unknown or little-known designer drugs. The online reporting of effects allows users to experience not only the thrill of a new kind of high but also connection with others in the self-experimenting drug community. PMID:27721526
Asati, Ankita; Satyanarayana, G N V; Patel, Devendra K
2017-04-01
An efficient and inexpensive method using vortex-assisted surfactant-enhanced emulsification microextraction (VASEME) based on solidification of floating organic droplet coupled with ultraperformance liquid chromatography-tandem mass spectrometry is proposed for the analysis of glucocorticoids in water samples (river water and hospital wastewater). VASEME was optimized by the experimental validation of Plackett-Burman design and central composite design, which has been co-related to experimental design. Plackett-Burman design showed that factors such as vortex time, surfactant concentration, and pH significantly affect the extraction efficiency of the method. Method validation was characterized by an acceptable calibration range of 1-1000 ng L -1 , and the limit of detection was in the range from 2.20 to 8.12 ng L -1 for glucocorticoids. The proposed method was applied to determine glucocorticoids in river water and hospital wastewater in Lucknow, India. It is reliable and rapid and has potential application for analysis of glucocorticoids in environmental aqueous samples. Graphical Abstract Low density based extraction of gluococorticoids by using design of experiment.
Energy Conservation for Low-Income Households: The Evaporative Cooler Experience.
ERIC Educational Resources Information Center
Ridge, Richard S.
1988-01-01
An econometric analysis, using a research design based on the nonequivalent control group (NECG), assessed the effectiveness of a program offering free evaporative coolers to low-income families owning air conditioners. The NECG controls for serious threats to internal validity, except for self-selection. The program successfully reduced energy…
A Comparison of Conjoint Analysis Response Formats
Kevin J. Boyle; Thomas P. Holmes; Mario F. Teisl; Brian Roe
2001-01-01
A split-sample design is used to evaluate the convergent validity of three response formats used in conjoint analysis experiments. WC investigate whether recoding rating data to rankings and choose-one formats, and recoding ranking data to choose one. result in structural models and welfare estimates that are statistically indistinguishable from...
Optical simulations for experimental networks: lessons from MONET
NASA Astrophysics Data System (ADS)
Richards, Dwight H.; Jackel, Janet L.; Goodman, Matthew S.; Roudas, Ioannis; Wagner, Richard E.; Antoniades, Neophytos
1999-08-01
We have used optical simulations as a means of setting component requirements, assessing component compatibility, and designing experiments in the MONET (Multiwavelength Optical Networking) Project. This paper reviews the simulation method, gives some examples of the types of simulations that have been performed, and discusses the validation of the simulations.
DOT National Transportation Integrated Search
2012-04-01
The three studies included in the current report examine the transition from an infrastructure-based rural : intersection crossing assist system to one located inside a vehicle. The primary goals of the first study, conducted in : a simulator, were t...
ERIC Educational Resources Information Center
Caprara, Gian Vittorio; Alessandri, Guido; Eisenberg, Nancy; Kupfer, A.; Steca, Patrizia; Caprara, Maria Giovanna; Yamaguchi, Susumu; Fukuzawa, Ai; Abela, John
2012-01-01
Five studies document the validity of a new 8-item scale designed to measure "positivity," defined as the tendency to view life and experiences with a positive outlook. In the first study (N = 372), the psychometric properties of Positivity Scale (P Scale) were examined in accordance with classical test theory using a large number of…
Challenges and Innovations in a Community-Based Participatory Randomized Controlled Trial
ERIC Educational Resources Information Center
Goodkind, Jessica R.; Amer, Suha; Christian, Charlisa; Hess, Julia Meredith; Bybee, Deborah; Isakson, Brian L.; Baca, Brandon; Ndayisenga, Martin; Greene, R. Neil; Shantzek, Cece
2017-01-01
Randomized controlled trials (RCTs) are a long-standing and important design for conducting rigorous tests of the effectiveness of health interventions. However, many questions have been raised about the external validity of RCTs, their utility in explicating mechanisms of intervention and participants' intervention experiences, and their…
DOT National Transportation Integrated Search
2008-10-01
Two experiments (simulator and test track) were conducted to validate the concept of a system designed to warn potential victims of a likely red-light violator. The warning system uses sensors to detect vehicles that are unlikely to stop at red traff...
The Cost of Action Miscues: Hemispheric Asymmetries
ERIC Educational Resources Information Center
Shenal, Brian V.; Hinze, Stephan; Heilman, Kenneth M.
2012-01-01
Adaptive behaviors require preparation and when necessary inhibition or alteration of actions. The right hemisphere has been posited to be dominant for preparatory motor activation. This experiment was designed to learn if there are hemispheric asymmetries in the control of altered plans of actions. Cues, both valid and invalid, which indicate the…
Proceedings of Tenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1985-01-01
Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.
Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract
NASA Astrophysics Data System (ADS)
Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang
2017-01-01
This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.
Validation results of satellite mock-up capturing experiment using nets
NASA Astrophysics Data System (ADS)
Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil
2017-05-01
The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly configured according to the parabolic flight scenario, and executed in order to generate the validation data. Both datasets have been compared according to different metrics in order to perform the validation of the PATENDER simulator.
NASA Astrophysics Data System (ADS)
Jiao, J.; Trautz, A.; Zhang, Y.; Illangasekera, T.
2017-12-01
Subsurface flow and transport characterization under data-sparse condition is addressed by a new and computationally efficient inverse theory that simultaneously estimates parameters, state variables, and boundary conditions. Uncertainty in static data can be accounted for while parameter structure can be complex due to process uncertainty. The approach has been successfully extended to inverting transient and unsaturated flows as well as contaminant source identification under unknown initial and boundary conditions. In one example, by sampling numerical experiments simulating two-dimensional steady-state flow in which tracer migrates, a sequential inversion scheme first estimates the flow field and permeability structure before the evolution of tracer plume and dispersivities are jointly estimated. Compared to traditional inversion techniques, the theory does not use forward simulations to assess model-data misfits, thus the knowledge of the difficult-to-determine site boundary condition is not required. To test the general applicability of the theory, data generated during high-precision intermediate-scale experiments (i.e., a scale intermediary to the field and column scales) in large synthetic aquifers can be used. The design of such experiments is not trivial as laboratory conditions have to be selected to mimic natural systems in order to provide useful data, thus requiring a variety of sensors and data collection strategies. This paper presents the design of such an experiment in a synthetic, multi-layered aquifer with dimensions of 242.7 x 119.3 x 7.7 cm3. Different experimental scenarios that will generate data to validate the theory are presented.
A business rules design framework for a pharmaceutical validation and alert system.
Boussadi, A; Bousquet, C; Sabatier, B; Caruba, T; Durieux, P; Degoulet, P
2011-01-01
Several alert systems have been developed to improve the patient safety aspects of clinical information systems (CIS). Most studies have focused on the evaluation of these systems, with little information provided about the methodology leading to system implementation. We propose here an 'agile' business rule design framework (BRDF) supporting both the design of alerts for the validation of drug prescriptions and the incorporation of the end user into the design process. We analyzed the unified process (UP) design life cycle and defined the activities, subactivities, actors and UML artifacts that could be used to enhance the agility of the proposed framework. We then applied the proposed framework to two different sets of data in the context of the Georges Pompidou University Hospital (HEGP) CIS. We introduced two new subactivities into UP: business rule specification and business rule instantiation activity. The pharmacist made an effective contribution to five of the eight BRDF design activities. Validation of the two new subactivities was effected in the context of drug dosage adaption to the patients' clinical and biological contexts. Pilot experiment shows that business rules modeled with BRDF and implemented as an alert system triggered an alert for 5824 of the 71,413 prescriptions considered (8.16%). A business rule design framework approach meets one of the strategic objectives for decision support design by taking into account three important criteria posing a particular challenge to system designers: 1) business processes, 2) knowledge modeling of the context of application, and 3) the agility of the various design steps.
NASA Astrophysics Data System (ADS)
Popov, Boris A.
2013-02-01
The HARP and NA61/SHINE hadroproduction experiments as well as their implications for neutrino physics are discussed. HARP measurements have already been used for predictions of neutrino beams in K2K and MiniBooNE/SciBooNE experiments and are also being used to improve the atmospheric neutrino flux predictions and to help in the optimization of neutrino factory and super-beam designs. First measurements released recently by the NA61/SHINE experiment are of significant importance for a precise prediction of the J-PARC neutrino beam used for the T2K experiment. Both HARP and NA61/SHINE experiments provide also a large amount of input for validation and tuning of hadron production models in Monte-Carlo generators.
NASA Astrophysics Data System (ADS)
Betti, R.
2017-10-01
The 1-D campaign on OMEGA is aimed at validating a novel approach to design cryogenic implosion experiments and provide valuable data to improve the accuracy of 1-D physics models. This new design methodology is being tested first on low-convergence, high-adiabat (α 6 to 7) implosions and will subsequently be applied to implosions with increasing convergence up to the level required for a hydro-equivalent demonstration of ignition. This design procedure assumes that the hydrodynamic codes used in implosion designs lack the necessary physics and that measurements of implosion properties are imperfect. It also assumes that while the measurements may have significant systematic errors, the shot-to-shot variations are small and that cryogenic implosion data are reproducible as observed on OMEGA. One of the goals of the 1-D campaign is to find a mapping of the data to the code results and use the mapping relations to design future implosions. In the 1-D campaign, this predictive methodology was used to design eight implosions using a simple two-shock pulse design, leading to pre-shot predictions of yields within 5% and ion temperatures within 4% of the experimental values. These implosions have also produced the highest neutron yield of 1014 in OMEGA cryogenic implosion experiments with an areal density of 100 mg/cm2. Furthermore, the results from this campaign have been used to test the validity of the 1-D physics models used in the radiation-hydrodynamics codes. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DENA0001944 and LLNL under Contract DE-AC52-07NA27344. * In collaboration with J.P. Knauer, V. Gopalaswamy, D. Patel, K.M. Woo, K.S. Anderson, A. Bose, A.R. Christopherson, V.Yu. Glebov, F.J. Marshall, S.P. Regan, P.B. Radha, C. Stoeckl, and E.M. Campbell.
Design and control of a vertical takeoff and landing fixed-wing unmanned aerial vehicle
NASA Astrophysics Data System (ADS)
Malang, Yasir
With the goal of extending capabilities of multi-rotor unmanned aerial vehicles (UAVs) for wetland conservation missions, a novel hybrid aircraft design consisting of four tilting rotors and a fixed wing is designed and built. The tilting rotors and nonlinear aerodynamic effects introduce a control challenge for autonomous flight, and the research focus is to develop and validate an autonomous transition flight controller. The overall controller structure consists of separate cascaded Proportional Integral Derivative (PID) controllers whose gains are scheduled according to the rotors' tilt angle. A control mechanism effectiveness factor is used to mix the multi-rotor and fixed-wing control actuators during transition. A nonlinear flight dynamics model is created and transition stability is shown through MATLAB simulations, which proves gain-scheduled control is a good fit for tilt-rotor aircraft. Experiments carried out using the prototype UAV validate simulation results for VTOL and tilted-rotor flight.
Design and Experimental Validation of a USBL Underwater Acoustic Positioning System.
Reis, Joel; Morgado, Marco; Batista, Pedro; Oliveira, Paulo; Silvestre, Carlos
2016-09-14
This paper presents the steps for developing a low-cost POrtableNavigation Tool for Underwater Scenarios (PONTUS) to be used as a localization device for subsea targets. PONTUS consists of an integrated ultra-short baseline acoustic positioning system aided by an inertial navigation system. Built on a practical design, it can be mounted on an underwater robotic vehicle or be operated by a scuba diver. It also features a graphical user interface that provides information on the tracking of the designated target, in addition to some details on the physical properties inside PONTUS. A full disclosure of the architecture of the tool is first presented, followed by thorough technical descriptions of the hardware components ensemble and the software development process. A series of experiments was carried out to validate the developed prototype, and the results are presented herein, which allow assessing its overall performance.
Design and Experimental Validation of a USBL Underwater Acoustic Positioning System
Reis, Joel; Morgado, Marco; Batista, Pedro; Oliveira, Paulo; Silvestre, Carlos
2016-01-01
This paper presents the steps for developing a low-cost POrtableNavigation Tool for Underwater Scenarios (PONTUS) to be used as a localization device for subsea targets. PONTUS consists of an integrated ultra-short baseline acoustic positioning system aided by an inertial navigation system. Built on a practical design, it can be mounted on an underwater robotic vehicle or be operated by a scuba diver. It also features a graphical user interface that provides information on the tracking of the designated target, in addition to some details on the physical properties inside PONTUS. A full disclosure of the architecture of the tool is first presented, followed by thorough technical descriptions of the hardware components ensemble and the software development process. A series of experiments was carried out to validate the developed prototype, and the results are presented herein, which allow assessing its overall performance. PMID:27649181
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hassan, Tasnim; Lissenden, Cliff; Carroll, Laura
The proposed research will develop systematic sets of uniaxial and multiaxial experimental data at a very high temperature (850-950°C) for Alloy 617. The loading histories to be prescribed in the experiments will induce creep-fatigue and creep-ratcheting failure mechanisms. These experimental responses will be scrutinized in order to quantify the influences of temperature and creep on fatigue and ratcheting failures. A unified constitutive model (UCM) will be developed and validated against these experimental responses. The improved UCM will be incorporated into the widely used finite element commercial software packages ANSYS. The modified ANSYS will be validated so that it can bemore » used for evaluating the very high temperature ASME-NH design-by-analysis methodology for Alloy 617 and thereby addressing the ASME-NH design code issues.« less
Calibration of X-ray spectrometers for opacity experiments at the Orion laser facility (invited).
Bentley, C; Allan, P; Brent, K; Bruce, N; Hoarty, D; Meadowcroft, A; Percival, J; Opie, C
2016-11-01
Accurately calibrated and characterised x-ray diagnostics are a key requirement in the fielding of experiments on the Orion laser where absolute measurements of x-ray emission are used to underpin the validity of models of emissivity and opacity. Diffraction crystals are used in spectrometers on Orion to record the dispersed spectral features emitted by the laser produced plasma to obtain a measurement of the plasma conditions. The ability to undertake diffraction crystal calibrations supports the successful outcome of these Orion experiments. This paper details the design and commissioning of a system to undertake these calibrations in the energy range 2.0 keV to approximately 8.5 keV. Improvements to the design are detailed which will extend the commissioned range of energies to below 1 keV.
Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T
2012-08-01
InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.
TREAT Neutronics Analysis and Design Support, Part II: Multi-SERTTA-CAL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Woolstenhulme, Nicolas E.; Hill, Connie M.
2016-08-01
Experiment vehicle design is necessary in preparation for Transient Reactor Test (TREAT) facility restart and the resumption of transient testing to support Accident Tolerant Fuel (ATF) characterization and other future fuels testing requirements. Currently the most mature vehicle design is the Multi-SERTTA (Static Environments Rodlet Transient Test Apparatuses), which can accommodate up to four concurrent rodlet-sized specimens under separate environmental conditions. Robust test vehicle design requires neutronics analyses to support design development, optimization of the power coupling factor (PCF) to efficiently maximize energy generation in the test fuel rodlets, and experiment safety analyses. In integral aspect of prior TREAT transientmore » testing was the incorporation of calibration experiments to experimentally evaluate and validate test conditions in preparation of the actual fuel testing. The calibration experiment package established the test parameter conditions to support fine-tuning of the computational models to deliver the required energy deposition to the fuel samples. The calibration vehicle was designed to be as near neutronically equivalent to the experiment vehicle as possible to minimize errors between the calibration and final tests. The Multi-SERTTA-CAL vehicle was designed to serve as the calibration vehicle supporting Multi-SERTTA experimentation. Models of the Multi-SERTTA-CAL vehicle containing typical PWR-fuel rodlets were prepared and neutronics calculations were performed using MCNP6.1 with ENDF/B-VII.1 nuclear data libraries; these results were then compared against those performed for Multi-SERTTA to determine the similarity and possible design modification necessary prior to construction of these experiment vehicles. The estimated reactivity insertion worth into the TREAT core is very similar between the two vehicle designs, with the primary physical difference being a hollow Inconel tube running down the length of the calibration vehicle. Calculations of PCF indicate that on average there is a reduction of approximately 6.3 and 12.6%, respectively, for PWR fuel rodlets irradiated under wet and dry conditions. Changes to the primary or secondary vessel structure in the calibration vehicle can be performed to offset this discrepancy and maintain neutronic equivalency. Current possible modifications to the calibration vehicle include reduction of the primary vessel wall thickness, swapping Zircaloy-4 for stainless steel 316 in the secondary containment, or slight modification to the temperature and pressure of the water environment within the primary vessel. Removal of some of the instrumentation within the calibration vehicle can also serve to slightly increase the PCF. Future efforts include further modification and optimization of the Multi-SERTTA and Multi-SERTTA-CAL designs in preparation of actual TREAT transient testing. Experimental results from both test vehicles will be compared against calculational results and methods to provide validation and support additional neutronics analyses.« less
Validation of the CQU-DTU-LN1 series of airfoils
NASA Astrophysics Data System (ADS)
Shen, W. Z.; Zhu, W. J.; Fischer, A.; Garcia, N. R.; Cheng, J. T.; Chen, J.; Madsen, J.
2014-12-01
The CQU-DTU-LN1 series of airfoils were designed with an objective of high lift and low noise emission. In the design process, the aerodynamic performance is obtained using XFOIL while noise emission is obtained with the BPM model. In this paper we present some validations of the designed CQU-DTU-LN118 airfoil by using wind tunnel measurements in the acoustic wind tunnel located at Virginia Tech and numerical computations with the inhouse Q3uic and EllipSys 2D/3D codes. To show the superiority of the new airfoils, comparisons with a NACA64618 airfoil are made. For the aerodynamic features, the designed Cl and Cl/Cd agrees well with the experiment and are in general higher than those of the NACA airfoil. For the acoustic features, the noise emission of the LN118 airfoil is compared with the acoustic measurements and that of the NACA airfoil. Comparisons show that the BPM model can predict correctly the noise changes.
NASA Astrophysics Data System (ADS)
Johnson, Maike; Hübner, Stefan; Reichmann, Carsten; Schönberger, Manfred; Fiß, Michael
2017-06-01
Energy storage systems are a key technology for developing a more sustainable energy supply system and lowering overall CO2 emissions. Among the variety of storage technologies, high temperature phase change material (PCM) storage is a promising option with a wide range of applications. PCM storages using an extended finned tube storage concept have been designed and techno-economically optimized for solar thermal power plant operations. These finned tube components were experimentally tested in order to validate the optimized design and simulation models used. Analysis of the charging and discharging characteristics of the storage at the pilot scale gives insight into the heat distribution both axially as well as radially in the storage material, thereby allowing for a realistic validation of the design. The design was optimized for discharging of the storage, as this is the more critical operation mode in power plant applications. The data show good agreement between the model and the experiments for discharging.
Design of a Synthetic Aperture Array to Support Experiments in Active Control of Scattering
1990-06-01
becomes necessary to validate the theory and test the control system algorithms . While experiments in open water would be most like the anticipated...mathematical development of the beamforming algorithms used as well as an estimate of their applicability to the specifics of beamforming in a reverberant...Chebyshev array have been proposed. The method used in ARRAY, a nested product algorithm , proposed by Bresler [21] is recommended by Pozar [19] and
Blast Load Simulator Experiments for Computational Model Validation Report 3
2017-07-01
establish confidence in the results produced by the simulations. This report describes a set of replicate experiments in which a small, non - responding steel...designed to simulate blast waveforms for explosive yields up to 20,000 lb of TNT equivalent at a peak reflected pressure up to 80 psi and a peak...the pressure loading on a non - responding box-type structure at varying obliquities located in the flow of the BLS simulated blast environment for
Design of a low parasitic inductance SiC power module with double-sided cooling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Fei; Liang, Zhenxian; Wang, Fei
In this paper, a low-parasitic inductance SiC power module with double-sided cooling is designed and compared with a baseline double-sided cooled module. With the unique 3D layout utilizing vertical interconnection, the power loop inductance is effectively reduced without sacrificing the thermal performance. Both simulations and experiments are carried out to validate the design. Q3D simulation results show a power loop inductance of 1.63 nH, verified by the experiment, indicating more than 60% reduction of power loop inductance compared with the baseline module. With 0Ω external gate resistance turn-off at 600V, the voltage overshoot is less than 9% of the busmore » voltage at a load of 44.6A.« less
Ferrante, Jeanne M; Friedman, Asia; Shaw, Eric K; Howard, Jenna; Cohen, Deborah J; Shahidi, Laleh
2015-10-18
While an increasing number of researchers are using online discussion forums for qualitative research, few authors have documented their experiences and lessons learned to demonstrate this method's viability and validity in health services research. We comprehensively describe our experiences, from start to finish, of designing and using an asynchronous online discussion forum for collecting and analyzing information elicited from care coordinators in Patient-Centered Medical Homes across the United States. Our lessons learned from each phase, including planning, designing, implementing, using, and ending this private online discussion forum, provide some recommendations for other health services researchers considering this method. An asynchronous online discussion forum is a feasible, efficient, and effective method to conduct a qualitative study, particularly when subjects are health professionals. © The Author(s) 2015.
Lessons Learned Designing and Using an Online Discussion Forum for Care Coordinators in Primary Care
Ferrante, Jeanne M.; Friedman, Asia; Shaw, Eric K.; Howard, Jenna; Cohen, Deborah J.; Shahidi, Laleh
2016-01-01
While an increasing number of researchers are using online discussion forums for qualitative research, few authors have documented their experiences and lessons learned to demonstrate this method’s viability and validity in health services research. We comprehensively describe our experiences, from start to finish, of designing and using an asynchronous online discussion forum for collecting and analyzing information elicited from care coordinators in Patient-Centered Medical Homes across the United States. Our lessons learned from each phase, including planning, designing, implementing, using, and ending this private online discussion forum, provide some recommendations for other health services researchers considering this method. An asynchronous online discussion forum is a feasible, efficient, and effective method to conduct a qualitative study, particularly when subjects are health professionals. PMID:26481942
Preliminary Results from the AFRL-NASA W/V-Band Terrestrial Link Experiment in Albuquerque, NM
NASA Technical Reports Server (NTRS)
Zemba, Michael; Nessel, James; Houts, Jacquelynne; Tarasenko, Nicholas; Lane, Steven; Murrell, David
2016-01-01
Atmospheric propagation models and the measurements that train them are critical to the design of efficient and effective space-ground links. As communication systems advance to higher frequencies in search of higher data rates and open spectrum, a lack of data at these frequencies necessitates new measurements to properly develop, validate, and refine the models used for link budgeting and system design. In collaboration with the Air Force Research Laboratory (AFRL), NASA Glenn Research Center has deployed the WV-band Terrestrial Link Experiment (WTLE) in Albuquerque, NM to conduct a measurement campaign at 72 and 84 GHz, among the first atmospheric propagation measurements at these frequencies. WTLE has been operational since October 1, 2015, and the system design shall be herein discussed alongside preliminary results and performance.
NASA Astrophysics Data System (ADS)
Beyrich, F.; Bange, J.; Hartogensis, O.; Raasch, S.
2009-09-01
The turbulent exchange of heat and water vapour are essential land surface - atmosphere interaction processes in the local, regional and global energy and water cycles. Scintillometry can be considered as the only technique presently available for the quasi-operational experimental determination of area-averaged turbulent fluxes needed to validate the fluxes simulated by regional atmospheric models or derived from satellite images at a horizontal scale of a few kilometres. While scintillometry has found increasing application over the last years, some fundamental issues related to its use still need further investigation. In particular, no studies are known so far to reproduce the path-averaged structure parameters measured by scintillometers by independent measurements or modelling techniques. The LITFASS-2009 field experiment has been performed in the area around the Meteorological Observatory Lindenberg / Richard-Aßmann-Observatory in Germany during summer 2009. It was designed to investigate the spatial (horizontal and vertical) and temporal variability of structure parameters (underlying the scintillometer principle) over moderately heterogeneous terrain. The experiment essentially relied on a coupling of eddy-covariance measurements, scintillometry and airborne measurements with an unmanned autonomous aircraft able to strictly fly along the scintillometer path. Data interpretation will be supported by numerical modelling using a large-eddy simulation (LES) model. The paper will describe the design of the experiment. First preliminary results from the measurements will be presented.
NASA Technical Reports Server (NTRS)
Sinha, Neeraj
2014-01-01
This Phase II project validated a state-of-the-art LES model, coupled with a Ffowcs Williams-Hawkings (FW-H) far-field acoustic solver, to support the development of advanced engine concepts. These concepts include innovative flow control strategies to attenuate jet noise emissions. The end-to-end LES/ FW-H noise prediction model was demonstrated and validated by applying it to rectangular nozzle designs with a high aspect ratio. The model also was validated against acoustic and flow-field data from a realistic jet-pylon experiment, thereby significantly advancing the state of the art for LES.
Morsink, Maarten C; Dukers, Danny F
2009-03-01
Animal models have been widely used for studying the physiology and pharmacology of psychiatric and neurological diseases. The concepts of face, construct, and predictive validity are used as indicators to estimate the extent to which the animal model mimics the disease. Currently, we used these three concepts to design a theoretical assignment to integrate the teaching of neurophysiology, neuropharmacology, and experimental design. For this purpose, seven case studies were developed in which animal models for several psychiatric and neurological diseases were described and in which neuroactive drugs used to treat or study these diseases were introduced. Groups of undergraduate students were assigned to one of these case studies and asked to give a classroom presentation in which 1) the disease and underlying pathophysiology are described, 2) face and construct validity of the animal model are discussed, and 3) a pharmacological experiment with the associated neuroactive drug to assess predictive validity is presented. After evaluation of the presentations, we found that the students had gained considerable insight into disease phenomenology, its underlying neurophysiology, and the mechanism of action of the neuroactive drug. Moreover, the assignment was very useful in the teaching of experimental design, allowing an in-depth discussion of experimental control groups and the prediction of outcomes in these groups if the animal model were to display predictive validity. Finally, the highly positive responses in the student evaluation forms indicated that the assignment was of great interest to the students. Hence, the currently developed case studies constitute a very useful tool for teaching neurophysiology, neuropharmacology, and experimental design.
Shanks, Ryan A.; Robertson, Chuck L.; Haygood, Christian S.; Herdliksa, Anna M.; Herdliska, Heather R.; Lloyd, Steven A.
2017-01-01
Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model’s ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads. PMID:28904647
NASA Astrophysics Data System (ADS)
Kamaruzaman, N. F.; Abdullah, E. J.
2017-12-01
Shape memory alloy (SMA) actuator offers great solution for aerospace applications with low weight being its most attractive feature. A SMA actuation mechanism for the flapping micro unmanned aerial vehicle (MAV) is proposed in this study, where SMA material is the primary system that provides the flapping motion to the wings. Based on several established design criteria, a design prototype has been fabricated to validate the design. As a proof of concept, an experiment is performed using an electrical circuit to power the SMA actuator to evaluate the flapping angle. During testing, several problems have been observed and their solutions for future development are proposed. Based on the experiment, the average recorded flapping wing angle is 14.33° for upward deflection and 12.12° for downward deflection. This meets the required design criteria and objective set forth for this design. The results prove the feasibility of employing SMA actuators in flapping wing MAV.
Development and Validation of the Elder Learning Barriers Scale Among Older Chinese Adults.
Wang, Renfeng; De Donder, Liesbeth; De Backer, Free; He, Tao; Van Regenmortel, Sofie; Li, Shihua; Lombaerts, Koen
2017-12-01
This study describes the development and validation of the Elder Learning Barriers (ELB) scale, which seeks to identify the obstacles that affect the level of educational participation of older adults. The process of item pool design and scale development is presented, as well as the testing and scale refinement procedure. The data were collected from a sample of 579 older Chinese adults (aged over 55) in the Xi'an region of China. After randomly splitting the sample for cross-validation purposes, the construct validity of the ELB scale was confirmed containing five dimensions: dispositional, informational, physical, situational, and institutional barriers. Furthermore, developmental differences in factor structure have been examined among older age groups. The results indicated that the scale demonstrated good reliability and validity. We conclude in general that the ELB scale appears to be a valuable instrument for examining the learning barriers that older Chinese citizens experience for participating in organized educational activities.
Ego-Dissolution and Psychedelics: Validation of the Ego-Dissolution Inventory (EDI)
Nour, Matthew M.; Evans, Lisa; Nutt, David; Carhart-Harris, Robin L.
2016-01-01
Aims: The experience of a compromised sense of “self”, termed ego-dissolution, is a key feature of the psychedelic experience. This study aimed to validate the Ego-Dissolution Inventory (EDI), a new 8-item self-report scale designed to measure ego-dissolution. Additionally, we aimed to investigate the specificity of the relationship between psychedelics and ego-dissolution. Method: Sixteen items relating to altered ego-consciousness were included in an internet questionnaire; eight relating to the experience of ego-dissolution (comprising the EDI), and eight relating to the antithetical experience of increased self-assuredness, termed ego-inflation. Items were rated using a visual analog scale. Participants answered the questionnaire for experiences with classical psychedelic drugs, cocaine and/or alcohol. They also answered the seven questions from the Mystical Experiences Questionnaire (MEQ) relating to the experience of unity with one’s surroundings. Results: Six hundred and ninety-one participants completed the questionnaire, providing data for 1828 drug experiences (1043 psychedelics, 377 cocaine, 408 alcohol). Exploratory factor analysis demonstrated that the eight EDI items loaded exclusively onto a single common factor, which was orthogonal to a second factor comprised of the items relating to ego-inflation (rho = −0.110), demonstrating discriminant validity. The EDI correlated strongly with the MEQ-derived measure of unitive experience (rho = 0.735), demonstrating convergent validity. EDI internal consistency was excellent (Cronbach’s alpha 0.93). Three analyses confirmed the specificity of ego-dissolution for experiences occasioned by psychedelic drugs. Firstly, EDI score correlated with drug-dose for psychedelic drugs (rho = 0.371), but not for cocaine (rho = 0.115) or alcohol (rho = −0.055). Secondly, the linear regression line relating the subjective intensity of the experience to ego-dissolution was significantly steeper for psychedelics (unstandardized regression coefficient = 0.701) compared with cocaine (0.135) or alcohol (0.144). Ego-inflation, by contrast, was specifically associated with cocaine experiences. Finally, a binary Support Vector Machine classifier identified experiences occasioned by psychedelic drugs vs. cocaine or alcohol with over 85% accuracy using ratings of ego-dissolution and ego-inflation alone. Conclusion: Our results demonstrate the psychometric structure, internal consistency and construct validity of the EDI. Moreover, we demonstrate the close relationship between ego-dissolution and the psychedelic experience. The EDI will facilitate the study of the neuronal correlates of ego-dissolution, which is relevant for psychedelic-assisted psychotherapy and our understanding of psychosis. PMID:27378878
NASA Astrophysics Data System (ADS)
Nakamura, Hiroo; Riccardi, B.; Loginov, N.; Ara, K.; Burgazzi, L.; Cevolani, S.; Dell'Orco, G.; Fazio, C.; Giusti, D.; Horiike, H.; Ida, M.; Ise, H.; Kakui, H.; Matsui, H.; Micciche, G.; Muroga, T.; Nakamura, Hideo; Shimizu, K.; Sugimoto, M.; Suzuki, A.; Takeuchi, H.; Tanaka, S.; Yoneoka, T.
2004-08-01
During the three year key element technology phase of the International Fusion Materials Irradiation Facility (IFMIF) project, completed at the end of 2002, key technologies have been validated. In this paper, these results are summarized. A water jet experiment simulating Li flow validated stable flow up to 20 m/s with a double reducer nozzle. In addition, a small Li loop experiment validated stable Li flow up to 14 m/s. To control the nitrogen content in Li below 10 wppm will require surface area of a V-Ti alloy getter of 135 m 2. Conceptual designs of diagnostics have been carried out. Moreover, the concept of a remote handling system to replace the back wall based on `cut and reweld' and `bayonet' options has been established. Analysis by FMEA showed safe operation of the target system. Recent activities in the transition phase, started in 2003, and plan for the next phase are also described.
Ferreira, João C. P.; Fujihara, Caroline J.; Fruhvald, Erika; Trevisol, Eduardo; Destro, Flavia C.; Teixeira, Carlos R.; Pantoja, José C. F.; Schmidt, Elizabeth M. S.; Palme, Rupert
2015-01-01
Parrots kept in zoos and private households often develop psychological and behavioural disorders. Despite knowing that such disorders have a multifactorial aetiology and that chronic stress is involved, little is known about their development mainly due to a poor understanding of the parrots’ physiology and the lack of validated methods to measure stress in these species. In birds, blood corticosterone concentrations provide information about adrenocortical activity. However, blood sampling techniques are difficult, highly invasive and inappropriate to investigate stressful situations and welfare conditions. Thus, a non-invasive method to measure steroid hormones is critically needed. Aiming to perform a physiological validation of a cortisone enzyme immunoassay (EIA) to measure glucocorticoid metabolites (GCM) in droppings of 24 Blue-fronted parrots (Amazona aestiva), two experiments were designed. During the experiments all droppings were collected at 3-h intervals. Initially, birds were sampled for 24 h (experiment 1) and one week later assigned to four different treatments (experiment 2): Control (undisturbed), Saline (0.2 mL of 0.9% NaCl IM), Dexamethasone (1 mg/kg IM) and Adrenocorticotropic hormone (ACTH; 25 IU IM). Treatments (always one week apart) were applied to all animals in a cross-over study design. A daily rhythm pattern in GCM excretion was detected but there were no sex differences (first experiment). Saline and dexamethasone treatments had no effect on GCM (not different from control concentrations). Following ACTH injection, GCM concentration increased about 13.1-fold (median) at the peak (after 3–9 h), and then dropped to pre-treatment concentrations. By a successful physiological validation, we demonstrated the suitability of the cortisone EIA to non-invasively monitor increased adrenocortical activity, and thus, stress in the Blue-fronted parrot. This method opens up new perspectives for investigating the connection between behavioural disorders and stress in this bird species, and could also help in their captive management. PMID:26717147
Ferreira, João C P; Fujihara, Caroline J; Fruhvald, Erika; Trevisol, Eduardo; Destro, Flavia C; Teixeira, Carlos R; Pantoja, José C F; Schmidt, Elizabeth M S; Palme, Rupert
2015-01-01
Parrots kept in zoos and private households often develop psychological and behavioural disorders. Despite knowing that such disorders have a multifactorial aetiology and that chronic stress is involved, little is known about their development mainly due to a poor understanding of the parrots' physiology and the lack of validated methods to measure stress in these species. In birds, blood corticosterone concentrations provide information about adrenocortical activity. However, blood sampling techniques are difficult, highly invasive and inappropriate to investigate stressful situations and welfare conditions. Thus, a non-invasive method to measure steroid hormones is critically needed. Aiming to perform a physiological validation of a cortisone enzyme immunoassay (EIA) to measure glucocorticoid metabolites (GCM) in droppings of 24 Blue-fronted parrots (Amazona aestiva), two experiments were designed. During the experiments all droppings were collected at 3-h intervals. Initially, birds were sampled for 24 h (experiment 1) and one week later assigned to four different treatments (experiment 2): Control (undisturbed), Saline (0.2 mL of 0.9% NaCl IM), Dexamethasone (1 mg/kg IM) and Adrenocorticotropic hormone (ACTH; 25 IU IM). Treatments (always one week apart) were applied to all animals in a cross-over study design. A daily rhythm pattern in GCM excretion was detected but there were no sex differences (first experiment). Saline and dexamethasone treatments had no effect on GCM (not different from control concentrations). Following ACTH injection, GCM concentration increased about 13.1-fold (median) at the peak (after 3-9 h), and then dropped to pre-treatment concentrations. By a successful physiological validation, we demonstrated the suitability of the cortisone EIA to non-invasively monitor increased adrenocortical activity, and thus, stress in the Blue-fronted parrot. This method opens up new perspectives for investigating the connection between behavioural disorders and stress in this bird species, and could also help in their captive management.
Model Development and Model-Based Control Design for High Performance Nonlinear Smart Systems
2007-11-20
potentially impact a broad range of flow control problems of interest to the Air Force and Boeing. Point of contact: James Mabe , Boeing Phantom Works...rotorcraft blades. In both cases, models and control designs will be validated using data from Boeing experiments and flight tests. Point of contact: James ... Mabe , Boeing Phantom Works, Seattle, WA, 206-655-0091. 3. PZT Unimorphs – Boeing: Nonlinear structural models developed through AFOSR support are being
NASA Technical Reports Server (NTRS)
Martini, W. R.
1980-01-01
Four fully disclosed reference engines and five design methods are discussed. So far, the agreement between theory and experiment is about as good for the simpler calculation methods as it is for the more complicated methods, that is, within 20%. For the simpler methods, a one number adjustable constant can be used to reduce the error in predicting power output and efficiency over the entire operating map to less than 10%.
LANDSAT D instrument module study
NASA Technical Reports Server (NTRS)
1976-01-01
Spacecraft instrument module configurations which support an earth resource data gathering mission using a thematic mapper sensor were examined. The differences in size of these two experiments necessitated the development of two different spacecraft configurations. Following the selection of the best-suited configurations, a validation phase of design, analysis and modelling was conducted to verify feasibility. The chosen designs were then used to formulate definition for a systems weight, a cost range for fabrication and interface requirements for the thematic mapper (TM).
NASA Technical Reports Server (NTRS)
Stefanescu, D. M.; Catalina, A. V.; Juretzko, Frank R.; Sen, Subhayu; Curreri, P. A.
2003-01-01
The objective of the work on Particle Engulfment and Pushing by Solidifying Interfaces (PEP) include: 1) to obtain fundamental understanding of the physics of particle pushing and engulfment, 2) to develop mathematical models to describe the phenomenon, and 3) to perform critical experiments in the microgravity environment of space to provide benchmark data for model validation. Successful completion of this project will yield vital information relevant to a diverse area of terrestrial applications. With PEP being a long term research effort, this report will focus on advances in the theoretical treatment of the solid/liquid interface interaction with an approaching particle, experimental validation of some aspects of the developed models, and the experimental design aspects of future experiments to be performed on board the International Space Station.
The SCALE Verified, Archived Library of Inputs and Data - VALID
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Rearden, Bradley T
The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
Prediction of Cutting Force in Turning Process-an Experimental Approach
NASA Astrophysics Data System (ADS)
Thangarasu, S. K.; Shankar, S.; Thomas, A. Tony; Sridhar, G.
2018-02-01
This Paper deals with a prediction of Cutting forces in a turning process. The turning process with advanced cutting tool has a several advantages over grinding such as short cycle time, process flexibility, compatible surface roughness, high material removal rate and less environment problems without the use of cutting fluid. In this a full bridge dynamometer has been used to measure the cutting forces over mild steel work piece and cemented carbide insert tool for different combination of cutting speed, feed rate and depth of cut. The experiments are planned based on taguchi design and measured cutting forces were compared with the predicted forces in order to validate the feasibility of the proposed design. The percentage contribution of each process parameter had been analyzed using Analysis of Variance (ANOVA). Both the experimental results taken from the lathe tool dynamometer and the designed full bridge dynamometer were analyzed using Taguchi design of experiment and Analysis of Variance.
A Model for Designing Adaptive Laboratory Evolution Experiments.
LaCroix, Ryan A; Palsson, Bernhard O; Feist, Adam M
2017-04-15
The occurrence of mutations is a cornerstone of the evolutionary theory of adaptation, capitalizing on the rare chance that a mutation confers a fitness benefit. Natural selection is increasingly being leveraged in laboratory settings for industrial and basic science applications. Despite increasing deployment, there are no standardized procedures available for designing and performing adaptive laboratory evolution (ALE) experiments. Thus, there is a need to optimize the experimental design, specifically for determining when to consider an experiment complete and for balancing outcomes with available resources (i.e., laboratory supplies, personnel, and time). To design and to better understand ALE experiments, a simulator, ALEsim, was developed, validated, and applied to the optimization of ALE experiments. The effects of various passage sizes were experimentally determined and subsequently evaluated with ALEsim, to explain differences in experimental outcomes. Furthermore, a beneficial mutation rate of 10 -6.9 to 10 -8.4 mutations per cell division was derived. A retrospective analysis of ALE experiments revealed that passage sizes typically employed in serial passage batch culture ALE experiments led to inefficient production and fixation of beneficial mutations. ALEsim and the results described here will aid in the design of ALE experiments to fit the exact needs of a project while taking into account the resources required and will lower the barriers to entry for this experimental technique. IMPORTANCE ALE is a widely used scientific technique to increase scientific understanding, as well as to create industrially relevant organisms. The manner in which ALE experiments are conducted is highly manual and uniform, with little optimization for efficiency. Such inefficiencies result in suboptimal experiments that can take multiple months to complete. With the availability of automation and computer simulations, we can now perform these experiments in an optimized fashion and can design experiments to generate greater fitness in an accelerated time frame, thereby pushing the limits of what adaptive laboratory evolution can achieve. Copyright © 2017 American Society for Microbiology.
Enhanced Missing Proteins Detection in NCI60 Cell Lines Using an Integrative Search Engine Approach.
Guruceaga, Elizabeth; Garin-Muga, Alba; Prieto, Gorka; Bejarano, Bartolomé; Marcilla, Miguel; Marín-Vicente, Consuelo; Perez-Riverol, Yasset; Casal, J Ignacio; Vizcaíno, Juan Antonio; Corrales, Fernando J; Segura, Victor
2017-12-01
The Human Proteome Project (HPP) aims deciphering the complete map of the human proteome. In the past few years, significant efforts of the HPP teams have been dedicated to the experimental detection of the missing proteins, which lack reliable mass spectrometry evidence of their existence. In this endeavor, an in depth analysis of shotgun experiments might represent a valuable resource to select a biological matrix in design validation experiments. In this work, we used all the proteomic experiments from the NCI60 cell lines and applied an integrative approach based on the results obtained from Comet, Mascot, OMSSA, and X!Tandem. This workflow benefits from the complementarity of these search engines to increase the proteome coverage. Five missing proteins C-HPP guidelines compliant were identified, although further validation is needed. Moreover, 165 missing proteins were detected with only one unique peptide, and their functional analysis supported their participation in cellular pathways as was also proposed in other studies. Finally, we performed a combined analysis of the gene expression levels and the proteomic identifications from the common cell lines between the NCI60 and the CCLE project to suggest alternatives for further validation of missing protein observations.
Enhanced Missing Proteins Detection in NCI60 Cell Lines Using an Integrative Search Engine Approach
2017-01-01
The Human Proteome Project (HPP) aims deciphering the complete map of the human proteome. In the past few years, significant efforts of the HPP teams have been dedicated to the experimental detection of the missing proteins, which lack reliable mass spectrometry evidence of their existence. In this endeavor, an in depth analysis of shotgun experiments might represent a valuable resource to select a biological matrix in design validation experiments. In this work, we used all the proteomic experiments from the NCI60 cell lines and applied an integrative approach based on the results obtained from Comet, Mascot, OMSSA, and X!Tandem. This workflow benefits from the complementarity of these search engines to increase the proteome coverage. Five missing proteins C-HPP guidelines compliant were identified, although further validation is needed. Moreover, 165 missing proteins were detected with only one unique peptide, and their functional analysis supported their participation in cellular pathways as was also proposed in other studies. Finally, we performed a combined analysis of the gene expression levels and the proteomic identifications from the common cell lines between the NCI60 and the CCLE project to suggest alternatives for further validation of missing protein observations. PMID:28960077
Anke, Audny; Manskow, Unn Sollid; Friborg, Oddgeir; Røe, Cecilie; Arntzen, Cathrine
2016-11-28
Family members are important for support and care of their close relative after severe traumas, and their experiences are vital health care quality indicators. The objective was to describe the development of the Family Experiences of in-hospital Care Questionnaire for family members of patients with severe Traumatic Brain Injury (FECQ-TBI), and to evaluate its psychometric properties and validity. The design of the study is a Norwegian multicentre study inviting 171 family members. The questionnaire developmental process included a literature review, use of an existing instrument (the parent experience of paediatric care questionnaire), focus group with close family members, as well as expert group judgments. Items asking for family care experiences related to acute wards and rehabilitation were included. Several items of the paediatric care questionnaire were removed or the wording of the items was changed to comply with the present purpose. Questions covering experiences with the inpatient rehabilitation period, the discharge phase, the family experiences with hospital facilities, the transfer between departments and the economic needs of the family were added. The developed questionnaire was mailed to the participants. Exploratory factor analyses were used to examine scale structure, in addition to screening for data quality, and analyses of internal consistency and validity. The questionnaire was returned by 122 (71%) of family members. Principal component analysis extracted six dimensions (eigenvalues > 1.0): acute organization and information (10 items), rehabilitation organization (13 items), rehabilitation information (6 items), discharge (4 items), hospital facilities-patients (4 items) and hospital facilities-family (2 items). Items related to the acute phase were comparable to items in the two dimensions of rehabilitation: organization and information. All six subscales had high Cronbach's alpha coefficients >0.80. The construct validity was confirmed. The FECQ-TBI assesses important aspects of in-hospital care in the acute and rehabilitation phases, as seen from a family perspective. The psychometric properties and the construct validity of the questionnaire were good, hence supporting the use of the FECQ-TBI to assess quality of care in rehabilitation departments.
Beliefs Alter Holistic Face Processing…If Response Bias is not Taken into Account
Richler, Jennifer J.; Cheung, Olivia S.; Gauthier, Isabel
2012-01-01
The composite paradigm is widely used to quantify holistic processing (HP) of faces, but there is debate regarding the appropriate design (partial vs. complete) and measures in this task. Here, we argue that some operational definitions of HP are problematic because they are sensitive to top-down influences, even though the underlying concept is assumed to be cognitively impenetrable. In Experiment 1, we told one group of participants that the target face half would remain the same on 75% of trials, and another group that it would change on 75% of trials. The true proportion of same/different trials was 50% - groups only differed in their beliefs about the target halves. In Experiment 2, we manipulated the actual proportion of same/different trials in the experiment (75% of trials were same for one group, 75% of trials were different for another group), but did not give explicit instructions about proportions. In both experiments these manipulations influenced response biases that altered partial design measures of HP while the complete design measure was unaffected. We argue that the partial design should be abandoned because it has poor construct validity. PMID:22101018
Karkar, Ravi; Schroeder, Jessica; Epstein, Daniel A; Pina, Laura R; Scofield, Jeffrey; Fogarty, James; Kientz, Julie A; Munson, Sean A; Vilardaga, Roger; Zia, Jasmine
2017-05-02
Diagnostic self-tracking, the recording of personal information to diagnose or manage a health condition, is a common practice, especially for people with chronic conditions. Unfortunately, many who attempt diagnostic self-tracking have trouble accomplishing their goals. People often lack knowledge and skills needed to design and conduct scientifically rigorous experiments, and current tools provide little support. To address these shortcomings and explore opportunities for diagnostic self-tracking, we designed, developed, and evaluated a mobile app that applies a self-experimentation framework to support patients suffering from irritable bowel syndrome (IBS) in identifying their personal food triggers. TummyTrials aids a person in designing, executing, and analyzing self-experiments to evaluate whether a specific food triggers their symptoms. We examined the feasibility of this approach in a field study with 15 IBS patients, finding that participants could use the tool to reliably undergo a self-experiment. However, we also discovered an underlying tension between scientific validity and the lived experience of self-experimentation. We discuss challenges of applying clinical research methods in everyday life, motivating a need for the design of self-experimentation systems to balance rigor with the uncertainties of everyday life.
Liu, Huolong; Galbraith, S C; Ricart, Brendon; Stanton, Courtney; Smith-Goettler, Brandye; Verdi, Luke; O'Connor, Thomas; Lee, Sau; Yoon, Seongkyu
2017-06-15
In this study, the influence of key process variables (screw speed, throughput and liquid to solid (L/S) ratio) of a continuous twin screw wet granulation (TSWG) was investigated using a central composite face-centered (CCF) experimental design method. Regression models were developed to predict the process responses (motor torque, granule residence time), granule properties (size distribution, volume average diameter, yield, relative width, flowability) and tablet properties (tensile strength). The effects of the three key process variables were analyzed via contour and interaction plots. The experimental results have demonstrated that all the process responses, granule properties and tablet properties are influenced by changing the screw speed, throughput and L/S ratio. The TSWG process was optimized to produce granules with specific volume average diameter of 150μm and the yield of 95% based on the developed regression models. A design space (DS) was built based on volume average granule diameter between 90 and 200μm and the granule yield larger than 75% with a failure probability analysis using Monte Carlo simulations. Validation experiments successfully validated the robustness and accuracy of the DS generated using the CCF experimental design in optimizing a continuous TSWG process. Copyright © 2017 Elsevier B.V. All rights reserved.
Benchmark tests for a Formula SAE Student car prototyping
NASA Astrophysics Data System (ADS)
Mariasiu, Florin
2011-12-01
Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.
ERIC Educational Resources Information Center
Kalyuga, Slava
2008-01-01
Rapid cognitive diagnosis allows measuring current levels of learner domain-specific knowledge in online learning environments. Such measures are required for individualizing instructional support in real time, as students progress through a learning session. This article describes 2 experiments designed to validate a rapid online diagnostic…
Dynamic Assessment: One Approach and Some Initial Data. Technical Report No. 361.
ERIC Educational Resources Information Center
Campione, Joseph C.; Brown, Ann L.
In an effort to validate dynamic assessment methods influenced by Vygotsky's (1978) definition of zones of proximal development (an indicator of readiness), three sets of experiments addressed two goals: the development of diagnostic assessment methods and the use of diagnostic results to guide the design of instructional programs. The first two…
NASA Astrophysics Data System (ADS)
Ali, M.; Supriyatman; Saehana, S.
2018-03-01
It has been successfully designing low cost of science experiment from recycled materials. The science instruments were produced to explain expansion concept and hydrostatic pressure inside the liquid. Science instruments were calibrated and then validated. It was also implemented in science learning.
New Measure for Fathers of Children with Developmental Challenges
ERIC Educational Resources Information Center
Ly, A. R.; Goldberg, W. A.
2014-01-01
Background: There is a relative lack of measures tailored to the study of fathers of children with developmental challenges (DCs). The goal of the current study was to create and validate a brief measure designed to capture the perceptions and experiences of these fathers. The Fathers of Children with Developmental Challenges (FCDC) questionnaire…
Critical Pedagogy: EFL Teachers' Views, Experience and Academic Degrees
ERIC Educational Resources Information Center
Mahmoodarabi, Mahsa; Khodabakhsh, Mohammad Reza
2015-01-01
Although critical pedagogy has brought about positive changes in the field of education by shifting from traditional pedagogy to emancipatory pedagogy, not much attention has been paid to the factors affecting teachers' beliefs of critical pedagogy and only few studies have been conducted to design reliable and valid instruments to study EFL…
ERIC Educational Resources Information Center
Cobos Alvarado, Fabián; Peñaherrera León, Mónica; Ortiz Colon, Ana María
2016-01-01
Universities in Latin American countries are undergoing major changes in its institutional and academic settings. One strategy for continuous improvement of teaching and learning process is the incorporation of methods and teaching aids seeking to develop scientific research skills in students from their undergraduate studies. The aim of this…
A Taxometric Study of the Adult Attachment Interview
ERIC Educational Resources Information Center
Roisman, Glenn I.; Fraley, R. Chris; Belsky, Jay
2007-01-01
This study is the first to examine the latent structure of individual differences reflected in the Adult Attachment Interview (AAI; C. George, N. Kaplan, & M. Main, 1985), a commonly used and well-validated measure designed to assess an adult's current state of mind regarding childhood experiences with caregivers. P. E. Meehl's (1995)…
ERIC Educational Resources Information Center
Floyd, Randy G.; Shands, Elizabeth I.; Alfonso, Vincent C.; Phillips, Jessica F.; Autry, Beth K.; Mosteller, Jessica A.; Skinner, Mary; Irby, Sarah
2015-01-01
Adaptive behavior scales are vital in assessing children and adolescents who experience a range of disabling conditions in school settings. This article presents the results of an evaluation of the design characteristics, norming, scale characteristics, reliability and validity evidence, and bias identification studies supporting 14…
USDA-ARS?s Scientific Manuscript database
The experiment was designed to validate the use of ultrasound to evaluate body composition in mature beef cows. Both precision and accuracy of measurement were assessed. Cull cows (n = 87) selected for highly variable fatness were used. Two experienced ultrasound technicians scanned and assigned ...
ERIC Educational Resources Information Center
Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing
2018-01-01
With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…
NASA Astrophysics Data System (ADS)
Razak, Jeefferie Abd; Ahmad, Sahrim Haji; Ratnam, Chantara Thevy; Mahamood, Mazlin Aida; Yaakub, Juliana; Mohamad, Noraiham
2014-09-01
Fractional 25 two-level factorial design of experiment (DOE) was applied to systematically prepare the NR/EPDM blend using Haake internal mixer set-up. The process model of rubber blend preparation that correlates the relationships between the mixer process input parameters and the output response of blend compatibility was developed. Model analysis of variance (ANOVA) and model fitting through curve evaluation finalized the R2 of 99.60% with proposed parametric combination of A = 30/70 NR/EPDM blend ratio; B = 70°C mixing temperature; C = 70 rpm of rotor speed; D = 5 minutes of mixing period and E = 1.30 phr EPDM-g-MAH compatibilizer addition, with overall 0.966 desirability. Model validation with small deviation at +2.09% confirmed the repeatability of the mixing strategy with valid maximum tensile strength output representing the blend miscibility. Theoretical calculation of NR/EPDM blend compatibility is also included and compared. In short, this study provides a brief insight on the utilization of DOE for experimental simplification and parameter inter-correlation studies, especially when dealing with multiple variables during elastomeric rubber blend preparation.
Suitable RF spectrum in ISM band for 2-way advanced metering network in India
NASA Astrophysics Data System (ADS)
Mishra, A.; Khan, M. A.; Gaur, M. S.
2013-01-01
The ISM (Industrial Scientific and Medical) bands in the radio frequency space in India offer two alternative spectra to implement wireless network for advanced metering infrastructure (AMI). These bands lie in the range of 2.4GHz and sub-GHz frequencies 865 to 867 MHz This paper aims to examine the suitability of both options by designing and executing experiments in laboratory as well as carrying out field trials on electricity meters to validate the selected option. A parameter, communication effectiveness index (CEI2) is defined to measure the effectiveness of 2 way data communication (packet exchange) between two points under different scenarios of buildings and free space. Both 2.4 GHz and Sub-GHz designs were implemented to compare the results. The experiments were conducted across 3 floors of a building. Validation of the selected option was carried out by conducting a field trial by integrating the selected radio frequency (RF) modem into the single phase electricity meters and installing these meters across three floors of the building. The methodology, implementation details, observations and resulting analytical conclusion are described in the paper.
Psychological need thwarting in the sport context: assessing the darker side of athletic experience.
Bartholomew, Kimberley J; Ntoumanis, Nikos; Ryan, Richard M; Thøgersen-Ntoumani, Cecilie
2011-02-01
Research in self-determination theory (Ryan & Deci, 2002) has shown that satisfaction of autonomy, competence, and relatedness needs in sport contexts is associated with enhanced engagement, performance, and well-being. This article outlines the initial development of a multidimensional measure designed to assess psychological need thwarting, an under-studied area of conceptual and practical importance. Study 1 generated a pool of items designed to tap the negative experiential state that occurs when athletes perceive their needs for autonomy, competence, and relatedness to be actively undermined. Study 2 tested the factorial structure of the questionnaire using confirmatory factor analysis. The supported model comprised 3 factors, which represented the hypothesized interrelated dimensions of need thwarting. The model was refined and cross-validated using an independent sample in Study 3. Overall, the psychological need thwarting scale (PNTS) demonstrated good content, factorial, and predictive validity, as well as internal consistency and invariance across gender, sport type, competitive level, and competitive experience. The conceptualization of psychological need thwarting is discussed, and suggestions are made regarding the use of the PNTS in research pertaining to the darker side of sport participation.
Design of a high-temperature experiment for evaluating advanced structural materials
NASA Technical Reports Server (NTRS)
Mockler, Theodore T.; Castro-Cedeno, Mario; Gladden, Herbert J.; Kaufman, Albert
1992-01-01
This report describes the design of an experiment for evaluating monolithic and composite material specimens in a high-temperature environment and subject to big thermal gradients. The material specimens will be exposed to aerothermal loads that correspond to thermally similar engine operating conditions. Materials evaluated in this study were monolithic nickel alloys and silicon carbide. In addition, composites such as tungsten/copper were evaluated. A facility to provide the test environment has been assembled in the Engine Research Building at the Lewis Research Center. The test section of the facility will permit both regular and Schlieren photography, thermal imaging, and laser Doppler anemometry. The test environment will be products of hydrogen-air combustion at temperatures from about 1200 F to as high as 4000 F. The test chamber pressure will vary up to 60 psia, and the free-stream flow velocity can reach Mach 0.9. The data collected will be used to validate thermal and stress analysis models of the specimen. This process of modeling, testing, and validation is expected to yield enhancements to existing analysis tools and techniques.
Purwar, Namrta; Tenboer, Jason; Tripathi, Shailesh; Schmidt, Marius
2013-09-13
Time-resolved spectroscopic experiments have been performed with protein in solution and in crystalline form using a newly designed microspectrophotometer. The time-resolution of these experiments can be as good as two nanoseconds (ns), which is the minimal response time of the image intensifier used. With the current setup, the effective time-resolution is about seven ns, determined mainly by the pulse duration of the nanosecond laser. The amount of protein required is small, on the order of 100 nanograms. Bleaching, which is an undesirable effect common to photoreceptor proteins, is minimized by using a millisecond shutter to avoid extensive exposure to the probing light. We investigate two model photoreceptors, photoactive yellow protein (PYP), and α-phycoerythrocyanin (α-PEC), on different time scales and at different temperatures. Relaxation times obtained from kinetic time-series of difference absorption spectra collected from PYP are consistent with previous results. The comparison with these results validates the capability of this spectrophotometer to deliver high quality time-resolved absorption spectra.
van Dongen, Koen W; Ahlberg, Gunnar; Bonavina, Luigi; Carter, Fiona J; Grantcharov, Teodor P; Hyltander, Anders; Schijven, Marlies P; Stefani, Alessandro; van der Zee, David C; Broeders, Ivo A M J
2011-01-01
Virtual reality (VR) simulators have been demonstrated to improve basic psychomotor skills in endoscopic surgery. The exercise configuration settings used for validation in studies published so far are default settings or are based on the personal choice of the tutors. The purpose of this study was to establish consensus on exercise configurations and on a validated training program for a virtual reality simulator, based on the experience of international experts to set criterion levels to construct a proficiency-based training program. A consensus meeting was held with eight European teams, all extensively experienced in using the VR simulator. Construct validity of the training program was tested by 20 experts and 60 novices. The data were analyzed by using the t test for equality of means. Consensus was achieved on training designs, exercise configuration, and examination. Almost all exercises (7/8) showed construct validity. In total, 50 of 94 parameters (53%) showed significant difference. A European, multicenter, validated, training program was constructed according to the general consensus of a large international team with extended experience in virtual reality simulation. Therefore, a proficiency-based training program can be offered to training centers that use this simulator for training in basic psychomotor skills in endoscopic surgery.
NASA Astrophysics Data System (ADS)
Saupe, Florian; Knoblach, Andreas
2015-02-01
Two different approaches for the determination of frequency response functions (FRFs) are used for the non-parametric closed loop identification of a flexible joint industrial manipulator with serial kinematics. The two applied experiment designs are based on low power multisine and high power chirp excitations. The main challenge is to eliminate disturbances of the FRF estimates caused by the numerous nonlinearities of the robot. For the experiment design based on chirp excitations, a simple iterative procedure is proposed which allows exploiting the good crest factor of chirp signals in a closed loop setup. An interesting synergy of the two approaches, beyond validation purposes, is pointed out.
Design of an Efficient Turbulent Micro-Mixer for Protein Folding Experiments
NASA Astrophysics Data System (ADS)
Inguva, Venkatesh; Perot, Blair
2015-11-01
Protein folding studies require the development of micro-mixers that require less sample, mix at faster rates, and still provide a high signal to noise ratio. Chaotic to marginally turbulent micro-mixers are promising candidates for this application. In this study, various turbulence and unsteadiness generation concepts are explored that avoid cavitation. The mixing enhancements include flow turning regions, flow splitters, and vortex shedding. The relative effectiveness of these different approaches for rapid micro-mixing is discussed. Simulations found that flow turning regions provided the best mixing profile. Experimental validation of the optimal design is verified through laser confocal microscopy experiments. This work is support by the National Science Foundation.
Integrated software for the detection of epileptogenic zones in refractory epilepsy.
Mottini, Alejandro; Miceli, Franco; Albin, Germán; Nuñez, Margarita; Ferrándo, Rodolfo; Aguerrebere, Cecilia; Fernandez, Alicia
2010-01-01
In this paper we present an integrated software designed to help nuclear medicine physicians in the detection of epileptogenic zones (EZ) by means of ictal-interictal SPECT and MR images. This tool was designed to be flexible, friendly and efficient. A novel detection method was included (A-contrario) along with the classical detection method (Subtraction analysis). The software's performance was evaluated with two separate sets of validation studies: visual interpretation of 12 patient images by an experimented observer and objective analysis of virtual brain phantom experiments by proposed numerical observers. Our results support the potential use of the proposed software to help nuclear medicine physicians in the detection of EZ in clinical practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irminger, Philip; Starke, Michael R; Dimitrovski, Aleksandar D
2014-01-01
Power system equipment manufacturers and researchers continue to experiment with novel overhead electric conductor designs that support better conductor performance and address congestion issues. To address the technology gap in testing these novel designs, Oak Ridge National Laboratory constructed the Powerline Conductor Accelerated Testing (PCAT) facility to evaluate the performance of novel overhead conductors in an accelerated fashion in a field environment. Additionally, PCAT has the capability to test advanced sensors and measurement methods for accessing overhead conductor performance and condition. Equipped with extensive measurement and monitoring devices, PCAT provides a platform to improve/validate conductor computer models and assess themore » performance of novel conductors. The PCAT facility and its testing capabilities are described in this paper.« less
Development of an Axisymmetric Afterbody Test Case for Turbulent Flow Separation Validation
NASA Technical Reports Server (NTRS)
Disotell, Kevin J.; Rumsey, Christopher L.
2017-01-01
As identified in the CFD Vision 2030 Study commissioned by NASA, validation of advanced RANS models and scale-resolving methods for computing turbulent flows must be supported by improvements in high-quality experiments designed specifically for CFD implementation. A new test platform referred to as the Axisymmetric Afterbody allows for a range of flow behaviors to be studied on interchangeable afterbodies while facilitating access to higher Reynolds number facilities. A priori RANS computations are reported for a risk-reduction configuration to demonstrate critical variation among turbulence model results for a given afterbody, ranging from barely-attached to mild separated flow. The effects of body nose geometry and tunnel-wall boundary condition on the computed afterbody flow are explored to inform the design of an experimental test program.
NASA Astrophysics Data System (ADS)
Fix, A.; Ehret, G.; Flentje, H.; Poberaj, G.; Gottwald, M.; Finkenzeller, H.; Bremer, H.; Bruns, M.; Burrows, J. P.; Kleinböhl, A.; Küllmann, H.; Kuttippurath, J.; Richter, A.; Wang, P.; Heue, K.-P.; Platt, U.; Wagner, T.
2004-12-01
For the first time three different remote sensing instruments - a sub-millimeter radiometer, a differential optical absorption spectrometer in the UV-visible spectral range, and a lidar - were deployed aboard DLR's meteorological research aircraft Falcon 20 to validate a large number of SCIAMACHY level 2 and off-line data products such as O3, NO2, N2O, BrO, OClO, H2O, aerosols, and clouds. Within two main validation campaigns of the SCIA-VALUE mission (SCIAMACHY VALidation and Utilization Experiment) extended latitudinal cross-sections stretching from polar regions to the tropics as well as longitudinal cross sections at polar latitudes at about 70° N and the equator have been generated. This contribution gives an overview over the campaigns performed and reports on the observation strategy for achieving the validation goals. We also emphasize the synergetic use of the novel set of aircraft instrumentation and the usefulness of this innovative suite of remote sensing instruments for satellite validation.
NASA Astrophysics Data System (ADS)
Fix, A.; Ehret, G.; Flentje, H.; Poberaj, G.; Gottwald, M.; Finkenzeller, H.; Bremer, H.; Bruns, M.; Burrows, J. P.; Kleinböhl, A.; Küllmann, H.; Kuttippurath, J.; Richter, A.; Wang, P.; Heue, K.-P.; Platt, U.; Pundt, I.; Wagner, T.
2005-05-01
For the first time three different remote sensing instruments - a sub-millimeter radiometer, a differential optical absorption spectrometer in the UV-visible spectral range, and a lidar - were deployed aboard DLR's meteorological research aircraft Falcon 20 to validate a large number of SCIAMACHY level 2 and off-line data products such as O3, NO2, N2O, BrO, OClO, H2O, aerosols, and clouds. Within two validation campaigns of the SCIA-VALUE mission (SCIAMACHY VALidation and Utilization Experiment) extended latitudinal cross-sections stretching from polar regions to the tropics as well as longitudinal cross sections at polar latitudes at about 70° N and the equator were generated. This contribution gives an overview over the campaigns performed and reports on the observation strategy for achieving the validation goals. We also emphasize the synergetic use of the novel set of aircraft instrumentation and the usefulness of this innovative suite of remote sensing instruments for satellite validation.
Construction of Virtual-Experiment Systems for Information Science Education
NASA Astrophysics Data System (ADS)
She, Jin-Hua; Amano, Naoki
Practice is very important in education because it not only can stimulate the motivation of learning, but also can deepen the understanding of theory. However, due to the limitations on the time and experiment resources, experiments cannot be simply introduced in every lesson. To make the best use of multimedia technology, this paper designs five virtual experiment systems, which are based on the knowledge of physics at the high-school lever, to improve the effectiveness of teaching data processing. The systems are designed by employing the cognitive theory of multimedia learning and the inner game principle to ensure the easy use and to reduce the cognitive load. The learning process is divided into two stages: the first stage teaches the basic concepts of data processing; and the second stage practices the techniques taught in the first stage and uses them to build a linear model and to carry out estimation. The virtual experiment systems have been tested in an university's data processing course, and have demonstrated their validity.
New measure for fathers of children with developmental challenges.
Ly, A R; Goldberg, W A
2014-05-01
There is a relative lack of measures tailored to the study of fathers of children with developmental challenges (DCs). The goal of the current study was to create and validate a brief measure designed to capture the perceptions and experiences of these fathers. The Fathers of Children with Developmental Challenges (FCDC) questionnaire was designed to assess fathers' perceptions of the supports for, and challenges to, their efforts to be involved in the rearing of their children. Participants were 101 fathers of children with DCs who completed an online survey. Scale validation included tests to determine reliability, validity and factor structure. Used to establish validity were measures of parenting stress, parenting commitment, parent personality and child social-communicative skills. Analyses indicated that the FCDC is reliable (α = 0.89), demonstrates content validity, construct validity and acts in theoretically expected ways. Factor analysis on the 20-item measure yielded two sub-scales: (1) impact on parenting, and (2) involvement with child intervention. The FCDC fills a gap in the literature by offering an easy-to-administer self-report measure of fathers' perceptions of supports for, and barriers to, their involvement with their children with DCs. The FCDC could assist professionals in delivering support services specifically for fathers of children with DCs. © 2013 The Authors. Journal of Intellectual Disability Research © 2013 John Wiley & Sons Ltd, MENCAP & IASSIDD.
Statistical aspects of quantitative real-time PCR experiment design.
Kitchen, Robert R; Kubista, Mikael; Tichopad, Ales
2010-04-01
Experiments using quantitative real-time PCR to test hypotheses are limited by technical and biological variability; we seek to minimise sources of confounding variability through optimum use of biological and technical replicates. The quality of an experiment design is commonly assessed by calculating its prospective power. Such calculations rely on knowledge of the expected variances of the measurements of each group of samples and the magnitude of the treatment effect; the estimation of which is often uninformed and unreliable. Here we introduce a method that exploits a small pilot study to estimate the biological and technical variances in order to improve the design of a subsequent large experiment. We measure the variance contributions at several 'levels' of the experiment design and provide a means of using this information to predict both the total variance and the prospective power of the assay. A validation of the method is provided through a variance analysis of representative genes in several bovine tissue-types. We also discuss the effect of normalisation to a reference gene in terms of the measured variance components of the gene of interest. Finally, we describe a software implementation of these methods, powerNest, that gives the user the opportunity to input data from a pilot study and interactively modify the design of the assay. The software automatically calculates expected variances, statistical power, and optimal design of the larger experiment. powerNest enables the researcher to minimise the total confounding variance and maximise prospective power for a specified maximum cost for the large study. Copyright 2010 Elsevier Inc. All rights reserved.
Cooling tower plume - model and experiment
NASA Astrophysics Data System (ADS)
Cizek, Jan; Gemperle, Jiri; Strob, Miroslav; Nozicka, Jiri
The paper discusses the description of the simple model of the, so-called, steam plume, which in many cases forms during the operation of the evaporative cooling systems of the power plants, or large technological units. The model is based on semi-empirical equations that describe the behaviour of a mixture of two gases in case of the free jet stream. In the conclusion of the paper, a simple experiment is presented through which the results of the designed model shall be validated in the subsequent period.
Space processes for extended low-G testing
NASA Technical Reports Server (NTRS)
Steurer, W. H.; Kaye, S.; Gorham, D. J.
1973-01-01
Results of an investigation of verifying the capabilities of space processes in ground based experiments at low-g periods are presented. Limited time experiments were conducted with the processes. A valid representation of the complete process cycle was achieved at low-g periods ranging from 40 to 390 seconds. A minimum equipment inventory, is defined. A modular equipment design, adopted to assure low cost and high program flexibility, is presented as well as procedures and data established for the synthesis and definition of dedicated and mixed rocket payloads.
Patel, Prinesh N; Karakam, Vijaya Saradhi; Samanthula, Gananadhamu; Ragampeta, Srinivas
2015-10-01
Quality-by-design-based methods hold greater level of confidence for variations and greater success in method transfer. A quality-by-design-based ultra high performance liquid chromatography method was developed for the simultaneous assay of sumatriptan and naproxen along with their related substances. The first screening was performed by fractional factorial design comprising 44 experiments for reversed-phase stationary phases, pH, and organic modifiers. The results of screening design experiments suggested phenyl hexyl column and acetonitrile were the best combination. The method was further optimized for flow rate, temperature, and gradient time by experimental design of 20 experiments and the knowledge space was generated for effect of variable on response (number of peaks ≥ 1.50 - resolution). Proficient design space was generated from knowledge space by applying Monte Carlo simulation to successfully integrate quantitative robustness metrics during optimization stage itself. The final method provided the robust performance which was verified and validated. Final conditions comprised Waters® Acquity phenyl hexyl column with gradient elution using ammonium acetate (pH 4.12, 0.02 M) buffer and acetonitrile at 0.355 mL/min flow rate and 30°C. The developed method separates all 13 analytes within a 15 min run time with fewer experiments compared to the traditional quality-by-testing approach. ©2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Using CFD as a Rocket Injector Design Tool: Recent Progress at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Tucker, Kevin; West, Jeff; Williams, Robert; Lin, Jeff; Canabal, Francisco; Rocker, marvin; Robles, Bryan; Garcia, Robert; Chenoweth, James
2005-01-01
New programs are forcing American propulsion system designers into unfamiliar territory. For instance, industry s answer to the cost and reliability goals set out by the Next Generation Launch Technology Program are engine concepts based on the Oxygen- Rich Staged Combustion Cycle. Historical injector design tools are not well suited for this new task. The empirical correlations do not apply directly to the injector concepts associated with the ORSC cycle. These legacy tools focus primarily on performance with environment evaluation a secondary objective. Additionally, the environmental capability of these tools is usually one-dimensional while the actual environments are at least two- and often three-dimensional. CFD has the potential to calculate performance and multi-dimensional environments but its use in the injector design process has been retarded by long solution turnaround times and insufficient demonstrated accuracy. This paper has documented the parallel paths of program support and technology development currently employed at Marshall Space Flight Center in an effort to move CFD to the forefront of injector design. MSFC has established a long-term goal for use of CFD for combustion devices design. The work on injector design is the heart of that vision and the Combustion Devices CFD Simulation Capability Roadmap that focuses the vision. The SRL concept, combining solution fidelity, robustness and accuracy, has been established as a quantitative gauge of current and desired capability. Three examples of current injector analysis for program support have been presented and discussed. These examples are used to establish the current capability at MSFC for these problems. Shortcomings identified from this experience are being used as inputs to the Roadmap process. The SRL evaluation identified lack of demonstrated solution accuracy as a major issue. Accordingly, the MSFC view of code validation and current MSFC-funded validation efforts were discussed in some detail. The objectives of each effort were noted. Issues relative to code validation for injector design were discussed in some detail. The requirement for CFD support during the design of the experiment was noted and discussed in terms of instrumentation placement and experimental rig uncertainty. In conclusion, MSFC has made significant progress in the last two years in advancing CFD toward the goal of application to injector design. A parallel effort focused on program support and technology development via the SCIT Task have enabled the progress.
A z-gradient array for simultaneous multi-slice excitation with a single-band RF pulse.
Ertan, Koray; Taraghinia, Soheil; Sadeghi, Alireza; Atalar, Ergin
2018-07-01
Multi-slice radiofrequency (RF) pulses have higher specific absorption rates, more peak RF power, and longer pulse durations than single-slice RF pulses. Gradient field design techniques using a z-gradient array are investigated for exciting multiple slices with a single-band RF pulse. Two different field design methods are formulated to solve for the required current values of the gradient array elements for the given slice locations. The method requirements are specified, optimization problems are formulated for the minimum current norm and an analytical solution is provided. A 9-channel z-gradient coil array driven by independent, custom-designed gradient amplifiers is used to validate the theory. Performance measures such as normalized slice thickness error, gradient strength per unit norm current, power dissipation, and maximum amplitude of the magnetic field are provided for various slice locations and numbers of slices. Two and 3 slices are excited by a single-band RF pulse in simulations and phantom experiments. The possibility of multi-slice excitation with a single-band RF pulse using a z-gradient array is validated in simulations and phantom experiments. Magn Reson Med 80:400-412, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Numerical framework for the modeling of electrokinetic flows
NASA Astrophysics Data System (ADS)
Deshpande, Manish; Ghaddar, Chahid; Gilbert, John R.; St. John, Pamela M.; Woudenberg, Timothy M.; Connell, Charles R.; Molho, Joshua; Herr, Amy; Mungal, Godfrey; Kenny, Thomas W.
1998-09-01
This paper presents a numerical framework for design-based analyses of electrokinetic flow in interconnects. Electrokinetic effects, which can be broadly divided into electrophoresis and electroosmosis, are of importance in providing a transport mechanism in microfluidic devices for both pumping and separation. Models for the electrokinetic effects can be derived and coupled to the fluid dynamic equations through appropriate source terms. In the design of practical microdevices, however, accurate coupling of the electrokinetic effects requires the knowledge of several material and physical parameters, such as the diffusivity and the mobility of the solute in the solvent. Additionally wall-based effects such as chemical binding sites might exist that affect the flow patterns. In this paper, we address some of these issues by describing a synergistic numerical/experimental process to extract the parameters required. Experiments were conducted to provide the numerical simulations with a mechanism to extract these parameters based on quantitative comparisons with each other. These parameters were then applied in predicting further experiments to validate the process. As part of this research, we have created NetFlow, a tool for micro-fluid analyses. The tool can be validated and applied in existing technologies by first creating test structures to extract representations of the physical phenomena in the device, and then applying them in the design analyses to predict correct behavior.
Designing and validation of a yoga-based intervention for obsessive compulsive disorder.
Bhat, Shubha; Varambally, Shivarama; Karmani, Sneha; Govindaraj, Ramajayam; Gangadhar, B N
2016-06-01
Some yoga-based practices have been found to be useful for patients with obsessive compulsive disorder (OCD). The authors could not find a validated yoga therapy module available for OCD. This study attempted to formulate a generic yoga-based intervention module for OCD. A yoga module was designed based on traditional and contemporary yoga literature. The module was sent to 10 yoga experts for content validation. The experts rated the usefulness of the practices on a scale of 1-5 (5 = extremely useful). The final version of the module was pilot-tested on patients with OCD (n = 17) for both feasibility and effect on symptoms. Eighty-eight per cent (22 out of 25) of the items in the initial module were retained, with modifications in the module as suggested by the experts along with patients' inputs and authors' experience. The module was found to be feasible and showed an improvement in symptoms of OCD on total Yale-Brown Obsessive-Compulsive Scale (YBOCS) score (p = 0.001). A generic yoga therapy module for OCD was validated by experts in the field and found feasible to practice in patients. A decrease in the symptom scores was also found following yoga practice of 2 weeks. Further clinical validation is warranted to confirm efficacy.
Design and validation of a method for evaluation of interocular interaction.
Lai, Xin Jie Angela; Alexander, Jack; Ho, Arthur; Yang, Zhikuan; He, Mingguang; Suttle, Catherine
2012-02-01
To design a simple viewing system allowing dichoptic masking, and to validate this system in adults and children with normal vision. A Trial Frame Apparatus (TFA) was designed to evaluate interocular interaction. This device consists of a trial frame, a 1 mm pinhole in front of the tested eye and a full or partial occluder in front of the non-tested eye. The difference in visual function in one eye between the full- and partial-occlusion conditions was termed the Interaction Index. In experiment 1, low-contrast acuity was measured in six adults using five types of partial occluder. Interaction Index was compared between these five, and the occluder showing the highest Index was used in experiment 2. In experiment 2, low-contrast acuity, contrast sensitivity, and alignment sensitivity were measured in the non-dominant eye of 45 subjects (15 older adults, 15 young adults, and 15 children), using the TFA and an existing well-validated device (shutter goggles) with full and partial occlusion of the dominant eye. These measurements were repeated on 11 subjects of each group using TFA in the partial-occlusion condition only. Repeatability of visual function measurements using TFA was assessed using the Bland-Altman method and agreement between TFA and goggles in terms of visual functions and interactions was assessed using the Bland-Altman method and t-test. In all three subject groups, the TFA showed a high level of repeatability in all visual function measurements. Contrast sensitivity was significantly poorer when measured using TFA than using goggles (p < 0.05). However, Interaction Index of all three visual functions showed acceptable agreement between TFA and goggles (p > 0.05). The TFA may provide an acceptable method for the study of some forms of dichoptic masking in populations where more complex devices (e.g., shutter goggles) cannot be used.
DE-NE0008277_PROTEUS final technical report 2018
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enqvist, Andreas
This project details re-evaluations of experiments of gas-cooled fast reactor (GCFR) core designs performed in the 1970s at the PROTEUS reactor and create a series of International Reactor Physics Experiment Evaluation Project (IRPhEP) benchmarks. Currently there are no gas-cooled fast reactor (GCFR) experiments available in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). These experiments are excellent candidates for reanalysis and development of multiple benchmarks because these experiments provide high-quality integral nuclear data relevant to the validation and refinement of thorium, neptunium, uranium, plutonium, iron, and graphite cross sections. It would be cost prohibitive to reproduce suchmore » a comprehensive suite of experimental data to support any future GCFR endeavors.« less
Age Effects and Heuristics in Decision Making*
Besedeš, Tibor; Deck, Cary; Sarangi, Sudipta; Shor, Mikhael
2011-01-01
Using controlled experiments, we examine how individuals make choices when faced with multiple options. Choice tasks are designed to mimic the selection of health insurance, prescription drug, or retirement savings plans. In our experiment, available options can be objectively ranked allowing us to examine optimal decision making. First, the probability of a person selecting the optimal option declines as the number of options increases, with the decline being more pronounced for older subjects. Second, heuristics differ by age with older subjects relying more on suboptimal decision rules. In a heuristics validation experiment, older subjects make worse decisions than younger subjects. PMID:22544977
Age Effects and Heuristics in Decision Making.
Besedeš, Tibor; Deck, Cary; Sarangi, Sudipta; Shor, Mikhael
2012-05-01
Using controlled experiments, we examine how individuals make choices when faced with multiple options. Choice tasks are designed to mimic the selection of health insurance, prescription drug, or retirement savings plans. In our experiment, available options can be objectively ranked allowing us to examine optimal decision making. First, the probability of a person selecting the optimal option declines as the number of options increases, with the decline being more pronounced for older subjects. Second, heuristics differ by age with older subjects relying more on suboptimal decision rules. In a heuristics validation experiment, older subjects make worse decisions than younger subjects.
Roy, Tapta Kanchan; Kopysov, Vladimir; Nagornova, Natalia S; Rizzo, Thomas R; Boyarkin, Oleg V; Gerber, R Benny
2015-05-18
Calculated structures of the two most stable conformers of a protonated decapeptide gramicidin S in the gas phase have been validated by comparing the vibrational spectra, calculated from first- principles and measured in a wide spectral range using infrared (IR)-UV double resonance cold ion spectroscopy. All the 522 vibrational modes of each conformer were calculated quantum mechanically and compared with the experiment without any recourse to an empirical scaling. The study demonstrates that first-principles calculations, when accounting for vibrational anharmonicity, can reproduce high-resolution experimental spectra well enough for validating structures of molecules as large as of 200 atoms. The validated accurate structures of the peptide may serve as templates for in silico drug design and absolute calibration of ion mobility measurements. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Use of Taguchi design of experiments to optimize and increase robustness of preliminary designs
NASA Technical Reports Server (NTRS)
Carrasco, Hector R.
1992-01-01
The research performed this summer includes the completion of work begun last summer in support of the Air Launched Personnel Launch System parametric study, providing support on the development of the test matrices for the plume experiments in the Plume Model Investigation Team Project, and aiding in the conceptual design of a lunar habitat. After the conclusion of last years Summer Program, the Systems Definition Branch continued with the Air Launched Personnel Launch System (ALPLS) study by running three experiments defined by L27 Orthogonal Arrays. Although the data was evaluated during the academic year, the analysis of variance and the final project review were completed this summer. The Plume Model Investigation Team (PLUMMIT) was formed by the Engineering Directorate to develop a consensus position on plume impingement loads and to validate plume flowfield models. In order to obtain a large number of individual correlated data sets for model validation, a series of plume experiments was planned. A preliminary 'full factorial' test matrix indicated that 73,024 jet firings would be necessary to obtain all of the information requested. As this was approximately 100 times more firings than the scheduled use of Vacuum Chamber A would permit, considerable effort was needed to reduce the test matrix and optimize it with respect to the specific objectives of the program. Part of the First Lunar Outpost Project deals with Lunar Habitat. Requirements for the habitat include radiation protection, a safe haven for occasional solar flare storms, an airlock module as well as consumables to support 34 extra vehicular activities during a 45 day mission. The objective for the proposed work was to collaborate with the Habitat Team on the development and reusability of the Logistics Modules.
Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education
Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward
2011-01-01
Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content–based design outperforms the traditional VLE–based design. PMID:21998652
Determination of lipophilic toxins by LC/MS/MS: single-laboratory validation.
Villar-González, Adriano; Rodríguez-Velasco, María Luisa; Gago-Martínez, Ana
2011-01-01
An LC/MS/MS method has been developed, assessed, and intralaboratory-validated for the analysis of the lipophilic toxins currently regulated by European Union legislation: okadaic acid (OA) and dinophysistoxins 1 and 2, including their ester forms; azaspiracids 1, 2, and 3; pectenotoxins 1 and 2; yessotoxin (YTX), and the analogs 45 OH-YTX, Homo YTX, and 45 OH-Homo YTX; as well as for the analysis of 13-desmetil-spirolide C. The method consists of duplicate sample extraction with methanol and direct analysis of the crude extract without further cleanup or concentration. Ester forms of OA and dinophysistoxins are detected as the parent ions after alkaline hydrolysis of the extract. The validation process of this method was performed using both fortified and naturally contaminated samples, and experiments were designed according to International Organization for Standardization, International Union of Pure and Applied Chemistry, and AOAC guidelines. With the exception of YTX in fortified samples, RSDr below 15% and RSDR were below 25%. Recovery values were between 77 and 95%, and LOQs were below 60 microg/kg. These data together with validation experiments for recovery, selectivity, robustness, traceability, and linearity, as well as uncertainty calculations, are presented in this paper.
Nadkarni, Lindsay D; Roskind, Cindy G; Auerbach, Marc A; Calhoun, Aaron W; Adler, Mark D; Kessler, David O
2018-04-01
The aim of this study was to assess the validity of a formative feedback instrument for leaders of simulated resuscitations. This is a prospective validation study with a fully crossed (person × scenario × rater) study design. The Concise Assessment of Leader Management (CALM) instrument was designed by pediatric emergency medicine and graduate medical education experts to be used off the shelf to evaluate and provide formative feedback to resuscitation leaders. Four experts reviewed 16 videos of in situ simulated pediatric resuscitations and scored resuscitation leader performance using the CALM instrument. The videos consisted of 4 pediatric emergency department resuscitation teams each performing in 4 pediatric resuscitation scenarios (cardiac arrest, respiratory arrest, seizure, and sepsis). We report on content and internal structure (reliability) validity of the CALM instrument. Content validity was supported by the instrument development process that involved professional experience, expert consensus, focused literature review, and pilot testing. Internal structure validity (reliability) was supported by the generalizability analysis. The main component that contributed to score variability was the person (33%), meaning that individual leaders performed differently. The rater component had almost zero (0%) contribution to variance, which implies that raters were in agreement and argues for high interrater reliability. These results provide initial evidence to support the validity of the CALM instrument as a reliable assessment instrument that can facilitate formative feedback to leaders of pediatric simulated resuscitations.
Modeling Drift Compression in an Integrated Beam Experiment for Heavy-Ion-Fusion
NASA Astrophysics Data System (ADS)
Sharp, W. M.; Barnard, J. J.; Friedman, A.; Grote, D. P.; Celata, C. M.; Yu, S. S.
2003-10-01
The Integrated Beam Experiment (IBX) is an induction accelerator being designed to further develop the science base for heavy-ion fusion. The experiment is being developed jointly by Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. One conceptual approach would first accelerate a 0.5-1 A beam of singly charged potassium ions to 5 MeV, impose a head-to-tail velocity tilt to compress the beam longitudinally, and finally focus the beam radiallly using a series of quadrupole lenses. The lengthwise compression is a critical step because the radial size must be controlled as the current increases, and the beam emittance must be kept minimal. The work reported here first uses the moment-based model HERMES to design the drift-compression beam line and to assess the sensitivity of the final beam profile to beam and lattice errors. The particle-in-cell code WARP is then used to validate the physics design, study the phase-space evolution, and quantify the emittance growth.
A Robust Inner and Outer Loop Control Method for Trajectory Tracking of a Quadrotor
Xia, Dunzhu; Cheng, Limei; Yao, Yanhong
2017-01-01
In order to achieve the complicated trajectory tracking of quadrotor, a geometric inner and outer loop control scheme is presented. The outer loop generates the desired rotation matrix for the inner loop. To improve the response speed and robustness, a geometric SMC controller is designed for the inner loop. The outer loop is also designed via sliding mode control (SMC). By Lyapunov theory and cascade theory, the closed-loop system stability is guaranteed. Next, the tracking performance is validated by tracking three representative trajectories. Then, the robustness of the proposed control method is illustrated by trajectory tracking in presence of model uncertainty and disturbances. Subsequently, experiments are carried out to verify the method. In the experiment, ultra wideband (UWB) is used for indoor positioning. Extended Kalman Filter (EKF) is used for fusing inertial measurement unit (IMU) and UWB measurements. The experimental results show the feasibility of the designed controller in practice. The comparative experiments with PD and PD loop demonstrate the robustness of the proposed control method. PMID:28925984
Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle
NASA Technical Reports Server (NTRS)
Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.
2004-01-01
This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.
Physical models and primary design of reactor based slow positron source at CMRR
NASA Astrophysics Data System (ADS)
Wang, Guanbo; Li, Rundong; Qian, Dazhi; Yang, Xin
2018-07-01
Slow positron facilities are widely used in material science. A high intensity slow positron source is now at the design stage based on the China Mianyang Research Reactor (CMRR). This paper describes the physical models and our primary design. We use different computer programs or mathematical formula to simulate different physical process, and validate them by proper experiments. Considering the feasibility, we propose a primary design, containing a cadmium shield, a honeycomb arranged W tubes assembly, electrical lenses, and a solenoid. It is planned to be vertically inserted in the Si-doping channel. And the beam intensity is expected to be 5 ×109
Harvard ER-2 OH laser-induced fluorescence instrument
NASA Technical Reports Server (NTRS)
Wennberg, Paul O.; Anderson, James G.
1994-01-01
The Harvard ER-2 OH instrument is scheduled to be integrated into the NASA ER-2 high altitude aircraft ozone payload in August 1992. Design and fabrication is presently underway. This experiment is a descendant of a balloon borne instrument designed and built in the mid-1980s. The ER-2 instrument is being designed to measure OH and HO2 as part of the NASA ozone payload for the investigation of processes controlling the concentration of stratospheric ozone. Although not specifically designed to do so, it is hoped that valid measurements of OH and HO2 can be made in the remote free troposphere with this instrument.
Single element injector testing for STME injector technology
NASA Technical Reports Server (NTRS)
Hulka, J.; Schneider, J. A.; Davis, J.
1992-01-01
An oxidizer-swirled coaxial element injector is being developed for application in the liquid oxygen/gaseous hydrogen Space Transportation Main Engine (STME) for the National Launch System (NLS) vehicle. This paper reports on the first two parts of a four part single injector element study for optimization of the STME injector design. Measurements of Rupe mixing efficiency and atomization characteristics are reported for single element versions of injection elements from two multielement injectors that have been recently hot fire tested. Rather than attempting to measure a definitive mixing efficiency or droplet size parameters of these injector elements, the purpose of these experiments was to provide a baseline comparison for evaluating future injector element design modifications. Hence, all the experiments reported here were conducted with cold flow simulants to nonflowing, ambient conditions. Mixing experiments were conducted with liquid/liquid simulants to provide economical trend data. Atomization experiments were conducted with liquid/gas simulants without backpressure. The results, despite significant differences from hot fire conditions, were found to relate to mixing and atomization parameters deduced from the hot fire testing, suggesting that these experiments are valid for trend analyses. Single element and subscale multielement hot fire testing will verify optimized designs before committing to fullscale fabrication.
Design And Ground Testing For The Expert PL4/PL5 'Natural And Roughness Induced Transition'
NASA Astrophysics Data System (ADS)
Masutti, Davie; Chazot, Olivier; Donelli, Raffaele; de Rosa, Donato
2011-05-01
Unpredicted boundary layer transition can impact dramatically the stability of the vehicle, its aerodynamic coefficients and reduce the efficiency of the thermal protection system. In this frame, ESA started the EXPERT (European eXPErimental Reentry Testbed) program to pro- vide and perform in-flight experiments in order to obtain aerothermodynamic data for the validation of numerical models and of ground-to-flight extrapolation methodologies. Considering the boundary layer transition investigation, the EXPERT vehicle is equipped with two specific payloads, PL4 and PL5, concerning respectively the study of the natural and roughness induced transition. The paper is a survey on the design process of these two in-flight experiments and it covers the major analyses and findings encountered during the development of the payloads. A large amount of transition criteria have been investigated and used to estimate either the dangerousness of the height of the distributed roughness, arising due to nose erosion, or the effectiveness of height of the isolated roughness element forcing the boundary layer transition. Supporting the PL4 design, linear stability computations and CFD analyses have been performed by CIRA on the EXPERT flight vehicle to determine the amplification factor of the boundary layer instabilities at different point of the re-entry trajectory. Ground test experiments regarding the PL5 are carried on in the Mach 6 VKI H3 Hypersonic Wind Tunnel with a Reynolds numbers ranging from 18E6/m to 26E6/m. Infrared measurements (Stanton number) and flow visualization are used on a 1/16 scaled model of the EXPERT vehicle and a flat plate to validate the Potter and Whitfield criterion as a suitable methodology for ground-to-flight extrapolation and the payload design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Madison E.
Opacity is a critical parameter in the simulation of radiation transport in systems such as inertial con nement fusion capsules and stars. The resolution of current disagreements between solar models and helioseismological observations would bene t from experimental validation of theoretical opacity models. Overall, short pulse laser heated iron experiments reaching stellar-relevant conditions have been designed with consideration of minimizing tamper emission and optical depth effects while meeting plasma condition and x-ray emission goals.
High-speed inlet research program and supporting analysis
NASA Technical Reports Server (NTRS)
Coltrin, Robert E.
1990-01-01
The technology challenges faced by the high speed inlet designer are discussed by describing the considerations that went into the design of the Mach 5 research inlet. It is shown that the emerging three dimensional viscous computational fluid dynamics (CFD) flow codes, together with small scale experiments, can be used to guide larger scale full inlet systems research. Then, in turn, the results of the large scale research, if properly instrumented, can be used to validate or at least to calibrate the CFD codes.
Development of N-version software samples for an experiment in software fault tolerance
NASA Technical Reports Server (NTRS)
Lauterbach, L.
1987-01-01
The report documents the task planning and software development phases of an effort to obtain twenty versions of code independently designed and developed from a common specification. These versions were created for use in future experiments in software fault tolerance, in continuation of the experimental series underway at the Systems Validation Methods Branch (SVMB) at NASA Langley Research Center. The 20 versions were developed under controlled conditions at four U.S. universities, by 20 teams of two researchers each. The versions process raw data from a modified Redundant Strapped Down Inertial Measurement Unit (RSDIMU). The specifications, and over 200 questions submitted by the developers concerning the specifications, are included as appendices to this report. Design documents, and design and code walkthrough reports for each version, were also obtained in this task for use in future studies.
Rubashkin, Nicholas; Szebik, Imre; Baji, Petra; Szántó, Zsuzsa; Susánszky, Éva; Vedam, Saraswathi
2017-11-16
Instruments to assess quality of maternity care in Central and Eastern European (CEE) region are scarce, despite reports of poor doctor-patient communication, non-evidence-based care, and informal cash payments. We validated and tested an online questionnaire to study maternity care experiences among Hungarian women. Following literature review, we collated validated items and scales from two previous English-language surveys and adapted them to the Hungarian context. An expert panel assessed items for clarity and relevance on a 4-point ordinal scale. We calculated item-level Content Validation Index (CVI) scores. We designed 9 new items concerning informal cash payments, as well as 7 new "model of care" categories based on mode of payment. The final questionnaire (N = 111 items) was tested in two samples of Hungarian women, representative (N = 600) and convenience (N = 657). We conducted bivariate analysis and thematic analysis of open-ended responses. Experts rated pre-existing English-language items as clear and relevant to Hungarian women's maternity care experiences with an average CVI for included questions of 0.97. Significant differences emerged across the model of care categories in terms of informal payments, informed consent practices, and women's perceptions of autonomy. Thematic analysis (N = 1015) of women's responses identified 13 priority areas of the maternity care experience, 9 of which were addressed by the questionnaire. We developed and validated a comprehensive questionnaire that can be used to evaluate respectful maternity care, evidence-based practice, and informal cash payments in CEE region and beyond.
Inter-Disciplinary Collaboration in Support of the Post-Standby TREAT Mission
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeHart, Mark; Baker, Benjamin; Ortensi, Javier
Although analysis methods have advanced significantly in the last two decades, high fidelity multi- physics methods for reactors systems have been under development for only a few years and are not presently mature nor deployed. Furthermore, very few methods provide the ability to simulate rapid transients in three dimensions. Data for validation of advanced time-dependent multi- physics is sparse; at TREAT, historical data were not collected for the purpose of validating three-dimensional methods, let alone multi-physics simulations. Existing data continues to be collected to attempt to simulate the behavior of experiments and calibration transients, but it will be insufficient formore » the complete validation of analysis methods used for TREAT transient simulations. Hence, a 2018 restart will most likely occur without the direct application of advanced modeling and simulation methods. At present, the current INL modeling and simulation team plans to work with TREAT operations staff in performing reactor simulations with MAMMOTH, in parallel with the software packages currently being used in preparation for core restart (e.g., MCNP5, RELAP5, ABAQUS). The TREAT team has also requested specific measurements to be performed during startup testing, currently scheduled to run from February to August of 2018. These startup measurements will be crucial in validating the new analysis methods in preparation for ultimate application for TREAT operations and experiment design. This document describes the collaboration between modeling and simulation staff and restart, operations, instrumentation and experiment development teams to be able to effectively interact and achieve successful validation work during restart testing.« less
The NASA CloudSat/GPM Light Precipitation Validation Experiment (LPVEx)
NASA Technical Reports Server (NTRS)
Petersen, Walter A.; L'Ecuyer, Tristan; Moisseev, Dmitri
2011-01-01
Ground-based measurements of cool-season precipitation at mid and high latitudes (e.g., above 45 deg N/S) suggest that a significant fraction of the total precipitation volume falls in the form of light rain, i.e., at rates less than or equal to a few mm/h. These cool-season light rainfall events often originate in situations of a low-altitude (e.g., lower than 2 km) melting level and pose a significant challenge to the fidelity of all satellite-based precipitation measurements, especially those relying on the use of multifrequency passive microwave (PMW) radiometers. As a result, significant disagreements exist between satellite estimates of rainfall accumulation poleward of 45 deg. Ongoing efforts to develop, improve, and ultimately evaluate physically-based algorithms designed to detect and accurately quantify high latitude rainfall, however, suffer from a general lack of detailed, observationally-based ground validation datasets. These datasets serve as a physically consistent framework from which to test and refine algorithm assumptions, and as a means to build the library of algorithm retrieval databases in higher latitude cold-season light precipitation regimes. These databases are especially relevant to NASA's CloudSat and Global Precipitation Measurement (GPM) ground validation programs that are collecting high-latitude precipitation measurements in meteorological systems associated with frequent coolseason light precipitation events. In an effort to improve the inventory of cool-season high-latitude light precipitation databases and advance the physical process assumptions made in satellite-based precipitation retrieval algorithm development, the CloudSat and GPM mission ground validation programs collaborated with the Finnish Meteorological Institute (FMI), the University of Helsinki (UH), and Environment Canada (EC) to conduct the Light Precipitation Validation Experiment (LPVEx). The LPVEx field campaign was designed to make detailed measurements of cool-season light precipitation by leveraging existing infrastructure in the Helsinki Precipitation Testbed. LPVEx was conducted during the months of September--October, 2010 and featured coordinated ground and airborne remote sensing components designed to observe and quantify the precipitation physics associated with light rain in low-altitude melting layer environments over the Gulf of Finland and neighboring land mass surrounding Helsinki, Finland.
An industrial approach to design compelling VR and AR experience
NASA Astrophysics Data System (ADS)
Richir, Simon; Fuchs, Philippe; Lourdeaux, Domitile; Buche, Cédric; Querrec, Ronan
2013-03-01
The convergence of technologies currently observed in the field of VR, AR, robotics and consumer electronic reinforces the trend of new applications appearing every day. But when transferring knowledge acquired from research to businesses, research laboratories are often at a loss because of a lack of knowledge of the design and integration processes in creating an industrial scale product. In fact, the innovation approaches that take a good idea from the laboratory to a successful industrial product are often little known to researchers. The objective of this paper is to present the results of the work of several research teams that have finalized a working method for researchers and manufacturers that allow them to design virtual or augmented reality systems and enable their users to enjoy "a compelling VR experience". That approach, called "the I2I method", present 11 phases from "Establishing technological and competitive intelligence and industrial property" to "Improvements" through the "Definition of the Behavioral Interface, Virtual Environment and Behavioral Software Assistance". As a result of the experience gained by various research teams, this design approach benefits from contributions from current VR and AR research. Our objective is to validate and continuously move such multidisciplinary design team methods forward.
NASA Technical Reports Server (NTRS)
Pavlock, Kate M.
2011-01-01
The National Aeronautics and Space Administration's Dryden Flight Research Center completed flight testing of adaptive controls research on the Full-Scale Advance Systems Testbed (FAST) in January of 2011. The research addressed technical challenges involved with reducing risk in an increasingly complex and dynamic national airspace. Specific challenges lie with the development of validated, multidisciplinary, integrated aircraft control design tools and techniques to enable safe flight in the presence of adverse conditions such as structural damage, control surface failures, or aerodynamic upsets. The testbed is an F-18 aircraft serving as a full-scale vehicle to test and validate adaptive flight control research and lends a significant confidence to the development, maturation, and acceptance process of incorporating adaptive control laws into follow-on research and the operational environment. The experimental systems integrated into FAST were designed to allow for flexible yet safe flight test evaluation and validation of modern adaptive control technologies and revolve around two major hardware upgrades: the modification of Production Support Flight Control Computers (PSFCC) and integration of two, fourth-generation Airborne Research Test Systems (ARTS). Post-hardware integration verification and validation provided the foundation for safe flight test of Nonlinear Dynamic Inversion and Model Reference Aircraft Control adaptive control law experiments. To ensure success of flight in terms of cost, schedule, and test results, emphasis on risk management was incorporated into early stages of design and flight test planning and continued through the execution of each flight test mission. Specific consideration was made to incorporate safety features within the hardware and software to alleviate user demands as well as into test processes and training to reduce human factor impacts to safe and successful flight test. This paper describes the research configuration, experiment functionality, overall risk mitigation, flight test approach and results, and lessons learned of adaptive controls research of the Full-Scale Advanced Systems Testbed.
Chaspari, Theodora; Soldatos, Constantin; Maragos, Petros
2015-01-01
The development of ecologically valid procedures for collecting reliable and unbiased emotional data towards computer interfaces with social and affective intelligence targeting patients with mental disorders. Following its development, presented with, the Athens Emotional States Inventory (AESI) proposes the design, recording and validation of an audiovisual database for five emotional states: anger, fear, joy, sadness and neutral. The items of the AESI consist of sentences each having content indicative of the corresponding emotion. Emotional content was assessed through a survey of 40 young participants with a questionnaire following the Latin square design. The emotional sentences that were correctly identified by 85% of the participants were recorded in a soundproof room with microphones and cameras. A preliminary validation of AESI is performed through automatic emotion recognition experiments from speech. The resulting database contains 696 recorded utterances in Greek language by 20 native speakers and has a total duration of approximately 28 min. Speech classification results yield accuracy up to 75.15% for automatically recognizing the emotions in AESI. These results indicate the usefulness of our approach for collecting emotional data with reliable content, balanced across classes and with reduced environmental variability.
Pigache, Francois; Messine, Frédéric; Nogarede, Bertrand
2007-07-01
This paper deals with a deterministic and rational way to design piezoelectric transformers in radial mode. The proposed approach is based on the study of the inverse problem of design and on its reformulation as a mixed constrained global optimization problem. The methodology relies on the association of the analytical models for describing the corresponding optimization problem and on an exact global optimization software, named IBBA and developed by the second author to solve it. Numerical experiments are presented and compared in order to validate the proposed approach.
VALUE - A Framework to Validate Downscaling Approaches for Climate Change Studies
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilke, Renate A. I.
2015-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. Here, we present the key ingredients of this framework. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.
VALUE: A framework to validate downscaling approaches for climate change studies
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilcke, Renate A. I.
2015-01-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. In this paper, we present the key ingredients of this framework. VALUE's main approach to validation is user- focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.
Analysis and Ground Testing for Validation of the Inflatable Sunshield in Space (ISIS) Experiment
NASA Technical Reports Server (NTRS)
Lienard, Sebastien; Johnston, John; Adams, Mike; Stanley, Diane; Alfano, Jean-Pierre; Romanacci, Paolo
2000-01-01
The Next Generation Space Telescope (NGST) design requires a large sunshield to protect the large aperture mirror and instrument module from constant solar exposure at its L2 orbit. The structural dynamics of the sunshield must be modeled in order to predict disturbances to the observatory attitude control system and gauge effects on the line of site jitter. Models of large, non-linear membrane systems are not well understood and have not been successfully demonstrated. To answer questions about sunshield dynamic behavior and demonstrate controlled deployment, the NGST project is flying a Pathfinder experiment, the Inflatable Sunshield in Space (ISIS). This paper discusses in detail the modeling and ground-testing efforts performed at the Goddard Space Flight Center to: validate analytical tools for characterizing the dynamic behavior of the deployed sunshield, qualify the experiment for the Space Shuttle, and verify the functionality of the system. Included in the discussion will be test parameters, test setups, problems encountered, and test results.
Preliminary experiments on pharmacokinetic diffuse fluorescence tomography of CT-scanning mode
NASA Astrophysics Data System (ADS)
Zhang, Yanqi; Wang, Xin; Yin, Guoyan; Li, Jiao; Zhou, Zhongxing; Zhao, Huijuan; Gao, Feng; Zhang, Limin
2016-10-01
In vivo tomographic imaging of the fluorescence pharmacokinetic parameters in tissues can provide additional specific and quantitative physiological and pathological information to that of fluorescence concentration. This modality normally requires a highly-sensitive diffuse fluorescence tomography (DFT) working in dynamic way to finally extract the pharmacokinetic parameters from the measured pharmacokinetics-associated temporally-varying boundary intensity. This paper is devoted to preliminary experimental validation of our proposed direct reconstruction scheme of instantaneous sampling based pharmacokinetic-DFT: A highly-sensitive DFT system of CT-scanning mode working with parallel four photomultiplier-tube photon-counting channels is developed to generate an instantaneous sampling dataset; A direct reconstruction scheme then extracts images of the pharmacokinetic parameters using the adaptive-EKF strategy. We design a dynamic phantom that can simulate the agent metabolism in living tissue. The results of the dynamic phantom experiments verify the validity of the experiment system and reconstruction algorithms, and demonstrate that system provides good resolution, high sensitivity and quantitativeness at different pump speed.
ERIC Educational Resources Information Center
Layton, Richard A.; Loughry, Misty L.; Ohland, Matthew W.; Ricco, George D.
2010-01-01
A significant body of research identifies a large number of team composition characteristics that affect the success of individuals and teams in cooperative learning and project-based team environments. Controlling these factors when assigning students to teams should result in improved learning experiences. However, it is very difficult for…
ERIC Educational Resources Information Center
Yang, Ya-Ting C.; Gamble, Jeffrey; Tang, Shiun-Yi S.
2012-01-01
The challenge of providing authentic experiences and interactions for fostering oral proficiency and motivation in foreign languages is an opportunity for innovation in educational technology and instructional design. Although several recent innovations have received the attention of scholars, empirical investigation and validation is often…
Child Care Services IV: Activities That Teach, Home and Family Education: 6755.05.
ERIC Educational Resources Information Center
Ahrens, Thea
This course is designed for senior high school students interested in early childhood education and gives the Child Care Aide experience in planning and executing activities with children in group situations which reflect knowledge of their individual development. The course centers on the following concepts: play is valid, development of the…
ERIC Educational Resources Information Center
Floyd, Nancy D.
2012-01-01
Higher education in the United States is replete with inventories and instruments designed to help administrators to identify students who are more likely to succeed in college and to tailor the higher education experience to foster this success. One area of research involves the Holland vocational personality type (Holland 1973, 1985, 1997)…
ERIC Educational Resources Information Center
Tharayil, Davis Porinchu
2012-01-01
As the existing scales to measure loneliness are almost all Western and there is no single scale developed cross-culturally for this purpose, this study is designed to develop a reliable and valid scale to measure the experience of loneliness of individuals from individualistic or collectivistic cultures. There are three samples for this study…
ERIC Educational Resources Information Center
Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary
2018-01-01
Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…
ERIC Educational Resources Information Center
Bottema-Beutel, Kristen; Mullins, Teagan S.; Harvey, Michelle N.; Gustafson, Jenny R.; Carter, Erik W.
2016-01-01
Many youth with autism spectrum disorder participate in school-based, peer-mediated intervention programs designed to improve their social experiences. However, there is little research discerning how these youth view intervention practices currently represented in the literature, information which could improve the social validity of intervention…
MURI: Impact of Oceanographic Variability on Acoustic Communications
2011-09-01
multiplexing ( OFDM ), multiple- input/multiple-output ( MIMO ) transmissions, and multi-user single-input/multiple-output (SIMO) communications. Lastly... MIMO - OFDM communications: Receiver design for Doppler distorted underwater acoustic channels,” Proc. Asilomar Conf. on Signals, Systems, and... MIMO ) will be of particular interest. Validating experimental data will be obtained during the ONR acoustic communications experiment in summer 2008
Thinking out of the Exams Box: Assessment through Talk?
ERIC Educational Resources Information Center
Coultas, Valerie
2017-01-01
This article examines the abandonment of talk-based assessment in favour of written exams, even when writing results in less valid assessment. It points to substantial experience of assessment through talk in English and media studies and points to its potential use in other subjects. It is followed by an example, originally designed by the…
Block 2 SRM conceptual design studies. Volume 1, Book 1: Conceptual design package
NASA Technical Reports Server (NTRS)
Smith, Brad; Williams, Neal; Miller, John; Ralston, Joe; Richardson, Jennifer; Moore, Walt; Doll, Dan; Maughan, Jeff; Hayes, Fred
1986-01-01
The conceptual design studies of a Block 2 Solid Rocket Motor (SRM) require the elimination of asbestos-filled insulation and was open to alternate designs, such as case changes, different propellants, modified burn rate - to improve reliability and performance. Limitations were placed on SRM changes such that the outside geometry should not impact the physical interfaces with other Space Shuttle elements and should have minimum changes to the aerodynamic and dynamic characteristics of the Space Shuttle vehicle. Previous Space Shuttle SRM experience was assessed and new design concepts combined to define a valid approach to assured flight success and economic operation of the STS. Trade studies, preliminary designs, analyses, plans, and cost estimates are documented.
The project ownership survey: measuring differences in scientific inquiry experiences.
Hanauer, David I; Dolan, Erin L
2014-01-01
A growing body of research documents the positive outcomes of research experiences for undergraduates, including increased persistence in science. Study of undergraduate lab learning experiences has demonstrated that the design of the experience influences the extent to which students report ownership of the project and that project ownership is one of the psychosocial factors involved in student retention in the sciences. To date, methods for measuring project ownership have not been suitable for the collection of larger data sets. The current study aims to rectify this by developing, presenting, and evaluating a new instrument for measuring project ownership. Eighteen scaled items were generated based on prior research and theory related to project ownership and combined with 30 items shown to measure respondents' emotions about an experience, resulting in the Project Ownership survey (POS). The POS was analyzed to determine its dimensionality, reliability, and validity. The POS had a coefficient alpha of 0.92 and thus has high internal consistency. Known-groups validity was analyzed through the ability of the instrument to differentiate between students who studied in traditional versus research-based laboratory courses. The POS scales as differentiated between the groups and findings paralleled previous results in relation to the characteristics of project ownership.
Development of Eulerian Code Modeling for ICF Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, Paul A.
2014-02-27
One of the most pressing unexplained phenomena standing in the way of ICF ignition is understanding mix and how it interacts with burn. Experiments were being designed and fielded as part of the Defect-Induced Mix Experiment (DIME) project to obtain data about the extent of material mix and how this mix influenced burn. Experiments on the Omega laser and National Ignition Facility (NIF) provided detailed data for comparison to the Eulerian code RAGE1. The Omega experiments were able to resolve the mix and provide “proof of principle” support for subsequent NIF experiments, which were fielded from July 2012 through Junemore » 2013. The Omega shots were fired at least once per year between 2009 and 2012. RAGE was not originally designed to model inertial confinement fusion (ICF) implosions. It still lacks lasers, so the code has been validated using an energy source. To test RAGE, the simulation output is compared to data and by means of postprocessing tools that were developed. Here, the various postprocessing tools are described with illustrative examples.« less
Design of a water electrolysis flight experiment
NASA Technical Reports Server (NTRS)
Lee, M. Gene; Grigger, David J.; Thompson, C. Dean; Cusick, Robert J.
1993-01-01
Supply of oxygen (O2) and hydrogen (H2) by electolyzing water in space will play an important role in meeting the National Aeronautics and Space Administration's (NASA's) needs and goals for future space missios. Both O2 and H2 are envisioned to be used in a variety of processes including crew life support, spacecraft propulsion, extravehicular activity, electrical power generation/storage as well as in scientific experiment and manufacturing processes. The Electrolysis Performance Improvement Concept Study (EPICS) flight experiment described herein is sponsored by NASA Headquarters as a part of the In-Space Technology Experiment Program (IN-STEP). The objective of the EPICS is to further contribute to the improvement of the SEF technology, specifially by demonstrating and validating the SFE electromechanical process in microgravity as well as investigating perrformance improvements projected possible in a microgravity environment. This paper defines the experiment objective and presents the results of the preliminary design of the EPICS. The experiment will include testing three subscale self-contained SFE units: one containing baseline components, and two units having variations in key component materials. Tests will be conducted at varying current and thermal condition.
A Fresnel collector process heat experiment at Capitol Concrete Products
NASA Technical Reports Server (NTRS)
Hauger, J. S.
1981-01-01
An experiment is planned, conducted and evaluated to determine the feasibility of using a Power Kinetics' Fresnel concentrator to provide process heat in an industrial environment. The plant provides process steam at 50 to 60 psig to two autoclaves for curing masonry blocks. When steam is not required, the plant preheats hot water for later use. A second system is installed at the Jet Propulsion Laboratory parabolic dish test site for hardware validation and experiment control. Experiment design allows for the extrapolation of results to varying demands for steam and hot water, and includes a consideration of some socio-technical factors such as the impact on production scheduling of diurnal variations in energy availability.
The Development of a Web-Based Urban Soundscape Evaluation System
NASA Astrophysics Data System (ADS)
Sudarsono, A. S.; Sarwono, J.
2018-05-01
Acoustic quality is one of the important aspects of urban design. It is usually evaluated based on how loud the urban environment is. However, this approach does not consider people’s perception of the urban acoustic environment. Therefore, a different method has been developed based on the perception of the acoustic environment using the concept of soundscape. Soundscape is defined as the acoustic environment perceived by people who are part of the environment. This approach considers the relationship between the sound source, the environment, and the people. The analysis of soundscape considers many aspects such as cultural aspects, people’s expectations, people’s experience of space, and social aspects. Soundscape affects many aspects of human life such as culture, health, and the quality of life. Urban soundscape management and planning must be integrated with the other aspect of urban design, both in the design and the improvement stages. The soundscape concept seeks to make the acoustic environment as pleasant as possible in a space with or without uncomfortable sound sources. Soundscape planning includes the design of physical features to achieve a positive perceptual outcome. It is vital to gather data regarding the relationship between humans and the components of a soundscape, e.g., sound sources, features of the physical environment, the functions of a space, and the expectation of the sound source. The data can be measured and gathered using several soundscape evaluation methods. Soundscape evaluation is usually conducted using in-situ surveys and laboratory experiments using a multi-speaker system. Although these methods have been validated and are widely used in soundscape analysis, there are some limitations in the application. The in-situ survey needs to be done at one time with many people at the same time because it is hard to replicate the acoustic environment. Conversely, the laboratory experiment does not have a problem with the repetition of the experiment. This method requires a room with a multi-speaker reproduction system. This project used a different method to analyse soundscape developed using headphones via the internet. The internet system for data gathering has been established; a website has enabled to reproduce high-quality audio and it has a system to design online questionnaires. Furthermore, the development of a virtual reality system allows the reproduction of virtual audio-visual stimulus on a website. Although the website has an established system to gather the required data, the problem is the validation of the reproduction system for soundscape analysis, which needs to be done with consideration of several factors: the suitable recording system, the effect of headphone variation, the calibration of the system, and the perception result from internet-based acoustic environment reproduction. This study aims to develop and validate a web-based urban soundscape evaluation method. By using this method, the experiment can be repeated easily and data can be gathered from many respondents. Furthermore, the simplicity of the system allows for the application by the stakeholders in urban design. The data gathered from this system is important for the design of an urban area with consideration of the acoustic aspects.
Multi Length Scale Finite Element Design Framework for Advanced Woven Fabrics
NASA Astrophysics Data System (ADS)
Erol, Galip Ozan
Woven fabrics are integral parts of many engineering applications spanning from personal protective garments to surgical scaffolds. They provide a wide range of opportunities in designing advanced structures because of their high tenacity, flexibility, high strength-to-weight ratios and versatility. These advantages result from their inherent multi scale nature where the filaments are bundled together to create yarns while the yarns are arranged into different weave architectures. Their highly versatile nature opens up potential for a wide range of mechanical properties which can be adjusted based on the application. While woven fabrics are viable options for design of various engineering systems, being able to understand the underlying mechanisms of the deformation and associated highly nonlinear mechanical response is important and necessary. However, the multiscale nature and relationships between these scales make the design process involving woven fabrics a challenging task. The objective of this work is to develop a multiscale numerical design framework using experimentally validated mesoscopic and macroscopic length scale approaches by identifying important deformation mechanisms and recognizing the nonlinear mechanical response of woven fabrics. This framework is exercised by developing mesoscopic length scale constitutive models to investigate plain weave fabric response under a wide range of loading conditions. A hyperelastic transversely isotropic yarn material model with transverse material nonlinearity is developed for woven yarns (commonly used in personal protection garments). The material properties/parameters are determined through an inverse method where unit cell finite element simulations are coupled with experiments. The developed yarn material model is validated by simulating full scale uniaxial tensile, bias extension and indentation experiments, and comparing to experimentally observed mechanical response and deformation mechanisms. Moreover, mesoscopic unit cell finite elements are coupled with a design-of-experiments method to systematically identify the important yarn material properties for the macroscale response of various weave architectures. To demonstrate the macroscopic length scale approach, two new material models for woven fabrics were developed. The Planar Material Model (PMM) utilizes two important deformation mechanisms in woven fabrics: (1) yarn elongation, and (2) relative yarn rotation due to shear loads. The yarns' uniaxial tensile response is modeled with a nonlinear spring using constitutive relations while a nonlinear rotational spring is implemented to define fabric's shear stiffness. The second material model, Sawtooth Material Model (SMM) adopts the sawtooth geometry while recognizing the biaxial nature of woven fabrics by implementing the interactions between the yarns. Material properties/parameters required by both PMM and SMM can be directly determined from standard experiments. Both macroscopic material models are implemented within an explicit finite element code and validated by comparing to the experiments. Then, the developed macroscopic material models are compared under various loading conditions to determine their accuracy. Finally, the numerical models developed in the mesoscopic and macroscopic length scales are linked thus demonstrating the new systematic design framework involving linked mesoscopic and macroscopic length scale modeling approaches. The approach is demonstrated with both Planar and Sawtooth Material Models and the simulation results are verified by comparing the results obtained from meso and macro models.
Pediatric Cancer Survivorship Research: Experience of the Childhood Cancer Survivor Study
Leisenring, Wendy M.; Mertens, Ann C.; Armstrong, Gregory T.; Stovall, Marilyn A.; Neglia, Joseph P.; Lanctot, Jennifer Q.; Boice, John D.; Whitton, John A.; Yasui, Yutaka
2009-01-01
The Childhood Cancer Survivor Study (CCSS) is a comprehensive multicenter study designed to quantify and better understand the effects of pediatric cancer and its treatment on later health, including behavioral and sociodemographic outcomes. The CCSS investigators have published more than 100 articles in the scientific literature related to the study. As with any large cohort study, high standards for methodologic approaches are imperative for valid and generalizable results. In this article we describe methodological issues of study design, exposure assessment, outcome validation, and statistical analysis. Methods for handling missing data, intrafamily correlation, and competing risks analysis are addressed; each with particular relevance to pediatric cancer survivorship research. Our goal in this article is to provide a resource and reference for other researchers working in the area of long-term cancer survivorship. PMID:19364957
A mobile sensing system for structural health monitoring: design and validation
NASA Astrophysics Data System (ADS)
Zhu, Dapeng; Yi, Xiaohua; Wang, Yang; Lee, Kok-Meng; Guo, Jiajie
2010-05-01
This paper describes a new approach using mobile sensor networks for structural health monitoring. Compared with static sensors, mobile sensor networks offer flexible system architectures with adaptive spatial resolutions. The paper first describes the design of a mobile sensing node that is capable of maneuvering on structures built with ferromagnetic materials. The mobile sensing node can also attach/detach an accelerometer onto/from the structural surface. The performance of the prototype mobile sensor network has been validated through laboratory experiments. Two mobile sensing nodes are adopted for navigating on a steel portal frame and providing dense acceleration measurements. Transmissibility function analysis is conducted to identify structural damage using data collected by the mobile sensing nodes. This preliminary work is expected to spawn transformative changes in the use of mobile sensors for future structural health monitoring.
Linden, Ariel
2017-04-01
The basic single-group interrupted time series analysis (ITSA) design has been shown to be susceptible to the most common threat to validity-history-the possibility that some other event caused the observed effect in the time series. A single-group ITSA with a crossover design (in which the intervention is introduced and withdrawn 1 or more times) should be more robust. In this paper, we describe and empirically assess the susceptibility of this design to bias from history. Time series data from 2 natural experiments (the effect of multiple repeals and reinstatements of Louisiana's motorcycle helmet law on motorcycle fatalities and the association between the implementation and withdrawal of Gorbachev's antialcohol campaign with Russia's mortality crisis) are used to illustrate that history remains a threat to ITSA validity, even in a crossover design. Both empirical examples reveal that the single-group ITSA with a crossover design may be biased because of history. In the case of motorcycle fatalities, helmet laws appeared effective in reducing mortality (while repealing the law increased mortality), but when a control group was added, it was shown that this trend was similar in both groups. In the case of Gorbachev's antialcohol campaign, only when contrasting the results against those of a control group was the withdrawal of the campaign found to be the more likely culprit in explaining the Russian mortality crisis than the collapse of the Soviet Union. Even with a robust crossover design, single-group ITSA models remain susceptible to bias from history. Therefore, a comparable control group design should be included, whenever possible. © 2016 John Wiley & Sons, Ltd.
Spring 2013 Graduate Engineering Internship Summary
NASA Technical Reports Server (NTRS)
Ehrlich, Joshua
2013-01-01
In the spring of 2013, I participated in the National Aeronautics and Space Administration (NASA) Pathways Intern Employment Program at the Kennedy Space Center (KSC) in Florida. This was my final internship opportunity with NASA, a third consecutive extension from a summer 2012 internship. Since the start of my tenure here at KSC, I have gained an invaluable depth of engineering knowledge and extensive hands-on experience. These opportunities have granted me the ability to enhance my systems engineering approach in the field of payload design and testing as well as develop a strong foundation in the area of composite fabrication and testing for repair design on space vehicle structures. As a systems engineer, I supported the systems engineering and integration team with final acceptance testing of the Vegetable Production System, commonly referred to as Veggie. Verification and validation (V and V) of Veggie was carried out prior to qualification testing of the payload, which incorporated the process of confirming the system's design requirements dependent on one or more validation methods: inspection, analysis, demonstration, and testing.
Vesterinen, Hanna M; Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich
2011-04-01
Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication.
Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich
2011-01-01
Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication. PMID:21157472
Design and validation of an intelligent wheelchair towards a clinically-functional outcome.
Boucher, Patrice; Atrash, Amin; Kelouwani, Sousso; Honoré, Wormser; Nguyen, Hai; Villemure, Julien; Routhier, François; Cohen, Paul; Demers, Louise; Forget, Robert; Pineau, Joelle
2013-06-17
Many people with mobility impairments, who require the use of powered wheelchairs, have difficulty completing basic maneuvering tasks during their activities of daily living (ADL). In order to provide assistance to this population, robotic and intelligent system technologies have been used to design an intelligent powered wheelchair (IPW). This paper provides a comprehensive overview of the design and validation of the IPW. The main contributions of this work are three-fold. First, we present a software architecture for robot navigation and control in constrained spaces. Second, we describe a decision-theoretic approach for achieving robust speech-based control of the intelligent wheelchair. Third, we present an evaluation protocol motivated by a meaningful clinical outcome, in the form of the Robotic Wheelchair Skills Test (RWST). This allows us to perform a thorough characterization of the performance and safety of the system, involving 17 test subjects (8 non-PW users, 9 regular PW users), 32 complete RWST sessions, 25 total hours of testing, and 9 kilometers of total running distance. User tests with the RWST show that the navigation architecture reduced collisions by more than 60% compared to other recent intelligent wheelchair platforms. On the tasks of the RWST, we measured an average decrease of 4% in performance score and 3% in safety score (not statistically significant), compared to the scores obtained with conventional driving model. This analysis was performed with regular users that had over 6 years of wheelchair driving experience, compared to approximately one half-hour of training with the autonomous mode. The platform tested in these experiments is among the most experimentally validated robotic wheelchairs in realistic contexts. The results establish that proficient powered wheelchair users can achieve the same level of performance with the intelligent command mode, as with the conventional command mode.
The EGS Collab Project: Stimulation Investigations for Geothermal Modeling Analysis and Validation
NASA Astrophysics Data System (ADS)
Blankenship, D.; Kneafsey, T. J.
2017-12-01
The US DOE's EGS Collab project team is establishing a suite of intermediate-scale ( 10-20 m) field test beds for coupled stimulation and interwell flow tests. The multiple national laboratory and university team is designing the tests to compare measured data to models to improve measurement and modeling toolsets available for use in field sites and investigations such as DOE's Frontier Observatory for Research in Geothermal Energy (FORGE) Project. Our tests will be well-controlled, in situexperiments focused on rock fracture behavior, seismicity, and permeability enhancement. Pre- and post-test modeling will allow for model prediction and validation. High-quality, high-resolution geophysical and other fracture characterization data will be collected, analyzed, and compared with models and field observations to further elucidate the basic relationships between stress, induced seismicity, and permeability enhancement. Coring through the stimulated zone after tests will provide fracture characteristics that can be compared to monitoring data and model predictions. We will also observe and quantify other key governing parameters that impact permeability, and attempt to understand how these parameters might change throughout the development and operation of an Enhanced Geothermal System (EGS) project with the goal of enabling commercial viability of EGS. The Collab team will perform three major experiments over the three-year project duration. Experiment 1, intended to investigate hydraulic fracturing, will be performed in the Sanford Underground Research Facility (SURF) at 4,850 feet depth and will build on kISMET Project findings. Experiment 2 will be designed to investigate hydroshearing. Experiment 3 will investigate changes in fracturing strategies and will be further specified as the project proceeds. The tests will provide quantitative insights into the nature of stimulation (e.g., hydraulic fracturing, hydroshearing, mixed-mode fracturing, thermal fracturing) in crystalline rock under reservoir-like stress conditions and generate high-quality, high-resolution, diverse data sets to be simulated allowing model validation. Monitoring techniques will also be evaluated under controlled conditions identifying technologies appropriate for deeper full-scale EGS sites.
Variable-Speed Power-Turbine for the Large Civil Tilt Rotor
NASA Technical Reports Server (NTRS)
Suchezky, Mark; Cruzen, G. Scott
2012-01-01
Turbine design concepts were studied for application to a large civil tiltrotor transport aircraft. The concepts addressed the need for high turbine efficiency across the broad 2:1 turbine operating speed range representative of the notional mission for the aircraft. The study focused on tailoring basic turbine aerodynamic design design parameters to avoid the need for complex, heavy, and expensive variable geometry features. The results of the study showed that good turbine performance can be achieved across the design speed range if the design focuses on tailoring the aerodynamics for good tolerance to large swings in incidence, as opposed to optimizing for best performance at the long range cruise design point. A rig design configuration and program plan are suggested for a dedicated experiment to validate the proposed approach.
Beyond associations: Do implicit beliefs play a role in smoking addiction?
Tibboel, Helen; De Houwer, Jan; Dirix, Nicolas; Spruyt, Adriaan
2017-01-01
Influential dual-system models of addiction suggest that an automatic system that is associative and habitual promotes drug use, whereas a controlled system that is propositional and rational inhibits drug use. It is assumed that effects on the Implicit Association Test (IAT) reflect the automatic processes that guide drug seeking. However, results have been inconsistent, challenging: (1) the validity of addiction IATs; and (2) the assumption that the automatic system contains only simple associative information. We aimed to further test the validity of IATs that are used within this field of research using an experimental design. Second, we introduced a new procedure aimed at examining the automatic activation of complex propositional knowledge, the Relational Responding Task (RRT) and examine the validity of RRT effects in the context of smoking. In two experiments, smokers performed two different tasks: an approach/avoid IAT and a liking IAT in Experiment 1, and a smoking urges RRT and a valence IAT in Experiment 2. Smokers were tested once immediately after smoking and once after 10 hours of nicotine-deprivation. None of the IAT scores were affected by the deprivation manipulation. RRT scores revealed a stronger implicit desire for smoking in the deprivation condition compared to the satiation condition. IATs that are currently used to assess automatic processes in addiction have serious drawbacks. Furthermore, the automatic system may contain not only associations but complex drug-related beliefs as well. The RRT may be a useful and valid tool to examine these beliefs.
Yousefi, Azizeh-Mitra; Smucker, Byran; Naber, Alex; Wyrick, Cara; Shaw, Charles; Bennett, Katelyn; Szekely, Sarah; Focke, Carlie; Wood, Katherine A
2018-02-01
Tissue engineering using three-dimensional porous scaffolds has shown promise for the restoration of normal function in injured and diseased tissues and organs. Rigorous control over scaffold architecture in melt extrusion additive manufacturing is highly restricted mainly due to pronounced variations in the deposited strand diameter upon any variations in process conditions and polymer viscoelasticity. We have designed an I-optimal, split-plot experiment to study the extrudate swell in melt extrusion additive manufacturing and to control the scaffold architecture. The designed experiment was used to generate data to relate three responses (swell, density, and modulus) to a set of controllable factors (plotting needle diameter, temperature, pressure, and the dispensing speed). The fitted regression relationships were used to optimize the three responses simultaneously. The swell response was constrained to be close to 1 while maximizing the modulus and minimizing the density. Constraining the extrudate swell to 1 generates design-driven scaffolds, with strand diameters equal to the plotting needle diameter, and allows a greater control over scaffold pore size. Hence, the modulus of the scaffolds can be fully controlled by adjusting the in-plane distance between the deposited strands. To the extent of the model's validity, we can eliminate the effect of extrudate swell in designing these scaffolds, while targeting a range of porosity and modulus appropriate for bone tissue engineering. The result of this optimization was a predicted modulus of 14 MPa and a predicted density of 0.29 g/cm 3 (porosity ≈ 75%) using polycaprolactone as scaffold material. These predicted responses corresponded to factor levels of 0.6 μm for the plotting needle diameter, plotting pressure of 2.5 bar, melt temperature of 113.5 °C, and dispensing speed of 2 mm/s. The validation scaffold enabled us to quantify the percentage difference for the predictions, which was 9.5% for the extrudate swell, 19% for the density, and 29% for the modulus.
Flexible twist for pitch control in a high altitude long endurance aircraft with nonlinear response
NASA Astrophysics Data System (ADS)
Bond, Vanessa L.
Information dominance is the key motivator for employing high-altitude long-endurance (HALE) aircraft to provide continuous coverage in the theaters of operation. A joined-wing configuration of such a craft gives the advantage of a platform for higher resolution sensors. Design challenges emerge with structural flexibility that arise from a long-endurance aircraft design. The goal of this research was to demonstrate that scaling the nonlinear response of a full-scale finite element model was possible if the model was aeroelastically and "nonlinearly" scaled. The research within this dissertation showed that using the first three modes and the first bucking modes was not sufficient for proper scaling. In addition to analytical scaling several experiments were accomplished to understand and overcome design challenges of HALE aircraft. One such challenge is combated by eliminating pitch control surfaces and replacing them with an aft-wing twist concept. This design option was physically realized through wind tunnel measurement of forces, moments and pressures on a subscale experimental model. This design and experiment demonstrated that pitch control with aft-wing twist is feasible. Another challenge is predicting the nonlinear response of long-endurance aircraft. This was addressed by experimental validation of modeling nonlinear response on a subscale experimental model. It is important to be able to scale nonlinear behavior in this type of craft due to its highly flexible nature. The validation accomplished during this experiment on a subscale model will reduce technical risk for full-scale development of such pioneering craft. It is also important to experimentally reproduce the air loads following the wing as it deforms. Nonlinearities can be attributed to these follower forces that might otherwise be overlooked. This was found to be a significant influence in HALE aircraft to include the case study of the FEM and experimental models herein.
Adiabatic model and design of a translating field reversed configuration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Intrator, T. P.; Siemon, R. E.; Sieck, P. E.
We apply an adiabatic evolution model to predict the behavior of a field reversed configuration (FRC) during decompression and translation, as well as during boundary compression. Semi-empirical scaling laws, which were developed and benchmarked primarily for collisionless FRCs, are expected to remain valid even for the collisional regime of FRX-L experiment. We use this approach to outline the design implications for FRX-L, the high density translated FRC experiment at Los Alamos National Laboratory. A conical theta coil is used to accelerate the FRC to the largest practical velocity so it can enter a mirror bounded compression region, where it mustmore » be a suitable target for a magnetized target fusion (MTF) implosion. FRX-L provides the physics basis for the integrated MTF plasma compression experiment at the Shiva-Star pulsed power facility at Kirtland Air Force Research Laboratory, where the FRC will be compressed inside a flux conserving cylindrical shell.« less
Ground Testing of a 10 K Sorption Cryocooler Flight Experiment (BETSCE)
NASA Technical Reports Server (NTRS)
Bard, S.; Wu, J.; Karlmann, P.; Cowgill, P.; Mirate, C.; Rodriguez, J.
1994-01-01
The Brilliant Eyes Ten-Kelvin Sorption Cryocooler Experiment (BETSCE) is a Space Shuttle side-wall-mounted flight experiment designed to demonstrate 10 K sorption cryocooler technology in a space environment. The BETSCE objectives are to: (1) provide a thorough end-to-end characterization and space performance validation of a complete, multistage, automated, closed-cycle hydride sorption cryocooler in the 10 to 30 K temperature range, (2) acquire the quantitative microgravity database required to provide confident engineering design, scaling, and optimization, (3) advance the enabling technologies and resolve integration issues, and (4) provide hardware qualification and safety verification heritage. BETSCE ground tests were the first-ever demonstration of a complete closed-cycle 10 K sorption cryocooler. Test results exceeded functional requirements. This paper summarizes functional and environmental ground test results, planned characterization tests, important development challenges that were overcome, and valuable lessons-learned.
Control of Flexible Structures (COFS) Flight Experiment Background and Description
NASA Technical Reports Server (NTRS)
Hanks, B. R.
1985-01-01
A fundamental problem in designing and delivering large space structures to orbit is to provide sufficient structural stiffness and static configuration precision to meet performance requirements. These requirements are directly related to control requirements and the degree of control system sophistication available to supplement the as-built structure. Background and rationale are presented for a research study in structures, structural dynamics, and controls using a relatively large, flexible beam as a focus. This experiment would address fundamental problems applicable to large, flexible space structures in general and would involve a combination of ground tests, flight behavior prediction, and instrumented orbital tests. Intended to be multidisciplinary but basic within each discipline, the experiment should provide improved understanding and confidence in making design trades between structural conservatism and control system sophistication for meeting static shape and dynamic response/stability requirements. Quantitative results should be obtained for use in improving the validity of ground tests for verifying flight performance analyses.
Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio
2018-03-03
A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less
Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio
2018-05-30
A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175-183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave ). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave . The improved model contains six of the 10 terms in the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. Compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio
A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less
[Validation of two brief scales for Internet addiction and mobile phone problem use].
Beranuy Fargues, Marta; Chamarro Lusar, Andrés; Graner Jordania, Carla; Carbonell Sánchez, Xavier
2009-08-01
This study describes the construction and validation process of two questionnaires designed to assess the addictive use of Internet and mobile phones. The scales were applied to a sample of 1,879 students. Results support a two-factor model, presenting an acceptable internal consistency and indices of convergent and discriminant validity. The Questionnaire of Experiences Related to Internet was found to assess intra- and interpersonal conflicts related to Internet use. The Questionnaire of Experiences Related to the Mobile Phone was found to assess conflicts related to mobile phone abuse and to maladaptive emotional and communicational patterns. Our results indicate that the mobile phone does not produce the same degree of addictive behavior as Internet; it could rather be interpreted as problematic use. Men displayed more addictive use of Internet, whilst women seemed to use the mobile phone as a means for emotional communication. It seems that the use of both technologies is more problematic during adolescence and normalizes with age toward a more professional and less playful use, and with fewer negative consequences.
Experiment for validation of fluid-structure interaction models and algorithms.
Hessenthaler, A; Gaddum, N R; Holub, O; Sinkus, R; Röhrle, O; Nordsletten, D
2017-09-01
In this paper a fluid-structure interaction (FSI) experiment is presented. The aim of this experiment is to provide a challenging yet easy-to-setup FSI test case that addresses the need for rigorous testing of FSI algorithms and modeling frameworks. Steady-state and periodic steady-state test cases with constant and periodic inflow were established. Focus of the experiment is on biomedical engineering applications with flow being in the laminar regime with Reynolds numbers 1283 and 651. Flow and solid domains were defined using computer-aided design (CAD) tools. The experimental design aimed at providing a straightforward boundary condition definition. Material parameters and mechanical response of a moderately viscous Newtonian fluid and a nonlinear incompressible solid were experimentally determined. A comprehensive data set was acquired by using magnetic resonance imaging to record the interaction between the fluid and the solid, quantifying flow and solid motion. Copyright © 2016 The Authors. International Journal for Numerical Methods in Biomedical Engineering published by John Wiley & Sons Ltd.
Brusati, M.; Camplani, A.; Cannon, M.; ...
2017-02-20
SRAM-ba8ed Field Programmable Gate Array (FPGA) logic devices arc very attractive in applications where high data throughput is needed, such as the latest generation of High Energy Physics (HEP) experiments. FPGAs have been rarely used in such experiments because of their sensitivity to radiation. The present paper proposes a mitigation approach applied to commercial FPGA devices to meet the reliability requirements for the front-end electronics of the Liquid Argon (LAr) electromagnetic calorimeter of the ATLAS experiment, located at CERN. Particular attention will be devoted to define a proper mitigation scheme of the multi-gigabit transceivers embedded in the FPGA, which ismore » a critical part of the LAr data acquisition chain. A demonstrator board is being developed to validate the proposed methodology. :!\\litigation techniques such as Triple Modular Redundancy (T:t\\IR) and scrubbing will be used to increase the robustness of the design and to maximize the fault tolerance from Single-Event Upsets (SEUs).« less
Fatigue Damage of Collagenous Tissues: Experiment, Modeling and Simulation Studies
Martin, Caitlin; Sun, Wei
2017-01-01
Mechanical fatigue damage is a critical issue for soft tissues and tissue-derived materials, particularly for musculoskeletal and cardiovascular applications; yet, our understanding of the fatigue damage process is incomplete. Soft tissue fatigue experiments are often difficult and time-consuming to perform, which has hindered progress in this area. However, the recent development of soft-tissue fatigue-damage constitutive models has enabled simulation-based fatigue analyses of tissues under various conditions. Computational simulations facilitate highly controlled and quantitative analyses to study the distinct effects of various loading conditions and design features on tissue durability; thus, they are advantageous over complex fatigue experiments. Although significant work to calibrate the constitutive models from fatigue experiments and to validate predictability remains, further development in these areas will add to our knowledge of soft-tissue fatigue damage and will facilitate the design of durable treatments and devices. In this review, the experimental, modeling, and simulation efforts to study collagenous tissue fatigue damage are summarized and critically assessed. PMID:25955007
NASA Astrophysics Data System (ADS)
Galaev, S. A.; Ris, V. V.; Smirnov, E. M.; Babiev, A. N.
2018-06-01
Experience gained from designing exhaust hoods for modernized versions of K-175/180-12.8 and K-330-23.5-1 steam turbines is presented. The hood flow path is optimized based on the results of analyzing equilibrium wet steam 3D flow fields calculated using up-to-date computation fluid dynamics techniques. The mathematical model constructed on the basis of Reynolds-averaged Navier-Stokes equations is validated by comparing the calculated kinetic energy loss with the published data on full-scale experiments for the hood used in the K-160-130 turbine produced by the Kharkiv Turbine-Generator Works. Test calculations were carried out for four turbine operation modes. The obtained results from validating the model with the K-160-130 turbine hood taken as an example were found to be equally positive with the results of the previously performed calculations of flow pattern in the K-300-240 turbine hood. It is shown that the calculated coefficients of total losses in the K-160-130 turbine hood differ from the full-scale test data by no more than 5%. As a result of optimizing the K-175/180-12.8 turbine hood flow path, the total loss coefficient has been decreased from 1.50 for the initial design to 1.05 for the best of the modification versions. The optimized hood is almost completely free from supersonic flow areas, and the flow through it has become essentially more uniform both inside the hood and at its outlet. In the modified version of the K-330-23.5-1 turbine hood, the total loss coefficient has been decreased by more than a factor of 2: from 2.3 in the hood initial design to a value of 1.1 calculated for the hood final design version and sizes adopted for developing the detailed design. Cardinally better performance of both the hoods with respect to their initial designs was achieved as a result of multicase calculations, during which the flow path geometrical characteristics were sequentially varied, including options involving its maximally possible expansion and removal of the guiding plates producing an adverse effect.
High-visibility time-bin entanglement for testing chained Bell inequalities
NASA Astrophysics Data System (ADS)
Tomasin, Marco; Mantoan, Elia; Jogenfors, Jonathan; Vallone, Giuseppe; Larsson, Jan-Åke; Villoresi, Paolo
2017-03-01
The violation of Bell's inequality requires a well-designed experiment to validate the result. In experiments using energy-time and time-bin entanglement, initially proposed by Franson in 1989, there is an intrinsic loophole due to the high postselection. To obtain a violation in this type of experiment, a chained Bell inequality must be used. However, the local realism bound requires a high visibility in excess of 94.63% in the time-bin entangled state. In this work, we show how such a high visibility can be reached in order to violate a chained Bell inequality with six, eight, and ten terms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; J. Blair Briggs; Jim Gulliford
2014-10-01
The International Reactor Physics Experiment Evaluation Project (IRPhEP) is a widely recognized world class program. The work of the IRPhEP is documented in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). Integral data from the IRPhEP Handbook is used by reactor safety and design, nuclear data, criticality safety, and analytical methods development specialists, worldwide, to perform necessary validations of their calculational techniques. The IRPhEP Handbook is among the most frequently quoted reference in the nuclear industry and is expected to be a valuable resource for future decades.
Pandey, Vinay Kumar; Kar, Indrani; Mahanta, Chitralekha
2017-07-01
In this paper, an adaptive control method using multiple models with second level adaptation is proposed for a class of nonlinear multi-input multi-output (MIMO) coupled systems. Multiple estimation models are used to tune the unknown parameters at the first level. The second level adaptation provides a single parameter vector for the controller. A feedback linearization technique is used to design a state feedback control. The efficacy of the designed controller is validated by conducting real time experiment on a laboratory setup of twin rotor MIMO system (TRMS). The TRMS setup is discussed in detail and the experiments were performed for regulation and tracking problem for pitch and yaw control using different reference signals. An Extended Kalman Filter (EKF) has been used to observe the unavailable states of the TRMS. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Design and experiment of data-driven modeling and flutter control of a prototype wing
NASA Astrophysics Data System (ADS)
Lum, Kai-Yew; Xu, Cai-Lin; Lu, Zhenbo; Lai, Kwok-Leung; Cui, Yongdong
2017-06-01
This paper presents an approach for data-driven modeling of aeroelasticity and its application to flutter control design of a wind-tunnel wing model. Modeling is centered on system identification of unsteady aerodynamic loads using computational fluid dynamics data, and adopts a nonlinear multivariable extension of the Hammerstein-Wiener system. The formulation is in modal coordinates of the elastic structure, and yields a reduced-order model of the aeroelastic feedback loop that is parametrized by airspeed. Flutter suppression is thus cast as a robust stabilization problem over uncertain airspeed, for which a low-order H∞ controller is computed. The paper discusses in detail parameter sensitivity and observability of the model, the former to justify the chosen model structure, and the latter to provide a criterion for physical sensor placement. Wind tunnel experiments confirm the validity of the modeling approach and the effectiveness of the control design.
Keshtiari, Niloofar; Kuhlmann, Michael; Eslami, Moharram; Klann-Delius, Gisela
2015-03-01
Research on emotional speech often requires valid stimuli for assessing perceived emotion through prosody and lexical content. To date, no comprehensive emotional speech database for Persian is officially available. The present article reports the process of designing, compiling, and evaluating a comprehensive emotional speech database for colloquial Persian. The database contains a set of 90 validated novel Persian sentences classified in five basic emotional categories (anger, disgust, fear, happiness, and sadness), as well as a neutral category. These sentences were validated in two experiments by a group of 1,126 native Persian speakers. The sentences were articulated by two native Persian speakers (one male, one female) in three conditions: (1) congruent (emotional lexical content articulated in a congruent emotional voice), (2) incongruent (neutral sentences articulated in an emotional voice), and (3) baseline (all emotional and neutral sentences articulated in neutral voice). The speech materials comprise about 470 sentences. The validity of the database was evaluated by a group of 34 native speakers in a perception test. Utterances recognized better than five times chance performance (71.4 %) were regarded as valid portrayals of the target emotions. Acoustic analysis of the valid emotional utterances revealed differences in pitch, intensity, and duration, attributes that may help listeners to correctly classify the intended emotion. The database is designed to be used as a reliable material source (for both text and speech) in future cross-cultural or cross-linguistic studies of emotional speech, and it is available for academic research purposes free of charge. To access the database, please contact the first author.
NASA Astrophysics Data System (ADS)
Maimoni, A.
1988-03-01
The literature on aluminum trihydroxide crystallization is reviewed and the implications of crystallization on the design and performance of the aluminum-air battery are illustrated. Results of research on hydrargillite crystallization under battery operating conditions at Alcoa Laboratories, Alcan Kingston Laboratories, and Lawrence Livermore National Laboratory are summarized and are applied to the design of an electrolyte management system using lamella settlers for clarification of the electrolyte and product separation. The design principles were validated in a series of experiments that, for the first time in the aluminum-air program, demonstrated continuous operation of an integrated system consisting of cells, crystallizer, and a product-removal system.
Scalable Metadata Management for a Large Multi-Source Seismic Data Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaylord, J. M.; Dodge, D. A.; Magana-Zook, S. A.
In this work, we implemented the key metadata management components of a scalable seismic data ingestion framework to address limitations in our existing system, and to position it for anticipated growth in volume and complexity. We began the effort with an assessment of open source data flow tools from the Hadoop ecosystem. We then began the construction of a layered architecture that is specifically designed to address many of the scalability and data quality issues we experience with our current pipeline. This included implementing basic functionality in each of the layers, such as establishing a data lake, designing a unifiedmore » metadata schema, tracking provenance, and calculating data quality metrics. Our original intent was to test and validate the new ingestion framework with data from a large-scale field deployment in a temporary network. This delivered somewhat unsatisfying results, since the new system immediately identified fatal flaws in the data relatively early in the pipeline. Although this is a correct result it did not allow us to sufficiently exercise the whole framework. We then widened our scope to process all available metadata from over a dozen online seismic data sources to further test the implementation and validate the design. This experiment also uncovered a higher than expected frequency of certain types of metadata issues that challenged us to further tune our data management strategy to handle them. Our result from this project is a greatly improved understanding of real world data issues, a validated design, and prototype implementations of major components of an eventual production framework. This successfully forms the basis of future development for the Geophysical Monitoring Program data pipeline, which is a critical asset supporting multiple programs. It also positions us very well to deliver valuable metadata management expertise to our sponsors, and has already resulted in an NNSA Office of Defense Nuclear Nonproliferation commitment to a multi-year project for follow-on work.« less
Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code
NASA Astrophysics Data System (ADS)
Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.
2015-12-01
WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).
Validating Hydrodynamic Growth in National Ignition Facility Implosions
NASA Astrophysics Data System (ADS)
Peterson, J. Luc
2014-10-01
The hydrodynamic growth of capsule imperfections can threaten the success of inertial confinement fusion implosions. Therefore, it is important to design implosions that are robust to hydrodynamic instabilities. However, the numerical simulation of interacting Rayleigh-Taylor and Richtmyer-Meshkov growth in these implosions is sensitive to modeling uncertainties such as radiation drive and material equations of state, the effects of which are especially apparent at high mode number (small perturbation wavelength) and high convergence ratio (small capsule radius). A series of validation experiments were conducted at the National Ignition Facility to test the ability to model hydrodynamic growth in spherically converging ignition-relevant implosions. These experiments on the Hydro-Growth Radiography platform constituted direct measurements of the growth of pre-imposed imperfections up to Legendre mode 160 and a convergence ratio of greater than four using two different laser drives: a ``low-foot'' drive used during the National Ignition Campaign and a larger adiabat ``high-foot'' drive that is modeled to be relatively more robust to ablation front hydrodynamic growth. We will discuss these experiments and how their results compare to numerical simulations and analytic theories of hydrodynamic growth, as well as their implications for the modeling of future designs. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Large Space Systems Technology, Part 2, 1981
NASA Technical Reports Server (NTRS)
Boyer, W. J. (Compiler)
1982-01-01
Four major areas of interest are covered: technology pertinent to large antenna systems; technology related to the control of large space systems; basic technology concerning structures, materials, and analyses; and flight technology experiments. Large antenna systems and flight technology experiments are described. Design studies, structural testing results, and theoretical applications are presented with accompanying validation data. These research studies represent state-of-the art technology that is necessary for the development of large space systems. A total systems approach including structures, analyses, controls, and antennas is presented as a cohesive, programmatic plan for large space systems.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle
NASA Technical Reports Server (NTRS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.
1993-01-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle (STV)
NASA Technical Reports Server (NTRS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. Wayne
1991-01-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle
NASA Astrophysics Data System (ADS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.
1993-07-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
Disbergen, Niels R.; Valente, Giancarlo; Formisano, Elia; Zatorre, Robert J.
2018-01-01
Polyphonic music listening well exemplifies processes typically involved in daily auditory scene analysis situations, relying on an interactive interplay between bottom-up and top-down processes. Most studies investigating scene analysis have used elementary auditory scenes, however real-world scene analysis is far more complex. In particular, music, contrary to most other natural auditory scenes, can be perceived by either integrating or, under attentive control, segregating sound streams, often carried by different instruments. One of the prominent bottom-up cues contributing to multi-instrument music perception is their timbre difference. In this work, we introduce and validate a novel paradigm designed to investigate, within naturalistic musical auditory scenes, attentive modulation as well as its interaction with bottom-up processes. Two psychophysical experiments are described, employing custom-composed two-voice polyphonic music pieces within a framework implementing a behavioral performance metric to validate listener instructions requiring either integration or segregation of scene elements. In Experiment 1, the listeners' locus of attention was switched between individual instruments or the aggregate (i.e., both instruments together), via a task requiring the detection of temporal modulations (i.e., triplets) incorporated within or across instruments. Subjects responded post-stimulus whether triplets were present in the to-be-attended instrument(s). Experiment 2 introduced the bottom-up manipulation by adding a three-level morphing of instrument timbre distance to the attentional framework. The task was designed to be used within neuroimaging paradigms; Experiment 2 was additionally validated behaviorally in the functional Magnetic Resonance Imaging (fMRI) environment. Experiment 1 subjects (N = 29, non-musicians) completed the task at high levels of accuracy, showing no group differences between any experimental conditions. Nineteen listeners also participated in Experiment 2, showing a main effect of instrument timbre distance, even though within attention-condition timbre-distance contrasts did not demonstrate any timbre effect. Correlation of overall scores with morph-distance effects, computed by subtracting the largest from the smallest timbre distance scores, showed an influence of general task difficulty on the timbre distance effect. Comparison of laboratory and fMRI data showed scanner noise had no adverse effect on task performance. These Experimental paradigms enable to study both bottom-up and top-down contributions to auditory stream segregation and integration within psychophysical and neuroimaging experiments. PMID:29563861
Thermodynamically optimal whole-genome tiling microarray design and validation.
Cho, Hyejin; Chou, Hui-Hsien
2016-06-13
Microarray is an efficient apparatus to interrogate the whole transcriptome of species. Microarray can be designed according to annotated gene sets, but the resulted microarrays cannot be used to identify novel transcripts and this design method is not applicable to unannotated species. Alternatively, a whole-genome tiling microarray can be designed using only genomic sequences without gene annotations, and it can be used to detect novel RNA transcripts as well as known genes. The difficulty with tiling microarray design lies in the tradeoff between probe-specificity and coverage of the genome. Sequence comparison methods based on BLAST or similar software are commonly employed in microarray design, but they cannot precisely determine the subtle thermodynamic competition between probe targets and partially matched probe nontargets during hybridizations. Using the whole-genome thermodynamic analysis software PICKY to design tiling microarrays, we can achieve maximum whole-genome coverage allowable under the thermodynamic constraints of each target genome. The resulted tiling microarrays are thermodynamically optimal in the sense that all selected probes share the same melting temperature separation range between their targets and closest nontargets, and no additional probes can be added without violating the specificity of the microarray to the target genome. This new design method was used to create two whole-genome tiling microarrays for Escherichia coli MG1655 and Agrobacterium tumefaciens C58 and the experiment results validated the design.
Design of Low Complexity Model Reference Adaptive Controllers
NASA Technical Reports Server (NTRS)
Hanson, Curt; Schaefer, Jacob; Johnson, Marcus; Nguyen, Nhan
2012-01-01
Flight research experiments have demonstrated that adaptive flight controls can be an effective technology for improving aircraft safety in the event of failures or damage. However, the nonlinear, timevarying nature of adaptive algorithms continues to challenge traditional methods for the verification and validation testing of safety-critical flight control systems. Increasingly complex adaptive control theories and designs are emerging, but only make testing challenges more difficult. A potential first step toward the acceptance of adaptive flight controllers by aircraft manufacturers, operators, and certification authorities is a very simple design that operates as an augmentation to a non-adaptive baseline controller. Three such controllers were developed as part of a National Aeronautics and Space Administration flight research experiment to determine the appropriate level of complexity required to restore acceptable handling qualities to an aircraft that has suffered failures or damage. The controllers consist of the same basic design, but incorporate incrementally-increasing levels of complexity. Derivations of the controllers and their adaptive parameter update laws are presented along with details of the controllers implementations.
ERIC Educational Resources Information Center
Holzmann, Vered; Mischari, Shoshana; Goldberg, Shoshana; Ziv, Amitai
2012-01-01
Purpose: This article aims to present a unique systematic and validated method for creating a linkage between past experiences and management of future occurrences in an organization. Design/methodology/approach: The study is based on actual data accumulated in a series of projects performed in a major medical center. Qualitative and quantitative…
ERIC Educational Resources Information Center
Museus, Samuel D.; Zhang, Duan; Kim, Mee Joo
2016-01-01
The purpose of the current examination was to develop a scale to measure campus environments and their impact on the experiences and outcomes of diverse student populations. The Culturally Engaging Campus Environments (CECE) Scale was designed to measure the nine elements of college environments that foster success among diverse populations.…
ERIC Educational Resources Information Center
Thomas, Gregory P; Meldrum, Al; Beamish, John
2013-01-01
First-year undergraduate physics laboratories are important physics learning environments. However, there is a lack of empirically informed literature regarding how students perceive their overall laboratory learning experiences. Recipe formats persist as the dominant form of instructional design in these sites, and these formats do not adequately…
Markov Chains For Testing Redundant Software
NASA Technical Reports Server (NTRS)
White, Allan L.; Sjogren, Jon A.
1990-01-01
Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.
A Positive Model for Reducing and Preventing School Burnout in High School Students
ERIC Educational Resources Information Center
Aypay, Ayse
2017-01-01
This study aims to develop and test the validity of a model limited to attitude towards the future and subjective well-being for reducing and preventing the school burnout that high school students can experience. The study is designed as a relational screening model conducted over 389 high school students. The data in this study are analyzed…
Shavers, M R; Cucinotta, F A; Miller, J; Zeitlin, C; Heilbronn, L; Wilson, J W; Singleterry, R C
2001-01-01
Radiological assessment of the many cosmic ion species of widely distributed energies requires the use of theoretical transport models to accurately describe diverse physical processes related to nuclear reactions in spacecraft structures, planetary atmospheres and surfaces, and tissues. Heavy-ion transport models that were designed to characterize shielded radiation fields have been validated through comparison with data from thick-target irradiation experiments at particle accelerators. With the RTD Mission comes a unique opportunity to validate existing radiation transport models and guide the development of tools for shield design. For the first time, transport properties will be measured in free-space to characterize the shielding effectiveness of materials that are likely to be aboard interplanetary space missions. Target materials composed of aluminum, advanced composite spacecraft structure and other shielding materials, helium (a propellant) and tissue equivalent matrices will be evaluated. Large solid state detectors will provide kinetic energy and charge identification for incident heavy-ions and for secondary ions created in the target material. Transport calculations using the HZETRN model suggest that 8 g cm -2 thick targets would be adequate to evaluate the shielding effectiveness during solar minimum activity conditions for a period of 30 days or more.
Schomann, Carsten; Giebel, Ole; Nachreiner, Friedhelm
2006-01-01
BASS 4, a computer program for the design and evaluation of workings hours, is an example of an ergonomics-based software tool that can be used by safety practitioners at the shop floor with regard to legal, ergonomic, and economic criteria. Based on experiences with this computer program, a less sophisticated Working-Hours-Risk Index for assessing the quality of work schedules (including flexible work hours) to indicate risks to health and wellbeing has been developed to provide a quick and easy applicable tool for legally required risk assessments. The results of a validation study show that this risk index seems to be a promising indicator for predicting risks of health complaints and wellbeing. The purpose of the Risk Index is to simplify the evaluation process at the shop floor and provide some more general information about the quality of a work schedule that can be used for triggering preventive interventions. Such a risk index complies with practitioners' expectations and requests for easy, useful, and valid instruments.
NASA Astrophysics Data System (ADS)
Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.
2016-05-01
Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.
Agent-Based vs. Equation-based Epidemiological Models:A Model Selection Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R; Nutaro, James J
This paper is motivated by the need to design model validation strategies for epidemiological disease-spread models. We consider both agent-based and equation-based models of pandemic disease spread and study the nuances and complexities one has to consider from the perspective of model validation. For this purpose, we instantiate an equation based model and an agent based model of the 1918 Spanish flu and we leverage data published in the literature for our case- study. We present our observations from the perspective of each implementation and discuss the application of model-selection criteria to compare the risk in choosing one modeling paradigmmore » to another. We conclude with a discussion of our experience and document future ideas for a model validation framework.« less
NASA Astrophysics Data System (ADS)
Wagner, David R.; Holmgren, Per; Skoglund, Nils; Broström, Markus
2018-06-01
The design and validation of a newly commissioned entrained flow reactor is described in the present paper. The reactor was designed for advanced studies of fuel conversion and ash formation in powder flames, and the capabilities of the reactor were experimentally validated using two different solid biomass fuels. The drop tube geometry was equipped with a flat flame burner to heat and support the powder flame, optical access ports, a particle image velocimetry (PIV) system for in situ conversion monitoring, and probes for extraction of gases and particulate matter. A detailed description of the system is provided based on simulations and measurements, establishing the detailed temperature distribution and gas flow profiles. Mass balance closures of approximately 98% were achieved by combining gas analysis and particle extraction. Biomass fuel particles were successfully tracked using shadow imaging PIV, and the resulting data were used to determine the size, shape, velocity, and residence time of converting particles. Successful extractive sampling of coarse and fine particles during combustion while retaining their morphology was demonstrated, and it opens up for detailed time resolved studies of rapid ash transformation reactions; in the validation experiments, clear and systematic fractionation trends for K, Cl, S, and Si were observed for the two fuels tested. The combination of in situ access, accurate residence time estimations, and precise particle sampling for subsequent chemical analysis allows for a wide range of future studies, with implications and possibilities discussed in the paper.
Practical input optimization for aircraft parameter estimation experiments. Ph.D. Thesis, 1990
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1993-01-01
The object of this research was to develop an algorithm for the design of practical, optimal flight test inputs for aircraft parameter estimation experiments. A general, single pass technique was developed which allows global optimization of the flight test input design for parameter estimation using the principles of dynamic programming with the input forms limited to square waves only. Provision was made for practical constraints on the input, including amplitude constraints, control system dynamics, and selected input frequency range exclusions. In addition, the input design was accomplished while imposing output amplitude constraints required by model validity and considerations of safety during the flight test. The algorithm has multiple input design capability, with optional inclusion of a constraint that only one control move at a time, so that a human pilot can implement the inputs. It is shown that the technique can be used to design experiments for estimation of open loop model parameters from closed loop flight test data. The report includes a new formulation of the optimal input design problem, a description of a new approach to the solution, and a summary of the characteristics of the algorithm, followed by three example applications of the new technique which demonstrate the quality and expanded capabilities of the input designs produced by the new technique. In all cases, the new input design approach showed significant improvement over previous input design methods in terms of achievable parameter accuracies.
Intraindividual differences in executive functions during childhood: the role of emotions.
Pnevmatikos, Dimitris; Trikkaliotis, Ioannis
2013-06-01
Intraindividual differences in executive functions (EFs) have been rarely investigated. In this study, we addressed the question of whether the emotional fluctuations that schoolchildren experience in their classroom settings could generate substantial intraindividual differences in their EFs and, more specifically, in the fundamental unifying component of EFs, their inhibition function. We designed an experimental research with ecological validity within the school setting where schoolchildren of three age groups (8-, 10-, and 12-year-olds) were involved. We executed three experiments. In Experiment 1, using a between-participants design, we isolated a classroom episode that, compared with the other episodes, generated significant differences in inhibitory function in a consequent Go/NoGo task. This was an episode that induced frustration after the experience of anxiety due to the uncertainty. Experiment 2, using a within-participants design, confirmed both the induced emotions from the episode and the intraindividual variability in schoolchildren's inhibition accuracy in the consequent Go/NoGo task. Experiment 3, again using a within-participants design, examined whether the same episode could generate intraindividual differences in a more demanding inhibition task, namely the anti-saccade task. The experiment confirmed the previous evidence; the episode generated high variability that in some age groups accounted for more than 1.5 standard deviations from the interindividual variability between the schoolchildren of the same age. Results showed that, regardless of their sex and the developmental progression in their inhibition with age, the variability induced within participants from the experienced frustration was very high compared with the interindividual variability of the same age group. Copyright © 2013 Elsevier Inc. All rights reserved.
Thermal Analysis of a Metallic Wing Glove for a Mach-8 Boundary-Layer Experiment
NASA Technical Reports Server (NTRS)
Gong, Leslie; Richards, W. Lance
1998-01-01
A metallic 'glove' structure has been built and attached to the wing of the Pegasus(trademark) space booster. An experiment on the upper surface of the glove has been designed to help validate boundary-layer stability codes in a free-flight environment. Three-dimensional thermal analyses have been performed to ensure that the glove structure design would be within allowable temperature limits in the experiment test section of the upper skin of the glove. Temperature results obtained from the design-case analysis show a peak temperature at the leading edge of 490 F. For the upper surface of the glove, approximately 3 in. back from the leading edge, temperature calculations indicate transition occurs at approximately 45 sec into the flight profile. A worst-case heating analysis has also been performed to ensure that the glove structure would not have any detrimental effects on the primary objective of the Pegasus a launch. A peak temperature of 805 F has been calculated on the leading edge of the glove structure. The temperatures predicted from the design case are well within the temperature limits of the glove structure, and the worst-case heating analysis temperature results are acceptable for the mission objectives.
Fundamental Mixing and Combustion Experiments for Propelled Hypersonic Flight
NASA Technical Reports Server (NTRS)
Cutler, A. D.; Diskin, G. S.; Danehy, P. M.; Drummond, J. P.
2002-01-01
Two experiments have been conducted to acquire data for the validation of computational fluid dynamics (CFD) codes used in the design of supersonic combustors. The first experiment is a study of a supersonic coaxial jet into stagnant air in which the center jet is of a light gas, the coflow jet is of air, and the mixing layer between them is compressible. The jet flow field is characterized using schlieren imaging, surveys with Pitot, total temperature and gas sampling probes, and RELIEF velocimetry. VULCAN, a structured grid CFD code, is used to solve for the nozzle and jet flow. The second experiment is a study of a supersonic combustor consisting of a diverging duct with single downstream-angled wall injector. Entrance Mach number is 2 and enthalpy is nominally that of Mach 7 flight. Coherent anti-Stokes Raman spectroscopy (CARS) has been used to obtain nitrogen temperature in planes of the flow, and surface pressures and temperatures have also been acquired. Modern-design-of-experiment techniques have been used to maximize the quality of the data set.
The Rapid Response Radiation Survey (R3S) Mission Using the HISat Conformal Satellite Architecture
NASA Technical Reports Server (NTRS)
Miller, Nathanael
2015-01-01
The Rapid Response Radiation Survey (R3S) experiment, designed as a quick turnaround mission to make radiation measurements in LEO, will fly as a hosted payload in partnership with NovaWurks using their Hyper-integrated Satlet (HiSat) architecture. The need for the mission arises as the Nowcast of Atmospheric Ionization Radiation for Aviation Safety (NAIRAS) model moves from a research effort into an operational radiation assessment tool. The data collected by R3S, in addition to the complementary data from a NASA Langley Research Center (LaRC) atmospheric balloon mission entitled Radiation Dosimetry Experiment (RaDX), will validate exposure prediction capabilities of NAIRAS. This paper discusses the development of the R3S experiment as made possible by use of the HiSat architecture. The system design and operational modes of the experiment are described, as well as the experiment interfaces to the HiSat satellite via the user defined adapter (UDA) provided by NovaWurks. This paper outlines the steps taken by the project to execute the R3S mission in the 4 months of design, build, and test. Finally, description of the engineering process is provided, including the use of facilitated rapid/concurrent engineering sessions, the associated documentation, and the review process employed.
Quasi-experimental study designs series-paper 4: uses and value.
Bärnighausen, Till; Tugwell, Peter; Røttingen, John-Arne; Shemilt, Ian; Rockers, Peter; Geldsetzer, Pascal; Lavis, John; Grimshaw, Jeremy; Daniels, Karen; Brown, Annette; Bor, Jacob; Tanner, Jeffery; Rashidian, Arash; Barreto, Mauricio; Vollmer, Sebastian; Atun, Rifat
2017-09-01
Quasi-experimental studies are increasingly used to establish causal relationships in epidemiology and health systems research. Quasi-experimental studies offer important opportunities to increase and improve evidence on causal effects: (1) they can generate causal evidence when randomized controlled trials are impossible; (2) they typically generate causal evidence with a high degree of external validity; (3) they avoid the threats to internal validity that arise when participants in nonblinded experiments change their behavior in response to the experimental assignment to either intervention or control arm (such as compensatory rivalry or resentful demoralization); (4) they are often well suited to generate causal evidence on long-term health outcomes of an intervention, as well as nonhealth outcomes such as economic and social consequences; and (5) they can often generate evidence faster and at lower cost than experiments and other intervention studies. Copyright © 2017 Elsevier Inc. All rights reserved.
42 CFR 71.3 - Designation of yellow fever vaccination centers; Validation stamps.
Code of Federal Regulations, 2012 CFR
2012-10-01
...; Validation stamps. 71.3 Section 71.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN... Designation of yellow fever vaccination centers; Validation stamps. (a) Designation of yellow fever... health department, may revoke designation. (b) Validation stamps. International Certificates of...
42 CFR 71.3 - Designation of yellow fever vaccination centers; Validation stamps.
Code of Federal Regulations, 2014 CFR
2014-10-01
...; Validation stamps. 71.3 Section 71.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN... Designation of yellow fever vaccination centers; Validation stamps. (a) Designation of yellow fever... health department, may revoke designation. (b) Validation stamps. International Certificates of...
42 CFR 71.3 - Designation of yellow fever vaccination centers; Validation stamps.
Code of Federal Regulations, 2011 CFR
2011-10-01
...; Validation stamps. 71.3 Section 71.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN... Designation of yellow fever vaccination centers; Validation stamps. (a) Designation of yellow fever... health department, may revoke designation. (b) Validation stamps. International Certificates of...
42 CFR 71.3 - Designation of yellow fever vaccination centers; Validation stamps.
Code of Federal Regulations, 2013 CFR
2013-10-01
...; Validation stamps. 71.3 Section 71.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN... Designation of yellow fever vaccination centers; Validation stamps. (a) Designation of yellow fever... health department, may revoke designation. (b) Validation stamps. International Certificates of...
Thermal design of the IMP-I and H spacecraft
NASA Technical Reports Server (NTRS)
Hoffman, R. H.
1974-01-01
A description of the thermal subsystem of the IMP-I and H spacecraft is presented. These two spacecraft were of a larger and more advanced type in the Explorer series and were successfully launched in March 1971 and September 1972. The thermal requirements, analysis, and design of each spacecraft are described including several specific designs for individual experiments. Techniques for obtaining varying degrees of thermal isolation and contact are presented. The thermal control coatings including the spaceflight performance of silver-coated FEP Teflon are discussed. Predicted performance is compared to measured flight data. The good agreement between them verifies the validity of the thermal model and the selection of coatings.
Flux-Level Transit Injection Experiments with NASA Pleiades Supercomputer
NASA Astrophysics Data System (ADS)
Li, Jie; Burke, Christopher J.; Catanzarite, Joseph; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division
2016-06-01
Flux-Level Transit Injection (FLTI) experiments are executed with NASA's Pleiades supercomputer for the Kepler Mission. The latest release (9.3, January 2016) of the Kepler Science Operations Center Pipeline is used in the FLTI experiments. Their purpose is to validate the Analytic Completeness Model (ACM), which can be computed for all Kepler target stars, thereby enabling exoplanet occurrence rate studies. Pleiades, a facility of NASA's Advanced Supercomputing Division, is one of the world's most powerful supercomputers and represents NASA's state-of-the-art technology. We discuss the details of implementing the FLTI experiments on the Pleiades supercomputer. For example, taking into account that ~16 injections are generated by one core of the Pleiades processors in an hour, the “shallow” FLTI experiment, in which ~2000 injections are required per target star, can be done for 16% of all Kepler target stars in about 200 hours. Stripping down the transit search to bare bones, i.e. only searching adjacent high/low periods at high/low pulse durations, makes the computationally intensive FLTI experiments affordable. The design of the FLTI experiments and the analysis of the resulting data are presented in “Validating an Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments” by Catanzarite et al. (#2494058).Kepler was selected as the 10th mission of the Discovery Program. Funding for the Kepler Mission has been provided by the NASA Science Mission Directorate.
Factors that influence the tribocharging of pulverulent materials in compressed-air devices
NASA Astrophysics Data System (ADS)
Das, S.; Medles, K.; Mihalcioiu, A.; Beleca, R.; Dragan, C.; Dascalescu, L.
2008-12-01
Tribocharging of pulverulent materials in compressed-air devices is a typical multi-factorial process. This paper aims at demonstrating the interest of using the design of experiments methodology in association with virtual instrumentation for quantifying the effects of various process varaibles and of their interactions, as a prerequisite for the development of new tribocharging devices for industrial applications. The study is focused on the tribocharging of PVC powders in compressed-air devices similar to those employed in electrostatic painting. A classical 2 full-factorial design (3 factors at two levels) was employed for conducting the experiments. The response function was the charge/mass ratio of the material collected in a modified Faraday cage, at the exit of the tribocharging device. The charge/mass ratio was found to increase with the injection pressure and the vortex pressure in the tribocharging device, and to decrease with the increasing of the feed rate. In the present study an in-house design of experiments software was employed for statistical analysis of experimental data and validation of the experimental model.
Grant, Yitzchak; Matejtschuk, Paul; Bird, Christopher; Wadhwa, Meenu; Dalby, Paul A
2012-04-01
The lyophilization of proteins in microplates, to assess and optimise formulations rapidly, has been applied for the first time to a therapeutic protein and, in particular, one that requires a cell-based biological assay, in order to demonstrate the broader usefulness of the approach. Factorial design of experiment methods were combined with lyophilization in microplates to identify optimum formulations that stabilised granulocyte colony-stimulating factor during freeze drying. An initial screen rapidly identified key excipients and potential interactions, which was then followed by a central composite face designed optimisation experiment. Human serum albumin and Tween 20 had significant effects on maintaining protein stability. As previously, the optimum formulation was then freeze-dried in stoppered vials to verify that the microscale data is relevant to pilot scales. However, to validate the approach further, the selected formulation was also assessed for solid-state shelf-life through the use of accelerated stability studies. This approach allows for a high-throughput assessment of excipient options early on in product development, while also reducing costs in terms of time and quantity of materials required.
Development and Validation of a Supersonic Helium-Air Coannular Jet Facility
NASA Technical Reports Server (NTRS)
Carty, Atherton A.; Cutler, Andrew D.
1999-01-01
Data are acquired in a simple coannular He/air supersonic jet suitable for validation of CFD (Computational Fluid Dynamics) codes for high speed propulsion. Helium is employed as a non-reacting hydrogen fuel simulant, constituting the core of the coannular flow while the coflow is composed of air. The mixing layer interface between the two flows in the near field and the plume region which develops further downstream constitute the primary regions of interest, similar to those present in all hypersonic air breathing propulsion systems. A computational code has been implemented from the experiment's inception, serving as a tool for model design during the development phase.
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Farooq, Mohammad U.
1986-01-01
The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.
NASA Astrophysics Data System (ADS)
Catanzarite, Joseph; Burke, Christopher J.; Li, Jie; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division
2016-06-01
The Kepler Mission is developing an Analytic Completeness Model (ACM) to estimate detection completeness contours as a function of exoplanet radius and period for each target star. Accurate completeness contours are necessary for robust estimation of exoplanet occurrence rates.The main components of the ACM for a target star are: detection efficiency as a function of SNR, the window function (WF) and the one-sigma depth function (OSDF). (Ref. Burke et al. 2015). The WF captures the falloff in transit detection probability at long periods that is determined by the observation window (the duration over which the target star has been observed). The OSDF is the transit depth (in parts per million) that yields SNR of unity for the full transit train. It is a function of period, and accounts for the time-varying properties of the noise and for missing or deweighted data.We are performing flux-level transit injection (FLTI) experiments on selected Kepler target stars with the goal of refining and validating the ACM. “Flux-level” injection machinery inserts exoplanet transit signatures directly into the flux time series, as opposed to “pixel-level” injection, which inserts transit signatures into the individual pixels using the pixel response function. See Jie Li's poster: ID #2493668, "Flux-level transit injection experiments with the NASA Pleiades Supercomputer" for details, including performance statistics.Since FLTI is affordable for only a small subset of the Kepler targets, the ACM is designed to apply to most Kepler target stars. We validate this model using “deep” FLTI experiments, with ~500,000 injection realizations on each of a small number of targets and “shallow” FLTI experiments with ~2000 injection realizations on each of many targets. From the results of these experiments, we identify anomalous targets, model their behavior and refine the ACM accordingly.In this presentation, we discuss progress in validating and refining the ACM, and we compare our detection efficiency curves with those derived from the associated pixel-level transit injection experiments.Kepler was selected as the 10th mission of the Discovery Program. Funding for this mission is provided by NASA, Science Mission Directorate.
Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure Validation Simulation Study
NASA Technical Reports Server (NTRS)
Murdoch, Jennifer L.; Bussink, Frank J. L.; Chamberlain, James P.; Chartrand, Ryan C.; Palmer, Michael T.; Palmer, Susan O.
2008-01-01
The Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure (ITP) Validation Simulation Study investigated the viability of an ITP designed to enable oceanic flight level changes that would not otherwise be possible. Twelve commercial airline pilots with current oceanic experience flew a series of simulated scenarios involving either standard or ITP flight level change maneuvers and provided subjective workload ratings, assessments of ITP validity and acceptability, and objective performance measures associated with the appropriate selection, request, and execution of ITP flight level change maneuvers. In the majority of scenarios, subject pilots correctly assessed the traffic situation, selected an appropriate response (i.e., either a standard flight level change request, an ITP request, or no request), and executed their selected flight level change procedure, if any, without error. Workload ratings for ITP maneuvers were acceptable and not substantially higher than for standard flight level change maneuvers, and, for the majority of scenarios and subject pilots, subjective acceptability ratings and comments for ITP were generally high and positive. Qualitatively, the ITP was found to be valid and acceptable. However, the error rates for ITP maneuvers were higher than for standard flight level changes, and these errors may have design implications for both the ITP and the study's prototype traffic display. These errors and their implications are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Canhai; Xu, Zhijie; Pan, Wenxiao
2016-01-01
To quantify the predictive confidence of a solid sorbent-based carbon capture design, a hierarchical validation methodology—consisting of basic unit problems with increasing physical complexity coupled with filtered model-based geometric upscaling has been developed and implemented. This paper describes the computational fluid dynamics (CFD) multi-phase reactive flow simulations and the associated data flows among different unit problems performed within the said hierarchical validation approach. The bench-top experiments used in this calibration and validation effort were carefully designed to follow the desired simple-to-complex unit problem hierarchy, with corresponding data acquisition to support model parameters calibrations at each unit problem level. A Bayesianmore » calibration procedure is employed and the posterior model parameter distributions obtained at one unit-problem level are used as prior distributions for the same parameters in the next-tier simulations. Overall, the results have demonstrated that the multiphase reactive flow models within MFIX can be used to capture the bed pressure, temperature, CO2 capture capacity, and kinetics with quantitative accuracy. The CFD modeling methodology and associated uncertainty quantification techniques presented herein offer a solid framework for estimating the predictive confidence in the virtual scale up of a larger carbon capture device.« less
CFD optimization of continuous stirred-tank (CSTR) reactor for biohydrogen production.
Ding, Jie; Wang, Xu; Zhou, Xue-Fei; Ren, Nan-Qi; Guo, Wan-Qian
2010-09-01
There has been little work on the optimal configuration of biohydrogen production reactors. This paper describes three-dimensional computational fluid dynamics (CFD) simulations of gas-liquid flow in a laboratory-scale continuous stirred-tank reactor used for biohydrogen production. To evaluate the role of hydrodynamics in reactor design and optimize the reactor configuration, an optimized impeller design has been constructed and validated with CFD simulations of the normal and optimized impeller over a range of speeds and the numerical results were also validated by examination of residence time distribution. By integrating the CFD simulation with an ethanol-type fermentation process experiment, it was shown that impellers with different type and speed generated different flow patterns, and hence offered different efficiencies for biohydrogen production. The hydrodynamic behavior of the optimized impeller at speeds between 50 and 70 rev/min is most suited for economical biohydrogen production. Copyright 2010 Elsevier Ltd. All rights reserved.
A holistic approach to SIM platform and its application to early-warning satellite system
NASA Astrophysics Data System (ADS)
Sun, Fuyu; Zhou, Jianping; Xu, Zheyao
2018-01-01
This study proposes a new simulation platform named Simulation Integrated Management (SIM) for the analysis of parallel and distributed systems. The platform eases the process of designing and testing both applications and architectures. The main characteristics of SIM are flexibility, scalability, and expandability. To improve the efficiency of project development, new models of early-warning satellite system were designed based on the SIM platform. Finally, through a series of experiments, the correctness of SIM platform and the aforementioned early-warning satellite models was validated, and the systematical analyses for the orbital determination precision of the ballistic missile during its entire flight process were presented, as well as the deviation of the launch/landing point. Furthermore, the causes of deviation and prevention methods will be fully explained. The simulation platform and the models will lay the foundations for further validations of autonomy technology in space attack-defense architecture research.
Outcome evaluation of a new model of critical care orientation.
Morris, Linda L; Pfeifer, Pamela; Catalano, Rene; Fortney, Robert; Nelson, Greta; Rabito, Robb; Harap, Rebecca
2009-05-01
The shortage of critical care nurses and the service expansion of 2 intensive care units provided a unique opportunity to create a new model of critical care orientation. The goal was to design a program that assessed critical thinking, validated competence, and provided learning pathways that accommodated diverse experience. To determine the effect of a new model of critical care orientation on satisfaction, retention, turnover, vacancy, preparedness to manage patient care assignment, length of orientation, and cost of orientation. A prospective, quasi-experimental design with both quantitative and qualitative methods. The new model improved satisfaction scores, retention rates, and recruitment of critical care nurses. Length of orientation was unchanged. Cost was increased, primarily because a full-time education consultant was added. A new model for nurse orientation that was focused on critical thinking and competence validation improved retention and satisfaction and serves as a template for orientation of nurses throughout the medical center.
Marshall Space Flight Center CFD overview
NASA Technical Reports Server (NTRS)
Schutzenhofer, Luke A.
1989-01-01
Computational Fluid Dynamics (CFD) activities at Marshall Space Flight Center (MSFC) have been focused on hardware specific and research applications with strong emphasis upon benchmark validation. The purpose here is to provide insight into the MSFC CFD related goals, objectives, current hardware related CFD activities, propulsion CFD research efforts and validation program, future near-term CFD hardware related programs, and CFD expectations. The current hardware programs where CFD has been successfully applied are the Space Shuttle Main Engines (SSME), Alternate Turbopump Development (ATD), and Aeroassist Flight Experiment (AFE). For the future near-term CFD hardware related activities, plans are being developed that address the implementation of CFD into the early design stages of the Space Transportation Main Engine (STME), Space Transportation Booster Engine (STBE), and the Environmental Control and Life Support System (ECLSS) for the Space Station. Finally, CFD expectations in the design environment will be delineated.
Designing PISA-Like Mathematics Tasks In Indonesia: Experiences and Challenges
NASA Astrophysics Data System (ADS)
Zulkardi, Z.; Kohar, A. W.
2018-01-01
The insignificant improvement of Indonesian students in PISA mathematics survey triggered researchers in Indonesia to develop PISA-like mathematics tasks. Some development studies have been conducted to produce valid and practical PISA-like problems that potentially effect on improving students’ mathematical literacy. This article describes the experiences of Indonesian task designers in developing PISA-like mathematics tasks as well as the potential future studies regarding to mathematical literacy as challenges for policy makers, researchers, and practitioners to improve students’ mathematical literacy in Indonesia. The results of this research indicate the task designers to consider domains of PISA like: context, mathematical content, and process as the first profiles of their missions. Our analysis shows that the designers mostly experienced difficulties regarding to the authenticity of context use and language structure. Interestingly, many of them used a variety of local wisdom in Indonesia as contexts for designing PISA-like tasks. In addition, the products developed were reported to be potentially effects on students’ interest and elicit students’ mathematical competencies as mentioned in PISA framework. Finally, this paper discusses future studies such as issues in bringing PISA task into an instructional practice.