Sample records for future experimental validation

  1. Achieving external validity in home advantage research: generalizing crowd noise effects

    PubMed Central

    Myers, Tony D.

    2014-01-01

    Different factors have been postulated to explain the home advantage phenomenon in sport. One plausible explanation investigated has been the influence of a partisan home crowd on sports officials' decisions. Different types of studies have tested the crowd influence hypothesis including purposefully designed experiments. However, while experimental studies investigating crowd influences have high levels of internal validity, they suffer from a lack of external validity; decision-making in a laboratory setting bearing little resemblance to decision-making in live sports settings. This focused review initially considers threats to external validity in applied and theoretical experimental research. Discussing how such threats can be addressed using representative design by focusing on a recently published study that arguably provides the first experimental evidence of the impact of live crowd noise on officials in sport. The findings of this controlled experiment conducted in a real tournament setting offer a level of confirmation of the findings of laboratory studies in the area. Finally directions for future research and the future conduct of crowd noise studies are discussed. PMID:24917839

  2. Experimental aeroelasticity history, status and future in brief

    NASA Technical Reports Server (NTRS)

    Ricketts, Rodney H.

    1990-01-01

    NASA conducts wind tunnel experiments to determine and understand the aeroelastic characteristics of new and advanced flight vehicles, including fixed-wing, rotary-wing and space-launch configurations. Review and assessments are made of the state-of-the-art in experimental aeroelasticity regarding available facilities, measurement techniques, and other means and devices useful in testing. In addition, some past experimental programs are described which assisted in the development of new technology, validated new analysis codes, or provided needed information for clearing flight envelopes of unwanted aeroelastic response. Finally, needs and requirements for advances and improvements in testing capabilities for future experimental research and development programs are described.

  3. Code Validation Studies of High-Enthalpy Flows

    DTIC Science & Technology

    2006-12-01

    stage of future hypersonic vehicles. The development and design of such vehicles is aided by the use of experimentation and numerical simulation... numerical predictions and experimental measurements. 3. Summary of Previous Work We have studied extensively hypersonic double-cone flows with and in...the experimental measurements and the numerical predictions. When we accounted for that effect in numerical simulations, and also augmented the

  4. Animal Experimentation: Issues for the 1980s.

    ERIC Educational Resources Information Center

    Zola, Judith C.; And Others

    1984-01-01

    Examines the extent to which issues related to animal experimentation are in conflict and proposes choices that might least comprise them. These issues include animal well-being, human well-being, self-interest of science, scientific validity and responsibility, progress in biomedical and behavioral science, and the future quality of medical care.…

  5. New millennium program ST6: autonomous technologies for future NASA spacecraft

    NASA Technical Reports Server (NTRS)

    Chmielewski, Arthur B.; Chien, Steve; Sherwood, Robert; Wyman, William; Brady, T.; Buckley, S.; Tillier, C.

    2005-01-01

    The purpose of NASA's New Millennium Program (NMP) is to validate advanced technologies in space and thus lower the risk for the first mission user. The focus of NMP is only on those technologies which need space environment for proper validation. The ST6 project has developed two advanced, experimental technologies for use on spacecraft of the future. These technologies are the Autonomous Sciencecraft Experiment and the Inertial Stellar Compass. These technologies will improve spacecraft's ability to: make decisions on what information to gather and send back to the ground, determine its own attitude and adjust its pointing.

  6. French translation and validation of the Readiness for Interprofessional Learning Scale (RIPLS) in a Canadian undergraduate healthcare student context.

    PubMed

    Cloutier, Jacinthe; Lafrance, Josée; Michallet, Bernard; Marcoux, Lyson; Cloutier, France

    2015-03-01

    The Canadian Interprofessional Health Collaborative recommends that future professionals be prepared for collaborative practice. To do so, it is necessary for them to learn about the principles of interprofessional collaboration. Therefore, to ascertain if students are predisposed, their attitude toward interprofessional learning must be assessed. In the French Canadian context such a measuring tool has not been published yet. The purpose of this study is to translate in French an adapted version of the RIPLS questionnaire and to validate it for use with undergraduate students from seven various health and social care programmes in a Canadian university. According to Vallerand's methodology, a method for translating measuring instruments: (i) the forward-backward translation indicated that six items of the experimental French version of the RIPLS needed to be more specific; (ii) the experimental French version of the RIPLS seemed clear according to the pre-test assessing items clarity; (iii) evaluation of the content validity indicated that the experimental French version of the RIPLS presents good content validity and (iv) a very good internal consistency was obtained (α = 0.90; n = 141). Results indicate that the psychometric properties of the RIPLS in French are comparable to the English version, although a different factorial structure was found. The relevance of three of the 19 items on the RIPLS scale is questionable, resulting in a revised 16-item scale. Future research aimed at validating the translated French version of the RIPLS could also be conducted in another francophone cultural context.

  7. An Integrated Study on a Novel High Temperature High Entropy Alloy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Shizhong

    2016-12-31

    This report summarizes our recent works of theoretical modeling, simulation, and experimental validation of the simulation results on the new refractory high entropy alloy (HEA) design and oxide doped refractory HEA research. The simulation of the stability and thermal dynamics simulation on potential thermal stable candidates were performed and related HEA with oxide doped samples were synthesized and characterized. The HEA ab initio density functional theory and molecular dynamics physical property simulation methods and experimental texture validation techniques development, achievements already reached, course work development, students and postdoc training, and future improvement research directions are briefly introduced.

  8. Turbine Technology Team - An overview of current and planned activities relevant to the National Launch System (NLS)

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Huber, Frank W.

    1992-01-01

    The current status of the activities and future plans of the Turbine Technology Team of the Consortium for Computational Fluid Dynamics is reviewed. The activities of the Turbine Team focus on developing and enhancing codes and models, obtaining data for code validation and general understanding of flows through turbines, and developing and analyzing the aerodynamic designs of turbines suitable for use in the Space Transportation Main Engine fuel and oxidizer turbopumps. Future work will include the experimental evaluation of the oxidizer turbine configuration, the development, analysis, and experimental verification of concepts to control secondary and tip losses, and the aerodynamic design, analysis, and experimental evaluation of turbine volutes.

  9. Continued Development and Validation of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2015-11-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.

  10. The task of validation of gas-dynamic characteristics of a multistage centrifugal compressor for a natural gas booster compressor station

    NASA Astrophysics Data System (ADS)

    Danilishin, A. M.; Kozhukhov, Y. V.; Neverov, V. V.; Malev, K. G.; Mironov, Y. R.

    2017-08-01

    The aim of this work is the validation study for the numerical modeling of characteristics of a multistage centrifugal compressor for natural gas. In the research process was the analysis used grid interfaces and software systems. The result revealed discrepancies between the simulated and experimental characteristics and outlined the future work plan.

  11. Experimental Validation of a Closed Brayton Cycle System Transient Simulation

    NASA Technical Reports Server (NTRS)

    Johnson, Paul K.; Hervol, David S.

    2006-01-01

    The Brayton Power Conversion Unit (BPCU) located at NASA Glenn Research Center (GRC) in Cleveland, Ohio was used to validate the results of a computational code known as Closed Cycle System Simulation (CCSS). Conversion system thermal transient behavior was the focus of this validation. The BPCU was operated at various steady state points and then subjected to transient changes involving shaft rotational speed and thermal energy input. These conditions were then duplicated in CCSS. Validation of the CCSS BPCU model provides confidence in developing future Brayton power system performance predictions, and helps to guide high power Brayton technology development.

  12. Particle Engulfment and Pushing By Solidifying Interfaces - Recent Theoretical and Experimental Developments

    NASA Technical Reports Server (NTRS)

    Stefanescu, D. M.; Catalina, A. V.; Juretzko, Frank R.; Sen, Subhayu; Curreri, P. A.

    2003-01-01

    The objective of the work on Particle Engulfment and Pushing by Solidifying Interfaces (PEP) include: 1) to obtain fundamental understanding of the physics of particle pushing and engulfment, 2) to develop mathematical models to describe the phenomenon, and 3) to perform critical experiments in the microgravity environment of space to provide benchmark data for model validation. Successful completion of this project will yield vital information relevant to a diverse area of terrestrial applications. With PEP being a long term research effort, this report will focus on advances in the theoretical treatment of the solid/liquid interface interaction with an approaching particle, experimental validation of some aspects of the developed models, and the experimental design aspects of future experiments to be performed on board the International Space Station.

  13. Experimental Database with Baseline CFD Solutions: 2-D and Axisymmetric Hypersonic Shock-Wave/Turbulent-Boundary-Layer Interactions

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.; Brown, James L.; Gnoffo, Peter A.

    2013-01-01

    A database compilation of hypersonic shock-wave/turbulent boundary layer experiments is provided. The experiments selected for the database are either 2D or axisymmetric, and include both compression corner and impinging type SWTBL interactions. The strength of the interactions range from attached to incipient separation to fully separated flows. The experiments were chosen based on criterion to ensure quality of the datasets, to be relevant to NASA's missions and to be useful for validation and uncertainty assessment of CFD Navier-Stokes predictive methods, both now and in the future. An emphasis on datasets selected was on surface pressures and surface heating throughout the interaction, but include some wall shear stress distributions and flowfield profiles. Included, for selected cases, are example CFD grids and setup information, along with surface pressure and wall heating results from simulations using current NASA real-gas Navier-Stokes codes by which future CFD investigators can compare and evaluate physics modeling improvements and validation and uncertainty assessments of future CFD code developments. The experimental database is presented tabulated in the Appendices describing each experiment. The database is also provided in computer-readable ASCII files located on a companion DVD.

  14. Multiscale free-space optical interconnects for intrachip global communication: motivation, analysis, and experimental validation.

    PubMed

    McFadden, Michael J; Iqbal, Muzammil; Dillon, Thomas; Nair, Rohit; Gu, Tian; Prather, Dennis W; Haney, Michael W

    2006-09-01

    The use of optical interconnects for communication between points on a microchip is motivated by system-level interconnect modeling showing the saturation of metal wire capacity at the global layer. Free-space optical solutions are analyzed for intrachip communication at the global layer. A multiscale solution comprising microlenses, etched compound slope microprisms, and a curved mirror is shown to outperform a single-scale alternative. Microprisms are designed and fabricated and inserted into an optical setup apparatus to experimentally validate the concept. The multiscale free-space system is shown to have the potential to provide the bandwidth density and configuration flexibility required for global communication in future generations of microchips.

  15. Hybrid Particle-Element Simulation of Impact on Composite Orbital Debris Shields

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.

    2004-01-01

    This report describes the development of new numerical methods and new constitutive models for the simulation of hypervelocity impact effects on spacecraft. The research has included parallel implementation of the numerical methods and material models developed under the project. Validation work has included both one dimensional simulations, for comparison with exact solutions, and three dimensional simulations of published hypervelocity impact experiments. The validated formulations have been applied to simulate impact effects in a velocity and kinetic energy regime outside the capabilities of current experimental methods. The research results presented here allow for the expanded use of numerical simulation, as a complement to experimental work, in future design of spacecraft for hypervelocity impact effects.

  16. Analysis, testing, and evaluation of faulted and unfaulted Wye, Delta, and open Delta connected electromechanical actuators

    NASA Technical Reports Server (NTRS)

    Nehl, T. W.; Demerdash, N. A.

    1983-01-01

    Mathematical models capable of simulating the transient, steady state, and faulted performance characteristics of various brushless dc machine-PSA (power switching assembly) configurations were developed. These systems are intended for possible future use as primemovers in EMAs (electromechanical actuators) for flight control applications. These machine-PSA configurations include wye, delta, and open-delta connected systems. The research performed under this contract was initially broken down into the following six tasks: development of mathematical models for various machine-PSA configurations; experimental validation of the model for failure modes; experimental validation of the mathematical model for shorted turn-failure modes; tradeoff study; and documentation of results and methodology.

  17. Assessing the stability of human locomotion: a review of current measures

    PubMed Central

    Bruijn, S. M.; Meijer, O. G.; Beek, P. J.; van Dieën, J. H.

    2013-01-01

    Falling poses a major threat to the steadily growing population of the elderly in modern-day society. A major challenge in the prevention of falls is the identification of individuals who are at risk of falling owing to an unstable gait. At present, several methods are available for estimating gait stability, each with its own advantages and disadvantages. In this paper, we review the currently available measures: the maximum Lyapunov exponent (λS and λL), the maximum Floquet multiplier, variability measures, long-range correlations, extrapolated centre of mass, stabilizing and destabilizing forces, foot placement estimator, gait sensitivity norm and maximum allowable perturbation. We explain what these measures represent and how they are calculated, and we assess their validity, divided up into construct validity, predictive validity in simple models, convergent validity in experimental studies, and predictive validity in observational studies. We conclude that (i) the validity of variability measures and λS is best supported across all levels, (ii) the maximum Floquet multiplier and λL have good construct validity, but negative predictive validity in models, negative convergent validity and (for λL) negative predictive validity in observational studies, (iii) long-range correlations lack construct validity and predictive validity in models and have negative convergent validity, and (iv) measures derived from perturbation experiments have good construct validity, but data are lacking on convergent validity in experimental studies and predictive validity in observational studies. In closing, directions for future research on dynamic gait stability are discussed. PMID:23516062

  18. Functional Inference of Complex Anatomical Tendinous Networks at a Macroscopic Scale via Sparse Experimentation

    PubMed Central

    Saxena, Anupam; Lipson, Hod; Valero-Cuevas, Francisco J.

    2012-01-01

    In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16th century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver's middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines. PMID:23144601

  19. Functional inference of complex anatomical tendinous networks at a macroscopic scale via sparse experimentation.

    PubMed

    Saxena, Anupam; Lipson, Hod; Valero-Cuevas, Francisco J

    2012-01-01

    In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16(th) century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver's middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curceanu, C.; Bragadireanu, M.; Sirghi, D.

    The Pauli Exclusion Principle (PEP) is one of the basic principles of modern physics and, even if there are no compelling reasons to doubt its validity, it is still debated today because an intuitive, elementary explanation is still missing, and because of its unique stand among the basic symmetries of physics. We present an experimental test of the validity of the Pauli Exclusion Principle for electrons based on a straightforward idea put forward a few years ago by Ramberg and Snow (E. Ramberg and G. A. Snow 1990 Phys. Lett. B 238 438). We performed a very accurate search ofmore » X-rays from the Pauli-forbidden atomic transitions of electrons in the already filled 1S shells of copper atoms. Although the experiment has a very simple structure, it poses deep conceptual and interpretational problems. Here we describe the experimental method and recent experimental results interpreted as an upper limit for the probability to violate the Pauli Exclusion Principle. We also present future plans to upgrade the experimental apparatus.« less

  1. Using Experimental Paradigms to Examine Alcohol’s Role in Men’s Sexual Aggression: Opportunities and Challenges in Proxy Development

    PubMed Central

    Abbey, Antonia; Wegner, Rhiana

    2015-01-01

    The goals of this article are to review the major findings from alcohol administration studies that use sexual aggression proxies and to encourage additional experimental research that evaluates hypotheses about the role of alcohol in the etiology of men’s sexual aggression. Experiments allow participants to be randomly assigned to drink conditions, therefore ensuring that any differences between drinkers and nondrinkers can be attributed to their alcohol consumption. One of the biggest challenges faced by experimental researchers is the identification of valid operationalizations of key constructs. The tension between internal and external validity is particularly problematic for violence researchers because they cannot allow participants to engage in the target behavior in the laboratory. The strengths and limitations associated with written vignettes, audiotapes, videotapes, and confederate proxies for sexual aggression are described. Suggestions are made for future research to broaden the generalizability of the findings from experimental research. PMID:26048214

  2. Laboratory plasma interactions experiments: Results and implications to future space systems

    NASA Technical Reports Server (NTRS)

    Leung, Philip

    1986-01-01

    The experimental results discussed show the significance of the effects caused by spacecraft plasma interactions, in particular the generation of Electromagnetic Interference. As the experimental results show, the magnitude of the adverse effects induced by Plasma Interactions (PI) will be more significant for spacecraft of the next century. Therefore, research is needed to control possible adverse effects. Several techniques to control the selected PI effects are discussed. Tests, in the form of flight experiments, are needed to validate these proposed ideas.

  3. FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju

    To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposedmore » based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting hybrid validation data package (composed of experimental and dummy data) will provide a clear and complete instance delineating the structure of the desired validation data and enabling effective communication among the modeler, the experimentalist, and the knowledgebase developer. With a good common understanding of the desired data structure by the three parties of subject matter experts, further existing data hunting will be effectively conducted, new experimental data generation will be realistically pursued, knowledgebase schema will be practically designed; and code validation will be confidently planned.« less

  4. Experimental Validation Plan for the Xolotl Plasma-Facing Component Simulator Using Tokamak Sample Exposures

    NASA Astrophysics Data System (ADS)

    Chan, V. S.; Wong, C. P. C.; McLean, A. G.; Luo, G. N.; Wirth, B. D.

    2013-10-01

    The Xolotl code under development by PSI-SciDAC will enhance predictive modeling capability of plasma-facing materials under burning plasma conditions. The availability and application of experimental data to compare to code-calculated observables are key requirements to validate the breadth and content of physics included in the model and ultimately gain confidence in its results. A dedicated effort has been in progress to collect and organize a) a database of relevant experiments and their publications as previously carried out at sample exposure facilities in US and Asian tokamaks (e.g., DIII-D DiMES, and EAST MAPES), b) diagnostic and surface analysis capabilities available at each device, and c) requirements for future experiments with code validation in mind. The content of this evolving database will serve as a significant resource for the plasma-material interaction (PMI) community. Work supported in part by the US Department of Energy under GA-DE-SC0008698, DE-AC52-07NA27344 and DE-AC05-00OR22725.

  5. Goals and Status of the NASA Juncture Flow Experiment

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Morrison, Joseph H.

    2016-01-01

    The NASA Juncture Flow experiment is a new effort whose focus is attaining validation data in the juncture region of a wing-body configuration. The experiment is designed specifically for the purpose of CFD validation. Current turbulence models routinely employed by Reynolds-averaged Navier-Stokes CFD are inconsistent in their prediction of corner flow separation in aircraft juncture regions, so experimental data in the near-wall region of such a configuration will be useful both for assessment as well as for turbulence model improvement. This paper summarizes the Juncture Flow effort to date, including preliminary risk-reduction experiments already conducted and planned future experiments. The requirements and challenges associated with conducting a quality validation test are discussed.

  6. Multi-Evaporator Miniature Loop Heat Pipe for Small Spacecraft Thermal Control. Part 2; Validation Results

    NASA Technical Reports Server (NTRS)

    Ku, Jentung; Ottenstein, Laura; Douglas, Donya; Hoang, Triem

    2010-01-01

    Under NASA s New Millennium Program Space Technology 8 (ST 8) Project, Goddard Space Fight Center has conducted a Thermal Loop experiment to advance the maturity of the Thermal Loop technology from proof of concept to prototype demonstration in a relevant environment , i.e. from a technology readiness level (TRL) of 3 to a level of 6. The thermal Loop is an advanced thermal control system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers designed for future small system applications requiring low mass, low power, and compactness. The MLHP retains all features of state-of-the-art loop heat pipes (LHPs) and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. An MLHP breadboard was built and tested in the laboratory and thermal vacuum environments for the TRL 4 and TRL 5 validations, respectively, and an MLHP proto-flight unit was built and tested in a thermal vacuum chamber for the TRL 6 validation. In addition, an analytical model was developed to simulate the steady state and transient behaviors of the MLHP during various validation tests. The MLHP demonstrated excellent performance during experimental tests and the analytical model predictions agreed very well with experimental data. All success criteria at various TRLs were met. Hence, the Thermal Loop technology has reached a TRL of 6. This paper presents the validation results, both experimental and analytical, of such a technology development effort.

  7. The impact of crowd noise on officiating in muay thai: achieving external validity in an experimental setting.

    PubMed

    Myers, Tony; Balmer, Nigel

    2012-01-01

    Numerous factors have been proposed to explain the home advantage in sport. Several authors have suggested that a partisan home crowd enhances home advantage and that this is at least in part a consequence of their influence on officiating. However, while experimental studies examining this phenomenon have high levels of internal validity (since only the "crowd noise" intervention is allowed to vary), they suffer from a lack of external validity, with decision-making in a laboratory setting typically bearing little resemblance to decision-making in live sports settings. Conversely, observational and quasi-experimental studies with high levels of external validity suffer from low levels of internal validity as countless factors besides crowd noise vary. The present study provides a unique opportunity to address these criticisms, by conducting a controlled experiment on the impact of crowd noise on officiating in a live tournament setting. Seventeen qualified judges officiated on thirty Thai boxing bouts in a live international tournament setting featuring "home" and "away" boxers. In each bout, judges were randomized into a "noise" (live sound) or "no crowd noise" (noise-canceling headphones and white noise) condition, resulting in 59 judgments in the "no crowd noise" and 61 in the "crowd noise" condition. The results provide the first experimental evidence of the impact of live crowd noise on officials in sport. A cross-classified statistical model indicated that crowd noise had a statistically significant impact, equating to just over half a point per bout (in the context of five round bouts with the "10-point must" scoring system shared with professional boxing). The practical significance of the findings, their implications for officiating and for the future conduct of crowd noise studies are discussed.

  8. The Impact of Crowd Noise on Officiating in Muay Thai: Achieving External Validity in an Experimental Setting

    PubMed Central

    Myers, Tony; Balmer, Nigel

    2012-01-01

    Numerous factors have been proposed to explain the home advantage in sport. Several authors have suggested that a partisan home crowd enhances home advantage and that this is at least in part a consequence of their influence on officiating. However, while experimental studies examining this phenomenon have high levels of internal validity (since only the “crowd noise” intervention is allowed to vary), they suffer from a lack of external validity, with decision-making in a laboratory setting typically bearing little resemblance to decision-making in live sports settings. Conversely, observational and quasi-experimental studies with high levels of external validity suffer from low levels of internal validity as countless factors besides crowd noise vary. The present study provides a unique opportunity to address these criticisms, by conducting a controlled experiment on the impact of crowd noise on officiating in a live tournament setting. Seventeen qualified judges officiated on thirty Thai boxing bouts in a live international tournament setting featuring “home” and “away” boxers. In each bout, judges were randomized into a “noise” (live sound) or “no crowd noise” (noise-canceling headphones and white noise) condition, resulting in 59 judgments in the “no crowd noise” and 61 in the “crowd noise” condition. The results provide the first experimental evidence of the impact of live crowd noise on officials in sport. A cross-classified statistical model indicated that crowd noise had a statistically significant impact, equating to just over half a point per bout (in the context of five round bouts with the “10-point must” scoring system shared with professional boxing). The practical significance of the findings, their implications for officiating and for the future conduct of crowd noise studies are discussed. PMID:23049520

  9. Experimental investigation of hypersonic aerodynamics

    NASA Technical Reports Server (NTRS)

    Heinemann, K.; Intrieri, Peter F.

    1987-01-01

    An extensive series of ballistic range tests are currently being conducted at the Ames Research Center. These tests are intended to investigate the hypersonic aerodynamic characteristics of two basic configurations, which are: the blunt-cone Galileo probe which is scheduled to be launched in late 1989 and will enter the atmosphere of Jupiter in 1994, and a generic slender cone configuration to provide experimental aerodynamic data including good flow-field definition which computational aerodynamicists could use to validate their computer codes. Some of the results obtained thus far are presented and work for the near future is discussed.

  10. Evolution of strain localization in variable-width three-dimensional unsaturated laboratory-scale cut slopes

    USGS Publications Warehouse

    Morse, Michael S.; Lu, Ning; Wayllace, Alexandra; Godt, Jonathan W.

    2017-01-01

    To experimentally validate a recently developed theory for predicting the stability of cut slopes under unsaturated conditions, the authors measured increasing strain localization in unsaturated slope cuts prior to abrupt failure. Cut slope width and moisture content were controlled and varied in a laboratory, and a sliding door that extended the height of the free face of the slope was lowered until the cut slope failed. A particle image velocimetry tool was used to quantify soil displacement in the x-y">x-y (horizontal) and x-z">x-z (vertical) planes, and strain was calculated from the displacement. Areas of maximum strain localization prior to failure were shown to coincide with the location of the eventual failure plane. Experimental failure heights agreed with the recently developed stability theory for unsaturated cut slopes (within 14.3% relative error) for a range of saturation and cut slope widths. A theoretical threshold for sidewall influence on cut slope failures was also proposed to quantify the relationship between normalized sidewall width and critical height. The proposed relationship was consistent with the cut slope experiment results, and is intended for consideration in future geotechnical experiment design. The experimental data of evolution of strain localization presented herein provide a physical basis from which future numerical models of strain localization can be validated.

  11. An Experimental Evaluation of Competing Age-Predictions of Future Time Perspective between Workplace and Retirement Domains.

    PubMed

    Kerry, Matthew J; Embretson, Susan E

    2017-01-01

    Future time perspective (FTP) is defined as "perceptions of the future as being limited or open-ended" (Lang and Carstensen, 2002; p. 125). The construct figures prominently in both workplace and retirement domains, but the age-predictions are competing: Workplace research predicts decreasing FTP age-change, in contrast, retirement scholars predict increasing FTP age-change. For the first time, these competing predictions are pitted in an experimental manipulation of subjective life expectancy (SLE). A sample of N = 207 older adults (age 45-60) working full-time (>30-h/week) were randomly assigned to SLE questions framed as either 'Live-to' or 'Die-by' to evaluate competing predictions for FTP. Results indicate general support for decreasing age-change in FTP, indicated by independent-sample t -tests showing lower FTP in the 'Die-by' framing condition. Further general-linear model analyses were conducted to test for interaction effects of retirement planning with experimental framings on FTP and intended retirement; While retirement planning buffered FTP's decrease, simple-effects also revealed that retirement planning increased intentions for sooner retirement, but lack of planning increased intentions for later retirement. Discussion centers on practical implications of our findings and consequences validity evidence in future empirical research of FTP in both workplace and retirement domains.

  12. Recent Progress and Future Plans for Fusion Plasma Synthetic Diagnostics Platform

    NASA Astrophysics Data System (ADS)

    Shi, Lei; Kramer, Gerrit; Tang, William; Tobias, Benjamin; Valeo, Ernest; Churchill, Randy; Hausammann, Loic

    2015-11-01

    The Fusion Plasma Synthetic Diagnostics Platform (FPSDP) is a Python package developed at the Princeton Plasma Physics Laboratory. It is dedicated to providing an integrated programmable environment for applying a modern ensemble of synthetic diagnostics to the experimental validation of fusion plasma simulation codes. The FPSDP will allow physicists to directly compare key laboratory measurements to simulation results. This enables deeper understanding of experimental data, more realistic validation of simulation codes, quantitative assessment of existing diagnostics, and new capabilities for the design and optimization of future diagnostics. The Fusion Plasma Synthetic Diagnostics Platform now has data interfaces for the GTS and XGC-1 global particle-in-cell simulation codes with synthetic diagnostic modules including: (i) 2D and 3D Reflectometry; (ii) Beam Emission Spectroscopy; and (iii) 1D Electron Cyclotron Emission. Results will be reported on the delivery of interfaces for the global electromagnetic PIC code GTC, the extended MHD M3D-C1 code, and the electromagnetic hybrid NOVAK eigenmode code. Progress toward development of a more comprehensive 2D Electron Cyclotron Emission module will also be discussed. This work is supported by DOE contract #DEAC02-09CH11466.

  13. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    PubMed

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  14. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin

    PubMed Central

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2016-01-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a “complete mystical experience” that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. PMID:26442957

  15. Experimentally valid predictions of muscle force and EMG in models of motor-unit function are most sensitive to neural properties.

    PubMed

    Keenan, Kevin G; Valero-Cuevas, Francisco J

    2007-09-01

    Computational models of motor-unit populations are the objective implementations of the hypothesized mechanisms by which neural and muscle properties give rise to electromyograms (EMGs) and force. However, the variability/uncertainty of the parameters used in these models--and how they affect predictions--confounds assessing these hypothesized mechanisms. We perform a large-scale computational sensitivity analysis on the state-of-the-art computational model of surface EMG, force, and force variability by combining a comprehensive review of published experimental data with Monte Carlo simulations. To exhaustively explore model performance and robustness, we ran numerous iterative simulations each using a random set of values for nine commonly measured motor neuron and muscle parameters. Parameter values were sampled across their reported experimental ranges. Convergence after 439 simulations found that only 3 simulations met our two fitness criteria: approximating the well-established experimental relations for the scaling of EMG amplitude and force variability with mean force. An additional 424 simulations preferentially sampling the neighborhood of those 3 valid simulations converged to reveal 65 additional sets of parameter values for which the model predictions approximate the experimentally known relations. We find the model is not sensitive to muscle properties but very sensitive to several motor neuron properties--especially peak discharge rates and recruitment ranges. Therefore to advance our understanding of EMG and muscle force, it is critical to evaluate the hypothesized neural mechanisms as implemented in today's state-of-the-art models of motor unit function. We discuss experimental and analytical avenues to do so as well as new features that may be added in future implementations of motor-unit models to improve their experimental validity.

  16. Characterization of Depleted-Uranium Strength and Damage Behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, III, George T.; Chen, Shuh-Rong; Bronkhorst, Curt A.

    2012-12-17

    The intent of this report is to document the status of our knowledge of the mechanical and damage behavior of Depleted Uranium(DU hereafter). This report briefly summaries the motivation of the experimental and modeling research conducted at Los Alamos National Laboratory(LANL) on DU since the early 1980’s and thereafter the current experimental data quantifying the strength and damage behavior of DU as a function of a number of experimental variables including processing, strain rate, temperature, stress state, and shock prestraining. The effect of shock prestraining on the structure-property response of DU is described and the effect on post-shock mechanical behaviormore » of DU is discussed. The constitutive experimental data utilized to support the derivation of two constitutive strength (plasticity) models, the Preston-Tonks-Wallace (PTW) and Mechanical Threshold Stress (MTS) models, for both annealed and shock prestrained DU are detailed and the Taylor cylinder validation tests and finite-element modeling (FEM) utilized to validate these strength models is discussed. The similarities and differences in the PTW and MTS model descriptions for DU are discussed for both the annealed and shock prestrained conditions. Quasi-static tensile data as a function of triaxial constraint and spallation test data are described. An appendix additionally briefly describes low-pressure equation-of-state data for DU utilized to support the spallation experiments. The constitutive behavior of DU screw/bolt material is presented. The response of DU subjected to dynamic tensile extrusion testing as a function of temperature is also described. This integrated experimental technique is planned to provide an additional validation test in the future. The damage data as a function of triaxiality, tensile and spallation data, is thereafter utilized to support derivation of the Tensile Plasticity (TEPLA) damage model and simulations for comparison to the DU spallation data are presented. Finally, a discussion of future needs in the area of needed DU strength and damage research at LANL is presented to support the development of physically-based predictive strength and damage modeling capability.« less

  17. Validating Inertial Confinement Fusion (ICF) predictive capability using perturbed capsules

    NASA Astrophysics Data System (ADS)

    Schmitt, Mark; Magelssen, Glenn; Tregillis, Ian; Hsu, Scott; Bradley, Paul; Dodd, Evan; Cobble, James; Flippo, Kirk; Offerman, Dustin; Obrey, Kimberly; Wang, Yi-Ming; Watt, Robert; Wilke, Mark; Wysocki, Frederick; Batha, Steven

    2009-11-01

    Achieving ignition on NIF is a monumental step on the path toward utilizing fusion as a controlled energy source. Obtaining robust ignition requires accurate ICF models to predict the degradation of ignition caused by heterogeneities in capsule construction and irradiation. LANL has embarked on a project to induce controlled defects in capsules to validate our ability to predict their effects on fusion burn. These efforts include the validation of feature-driven hydrodynamics and mix in a convergent geometry. This capability is needed to determine the performance of capsules imploded under less-than-optimum conditions on future IFE facilities. LANL's recently initiated Defect Implosion Experiments (DIME) conducted at Rochester's Omega facility are providing input for these efforts. Recent simulation and experimental results will be shown.

  18. Approach and Instrument Placement Validation

    NASA Technical Reports Server (NTRS)

    Ator, Danielle

    2005-01-01

    The Mars Exploration Rovers (MER) from the 2003 flight mission represents the state of the art technology for target approach and instrument placement on Mars. It currently takes 3 sols (Martian days) for the rover to place an instrument on a designated rock target that is about 10 to 20 m away. The objective of this project is to provide an experimentally validated single-sol instrument placement capability to future Mars missions. After completing numerous test runs on the Rocky8 rover under various test conditions, it has been observed that lighting conditions, shadow effects, target features and the initial target distance have an effect on the performance and reliability of the tracking software. Additional software validation testing will be conducted in the months to come.

  19. Confirmation of the Basic Psychological Needs in Exercise Scale (BPNES) With a Sample of People who do Healthy Exercise

    PubMed Central

    Moreno-Murcia, Juan A; Martínez-Galindo, Celestina; Moreno-Pérez, Víctor; Marcos, Pablo J.; Borges, Fernanda

    2012-01-01

    This study aimed to cross-validate the psychometric properties of the Basic Psychological Needs in Exercise Scale (BPNES) by Vlachopoulos and Michailidou, 2006 in a Spanish context. Two studies were conducted. Confirmatory factor analysis results confirmed the hypothesized three-factor solution In addition, we documented evidence of reliability, analysed as internal consistency and temporal stability. Future studies should analyse the scale's validity and reliability with different populations and check their experimental effect. Key pointsThe Basic Psychological Needs in Exercise Scale (BPNES) is valid and reliable for measuring basic psychological needs in healthy physical exercise in the Spanish context.The factor structure of three correlated factors has shown minimal invariance across gender. PMID:24149130

  20. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  1. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2016-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.

  2. An Experimental Evaluation of Competing Age-Predictions of Future Time Perspective between Workplace and Retirement Domains

    PubMed Central

    Kerry, Matthew J.; Embretson, Susan E.

    2018-01-01

    Future time perspective (FTP) is defined as “perceptions of the future as being limited or open-ended” (Lang and Carstensen, 2002; p. 125). The construct figures prominently in both workplace and retirement domains, but the age-predictions are competing: Workplace research predicts decreasing FTP age-change, in contrast, retirement scholars predict increasing FTP age-change. For the first time, these competing predictions are pitted in an experimental manipulation of subjective life expectancy (SLE). A sample of N = 207 older adults (age 45–60) working full-time (>30-h/week) were randomly assigned to SLE questions framed as either ‘Live-to’ or ‘Die-by’ to evaluate competing predictions for FTP. Results indicate general support for decreasing age-change in FTP, indicated by independent-sample t-tests showing lower FTP in the ‘Die-by’ framing condition. Further general-linear model analyses were conducted to test for interaction effects of retirement planning with experimental framings on FTP and intended retirement; While retirement planning buffered FTP’s decrease, simple-effects also revealed that retirement planning increased intentions for sooner retirement, but lack of planning increased intentions for later retirement. Discussion centers on practical implications of our findings and consequences validity evidence in future empirical research of FTP in both workplace and retirement domains. PMID:29375435

  3. The Protein Data Bank in Europe (PDBe): bringing structure to biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velankar, Sameer; Kleywegt, Gerard J., E-mail: gerard@ebi.ac.uk

    2011-04-01

    Some future challenges for the PDB and its guardians are discussed and current and future activities in structural bioinformatics at the Protein Data Bank in Europe (PDBe) are described. The Protein Data Bank in Europe (PDBe) is the European partner in the Worldwide PDB and as such handles depositions of X-ray, NMR and EM data and structure models. PDBe also provides advanced bioinformatics services based on data from the PDB and related resources. Some of the challenges facing the PDB and its guardians are discussed, as well as some of the areas on which PDBe activities will focus in themore » future (advanced services, ligands, integration, validation and experimental data). Finally, some recent developments at PDBe are described.« less

  4. Neuroinflammatory targets and treatments for epilepsy validated in experimental models.

    PubMed

    Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M

    2017-07-01

    A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  5. Modelling and analysis of a direct ascorbic acid fuel cell

    NASA Astrophysics Data System (ADS)

    Zeng, Yingzhi; Fujiwara, Naoko; Yamazaki, Shin-ichi; Tanimoto, Kazumi; Wu, Ping

    L-Ascorbic acid (AA), also known as vitamin C, is an environmentally-benign and biologically-friendly compound that can be used as an alternative fuel for direct oxidation fuel cells. While direct ascorbic acid fuel cells (DAAFCs) have been studied experimentally, modelling and simulation of these devices have been overlooked. In this work, we develop a mathematical model to describe a DAAFC and validate it with experimental data. The model is formulated by integrating the mass and charge balances, and model parameters are estimated by best-fitting to experimental data of current-voltage curves. By comparing the transient voltage curves predicted by dynamic simulation and experiments, the model is further validated. Various parameters that affect the power generation are studied by simulation. The cathodic reaction is found to be the most significant determinant of power generation, followed by fuel feed concentration and the mass-transfer coefficient of ascorbic acid. These studies also reveal that the power density steadily increases with respect to the fuel feed concentration. The results may guide future development and operation of a more efficient DAAFC.

  6. Examining ecological validity in social interaction: problems of visual fidelity, gaze, and social potential.

    PubMed

    Reader, Arran T; Holmes, Nicholas P

    2016-01-01

    Social interaction is an essential part of the human experience, and much work has been done to study it. However, several common approaches to examining social interactions in psychological research may inadvertently either unnaturally constrain the observed behaviour by causing it to deviate from naturalistic performance, or introduce unwanted sources of variance. In particular, these sources are the differences between naturalistic and experimental behaviour that occur from changes in visual fidelity (quality of the observed stimuli), gaze (whether it is controlled for in the stimuli), and social potential (potential for the stimuli to provide actual interaction). We expand on these possible sources of extraneous variance and why they may be important. We review the ways in which experimenters have developed novel designs to remove these sources of extraneous variance. New experimental designs using a 'two-person' approach are argued to be one of the most effective ways to develop more ecologically valid measures of social interaction, and we suggest that future work on social interaction should use these designs wherever possible.

  7. Advanced Supersonic Nozzle Concepts: Experimental Flow Visualization Results Paired With LES

    NASA Astrophysics Data System (ADS)

    Berry, Matthew; Magstadt, Andrew; Stack, Cory; Gaitonde, Datta; Glauser, Mark; Syracuse University Team; The Ohio State University Team

    2015-11-01

    Advanced supersonic nozzle concepts are currently under investigation, utilizing multiple bypass streams and airframe integration to bolster performance and efficiency. This work focuses on the parametric study of a supersonic, multi-stream jet with aft deck. The single plane of symmetry, rectangular nozzle, displays very complex and unique flow characteristics. Flow visualization techniques in the form of PIV and schlieren capture flow features at various deck lengths and Mach numbers. LES is compared to the experimental results to both validate the computational model and identify limitations of the simulation. By comparing experimental results to LES, this study will help create a foundation of knowledge for advanced nozzle designs in future aircraft. SBIR Phase II with Spectral Energies, LLC under direction of Barry Kiel.

  8. [Experimental orthopedic surgery: the practical aspects and management].

    PubMed

    Di Denia, P; Caligiuri, G; Guzzardella, G A; Fini, M; Giardino, R

    1996-01-01

    The funds to grant for a scientific research project are more and more interesting public and private administrations. A quantitative analysis of experimental research prices in all its phases is mandatory for an optimization process. The aim of this paper is to define practical and economical aspects of the experimental 'in vivo' models designed for the validation of biomaterials, with particular respect to the managerial bookkeeping of consumer goods, based on the experience of our Institute. Some tables were realized in order to quantify the resources needed to perform experimental 'in vivo' models. These tables represent a reliable tool for a continuous monitoring of managerial costs for the current year and for an accurate budget planning for the future years considering the experimental projects in progress and the planned researches. A business organization of public research facilities may lead to an optimization of costs and an easier national and international funds achievement increasing, also, the partnership with private appointers.

  9. Alternatives to animal testing: research, trends, validation, regulatory acceptance.

    PubMed

    Huggins, Jane

    2003-01-01

    Current trends and issues in the development of alternatives to the use of animals in biomedical experimentation are discussed in this position paper. Eight topics are considered and include refinement of acute toxicity assays; eye corrosion/irritation alternatives; skin corrosion/irritation alternatives; contact sensitization alternatives; developmental/reproductive testing alternatives; genetic engineering (transgenic) assays; toxicogenomics; and validation of alternative methods. The discussion of refinement of acute toxicity assays is focused primarily on developments with regard to reduction of the number of animals used in the LD(50) assay. However, the substitution of humane endpoints such as clinical signs of toxicity for lethality in these assays is also evaluated. Alternative assays for eye corrosion/irritation as well as those for skin corrosion/irritation are described with particular attention paid to the outcomes, both successful and unsuccessful, of several validation efforts. Alternative assays for contact sensitization and developmental/reproductive toxicity are presented as examples of methods designed for the examination of interactions between toxins and somewhat more complex physiological systems. Moreover, genetic engineering and toxicogenomics are discussed with an eye toward the future of biological experimentation in general. The implications of gene manipulation for research animals, specifically, are also examined. Finally, validation methods are investigated as to their effectiveness, or lack thereof, and suggestions for their standardization and improvement, as well as implementation are reviewed.

  10. MARVEL analysis of the measured high-resolution spectra of 14NH3

    NASA Astrophysics Data System (ADS)

    Al Derzi, Afaf R.; Furtenbacher, Tibor; Tennyson, Jonathan; Yurchenko, Sergei N.; Császár, Attila G.

    2015-08-01

    Accurate, experimental rotational-vibrational energy levels and line positions, with associated labels and uncertainties, are reported for the ground electronic state of the symmetric-top 14NH3 molecule. All levels and lines are based on critically reviewed and validated high-resolution experimental spectra taken from 56 literature sources. The transition data are in the 0.7-17 000 cm-1 region, with a large gap between 7000 and 15 000 cm-1. The MARVEL (Measured Active Rotational-Vibrational Energy Levels) algorithm is used to determine the energy levels. Out of the 29 450 measured transitions 10 041 and 18 947 belong to ortho- and para-14NH3, respectively. A careful analysis of the related experimental spectroscopic network (SN) allows 28 530 of the measured transitions to be validated, 18 178 of these are unique, while 462 transitions belong to floating components. Despite the large number of spectroscopic measurements published over the last 80 years, the transitions determine only 30 vibrational band origins of 14NH3, 8 for ortho- and 22 for para-14NH3. The highest J value, where J stands for the rotational quantum number, for which an energy level is validated is 31. The number of experimental-quality ortho- and para-14NH3 rovibrational energy levels is 1724 and 3237, respectively. The MARVEL energy levels are checked against ones in the BYTe first-principles database, determined previously. The lists of validated lines and levels for 14NH3 are deposited in the Supporting Information to this paper. Combination of the MARVEL energy levels with first-principles absorption intensities yields a huge number of experimental-quality rovibrational lines, which should prove to be useful for the understanding of future complex high-resolution spectroscopy on 14NH3; these lines are also deposited in the Supporting Information to this paper.

  11. Active vibration control activities at the LaRC - Present and future

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.

    1990-01-01

    The NASA Controls-Structures-Interaction (CSI) program is presented with a description of the ground testing element objectives and approach. The goal of the CSI program is to develop and validate the technology required to design, verify and operate space systems in which the structure and the controls interact beneficially to meet the needs of future NASA missions. The operational Mini-Mast ground testbed and some sample active vibration control experimental results are discussed along with a description of the CSI Evolutionary Model testbed presently under development. Initial results indicate that embedded sensors and actuators are effective in controlling a large truss/reflector structure.

  12. Carrier statistics and quantum capacitance effects on mobility extraction in two-dimensional crystal semiconductor field-effect transistors

    NASA Astrophysics Data System (ADS)

    Ma, Nan; Jena, Debdeep

    2015-03-01

    In this work, the consequence of the high band-edge density of states on the carrier statistics and quantum capacitance in transition metal dichalcogenide two-dimensional semiconductor devices is explored. The study questions the validity of commonly used expressions for extracting carrier densities and field-effect mobilities from the transfer characteristics of transistors with such channel materials. By comparison to experimental data, a new method for the accurate extraction of carrier densities and mobilities is outlined. The work thus highlights a fundamental difference between these materials and traditional semiconductors that must be considered in future experimental measurements.

  13. Designing biomedical proteomics experiments: state-of-the-art and future perspectives.

    PubMed

    Maes, Evelyne; Kelchtermans, Pieter; Bittremieux, Wout; De Grave, Kurt; Degroeve, Sven; Hooyberghs, Jef; Mertens, Inge; Baggerman, Geert; Ramon, Jan; Laukens, Kris; Martens, Lennart; Valkenborg, Dirk

    2016-05-01

    With the current expanded technical capabilities to perform mass spectrometry-based biomedical proteomics experiments, an improved focus on the design of experiments is crucial. As it is clear that ignoring the importance of a good design leads to an unprecedented rate of false discoveries which would poison our results, more and more tools are developed to help researchers designing proteomic experiments. In this review, we apply statistical thinking to go through the entire proteomics workflow for biomarker discovery and validation and relate the considerations that should be made at the level of hypothesis building, technology selection, experimental design and the optimization of the experimental parameters.

  14. Antecedents and Consequences of Supplier Performance Evaluation Efficacy

    DTIC Science & Technology

    2016-06-30

    forming groups of high and low values. These tests are contingent on the reliable and valid measure of high and low rating inflation and high and...year)? Future research could deploy a SPM system as a test case on a limited set of transactions. Using a quasi-experimental design , comparisons...single source, common method bias must be of concern. Harmon’s one -factor test showed that when latent-indicator items were forced onto a single

  15. Off-Road Soft Soil Tire Model Development and Experimental Testing

    DTIC Science & Technology

    2011-06-29

    Eduardo Pinto 2 , Mr. Scott Naranjo 3 , Dr. Paramsothy Jayakumar 4 , Dr. Archie Andonian 5 , Dr. Dave Hubbell 6 , Dr. Brant Ross 7 1Virginia...The effect of soil charac- teristics on the tire dynamics will be studied. Validation against data collected from full vehicle testing is included in...the proposed future work. Keywords: tire model, soft soil, terramechanics, vehicle dynamics , indoor testing 1 Introduction The goal of this paper is

  16. Social preferences of future physicians

    PubMed Central

    Li, Jing; Dow, William H.

    2017-01-01

    We measure the social preferences of a sample of US medical students and compare their preferences with those of the general population sampled in the American Life Panel (ALP). We also compare the medical students with a subsample of highly educated, wealthy ALP subjects as well as elite law school students and undergraduate students. We further associate the heterogeneity in social preferences within medical students to the tier ranking of their medical schools and their expected specialty choice. Our experimental design allows us to rigorously distinguish altruism from preferences regarding equality–efficiency tradeoffs and accurately measure both at the individual level rather than pooling data or assuming homogeneity across subjects. This is particularly informative, because the subjects in our sample display widely heterogeneous social preferences in terms of both their altruism and equality–efficiency tradeoffs. We find that medical students are substantially less altruistic and more efficiency focused than the average American. Furthermore, medical students attending the top-ranked medical schools are less altruistic than those attending lower-ranked schools. We further show that the social preferences of those attending top-ranked medical schools are statistically indistinguishable from the preferences of a sample of elite law school students. The key limitation of this study is that our experimental measures of social preferences have not yet been externally validated against actual physician practice behaviors. Pending this future research, we probed the predictive validity of our experimental measures of social preferences by showing that the medical students choosing higher-paying medical specialties are less altruistic than those choosing lower-paying specialties. PMID:29146826

  17. Tranpsort phenomena in solidification processing of functionally graded materials

    NASA Astrophysics Data System (ADS)

    Gao, Juwen

    A combined numerical and experimental study of the transport phenomena during solidification processing of metal matrix composite functionally graded materials (FGMs) is conducted in this work. A multiphase transport model for the solidification of metal-matrix composite FGMs has been developed that accounts for macroscopic particle segregation due to liquid-particle flow and particle-solid interactions. An experimental study has also been conducted to gain physical insight as well as to validate the model. A novel method to in-situ measure the particle volume fraction using fiber optic probes is developed for transparent analogue solidification systems. The model is first applied to one-dimensional pure matrix FGM solidification under gravity or centrifugal field and is extensively validated against the experimental results. The mechanisms for the formation of particle concentration gradient are identified. Two-dimensional solidification of pure matrix FGM with convection is then studied using the model as well as experiments. The interaction among convection flow, solidification process and the particle transport is demonstrated. The results show the importance of convection in the particle concentration gradient formation. Then, simulations for alloy FGM solidification are carried out for unidirectional solidification as well as two-dimensional solidification with convection. The interplay among heat and species transport, convection and particle motion is investigated. Finally, future theoretical and experimental work is outlined.

  18. Utilizing Metalized Fabrics for Liquid and Rip Detection and Localization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Stephen; Mahan, Cody; Kuhn, Michael J

    2013-01-01

    This paper proposes a novel technique for utilizing conductive textiles as a distributed sensor for detecting and localizing liquids (e.g., blood), rips (e.g., bullet holes), and potentially biosignals. The proposed technique is verified through both simulation and experimental measurements. Circuit theory is utilized to depict conductive fabric as a bounded, near-infinite grid of resistors. Solutions to the well-known infinite resistance grid problem are used to confirm the accuracy and validity of this modeling approach. Simulations allow for discontinuities to be placed within the resistor matrix to illustrate the effects of bullet holes within the fabric. A real-time experimental system wasmore » developed that uses a multiplexed Wheatstone bridge approach to reconstruct the resistor grid across the conductive fabric and detect liquids and rips. The resistor grid model is validated through a comparison of simulated and experimental results. Results suggest accuracy proportional to the electrode spacing in determining the presence and location of discontinuities in conductive fabric samples. Future work is focused on refining the experimental system to provide more accuracy in detecting and localizing events as well as developing a complete prototype that can be deployed for field testing. Potential applications include intelligent clothing, flexible, lightweight sensing systems, and combat wound detection.« less

  19. Validation of radiocarpal joint contact models based on images from a clinical MRI scanner.

    PubMed

    Johnson, Joshua E; McIff, Terence E; Lee, Phil; Toby, E Bruce; Fischer, Kenneth J

    2014-01-01

    This study was undertaken to assess magnetic resonance imaging (MRI)-based radiocarpal surface contact models of functional loading in a clinical MRI scanner for future in vivo studies, by comparison with experimental measures from three cadaver forearm specimens. Experimental data were acquired using a Tekscan sensor during simulated light grasp. Magnetic resonance (MR) images were used to obtain model geometry and kinematics (image registration). Peak contact pressures (PPs) and average contact pressures (APs), contact forces and contact areas were determined in the radiolunate and radioscaphoid joints. Contact area was also measured directly from MR images acquired with load and compared with model data. Based on the validation criteria (within 25% of experimental data), out of the six articulations (three specimens with two articulations each), two met the criterion for AP (0%, 14%); one for peak pressure (20%); one for contact force (5%); four for contact area with respect to experiment (8%, 13%, 19% and 23%), and three contact areas met the criterion with respect to direct measurements (14%, 21% and 21%). Absolute differences between model and experimental PPs were reasonably low (within 2.5 MPa). Overall, the results indicate that MRI-based models generated from 3T clinical MR scanner appear sufficient to obtain clinically relevant data.

  20. Diverse convergent evidence in the genetic analysis of complex disease: coordinating omic, informatic, and experimental evidence to better identify and validate risk factors

    PubMed Central

    2014-01-01

    In omic research, such as genome wide association studies, researchers seek to repeat their results in other datasets to reduce false positive findings and thus provide evidence for the existence of true associations. Unfortunately this standard validation approach cannot completely eliminate false positive conclusions, and it can also mask many true associations that might otherwise advance our understanding of pathology. These issues beg the question: How can we increase the amount of knowledge gained from high throughput genetic data? To address this challenge, we present an approach that complements standard statistical validation methods by drawing attention to both potential false negative and false positive conclusions, as well as providing broad information for directing future research. The Diverse Convergent Evidence approach (DiCE) we propose integrates information from multiple sources (omics, informatics, and laboratory experiments) to estimate the strength of the available corroborating evidence supporting a given association. This process is designed to yield an evidence metric that has utility when etiologic heterogeneity, variable risk factor frequencies, and a variety of observational data imperfections might lead to false conclusions. We provide proof of principle examples in which DiCE identified strong evidence for associations that have established biological importance, when standard validation methods alone did not provide support. If used as an adjunct to standard validation methods this approach can leverage multiple distinct data types to improve genetic risk factor discovery/validation, promote effective science communication, and guide future research directions. PMID:25071867

  1. Controls-structures interaction guest investigator program: Overview and phase 1 experimental results and future plans

    NASA Technical Reports Server (NTRS)

    Smith-Taylor, Rudeen; Tanner, Sharon E.

    1993-01-01

    The NASA Controls-Structures Interaction (CSI) Guest Investigator program is described in terms of its support of the development of CSI technologies. The program is based on the introduction of CSI researchers from industry and academia to available test facilities for experimental validation of technologies and methods. Phase 1 experimental results are reviewed with attention given to their use of the Mini-MAST test facility and the facility for the Advance Control Evaluation of Structures. Experiments were conducted regarding the following topics: collocated/noncollocated controllers, nonlinear math modeling, controller design, passive/active suspension systems design, and system identification and fault isolation. The results demonstrate that significantly enhanced performance from the control techniques can be achieved by integrating knowledge of the structural dynamics under consideration into the approaches.

  2. Experimental validation of an analytical kinetic model for edge-localized modes in JET-ITER-like wall

    NASA Astrophysics Data System (ADS)

    Guillemaut, C.; Metzger, C.; Moulton, D.; Heinola, K.; O’Mullane, M.; Balboa, I.; Boom, J.; Matthews, G. F.; Silburn, S.; Solano, E. R.; contributors, JET

    2018-06-01

    The design and operation of future fusion devices relying on H-mode plasmas requires reliable modelling of edge-localized modes (ELMs) for precise prediction of divertor target conditions. An extensive experimental validation of simple analytical predictions of the time evolution of target plasma loads during ELMs has been carried out here in more than 70 JET-ITER-like wall H-mode experiments with a wide range of conditions. Comparisons of these analytical predictions with diagnostic measurements of target ion flux density, power density, impact energy and electron temperature during ELMs are presented in this paper and show excellent agreement. The analytical predictions tested here are made with the ‘free-streaming’ kinetic model (FSM) which describes ELMs as a quasi-neutral plasma bunch expanding along the magnetic field lines into the Scrape-Off Layer without collisions. Consequences of the FSM on energy reflection and deposition on divertor targets during ELMs are also discussed.

  3. Treatment efficiency and stoichiometry of a high-strength graywater.

    PubMed

    Morse, Audra; Khatri, Sukrut; Jackson, W Andrew

    2007-12-01

    The transit mission wastewater may represent a future graywater, in which toilet waste is separated from other household waste streams, and dilution water is minimal. A loading rate study indicated that denitrification is stoichiometrically limited, and nitrification was kinetically limited. Denitrification stoichiometry was developed by deriving hypothetical molecular formulas of organic carbon inputs to be represented by the relative proportions of carbon, hydrogen, oxygen, and nitrogen. The derived stoichiometry was validated against experimental data by adjusting the values of fe and fs and multiplying the total dissolved organic carbon loss across the system by the overall R equation and then comparing the total nitrogen removed in the reaction to experimentally observed total nitrogen removal. The nitrification stoichiometry was similarly validated by multiplying the R equation by the ammonium-nitrogen removed and then comparing the NO(x)-N formed in the equation to actual NO(x)-N production values. The fs values for the denitrifying and nitrifying bacteria were 0.33 and 0.15, respectively.

  4. Predicting cancerlectins by the optimal g-gap dipeptides

    NASA Astrophysics Data System (ADS)

    Lin, Hao; Liu, Wei-Xin; He, Jiao; Liu, Xin-Hui; Ding, Hui; Chen, Wei

    2015-12-01

    The cancerlectin plays a key role in the process of tumor cell differentiation. Thus, to fully understand the function of cancerlectin is significant because it sheds light on the future direction for the cancer therapy. However, the traditional wet-experimental methods were money- and time-consuming. It is highly desirable to develop an effective and efficient computational tool to identify cancerlectins. In this study, we developed a sequence-based method to discriminate between cancerlectins and non-cancerlectins. The analysis of variance (ANOVA) was used to choose the optimal feature set derived from the g-gap dipeptide composition. The jackknife cross-validated results showed that the proposed method achieved the accuracy of 75.19%, which is superior to other published methods. For the convenience of other researchers, an online web-server CaLecPred was established and can be freely accessed from the website http://lin.uestc.edu.cn/server/CalecPred. We believe that the CaLecPred is a powerful tool to study cancerlectins and to guide the related experimental validations.

  5. Numerical Modeling of Active Flow Control in a Boundary Layer Ingesting Offset Inlet

    NASA Technical Reports Server (NTRS)

    Allan, Brian G.; Owens, Lewis R.; Berrier, Bobby L.

    2004-01-01

    This investigation evaluates the numerical prediction of flow distortion and pressure recovery for a boundary layer ingesting offset inlet with active flow control devices. The numerical simulations are computed using a Reynolds averaged Navier-Stokes code developed at NASA. The numerical results are validated by comparison to experimental wind tunnel tests conducted at NASA Langley Research Center at both low and high Mach numbers. Baseline comparisons showed good agreement between numerical and experimental results. Numerical simulations for the inlet with passive and active flow control also showed good agreement at low Mach numbers where experimental data has already been acquired. Numerical simulations of the inlet at high Mach numbers with flow control jets showed an improvement of the flow distortion. Studies on the location of the jet actuators, for the high Mach number case, were conducted to provide guidance for the design of a future experimental wind tunnel test.

  6. Fatigue Life Prediction Based on Crack Closure and Equivalent Initial Flaw Size

    PubMed Central

    Wang, Qiang; Zhang, Wei; Jiang, Shan

    2015-01-01

    Failure analysis and fatigue life prediction are necessary and critical for engineering structural materials. In this paper, a general methodology is proposed to predict fatigue life of smooth and circular-hole specimens, in which the crack closure model and equivalent initial flaw size (EIFS) concept are employed. Different effects of crack closure on small crack growth region and long crack growth region are considered in the proposed method. The EIFS is determined by the fatigue limit and fatigue threshold stress intensity factor △Kth. Fatigue limit is directly obtained from experimental data, and △Kth is calculated by using a back-extrapolation method. Experimental data for smooth and circular-hole specimens in three different alloys (Al2024-T3, Al7075-T6 and Ti-6Al-4V) under multiple stress ratios are used to validate the method. In the validation section, Semi-circular surface crack and quarter-circular corner crack are assumed to be the initial crack shapes for the smooth and circular-hole specimens, respectively. A good agreement is observed between model predictions and experimental data. The detailed analysis and discussion are performed on the proposed model. Some conclusions and future work are given. PMID:28793625

  7. The Elastic Behaviour of Sintered Metallic Fibre Networks: A Finite Element Study by Beam Theory

    PubMed Central

    Bosbach, Wolfram A.

    2015-01-01

    Background The finite element method has complimented research in the field of network mechanics in the past years in numerous studies about various materials. Numerical predictions and the planning efficiency of experimental procedures are two of the motivational aspects for these numerical studies. The widespread availability of high performance computing facilities has been the enabler for the simulation of sufficiently large systems. Objectives and Motivation In the present study, finite element models were built for sintered, metallic fibre networks and validated by previously published experimental stiffness measurements. The validated models were the basis for predictions about so far unknown properties. Materials and Methods The finite element models were built by transferring previously published skeletons of fibre networks into finite element models. Beam theory was applied as simplification method. Results and Conclusions The obtained material stiffness isn’t a constant but rather a function of variables such as sample size and boundary conditions. Beam theory offers an efficient finite element method for the simulated fibre networks. The experimental results can be approximated by the simulated systems. Two worthwhile aspects for future work will be the influence of size and shape and the mechanical interaction with matrix materials. PMID:26569603

  8. Experimental Validation of Displacement Underestimation in ARFI Ultrasound

    PubMed Central

    Czernuszewicz, Tomasz J.; Streeter, Jason E.; Dayton, Paul A.; Gallippi, Caterina M.

    2014-01-01

    Acoustic radiation force impulse (ARFI) imaging is an elastography technique that uses ultrasonic pulses to both displace and track tissue motion. Previous modeling studies have shown that ARFI displacements are susceptible to underestimation due to lateral and elevational shearing that occurs within the tracking resolution cell. In this study, optical tracking was utilized to experimentally measure the displacement underestimation achieved by acoustic tracking using a clinical ultrasound system. Three optically translucent phantoms of varying stiffness were created, embedded with sub-wavelength diameter microspheres, and ARFI excitation pulses with F/1.5 or F/3 lateral focal configurations were transmitted from a standard linear array to induce phantom motion. Displacements were tracked using confocal optical and acoustic methods. As predicted by earlier FEM studies, significant acoustic displacement underestimation was observed for both excitation focal configurations; the maximum underestimation error was 35% of the optically measured displacement for the F/1.5 excitation pulse in the softest phantom. Using higher F/#, less tightly focused beams in the lateral dimension improved accuracy of displacements by approximately 10 percentage points. This work experimentally demonstrates limitations of ARFI implemented on a clinical scanner using a standard linear array and sets up a framework for future displacement tracking validation studies. PMID:23858054

  9. Image-based multi-scale simulation and experimental validation of thermal conductivity of lanthanum zirconate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Xingye; Hu, Bin; Wei, Changdong

    Lanthanum zirconate (La2Zr2O7) is a promising candidate material for thermal barrier coating (TBC) applications due to its low thermal conductivity and high-temperature phase stability. In this work, a novel image-based multi-scale simulation framework combining molecular dynamics (MD) and finite element (FE) calculations is proposed to study the thermal conductivity of La2Zr2O7 coatings. Since there is no experimental data of single crystal La2Zr2O7 thermal conductivity, a reverse non-equilibrium molecular dynamics (reverse NEMD) approach is first employed to compute the temperature-dependent thermal conductivity of single crystal La2Zr2O7. The single crystal data is then passed to a FE model which takes into accountmore » of realistic thermal barrier coating microstructures. The predicted thermal conductivities from the FE model are in good agreement with experimental validations using both flash laser technique and pulsed thermal imaging-multilayer analysis. The framework proposed in this work provides a powerful tool for future design of advanced coating systems. (C) 2016 Elsevier Ltd. All rights reserved.« less

  10. Overview of HIT-SI3 experiment: Simulations, Diagnostics, and Summary of Current Results

    NASA Astrophysics Data System (ADS)

    Penna, James; Jarboe, Thomas; Nelson, Brian; Hossack, Aaron; Sutherland, Derek; Morgan, Kyle; Hansen, Chris; Benedett, Thomas; Everson, Chris; Victor, Brian

    2016-10-01

    The Helicity Injected Torus - Steady Inductive 3(HIT-SI3)experiment forms and maintains spheromaks via Steady Inductive Helicity Injection (SIHI). Three injector units allow for continuous injection of helicity into a copper flux conserver in order to sustain a spheromak. Firing of the injectors with a phase difference allows finite rotation of the plasma to provide a stabilizing effect. Simulations in the MHD code NIMROD and the fluid-model code PSI-TET provide validation and a basis for interpretation of the observed experimental data. Thompson Scattering (TS) and Far Infrared (FIR) Interferometer systems allow temperature and line-averaged density measurements to be taken. An Ion Doppler Spectroscopy (IDS) system allows measurement of the plasma rotation and velocity. HIT-SI3 data has been used for validation of IDCD predictions, in particular the projected impedance of helicity injectors according to the theory. The experimental impedances have been calculated here for the first time for different HIT-SI3 regimes. Such experimental evidence will contribute to the design of future experiments employing IDCD as a current-drive mechanism. Work supported by the D.O.E., Office of Science, Office of Fusion Science.

  11. Maytenus heterophylla and Maytenus senegalensis, two traditional herbal medicines

    PubMed Central

    da Silva, G.; Serrano, R.; Silva, O.

    2011-01-01

    Maytenus heterophylla (Eckl. and Zeyh.) N.K.B. Robson and Maytenus senegalensis (Lam.) Exell are two African shrubs or trees that go under the common name of spike thorn, which belong to the Celastraceae family. Different plant parts of this species are largely used in traditional medicine for infectious and inflammatory diseases treatment. Several studies have been reported for both these species, but there are no recent review articles focusing microscopic, phytochemistry and pharmacological studies. The aim of this review is to summarize the information about these two African traditional medicines. Such kind of data can be applied in future experimental work and may guide future studies, namely in the field of validation of traditional medicine. PMID:22470236

  12. Modelling structural and plasma facing materials for fusion power plants: Recent advances and outstanding issues in the EURATOM fusion materials programme

    NASA Astrophysics Data System (ADS)

    Boutard, Jean-Louis; Dudarev, Sergei; Rieth, Michael

    2011-10-01

    EFDA Fusion Materials Topical Group was established at the end of 2007 to coordinate the EU effort on the development of structural and protection materials able to withstand the very demanding operating conditions of a future DEMO power plant. Focusing on a selection of well identified materials issues, including the behaviour of Reduced Activation Ferritic-Martensitic steels, and W-alloys under the foreseen operation conditions in a future DEMO, this paper describes recent advances in physical modelling and experimental validation, contributing to the definition of chemical composition and microstructure of materials with improved in-service stability at high temperature, high neutron flux and intense ion bombardment.

  13. Maytenus heterophylla and Maytenus senegalensis, two traditional herbal medicines.

    PubMed

    da Silva, G; Serrano, R; Silva, O

    2011-01-01

    Maytenus heterophylla (Eckl. and Zeyh.) N.K.B. Robson and Maytenus senegalensis (Lam.) Exell are two African shrubs or trees that go under the common name of spike thorn, which belong to the Celastraceae family. Different plant parts of this species are largely used in traditional medicine for infectious and inflammatory diseases treatment. Several studies have been reported for both these species, but there are no recent review articles focusing microscopic, phytochemistry and pharmacological studies. The aim of this review is to summarize the information about these two African traditional medicines. Such kind of data can be applied in future experimental work and may guide future studies, namely in the field of validation of traditional medicine.

  14. A critical evaluation of the validity of episodic future thinking: A clinical neuropsychology perspective.

    PubMed

    Ward, Amanda M

    2016-11-01

    Episodic future thinking is defined as the ability to mentally simulate a future event. Although episodic future thinking has been studied extensively in neuroscience, this construct has not been explored in depth from the perspective of clinical neuropsychology. The aim of this critical narrative review is to assess the validity and clinical implications of episodic future thinking. A systematic review of episodic future thinking literature was conducted. PubMed and PsycInfo were searched through July 2015 for review and empirical articles with the following search terms: "episodic future thinking," "future mental simulation," "imagining the future," "imagining new experiences," "future mental time travel," "future autobiographical experience," and "prospection." The review discusses evidence that episodic future thinking is important for adaptive functioning, which has implications for neurological populations. To determine the validity of episodic future thinking, the construct is evaluated with respect to related constructs, such as imagination, episodic memory, autobiographical memory, prospective memory, narrative construction, and working memory. Although it has been minimally investigated, there is evidence of convergent and discriminant validity for episodic future thinking. Research has not addressed the incremental validity of episodic future thinking. Practical considerations of episodic future thinking tasks and related constructs in a clinical neuropsychological setting are considered. The utility of episodic future thinking is currently unknown due to the lack of research investigating the validity of episodic future thinking. Future work is discussed, which could determine whether episodic future thinking is an important missing piece in standard clinical neuropsychological assessment. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2017-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. An extended MHD model has shown good agreement with experimental data at 14 kHz injector operation. Efforts to extend the existing validation to a range of higher frequencies (36, 53, 68 kHz) using the PSI-Tet 3D extended MHD code will be presented, along with simulations of potential combinations of flux conserver features and helicity injector configurations and their impact on current drive performance, density control, and temperature for future SIHI experiments. Work supported by USDoE.

  16. Validation of a quantized-current source with 0.2 ppm uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Friederike; Fricke, Lukas, E-mail: lukas.fricke@ptb.de; Scherer, Hansjörg

    2015-09-07

    We report on high-accuracy measurements of quantized current, sourced by a tunable-barrier single-electron pump at frequencies f up to 1 GHz. The measurements were performed with an ultrastable picoammeter instrument, traceable to the Josephson and quantum Hall effects. Current quantization according to I = ef with e being the elementary charge was confirmed at f = 545 MHz with a total relative uncertainty of 0.2 ppm, improving the state of the art by about a factor of 5. The accuracy of a possible future quantum current standard based on single-electron transport was experimentally validated to be better than the best (indirect) realization of the ampere within themore » present SI.« less

  17. Guidelines for Acoustical Measurements Inside Historical Opera Houses: Procedures and Validation

    NASA Astrophysics Data System (ADS)

    POMPOLI, ROBERTO; PRODI, NICOLA

    2000-04-01

    The acoustics of Italian historical theatres is to be regarded as a cultural heritage, which is to be preserved and studied. These actions are imperative for handing down the heritage to future generations and to avoid its loss. In this paper, the technical means for scientific quantification of the acoustical heritage are presented in the form of operative guidelines for acoustical measurements inside historical theatres. The document includes the advice of international experts and is being employed during an extended measurement campaign inside renaissance and baroque historical theatres. A relevant part of the paper deals with the experimental validation of the recommendations given in the guidelines, achieved by a dedicated test session inside the Municipal Theatre of Ferrara.

  18. Excess success for three related papers on racial bias.

    PubMed

    Francis, Gregory

    2015-01-01

    Three related articles reported that racial bias altered perceptual experience and influenced decision-making. These findings have been applied to training programs for law enforcement, and elsewhere, to mitigate racial bias. However, a statistical analysis of each of the three articles finds that the reported experimental results should be rare, even if the theoretical ideas were correct. The analysis estimates that the probability of the reported experimental success for the articles is 0.003, 0.048, and 0.070, respectively. These low probabilities suggest that similar future work is unlikely to produce as successful outcomes and indicates that readers should be skeptical about the validity of the reported findings and their theoretical implications. The reported findings should not be used to guide policies related to racial bias, and new experimental work is needed to judge the merit of the theoretical ideas.

  19. Structural Dynamics Experimental Activities in Ultra-Lightweight and Inflatable Space Structures

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Lassiter, John O.; Ross, Brian P.

    2001-01-01

    This paper reports recently completed structural dynamics experimental activities with new ultralightweight and inflatable space structures (a.k.a., "Gossamer" spacecraft) at NASA Langley Research Center, NASA Marshall Space Flight Center, and NASA Goddard Space Flight Center. Nine aspects of this work are covered, as follows: 1) inflated, rigidized tubes, 2) active control experiments, 3) photogrammetry, 4) laser vibrometry, 5) modal tests of inflatable structures, 6) in-vacuum modal tests, 7) tensioned membranes, 8) deployment tests, and 9) flight experiment support. Structural dynamics will play a major role in the design and eventual in-space deployment and performance of Gossamer spacecraft, and experimental R&D work such as this is required now to validate new analytical prediction methods. The activities discussed in the paper are pathfinder accomplishments, conducted on unique components and prototypes of future spacecraft systems.

  20. Biomarkers of Exposure to Toxic Substances. Volume 5: Biomarker Pre-validation Studies Prevalidation of Urine and Serum Biomarkers Indicative of Subclinical Kidney Damage in a Nephrotoxin Model

    DTIC Science & Technology

    2009-05-01

    demonstrated to degrade a specific kidney segment (proximal tubule and glomerulus, respectively). In this study a total of seventeen protein biomarkers were...exposure. Two experimental nephrotoxins were interrogated, D-serine and puromycin, each previously demonstrated to degrade a specific kidney segment...to degradation during isolation from sample render it unlikely to develop into a fieldable, self-contained assay system within the near future

  1. Validity and inter-observer reliability of subjective hand-arm vibration assessments.

    PubMed

    Coenen, Pieter; Formanoy, Margriet; Douwes, Marjolein; Bosch, Tim; de Kraker, Heleen

    2014-07-01

    Exposure to mechanical vibrations at work (e.g., due to handling powered tools) is a potential occupational risk as it may cause upper extremity complaints. However, reliable and valid assessment methods for vibration exposure at work are lacking. Measuring hand-arm vibration objectively is often difficult and expensive, while often used information provided by manufacturers lacks detail. Therefore, a subjective hand-arm vibration assessment method was tested on validity and inter-observer reliability. In an experimental protocol, sixteen tasks handling powered tools were executed by two workers. Hand-arm vibration was assessed subjectively by 16 observers according to the proposed subjective assessment method. As a gold standard reference, hand-arm vibration was measured objectively using a vibration measurement device. Weighted κ's were calculated to assess validity, intra-class-correlation coefficients (ICCs) were calculated to assess inter-observer reliability. Inter-observer reliability of the subjective assessments depicting the agreement among observers can be expressed by an ICC of 0.708 (0.511-0.873). The validity of the subjective assessments as compared to the gold-standard reference can be expressed by a weighted κ of 0.535 (0.285-0.785). Besides, the percentage of exact agreement of the subjective assessment compared to the objective measurement was relatively low (i.e., 52% of all tasks). This study shows that subjectively assessed hand-arm vibrations are fairly reliable among observers and moderately valid. This assessment method is a first attempt to use subjective risk assessments of hand-arm vibration. Although, this assessment method can benefit from some future improvement, it can be of use in future studies and in field-based ergonomic assessments. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  3. Bayesian Adaptive Trial Design for a Newly Validated Surrogate Endpoint

    PubMed Central

    Renfro, Lindsay A.; Carlin, Bradley P.; Sargent, Daniel J.

    2011-01-01

    Summary The evaluation of surrogate endpoints for primary use in future clinical trials is an increasingly important research area, due to demands for more efficient trials coupled with recent regulatory acceptance of some surrogates as ‘valid.’ However, little consideration has been given to how a trial which utilizes a newly-validated surrogate endpoint as its primary endpoint might be appropriately designed. We propose a novel Bayesian adaptive trial design that allows the new surrogate endpoint to play a dominant role in assessing the effect of an intervention, while remaining realistically cautious about its use. By incorporating multi-trial historical information on the validated relationship between the surrogate and clinical endpoints, then subsequently evaluating accumulating data against this relationship as the new trial progresses, we adaptively guard against an erroneous assessment of treatment based upon a truly invalid surrogate. When the joint outcomes in the new trial seem plausible given similar historical trials, we proceed with the surrogate endpoint as the primary endpoint, and do so adaptively–perhaps stopping the trial for early success or inferiority of the experimental treatment, or for futility. Otherwise, we discard the surrogate and switch adaptive determinations to the original primary endpoint. We use simulation to test the operating characteristics of this new design compared to a standard O’Brien-Fleming approach, as well as the ability of our design to discriminate trustworthy from untrustworthy surrogates in hypothetical future trials. Furthermore, we investigate possible benefits using patient-level data from 18 adjuvant therapy trials in colon cancer, where disease-free survival is considered a newly-validated surrogate endpoint for overall survival. PMID:21838811

  4. Computational fluid dynamic modeling of a medium-sized surface mine blasthole drill shroud

    PubMed Central

    Zheng, Y.; Reed, W.R.; Zhou, L.; Rider, J.P.

    2016-01-01

    The Pittsburgh Mining Research Division of the U.S. National Institute for Occupational Safety and Health (NIOSH) recently developed a series of models using computational fluid dynamics (CFD) to study airflows and respirable dust distribution associated with a medium-sized surface blasthole drill shroud with a dry dust collector system. Previously run experiments conducted in NIOSH’s full-scale drill shroud laboratory were used to validate the models. The setup values in the CFD models were calculated from experimental data obtained from the drill shroud laboratory and measurements of test material particle size. Subsequent simulation results were compared with the experimental data for several test scenarios, including 0.14 m3/s (300 cfm) and 0.24 m3/s (500 cfm) bailing airflow with 2:1, 3:1 and 4:1 dust collector-to-bailing airflow ratios. For the 2:1 and 3:1 ratios, the calculated dust concentrations from the CFD models were within the 95 percent confidence intervals of the experimental data. This paper describes the methodology used to develop the CFD models, to calculate the model input and to validate the models based on the experimental data. Problem regions were identified and revealed by the study. The simulation results could be used for future development of dust control methods for a surface mine blasthole drill shroud. PMID:27932851

  5. Methodological review of the quality of reach out and read: does it "work"?

    PubMed

    Yeager Pelatti, Christina; Pentimonti, Jill M; Justice, Laura M

    2014-04-01

    A considerable percentage of American children and adults fail to learn adequate literacy skills and read below a third grade level. Shared book reading is perhaps the single most important activity to prepare young children for success in reading. The primary objective of this manuscript was to critically review the methodological quality of Read Out and Read (ROR), a clinically based literacy program/intervention that teaches parents strategies to incorporate while sharing books with children as a method of preventing reading difficulties and academic struggles. A PubMed search was conducted. Articles that met three criteria were considered. First, the study must be clinically based and include parent contact with a pediatrician. Second, parental counseling ("anticipatory guidance") about the importance of parent-child book reading must be included. Third, only experimental or quasi-experimental studies were included; no additional criteria were used. Published articles from any year and peer-reviewed journal were considered. Study quality was determined using a modified version of the Downs and Black (1998) checklist assessing four categories: (1) Reporting, (2) External Validity, (3) Internal Validity-Bias, and (4) Internal Validity-Confounding. We were also interested in whether quality differed based on study design, children's age, sample size, and study outcome. Eleven studies met the inclusion criteria. The overall quality of evidence was variable across all studies; Reporting and External Validity categories were relatively strong while methodological concerns were found in the area of internal validity. Quality scores differed on the four study characteristics. Implications related to clinical practice and future studies are discussed.

  6. A new fast two-color interferometer at Alcator C-Mod for turbulence measurements and comparison with phase contrast imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kasten, C. P., E-mail: ckasten@alum.mit.edu; White, A. E.; Irby, J. H.

    2014-04-15

    Accurately predicting the turbulent transport properties of magnetically confined plasmas is a major challenge of fusion energy research. Validation of transport models is typically done by applying so-called “synthetic diagnostics” to the output of nonlinear gyrokinetic simulations, and the results are compared to experimental data. As part of the validation process, comparing two independent turbulence measurements to each other provides the opportunity to test the synthetic diagnostics themselves; a step which is rarely possible due to limited availability of redundant fluctuation measurements on magnetic confinement experiments. At Alcator C-Mod, phase-contrast imaging (PCI) is a commonly used turbulence diagnostic. PCI measuresmore » line-integrated electron density fluctuations with high sensitivity and wavenumber resolution (1.6 cm{sup −1}≲|k{sub R}|≲11 cm{sup −1}). A new fast two-color interferometry (FTCI) diagnostic on the Alcator C-Mod tokamak measures long-wavelength (|k{sub R}|≲3.0 cm{sup −1}) line-integrated electron density fluctuations. Measurements of coherent and broadband fluctuations made by PCI and FTCI are compared here for the first time. Good quantitative agreement is found between the two measurements. This provides experimental validation of the low-wavenumber region of the PCI calibration, and also helps validate the low-wavenumber portions of the synthetic PCI diagnostic that has been used in gyrokinetic model validation work in the past. We discuss possibilities to upgrade FTCI, so that a similar comparison could be done at higher wavenumbers in the future.« less

  7. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...

  8. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...

  9. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... experimental conditions for the validation study and subsequent use during decontamination. The following experimental conditions apply for any solvent: (a) Temperature and pressure. Conduct the validation study and...

  10. A pilot study on the validity of using pictures and videos for individualized symptom provocation in obsessive-compulsive disorder.

    PubMed

    Simon, Daniela; Kischkel, Eva; Spielberg, Rüdiger; Kathmann, Norbert

    2012-06-30

    Distressing symptom-related anxiety is difficult to study in obsessive-compulsive disorder (OCD) due to the disorder's heterogeneity. Our aim was to develop and validate a set of pictures and films comprising a variety of prominent OCD triggers that can be used for individually tailored symptom provocation in experimental studies. In a two-staged production procedure a large pool of OCD triggers and neutral contents was produced and preselected by three psychotherapists specialized in OCD. A sample of 13 OCD patients and 13 controls rated their anxiety, aversiveness and arousal during exposure to OCD-relevant, aversive and neutral control stimuli. Our findings demonstrate differences between the responses of patients and controls to OCD triggers only. Symptom-related anxiety was stronger in response to dynamic compared with static OCD-relevant stimuli. Due to the small number of 13 patients included in the study, only tentative conclusions can be drawn and this study merely provides a first step of validation. These standardized sets constitute valuable tools that can be used in experimental studies on the brain correlates of OCD symptoms and for the study of therapeutic interventions in order to contribute to future developments in the field. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Development and testing of a double length pets for the CLIC experimental area

    NASA Astrophysics Data System (ADS)

    Sánchez, L.; Carrillo, D.; Gavela, D.; Lara, A.; Rodríguez, E.; Gutiérrez, J. L.; Calero, J.; Toral, F.; Samoshkin, A.; Gudkov, D.; Riddone, G.

    2014-05-01

    CLIC (compact linear collider) is a future e+e- collider based on normal-conducting technology, currently under study at CERN. Its design is based on a novel two-beam acceleration scheme. The main beam gets RF power extracted from a drive beam through power extraction and transfer structures (PETS). The technical feasibility of CLIC is currently being proved by its Third Test Facility (CTF3) which includes the CLIC experimental area (CLEX). Two Double Length CLIC PETS will be installed in CLEX to validate their performance with beam. This paper is focused on the engineering design, fabrication and validation of this PETS first prototype. The design consists of eight identical bars, separated by radial slots in which damping material is located to absorb transverse wakefields, and two compact couplers placed at both ends of the bars to extract the generated power. The PETS bars are housed inside a vacuum tank designed to make the PETS as compact as possible. Several joint techniques such as vacuum brazing, electron beam and arc welding were used to complete the assembly. Finally, several tests such as dimensional control and leak testing were carried out to validate design and fabrication methods. In addition, RF measurements at low power were made to study frequency tuning.

  12. Characterization of the neutron irradiation system for use in the Low-Dose-Rate Irradiation Facility at Sandia National Laboratories.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franco, Manuel

    The objective of this work was to characterize the neutron irradiation system consisting of americium-241 beryllium (241AmBe) neutron sources placed in a polyethylene shielding for use at Sandia National Laboratories (SNL) Low Dose Rate Irradiation Facility (LDRIF). With a total activity of 0.3 TBq (9 Ci), the source consisted of three recycled 241AmBe sources of different activities that had been combined into a single source. The source in its polyethylene shielding will be used in neutron irradiation testing of components. The characterization of the source-shielding system was necessary to evaluate the radiation environment for future experiments. Characterization of the sourcemore » was also necessary because the documentation for the three component sources and their relative alignment within the Special Form Capsule (SFC) was inadequate. The system consisting of the source and shielding was modeled using Monte Carlo N-Particle transport code (MCNP). The model was validated by benchmarking it against measurements using multiple techniques. To characterize the radiation fields over the full spatial geometry of the irradiation system, it was necessary to use a number of instruments of varying sensitivities. First, the computed photon radiography assisted in determining orientation of the component sources. With the capsule properly oriented inside the shielding, the neutron spectra were measured using a variety of techniques. A N-probe Microspec and a neutron Bubble Dosimeter Spectrometer (BDS) set were used to characterize the neutron spectra/field in several locations. In the third technique, neutron foil activation was used to ascertain the neutron spectra. A high purity germanium (HPGe) detector was used to characterize the photon spectrum. The experimentally measured spectra and the MCNP results compared well. Once the MCNP model was validated to an adequate level of confidence, parametric analyses was performed on the model to optimize for potential experimental configurations and neutron spectra for component irradiation. The final product of this work is a MCNP model validated by measurements, an overall understanding of neutron irradiation system including photon/neutron transport and effective dose rates throughout the system, and possible experimental configurations for future irradiation of components.« less

  13. Nuclease Target Site Selection for Maximizing On-target Activity and Minimizing Off-target Effects in Genome Editing

    PubMed Central

    Lee, Ciaran M; Cradick, Thomas J; Fine, Eli J; Bao, Gang

    2016-01-01

    The rapid advancement in targeted genome editing using engineered nucleases such as ZFNs, TALENs, and CRISPR/Cas9 systems has resulted in a suite of powerful methods that allows researchers to target any genomic locus of interest. A complementary set of design tools has been developed to aid researchers with nuclease design, target site selection, and experimental validation. Here, we review the various tools available for target selection in designing engineered nucleases, and for quantifying nuclease activity and specificity, including web-based search tools and experimental methods. We also elucidate challenges in target selection, especially in predicting off-target effects, and discuss future directions in precision genome editing and its applications. PMID:26750397

  14. Experimental Design in Clinical 'Omics Biomarker Discovery.

    PubMed

    Forshed, Jenny

    2017-11-03

    This tutorial highlights some issues in the experimental design of clinical 'omics biomarker discovery, how to avoid bias and get as true quantities as possible from biochemical analyses, and how to select samples to improve the chance of answering the clinical question at issue. This includes the importance of defining clinical aim and end point, knowing the variability in the results, randomization of samples, sample size, statistical power, and how to avoid confounding factors by including clinical data in the sample selection, that is, how to avoid unpleasant surprises at the point of statistical analysis. The aim of this Tutorial is to help translational clinical and preclinical biomarker candidate research and to improve the validity and potential of future biomarker candidate findings.

  15. Numerical and Experimental Study of Wake Redirection Techniques in a Boundary Layer Wind Tunnel

    NASA Astrophysics Data System (ADS)

    Wang, J.; Foley, S.; Nanos, E. M.; Yu, T.; Campagnolo, F.; Bottasso, C. L.; Zanotti, A.; Croce, A.

    2017-05-01

    The aim of the present paper is to validate a wind farm LES framework in the context of two distinct wake redirection techniques: yaw misalignment and individual cyclic pitch control. A test campaign was conducted using scaled wind turbine models in a boundary layer wind tunnel, where both particle image velocimetry and hot-wire thermo anemometers were used to obtain high quality measurements of the downstream flow. A LiDAR system was also employed to determine the non-uniformity of the inflow velocity field. A high-fidelity large-eddy simulation lifting-line model was used to simulate the aerodynamic behavior of the system, including the geometry of the wind turbine nacelle and tower. A tuning-free Lagrangian scale-dependent dynamic approach was adopted to improve the sub-grid scale modeling. Comparisons with experimental measurements are used to systematically validate the simulations. The LES results are in good agreement with the PIV and hot-wire data in terms of time-averaged wake profiles, turbulence intensity and Reynolds shear stresses. Discrepancies are also highlighted, to guide future improvements.

  16. 6 DOF articulated-arm robot and mobile platform: Dynamic modelling as Multibody System and its validation via Experimental Modal Analysis.

    NASA Astrophysics Data System (ADS)

    Toledo Fuentes, A.; Kipfmueller, M.; José Prieto, M. A.

    2017-10-01

    Mobile manipulators are becoming a key instrument to increase the flexibility in industrial processes. Some of their requirements include handling of objects with different weights and sizes and their “fast” transportation, without jeopardizing production workers and machines. The compensation of forces affecting the system dynamic is therefore needed to avoid unwanted oscillations and tilting by sudden accelerations and decelerations. One general solution may be the implementation of external positioning elements to active stabilize the system. To accomplish the approach, the dynamic behavior of a robotic arm and a mobile platform was investigated to develop the stabilization mechanism using multibody simulations. The methodology used was divided into two phases for each subsystem: their natural frequencies and modal shapes were obtained using experimental modal analyses. Then, based on these experimental results, multibody simulation models (MBS) were set up and its dynamical parameters adjusted. Their modal shapes together with their obtained natural frequencies allowed a quantitative and qualitative analysis. In summary, the MBS models were successfully validated with the real subsystems, with a maximal percentage error of 15%. These models will serve as the basis for future steps in the design of the external actuators and its control strategy using a co-simulation tool.

  17. Mechanical Behavior of Dowel-Type Joints Made of Wood Scrimber Composite

    PubMed Central

    He, Minjuan; Tao, Duo; Li, Zheng; Li, Maolin

    2016-01-01

    As a renewable building material with low embodied energy characteristics, wood has gained more and more attention in the green and sustainable building industry. In terms of material resource and physical properties, scrimber composite not only makes full use of fast-growing wood species, but also has better mechanical performance and less inherent variability than natural wood material. In this study, the mechanical behavior of bolted beam-to-column joints built with a kind of scrimber composite was investigated both experimentally and numerically. Two groups of specimens were tested under monotonic and low frequency cyclic loading protocols. The experimental results showed that the bolted joints built with scrimber composite performed well in initial stiffness, ductility, and energy dissipation. A three-dimensional (3D) non-linear finite element model (FEM) for the bolted beam-to-column joints was then developed and validated by experimental results. The validated model was further used to investigate the failure mechanism of the bolted joints through stress analysis. This study can contribute to the application of the proposed scrimber composite in structural engineering, and the developed FEM can serve as a useful tool to evaluate the mechanical behavior of such bolted beam-to-column joints with different configurations in future research. PMID:28773703

  18. Mechanical Behavior of Dowel-Type Joints Made of Wood Scrimber Composite.

    PubMed

    He, Minjuan; Tao, Duo; Li, Zheng; Li, Maolin

    2016-07-15

    As a renewable building material with low embodied energy characteristics, wood has gained more and more attention in the green and sustainable building industry. In terms of material resource and physical properties, scrimber composite not only makes full use of fast-growing wood species, but also has better mechanical performance and less inherent variability than natural wood material. In this study, the mechanical behavior of bolted beam-to-column joints built with a kind of scrimber composite was investigated both experimentally and numerically. Two groups of specimens were tested under monotonic and low frequency cyclic loading protocols. The experimental results showed that the bolted joints built with scrimber composite performed well in initial stiffness, ductility, and energy dissipation. A three-dimensional (3D) non-linear finite element model (FEM) for the bolted beam-to-column joints was then developed and validated by experimental results. The validated model was further used to investigate the failure mechanism of the bolted joints through stress analysis. This study can contribute to the application of the proposed scrimber composite in structural engineering, and the developed FEM can serve as a useful tool to evaluate the mechanical behavior of such bolted beam-to-column joints with different configurations in future research.

  19. Optimal cooperative control synthesis of active displays

    NASA Technical Reports Server (NTRS)

    Garg, S.; Schmidt, D. K.

    1985-01-01

    A technique is developed that is intended to provide a systematic approach to synthesizing display augmentation for optimal manual control in complex, closed-loop tasks. A cooperative control synthesis technique, previously developed to design pilot-optimal control augmentation for the plant, is extended to incorporate the simultaneous design of performance enhancing displays. The technique utilizes an optimal control model of the man in the loop. It is applied to the design of a quickening control law for a display and a simple K/s(2) plant, and then to an F-15 type aircraft in a multi-channel task. Utilizing the closed loop modeling and analysis procedures, the results from the display design algorithm are evaluated and an analytical validation is performed. Experimental validation is recommended for future efforts.

  20. A Framework for Testing Scientific Software: A Case Study of Testing Amsterdam Discrete Dipole Approximation Software

    NASA Astrophysics Data System (ADS)

    Shao, Hongbing

    Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.

  1. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  2. Multi-scale predictions of coniferous forest mortality in the northern hemisphere

    NASA Astrophysics Data System (ADS)

    McDowell, N. G.

    2015-12-01

    Global temperature rise and extremes accompanying drought threaten forests and their associated climatic feedbacks. Our incomplete understanding of the fundamental physiological thresholds of vegetation mortality during drought limits our ability to accurately simulate future vegetation distributions and associated climate feedbacks. Here we integrate experimental evidence with models to show potential widespread loss of needleleaf evergreen trees (NET; ~ conifers) within the Southwest USA by 2100; with rising temperature being the primary cause of mortality. Experimentally, dominant Southwest USA NET species died when they fell below predawn water potential (Ypd) thresholds (April-August mean) beyond which photosynthesis, stomatal and hydraulic conductance, and carbohydrate availability approached zero. Empirical and mechanistic models accurately predicted NET Ypd, and 91% of predictions (10/11) exceeded mortality thresholds within the 21st century due to temperature rise. Completely independent global models predicted >50% loss of northern hemisphere NET by 2100, consistent with the findings for Southwest USA. The global models disagreed with the ecosystem process models in regards to future mortality in Southwest USA, however, highlighting the potential underestimates of future NET mortality as simulated by the global models and signifying the importance of improving regional predictions. Taken together, these results from the validated regional predictions and the global simulations predict global-scale conifer loss in coming decades under projected global warming.

  3. Electromagnetic Compatibility Testing Studies

    NASA Technical Reports Server (NTRS)

    Trost, Thomas F.; Mitra, Atindra K.

    1996-01-01

    This report discusses the results on analytical models and measurement and simulation of statistical properties from a study of microwave reverberation (mode-stirred) chambers performed at Texas Tech University. Two analytical models of power transfer vs. frequency in a chamber, one for antenna-to-antenna transfer and the other for antenna to D-dot sensor, were experimentally validated in our chamber. Two examples are presented of the measurement and calculation of chamber Q, one for each of the models. Measurements of EM power density validate a theoretical probability distribution on and away from the chamber walls and also yield a distribution with larger standard deviation at frequencies below the range of validity of the theory. Measurements of EM power density at pairs of points which validate a theoretical spatial correlation function on the chamber walls and also yield a correlation function with larger correlation length, R(sub corr), at frequencies below the range of validity of the theory. A numerical simulation, employing a rectangular cavity with a moving wall shows agreement with the measurements. The determination that the lowest frequency at which the theoretical spatial correlation function is valid in our chamber is considerably higher than the lowest frequency recommended by current guidelines for utilizing reverberation chambers in EMC testing. Two suggestions have been made for future studies related to EMC testing.

  4. Electroweak radiative corrections for polarized Moller scattering at the future 11 GeV JLab experiment

    DOE PAGES

    Aleksejevs, Aleksandrs; Barkanova, Svetlana; Ilyichev, Alexander; ...

    2010-11-19

    We perform updated and detailed calculations of the complete NLO set of electroweak radiative corrections to parity violating e – e – → e – e – (γ) scattering asymmetries at energies relevant for the ultra-precise Moller experiment coming soon at JLab. Our numerical results are presented for a range of experimental cuts and relative importance of various contributions is analyzed. In addition, we also provide very compact expressions analytically free from non-physical parameters and show them to be valid for fast yet accurate estimations.

  5. Validation of a three-dimensional viscous analysis of axisymmetric supersonic inlet flow fields

    NASA Technical Reports Server (NTRS)

    Benson, T. J.; Anderson, B. H.

    1983-01-01

    A three-dimensional viscous marching analysis for supersonic inlets was developed. To verify this analysis several benchmark axisymmetric test configurations were studied and are compared to experimental data. Detailed two-dimensional results for shock-boundary layer interactions are presented for flows with and without boundary layer bleed. Three dimensional calculations of a cone at angle of attack and a full inlet at attack are also discussed and evaluated. Results of the calculations demonstrate the code's ability to predict complex flow fields and establish guidelines for future calculations using similar codes.

  6. Integration of High-Resolution Laser Displacement Sensors and 3D Printing for Structural Health Monitoring

    PubMed Central

    Chang, Shu-Wei; Kuo, Shih-Yu; Huang, Ting-Hsuan

    2017-01-01

    This paper presents a novel experimental design for complex structural health monitoring (SHM) studies achieved by integrating 3D printing technologies, high-resolution laser displacement sensors, and multiscale entropy SHM theory. A seven-story structure with a variety of composite bracing systems was constructed using a dual-material 3D printer. A wireless Bluetooth vibration speaker was used to excite the ground floor of the structure, and high-resolution laser displacement sensors (1-μm resolution) were used to monitor the displacement history on different floors. Our results showed that the multiscale entropy SHM method could detect damage on the 3D-printed structures. The results of this study demonstrate that integrating 3D printing technologies and high-resolution laser displacement sensors enables the design of cheap, fast processing, complex, small-scale civil structures for future SHM studies. The novel experimental design proposed in this study provides a suitable platform for investigating the validity and sensitivity of SHM in different composite structures and damage conditions for real life applications in the future. PMID:29271937

  7. Integration of High-Resolution Laser Displacement Sensors and 3D Printing for Structural Health Monitoring.

    PubMed

    Chang, Shu-Wei; Lin, Tzu-Kang; Kuo, Shih-Yu; Huang, Ting-Hsuan

    2017-12-22

    This paper presents a novel experimental design for complex structural health monitoring (SHM) studies achieved by integrating 3D printing technologies, high-resolution laser displacement sensors, and multiscale entropy SHM theory. A seven-story structure with a variety of composite bracing systems was constructed using a dual-material 3D printer. A wireless Bluetooth vibration speaker was used to excite the ground floor of the structure, and high-resolution laser displacement sensors (1-μm resolution) were used to monitor the displacement history on different floors. Our results showed that the multiscale entropy SHM method could detect damage on the 3D-printed structures. The results of this study demonstrate that integrating 3D printing technologies and high-resolution laser displacement sensors enables the design of cheap, fast processing, complex, small-scale civil structures for future SHM studies. The novel experimental design proposed in this study provides a suitable platform for investigating the validity and sensitivity of SHM in different composite structures and damage conditions for real life applications in the future.

  8. Aerobiology: Experimental Considerations, Observations, and Future Tools

    PubMed Central

    Haddrell, Allen E.

    2017-01-01

    ABSTRACT Understanding airborne survival and decay of microorganisms is important for a range of public health and biodefense applications, including epidemiological and risk analysis modeling. Techniques for experimental aerosol generation, retention in the aerosol phase, and sampling require careful consideration and understanding so that they are representative of the conditions the bioaerosol would experience in the environment. This review explores the current understanding of atmospheric transport in relation to advances and limitations of aerosol generation, maintenance in the aerosol phase, and sampling techniques. Potential tools for the future are examined at the interface between atmospheric chemistry, aerosol physics, and molecular microbiology where the heterogeneity and variability of aerosols can be explored at the single-droplet and single-microorganism levels within a bioaerosol. The review highlights the importance of method comparison and validation in bioaerosol research and the benefits that the application of novel techniques could bring to increasing the understanding of aerobiological phenomena in diverse research fields, particularly during the progression of atmospheric transport, where complex interdependent physicochemical and biological processes occur within bioaerosol particles. PMID:28667111

  9. The TEF modeling and analysis approach to advance thermionic space power technology

    NASA Astrophysics Data System (ADS)

    Marshall, Albert C.

    1997-01-01

    Thermionics space power systems have been proposed as advanced power sources for future space missions that require electrical power levels significantly above the capabilities of current space power systems. The Defense Special Weapons Agency's (DSWA) Thermionic Evaluation Facility (TEF) is carrying out both experimental and analytical research to advance thermionic space power technology to meet this expected need. A Modeling and Analysis (M&A) project has been created at the TEF to develop analysis tools, evaluate concepts, and guide research. M&A activities are closely linked to the TEF experimental program, providing experiment support and using experimental data to validate models. A planning exercise has been completed for the M&A project, and a strategy for implementation was developed. All M&A activities will build on a framework provided by a system performance model for a baseline Thermionic Fuel Element (TFE) concept. The system model is composed of sub-models for each of the system components and sub-systems. Additional thermionic component options and model improvements will continue to be incorporated in the basic system model during the course of the program. All tasks are organized into four focus areas: 1) system models, 2) thermionic research, 3) alternative concepts, and 4) documentation and integration. The M&A project will provide a solid framework for future thermionic system development.

  10. An assessment of differential reinforcement procedures for learners with autism spectrum disorder.

    PubMed

    Johnson, Kate A; Vladescu, Jason C; Kodak, Tiffany; Sidener, Tina M

    2017-04-01

    Differential reinforcement procedures may promote unprompted correct responding, resulting in a quicker transfer of stimulus control than nondifferential reinforcement. Recent studies that have compared reinforcement arrangements have found that the most effective arrangement may differ across participants. The current study conducted an assessment of differential reinforcement arrangements (i.e., quality, schedule, and magnitude) and nondifferential reinforcement to identify the most effective arrangement for each participant. The assessment phase showed that the quality arrangement was the most efficient for all participants during auditory-visual matching. Next, a validation phase was conducted to evaluate whether the assessment would predict the most effective arrangement across multiple skills. The results from the assessment phase were validated for all participants for the same skill. However, the results were only validated for one participant during the other skills (i.e., tact and intraverbal). The results are discussed in light of previous research and future areas of research. © 2017 Society for the Experimental Analysis of Behavior.

  11. The ToMenovela – A Photograph-Based Stimulus Set for the Study of Social Cognition with High Ecological Validity

    PubMed Central

    Herbort, Maike C.; Iseev, Jenny; Stolz, Christopher; Roeser, Benedict; Großkopf, Nora; Wüstenberg, Torsten; Hellweg, Rainer; Walter, Henrik; Dziobek, Isabel; Schott, Björn H.

    2016-01-01

    We present the ToMenovela, a stimulus set that has been developed to provide a set of normatively rated socio-emotional stimuli showing varying amount of characters in emotionally laden interactions for experimental investigations of (i) cognitive and (ii) affective Theory of Mind (ToM), (iii) emotional reactivity, and (iv) complex emotion judgment with respect to Ekman’s basic emotions (happiness, anger, disgust, fear, sadness, surprise, Ekman and Friesen, 1975). Stimuli were generated with focus on ecological validity and consist of 190 scenes depicting daily-life situations. Two or more of eight main characters with distinct biographies and personalities are depicted on each scene picture. To obtain an initial evaluation of the stimulus set and to pave the way for future studies in clinical populations, normative data on each stimulus of the set was obtained from a sample of 61 neurologically and psychiatrically healthy participants (31 female, 30 male; mean age 26.74 ± 5.84), including a visual analog scale rating of Ekman’s basic emotions (happiness, anger, disgust, fear, sadness, surprise) and free-text descriptions of the content of each scene. The ToMenovela is being developed to provide standardized material of social scenes that are available to researchers in the study of social cognition. It should facilitate experimental control while keeping ecological validity high. PMID:27994562

  12. Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop

    PubMed Central

    Sali, Andrej; Berman, Helen M.; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K.; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M.J.J.; Chiu, Wah; Dal Peraro, Matteo; Di Maio, Frank; Ferrin, Thomas E.; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L.; Meiler, Jens; Marti-Renom, Marc A.; Montelione, Gaetano T.; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J.; Saibil, Helen; Schröder, Gunnar F.; Schwieters, Charles D.; Seidel, Claus A. M.; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L.; Velankar, Sameer; Westbrook, John D.

    2016-01-01

    Summary Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? PMID:26095030

  13. Experimental investigations of turbulent temperature fluctuations and phase angles in ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Freethy, Simon

    2017-10-01

    A complete experimental understanding of the turbulent fluctuations in tokamak plasmas is essential for providing confidence in the extrapolation of heat transport models to future experimental devices and reactors. Guided by ``predict first'' nonlinear gyrokinetic simulations with the GENE code, two new turbulence diagnostics were designed and have been installed on ASDEX Upgrade (AUG) to probe the fundamentals of ion-scale turbulent electron heat transport. The first, a 30-channel correlation ECE (CECE) radiometer, measures radial profiles (0.5

  14. On use of ZPR research reactors and associated instrumentation and measurement methods for reactor physics studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chauvin, J.P.; Blaise, P.; Lyoussi, A.

    2015-07-01

    The French atomic and alternative energies -CEA- is strongly involved in research and development programs concerning the use of nuclear energy as a clean and reliable source of energy and consequently is working on the present and future generation of reactors on various topics such as ageing plant management, optimization of the plutonium stockpile, waste management and innovative systems exploration. Core physics studies are an essential part of this comprehensive R and D effort. In particular, the Zero Power Reactor (ZPR) of CEA: EOLE, MINERVE and MASURCA play an important role in the validation of neutron (as well photon) physicsmore » calculation tools (codes and nuclear data). The experimental programs defined in the CEA's ZPR facilities aim at improving the calculation routes by reducing the uncertainties of the experimental databases. They also provide accurate data on innovative systems in terms of new materials (moderating and decoupling materials) and new concepts (ADS, ABWR, new MTR (e.g. JHR), GENIV) involving new fuels, absorbers and coolant materials. Conducting such interesting experimental R and D programs is based on determining and measuring main parameters of phenomena of interest to qualify calculation tools and nuclear data 'libraries'. Determining these parameters relies on the use of numerous and different experimental techniques using specific and appropriate instrumentation and detection tools. Main ZPR experimental programs at CEA, their objectives and challenges will be presented and discussed. Future development and perspectives regarding ZPR reactors and associated programs will be also presented. (authors)« less

  15. Preliminary Analysis of the BASALA-H Experimental Programme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaise, Patrick; Fougeras, Philippe; Philibert, Herve

    2002-07-01

    This paper is focused on the preliminary analysis of results obtained on the first cores of the first phase of the BASALA (Boiling water reactor Advanced core physics Study Aimed at mox fuel Lattice) programme, aimed at studying the neutronic parameters in ABWR core in hot conditions, currently under investigation in the French EOLE critical facility, within the framework of a cooperation between NUPEC, CEA and Cogema. The first 'on-line' analysis of the results has been made, using a new preliminary design and safety scheme based on the French APOLLO-2 code in its 2.4 qualified version and associated CEA-93 V4more » (JEF-2.2) Library, that will enable the Experimental Physics Division (SPEx) to perform future core designs. It describes the scheme adopted and the results obtained in various cases, going to the critical size determination to the reactivity worth of the perturbed configurations (voided, over-moderated, and poisoned with Gd{sub 2}O{sub 3}-UO{sub 2} pins). A preliminary study on the experimental results on the MISTRAL-4 is resumed, and the comparison of APOLLO-2 versus MCNP-4C calculations on these cores is made. The results obtained show very good agreements between the two codes, and versus the experiment. This work opens the way to the future full analysis of the experimental results of the qualifying teams with completely validated schemes, based on the new 2.5 version of the APOLLO-2 code. (authors)« less

  16. Experimental validation of structural optimization methods

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.

    1992-01-01

    The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.

  17. Implants in bone: part II. Research on implant osseointegration: material testing, mechanical testing, imaging and histoanalytical methods.

    PubMed

    von Wilmowsky, Cornelius; Moest, Tobias; Nkenke, Emeka; Stelzle, Florian; Schlegel, Karl Andreas

    2014-12-01

    In order to determine whether a newly developed implant material conforms to the requirements of biocompatibility, it must undergo rigorous testing. To correctly interpret the results of studies on implant material osseointegration, it is necessary to have a sound understanding of all the testing methods. The aim of this overview is to elucidate the methods that are used for the experimental evaluation of the osseointegration of implant materials. In recent decades, there has been a constant proliferation of new materials and surface modifications in the field of dental implants. This continuous development of innovative biomaterials requires a precise and detailed evaluation in terms of biocompatibility and implant healing before clinical use. The current gold standard is in vivo animal testing on well validated animal models. However, long-term outcome studies on patients have to follow to finally validate and show patient benefit. No experimental set-up can provide answers for all possible research questions. However, a certain transferability of the results to humans might be possible if the experimental set-up is carefully chosen for the aspects and questions being investigated. To enhance the implant survival rate in the rising number of patients with chronic diseases which compromise wound healing and osseointegration, dental implant research on compromised animal models will further gain importance in future.

  18. Scaling Studies for Advanced High Temperature Reactor Concepts, Final Technical Report: October 2014—December 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Brian; Gutowska, Izabela; Chiger, Howard

    Computer simulations of nuclear reactor thermal-hydraulic phenomena are often used in the design and licensing of nuclear reactor systems. In order to assess the accuracy of these computer simulations, computer codes and methods are often validated against experimental data. This experimental data must be of sufficiently high quality in order to conduct a robust validation exercise. In addition, this experimental data is generally collected at experimental facilities that are of a smaller scale than the reactor systems that are being simulated due to cost considerations. Therefore, smaller scale test facilities must be designed and constructed in such a fashion tomore » ensure that the prototypical behavior of a particular nuclear reactor system is preserved. The work completed through this project has resulted in scaling analyses and conceptual design development for a test facility capable of collecting code validation data for the following high temperature gas reactor systems and events— 1. Passive natural circulation core cooling system, 2. pebble bed gas reactor concept, 3. General Atomics Energy Multiplier Module reactor, and 4. prismatic block design steam-water ingress event. In the event that code validation data for these systems or events is needed in the future, significant progress in the design of an appropriate integral-type test facility has already been completed as a result of this project. Where applicable, the next step would be to begin the detailed design development and material procurement. As part of this project applicable scaling analyses were completed and test facility design requirements developed. Conceptual designs were developed for the implementation of these design requirements at the Oregon State University (OSU) High Temperature Test Facility (HTTF). The original HTTF is based on a ¼-scale model of a high temperature gas reactor concept with the capability for both forced and natural circulation flow through a prismatic core with an electrical heat source. The peak core region temperature capability is 1400°C. As part of this project, an inventory of test facilities that could be used for these experimental programs was completed. Several of these facilities showed some promise, however, upon further investigation it became clear that only the OSU HTTF had the power and/or peak temperature limits that would allow for the experimental programs envisioned herein. Thus the conceptual design and feasibility study development focused on examining the feasibility of configuring the current HTTF to collect validation data for these experimental programs. In addition to the scaling analyses and conceptual design development, a test plan was developed for the envisioned modified test facility. This test plan included a discussion on an appropriate shakedown test program as well as the specific matrix tests. Finally, a feasibility study was completed to determine the cost and schedule considerations that would be important to any test program developed to investigate these designs and events.« less

  19. Multi-component testing using HZ-PAN and AgZ-PAN Sorbents for OSPREY Model validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garn, Troy G.; Greenhalgh, Mitchell; Lyon, Kevin L.

    2015-04-01

    In efforts to further develop the capability of the Off-gas SeParation and RecoverY (OSPREY) model, multi-component tests were completed using both HZ-PAN and AgZ-PAN sorbents. The primary purpose of this effort was to obtain multi-component xenon and krypton capacities for comparison to future OSPREY predicted multi-component capacities using previously acquired Langmuir equilibrium parameters determined from single component isotherms. Experimental capacities were determined for each sorbent using two feed gas compositions of 1000 ppmv xenon and 150 ppmv krypton in either a helium or air balance. Test temperatures were consistently held at 220 K and the gas flowrate was 50 sccm.more » Capacities were calculated from breakthrough curves using TableCurve® 2D software by Jandel Scientific. The HZ-PAN sorbent was tested in the custom designed cryostat while the AgZ-PAN was tested in a newly installed cooling apparatus. Previous modeling validation efforts indicated the OSPREY model can be used to effectively predict single component xenon and krypton capacities for both engineered form sorbents. Results indicated good agreement with the experimental and predicted capacity values for both krypton and xenon on the sorbents. Overall, the model predicted slightly elevated capacities for both gases which can be partially attributed to the estimation of the parameters and the uncertainty associated with the experimental measurements. Currently, OSPREY is configured such that one species adsorbs and one does not (i.e. krypton in helium). Modification of OSPREY code is currently being performed to incorporate multiple adsorbing species and non-ideal interactions of gas phase species with the sorbent and adsorbed phases. Once these modifications are complete, the sorbent capacities determined in the present work will be used to validate OSPREY multicomponent adsorption predictions.« less

  20. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments.

    PubMed

    Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua

    2018-01-04

    Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    PubMed Central

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  2. New Reactor Physics Benchmark Data in the March 2012 Edition of the IRPhEP Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John D. Bess; J. Blair Briggs; Jim Gulliford

    2012-11-01

    The International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications. Numerous experiments that have been performed worldwide, represent a large investment of infrastructure, expertise, and cost, and are valuable resources of data for present and future research. These valuable assets provide the basis for recording, development, and validation of methods. If the experimental data are lost, the high cost to repeat many of these measurements may be prohibitive. The purpose of the IRPhEP is to provide an extensively peer-reviewed set ofmore » reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. Contributors from around the world collaborate in the evaluation and review of selected benchmark experiments for inclusion in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [1]. Several new evaluations have been prepared for inclusion in the March 2012 edition of the IRPhEP Handbook.« less

  3. Multicoordination Control Strategy Performance in Hybrid Power Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pezzini, Paolo; Bryden, Kenneth M.; Tucker, David

    This paper evaluates a state-space methodology of a multi-input multi-output (MIMO) control strategy using a 2 × 2 tightly coupled scenario applied to a physical gas turbine fuel cell hybrid power system. A centralized MIMO controller was preferred compared to a decentralized control approach because previous simulation studies showed that the coupling effect identified during the simultaneous control of the turbine speed and cathode airflow was better minimized. The MIMO controller was developed using a state-space dynamic model of the system that was derived using first-order transfer functions empirically obtained through experimental tests. The controller performance was evaluated in termsmore » of disturbance rejection through perturbations in the gas turbine operation, and setpoint tracking maneuver through turbine speed and cathode airflow steps. The experimental results illustrate that a multicoordination control strategy was able to mitigate the coupling of each actuator to each output during the simultaneous control of the system, and improved the overall system performance during transient conditions. On the other hand, the controller showed different performance during validation in simulation environment compared to validation in the physical facility, which will require a better dynamic modeling of the system for the implementation of future multivariable control strategies.« less

  4. Multicoordination Control Strategy Performance in Hybrid Power Systems

    DOE PAGES

    Pezzini, Paolo; Bryden, Kenneth M.; Tucker, David

    2018-04-11

    This paper evaluates a state-space methodology of a multi-input multi-output (MIMO) control strategy using a 2 × 2 tightly coupled scenario applied to a physical gas turbine fuel cell hybrid power system. A centralized MIMO controller was preferred compared to a decentralized control approach because previous simulation studies showed that the coupling effect identified during the simultaneous control of the turbine speed and cathode airflow was better minimized. The MIMO controller was developed using a state-space dynamic model of the system that was derived using first-order transfer functions empirically obtained through experimental tests. The controller performance was evaluated in termsmore » of disturbance rejection through perturbations in the gas turbine operation, and setpoint tracking maneuver through turbine speed and cathode airflow steps. The experimental results illustrate that a multicoordination control strategy was able to mitigate the coupling of each actuator to each output during the simultaneous control of the system, and improved the overall system performance during transient conditions. On the other hand, the controller showed different performance during validation in simulation environment compared to validation in the physical facility, which will require a better dynamic modeling of the system for the implementation of future multivariable control strategies.« less

  5. Transcriptome-wide selection of a reliable set of reference genes for gene expression studies in potato cyst nematodes (Globodera spp.).

    PubMed

    Sabeh, Michael; Duceppe, Marc-Olivier; St-Arnaud, Marc; Mimee, Benjamin

    2018-01-01

    Relative gene expression analyses by qRT-PCR (quantitative reverse transcription PCR) require an internal control to normalize the expression data of genes of interest and eliminate the unwanted variation introduced by sample preparation. A perfect reference gene should have a constant expression level under all the experimental conditions. However, the same few housekeeping genes selected from the literature or successfully used in previous unrelated experiments are often routinely used in new conditions without proper validation of their stability across treatments. The advent of RNA-Seq and the availability of public datasets for numerous organisms are opening the way to finding better reference genes for expression studies. Globodera rostochiensis is a plant-parasitic nematode that is particularly yield-limiting for potato. The aim of our study was to identify a reliable set of reference genes to study G. rostochiensis gene expression. Gene expression levels from an RNA-Seq database were used to identify putative reference genes and were validated with qRT-PCR analysis. Three genes, GR, PMP-3, and aaRS, were found to be very stable within the experimental conditions of this study and are proposed as reference genes for future work.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mallow, Anne; Abdelaziz, Omar; Graham, Jr., Samuel

    The thermal charging performance of paraffin wax combined with compressed expanded natural graphite foam was studied for different graphite bulk densities. Constant heat fluxes between 0.39 W/cm 2 and 1.55 W/cm 2 were applied, as well as a constant boundary temperature of 60 °C. Thermal charging experiments indicate that, in the design of thermal batteries, thermal conductivity of the composite alone is an insufficient metric to determine the influence of the graphite foam on the thermal energy storage. By dividing the latent heat of the composite by the time to end of melt for each applied boundary condition, the energymore » storage performance was calculated to show the effects of composite thermal conductivity, graphite bulk density, and latent heat capacity. For the experimental volume, the addition of graphite beyond a graphite bulk density of 100 kg/m 3 showed limited benefit on the energy storage performance due to the decrease in latent heat storage capacity. These experimental results are used to validate a numerical model to predict the time to melt and for future use in the design of heat exchangers with graphite-foam based phase change material composites. As a result, size scale effects are explored parametrically with the validated model.« less

  7. A joint numerical and experimental study of the jet of an aircraft engine installation with advanced techniques

    NASA Astrophysics Data System (ADS)

    Brunet, V.; Molton, P.; Bézard, H.; Deck, S.; Jacquin, L.

    2012-01-01

    This paper describes the results obtained during the European Union JEDI (JEt Development Investigations) project carried out in cooperation between ONERA and Airbus. The aim of these studies was first to acquire a complete database of a modern-type engine jet installation set under a wall-to-wall swept wing in various transonic flow conditions. Interactions between the engine jet, the pylon, and the wing were studied thanks to ¤advanced¥ measurement techniques. In parallel, accurate Reynolds-averaged Navier Stokes (RANS) simulations were carried out from simple ones with the Spalart Allmaras model to more complex ones like the DRSM-SSG (Differential Reynolds Stress Modef of Speziale Sarkar Gatski) turbulence model. In the end, Zonal-Detached Eddy Simulations (Z-DES) were also performed to compare different simulation techniques. All numerical results are accurately validated thanks to the experimental database acquired in parallel. This complete and complex study of modern civil aircraft engine installation allowed many upgrades in understanding and simulation methods to be obtained. Furthermore, a setup for engine jet installation studies has been validated for possible future works in the S3Ch transonic research wind-tunnel. The main conclusions are summed up in this paper.

  8. Measuring and Advancing Experimental Design Ability in an Introductory Course without Altering Existing Lab Curriculum.

    PubMed

    Shanks, Ryan A; Robertson, Chuck L; Haygood, Christian S; Herdliksa, Anna M; Herdliska, Heather R; Lloyd, Steven A

    2017-01-01

    Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model's ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads.

  9. Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; McCrea, Andrew C.

    2009-01-01

    The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.

  10. Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; McCrea, Andrew C.

    2010-01-01

    The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.

  11. Whole Device Modeling of Compact Tori: Stability and Transport Modeling of C-2W

    NASA Astrophysics Data System (ADS)

    Dettrick, Sean; Fulton, Daniel; Lau, Calvin; Lin, Zhihong; Ceccherini, Francesco; Galeotti, Laura; Gupta, Sangeeta; Onofri, Marco; Tajima, Toshiki; TAE Team

    2017-10-01

    Recent experimental evidence from the C-2U FRC experiment shows that the confinement of energy improves with inverse collisionality, similar to other high beta toroidal devices, NSTX and MAST. This motivated the construction of a new FRC experiment, C-2W, to study the energy confinement scaling at higher electron temperature. Tri Alpha Energy is working towards catalysing a community-wide collaboration to develop a Whole Device Model (WDM) of Compact Tori. One application of the WDM is the study of stability and transport properties of C-2W using two particle-in-cell codes, ANC and FPIC. These codes can be used to find new stable operating points, and to make predictions of the turbulent transport at those points. They will be used in collaboration with the C-2W experimental program to validate the codes against C-2W, mitigate experimental risk inherent in the exploration of new parameter regimes, accelerate the optimization of experimental operating scenarios, and to find operating points for future FRC reactor designs.

  12. Validation of the NIMH-ChEFS adolescent face stimulus set in an adolescent, parent, and health professional sample

    PubMed Central

    COFFMAN, MARIKA C.; TRUBANOVA, ANDREA; RICHEY, J. ANTHONY; WHITE, SUSAN W.; KIM-SPOON, JUNGMEEN; OLLENDICK, THOMAS H.; PINE, DANIEL S.

    2016-01-01

    Attention to faces is a fundamental psychological process in humans, with atypical attention to faces noted across several clinical disorders. Although many clinical disorders onset in adolescence, there is a lack of well-validated stimulus sets containing adolescent faces available for experimental use. Further, the images comprising most available sets are not controlled for high- and low-level visual properties. Here, we present a cross-site validation of the National Institute of Mental Health Child Emotional Faces Picture Set (NIMH-ChEFS), comprised of 257 photographs of adolescent faces displaying angry, fearful, happy, sad, and neutral expressions. All of the direct facial images from the NIMH-ChEFS set were adjusted in terms of location of facial features and standardized for luminance, size, and smoothness. Although overall agreement between raters in this study and the original development-site raters was high (89.52%), this differed by group such that agreement was lower for adolescents relative to mental health professionals in the current study. These results suggest that future research using this face set or others of adolescent/child faces should base comparisons on similarly-aged validation data. PMID:26359940

  13. Development and validation of a treatment planning model for magnetic nanoparticle hyperthermia cancer therapy

    NASA Astrophysics Data System (ADS)

    Stigliano, Robert Vincent

    The use of magnetic nanoparticles (mNPs) to induce local hyperthermia has been emerging in recent years as a promising cancer therapy, in both a stand-alone and combination treatment setting, including surgery radiation and chemotherapy. The mNP solution can be injected either directly into the tumor, or administered intravenously. Studies have shown that some cancer cells associate with, internalize, and aggregate mNPs more preferentially than normal cells, with and without antibody targeting. Once the mNPs are delivered inside the cells, a low frequency (30-300kHz) alternating electromagnetic field is used to activate the mNPs. The nanoparticles absorb the applied field and provide localized heat generation at nano-micron scales. Treatment planning models have been shown to improve treatment efficacy in radiation therapy by limiting normal tissue damage while maximizing dose to the tumor. To date, there does not exist a clinical treatment planning model for magnetic nanoparticle hyperthermia which is robust, validated, and commercially available. The focus of this research is on the development and experimental validation of a treatment planning model, consisting of a coupled electromagnetic and thermal model that predicts dynamic thermal distributions during treatment. When allowed to incubate, the mNPs are often sequestered by cancer cells and packed into endosomes. The proximity of the mNPs has a strong influence on their ability to heat due to interparticle magnetic interaction effects. A model of mNP heating which takes into account the effects of magnetic interaction was developed, and validated against experimental data. An animal study in mice was conducted to determine the effects of mNP solution injection duration and PEGylation on macroscale mNP distribution within the tumor, in order to further inform the treatment planning model and future experimental technique. In clinical applications, a critical limiting factor for the maximum applied field is the heating caused by eddy currents, which are induced in the noncancerous tissue. Phantom studies were conducted to validate the ability of the model to accurately predict eddy current heating in the case of zero blood perfusion, and preliminary data was collected to show the validity of the model in live mice to incorporate blood perfusion.

  14. Development of N-version software samples for an experiment in software fault tolerance

    NASA Technical Reports Server (NTRS)

    Lauterbach, L.

    1987-01-01

    The report documents the task planning and software development phases of an effort to obtain twenty versions of code independently designed and developed from a common specification. These versions were created for use in future experiments in software fault tolerance, in continuation of the experimental series underway at the Systems Validation Methods Branch (SVMB) at NASA Langley Research Center. The 20 versions were developed under controlled conditions at four U.S. universities, by 20 teams of two researchers each. The versions process raw data from a modified Redundant Strapped Down Inertial Measurement Unit (RSDIMU). The specifications, and over 200 questions submitted by the developers concerning the specifications, are included as appendices to this report. Design documents, and design and code walkthrough reports for each version, were also obtained in this task for use in future studies.

  15. A novel recession rate physics methodology for space applications at CIRA by means of CIRCE radioactive beam tracers

    NASA Astrophysics Data System (ADS)

    De Cesare, M.; Di Leva, A.; Del Vecchio, A.; Gialanella, L.

    2018-03-01

    Thermal protection systems (TPSs) of spacecrafts, either for single use or reusable, experience wear by ablation and erosion, due to the high heat fluxes during a re-entry phase in the atmosphere. The determination of the wear rate is a crucial point, which is presently mainly possible in aerospace on-ground measurements by means of invasive diagnostics. The purpose of this paper is to present novel contactless, online, high-sensitivity and non-intrusive diagnostics for wear measurements based on radioactive tracers. We propose the technique for future on-ground experiments that might later be developed to perform in-flight TPSs monitoring, thus significantly increasing the safety of the aerospace vehicles. The basic ideas of the method, its sensitivity investigated by GEANT4 simulations, and the future experimental validation are outlined.

  16. Simulation of energy buildups in solid-state regenerative amplifiers for 2-μm emitting lasers

    NASA Astrophysics Data System (ADS)

    Springer, Ramon; Alexeev, Ilya; Heberle, Johannes; Pflaum, Christoph

    2018-02-01

    A numerical model for solid-state regenerative amplifiers is presented, which is able to precisely simulate the quantitative energy buildup of stretched femtosecond pulses over passed roundtrips in the cavity. In detail, this model is experimentally validated with a Ti:Sapphire regenerative amplifier. Additionally, the simulation of a Ho:YAG based regenerative amplifier is conducted and compared to experimental data from literature. Furthermore, a bifurcation study of the investigated Ho:YAG system is performed, which leads to the identification of stable and instable operation regimes. The presented numerical model exhibits a well agreement to the experimental results from the Ti:Sapphire regenerative amplifier. Also, the gained pulse energy from the Ho:YAG system could be approximated closely, while the mismatch is explained with the monochromatic calculation of pulse amplification. Since the model is applicable to other solid-state gain media, it allows for the efficient design of future amplification systems based on regenerative amplification.

  17. Simulation studies of nucleation of ferroelectric polarization reversal.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brennecka, Geoffrey L.; Winchester, Benjamin Michael

    2014-08-01

    Electric field-induced reversal of spontaneous polarization is the defining characteristic of a ferroelectric material, but the process(es) and mechanism(s) associated with the initial nucleation of reverse-polarity domains are poorly understood. This report describes studies carried out using phase field modeling of LiTaO 3, a relatively simple prototype ferroelectric material, in order to explore the effects of either mechanical deformation or optically-induced free charges on nucleation and resulting domain configuration during field-induced polarization reversal. Conditions were selected to approximate as closely as feasible those of accompanying experimental work in order to provide not only support for the experimental work but alsomore » ensure that additional experimental validation of the simulations could be carried out in the future. Phase field simulations strongly support surface mechanical damage/deformation as effective for dramatically reducing the overall coercive field (Ec) via local field enhancements. Further, optically-nucleated polarization reversal appears to occur via stabilization of latent nuclei via the charge screening effects of free charges.« less

  18. Experimental Stage Separation Tool Development in NASA Langley's Aerothermodynamics Laboratory

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Scallion, William I.

    2005-01-01

    As part of the research effort at NASA in support of the stage separation and ascent aerothermodynamics research program, proximity testing of a generic bimese wing-body configuration was conducted in NASA Langley's Aerothermodynamics Laboratory in the 20-Inch Mach 6 Air Tunnel. The objective of this work is the development of experimental tools and testing methodologies to apply to hypersonic stage separation problems for future multi-stage launch vehicle systems. Aerodynamic force and moment proximity data were generated at a nominal Mach number of 6 over a small range of angles of attack. The generic bimese configuration was tested in a belly-to-belly and back-to-belly orientation at 86 relative proximity locations. Over 800 aerodynamic proximity data points were taken to serve as a database for code validation. Longitudinal aerodynamic data generated in this test program show very good agreement with viscous computational predictions. Thus a framework has been established to study separation problems in the hypersonic regime using coordinated experimental and computational tools.

  19. Development and experimental validation of downlink multiuser MIMO-OFDM in gigabit wireless LAN systems

    NASA Astrophysics Data System (ADS)

    Ishihara, Koichi; Asai, Yusuke; Kudo, Riichi; Ichikawa, Takeo; Takatori, Yasushi; Mizoguchi, Masato

    2013-12-01

    Multiuser multiple-input multiple-output (MU-MIMO) has been proposed as a means to improve spectrum efficiency for various future wireless communication systems. This paper reports indoor experimental results obtained for a newly developed and implemented downlink (DL) MU-MIMO orthogonal frequency division multiplexing (OFDM) transceiver for gigabit wireless local area network systems in the microwave band. In the transceiver, the channel state information (CSI) is estimated at each user and fed back to an access point (AP) on a real-time basis. At the AP, the estimated CSI is used to calculate the transmit beamforming weight for DL MU-MIMO transmission. This paper also proposes a recursive inverse matrix computation scheme for computing the transmit weight in real time. Experiments with the developed transceiver demonstrate its feasibility in a number of indoor scenarios. The experimental results clarify that DL MU-MIMO-OFDM transmission can achieve a 972-Mbit/s transmission data rate with simple digital signal processing of single-antenna users in an indoor environment.

  20. Evaluation of tocopherol recovery through simulation of molecular distillation process.

    PubMed

    Moraes, E B; Batistella, C B; Alvarez, M E Torres; Filho, Rubens Maciel; Maciel, M R Wolf

    2004-01-01

    DISMOL simulator was used to determine the best possible operating conditions to guide, in future studies, experimental works. This simulator needs several physical-chemical properties and often it is very difficult to determine them because of the complexity of the involved components. Their determinations must be made through correlations and/or predictions, in order to characterize the system and calculate it. The first try is to have simulation results of a system that later can be validated with experimental data. To implement, in the simulator, the necessary parameters of complex systems is a difficult task. In this work, we aimed to determe these properties in order to evaluate the tocopherol (vitamin E) recovery using a DISMOL simulator. The raw material used was the crude deodorizer distillate of soya oil. With this procedure, it is possible to determine the best operating conditions for experimental works and to evaluate the process in the separation of new systems, analyzing the profiles obtained from these simulations for the falling film molecular distillator.

  1. Numerical and Experimental Study on Hydrodynamic Performance of A Novel Semi-Submersible Concept

    NASA Astrophysics Data System (ADS)

    Gao, Song; Tao, Long-bin; Kou, Yu-feng; Lu, Chao; Sun, Jiang-long

    2018-04-01

    Multiple Column Platform (MCP) semi-submersible is a newly proposed concept, which differs from the conventional semi-submersibles, featuring centre column and middle pontoon. It is paramount to ensure its structural reliability and safe operation at sea, and a rigorous investigation is conducted to examine the hydrodynamic and structural performance for the novel structure concept. In this paper, the numerical and experimental studies on the hydrodynamic performance of MCP are performed. Numerical simulations are conducted in both the frequency and time domains based on 3D potential theory. The numerical models are validated by experimental measurements obtained from extensive sets of model tests under both regular wave and irregular wave conditions. Moreover, a comparative study on MCP and two conventional semi-submersibles are carried out using numerical simulation. Specifically, the hydrodynamic characteristics, including hydrodynamic coefficients, natural periods and motion response amplitude operators (RAOs), mooring line tension are fully examined. The present study proves the feasibility of the novel MCP and demonstrates the potential possibility of optimization in the future study.

  2. A MPPT Algorithm Based PV System Connected to Single Phase Voltage Controlled Grid

    NASA Astrophysics Data System (ADS)

    Sreekanth, G.; Narender Reddy, N.; Durga Prasad, A.; Nagendrababu, V.

    2012-10-01

    Future ancillary services provided by photovoltaic (PV) systems could facilitate their penetration in power systems. In addition, low-power PV systems can be designed to improve the power quality. This paper presents a single-phase PV systemthat provides grid voltage support and compensation of harmonic distortion at the point of common coupling thanks to a repetitive controller. The power provided by the PV panels is controlled by a Maximum Power Point Tracking algorithm based on the incremental conductance method specifically modified to control the phase of the PV inverter voltage. Simulation and experimental results validate the presented solution.

  3. The mathematics of cancer: integrating quantitative models.

    PubMed

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  4. Validation of fault-free behavior of a reliable multiprocessor system - FTMP: A case study. [Fault-Tolerant Multi-Processor avionics

    NASA Technical Reports Server (NTRS)

    Clune, E.; Segall, Z.; Siewiorek, D.

    1984-01-01

    A program of experiments has been conducted at NASA-Langley to test the fault-free performance of a Fault-Tolerant Multiprocessor (FTMP) avionics system for next-generation aircraft. Baseline measurements of an operating FTMP system were obtained with respect to the following parameters: instruction execution time, frame size, and the variation of clock ticks. The mechanisms of frame stretching were also investigated. The experimental results are summarized in a table. Areas of interest for future tests are identified, with emphasis given to the implementation of a synthetic workload generation mechanism on FTMP.

  5. Validation of Potential Models for Li2O in Classical Molecular Dynamics Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oda, Takuji; Oya, Yasuhisa; Tanaka, Satoru

    2007-08-01

    Four Buckingham-type pairwise potential models for Li2O were assessed by molecular static and dynamics simulations. In the static simulation, all models afforded acceptable agreement with experimental values and ab initio calculation results for the crystalline properties. Moreover, the superionic phase transition was realized in the dynamics simulation. However, the Li diffusivity and the lattice expansion were not adequately reproduced at the same time by any model. When using these models in future radiation simulation, these features should be taken into account, in order to reduce the model dependency of the results.

  6. Documentation of Two- and Three-Dimensional Hypersonic Shock Wave/Turbulent Boundary Layer Interaction Flows

    NASA Technical Reports Server (NTRS)

    Kussoy, Marvin I.; Horstman, Clifford C.

    1989-01-01

    Experimental data for a series of two- and three-dimensional shock wave/turbulent boundary layer interaction flows at Mach 7 are presented. Test bodies, composed of simple geometric shapes, were designed to generate flows with varying degrees of pressure gradient, boundary-layer separation, and turning angle. The data include surface-pressure and heat-transfer distributions as well as limited mean-flow-field surveys in both the undisturbed and the interaction regimes. The data are presented in a convenient form for use in validating existing or future computational models of these generic hypersonic flows.

  7. Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly

    NASA Technical Reports Server (NTRS)

    LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.

    2006-01-01

    The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.

  8. Model based multivariable controller for large scale compression stations. Design and experimental validation on the LHC 18KW cryorefrigerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonne, François; Bonnay, Patrick; Alamir, Mazen

    2014-01-29

    In this paper, a multivariable model-based non-linear controller for Warm Compression Stations (WCS) is proposed. The strategy is to replace all the PID loops controlling the WCS with an optimally designed model-based multivariable loop. This new strategy leads to high stability and fast disturbance rejection such as those induced by a turbine or a compressor stop, a key-aspect in the case of large scale cryogenic refrigeration. The proposed control scheme can be used to have precise control of every pressure in normal operation or to stabilize and control the cryoplant under high variation of thermal loads (such as a pulsedmore » heat load expected to take place in future fusion reactors such as those expected in the cryogenic cooling systems of the International Thermonuclear Experimental Reactor ITER or the Japan Torus-60 Super Advanced fusion experiment JT-60SA). The paper details how to set the WCS model up to synthesize the Linear Quadratic Optimal feedback gain and how to use it. After preliminary tuning at CEA-Grenoble on the 400W@1.8K helium test facility, the controller has been implemented on a Schneider PLC and fully tested first on the CERN's real-time simulator. Then, it was experimentally validated on a real CERN cryoplant. The efficiency of the solution is experimentally assessed using a reasonable operating scenario of start and stop of compressors and cryogenic turbines. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.« less

  9. Recent gyrokinetic turbulence insights with GENE and direct comparison with experimental measurements

    NASA Astrophysics Data System (ADS)

    Goerler, Tobias

    2017-10-01

    Throughout the last years direct comparisons between gyrokinetic turbulence simulations and experimental measurements have been intensified substantially. Such studies are largely motivated by the urgent need for reliable transport predictions for future burning plasma devices and the associated necessity for validating the numerical tools. On the other hand, they can be helpful to assess the way a particular diagnostic experiences turbulence and provide ideas for further optimization and the physics that may not yet be accessible. Here, synthetic diagnostics, i.e. models that mimic the spatial and sometimes temporal response of the experimental diagnostic, play an important role. In the contribution at hand, we focus on recent gyrokinetic GENE simulations dedicated to ASDEX Upgrade L-mode plasmas and comparison with various turbulence measurements. Particular emphasis will be given to density fluctuation spectra which are experimentally accessible via Doppler reflectometry. A sophisticated synthetic diagnostic involving a fullwave code has recently been established and solves the long-lasting question on different spectral roll-overs in gyrokinetic and measured spectra as well as the potentially different power laws in the O- and X-mode signals. The demonstrated agreement furthermore extends the validation data base deep into spectral space and confirms a proper coverage of the turbulence cascade physics. The flux-matched GENE simulations are then used to study the sensitivity of the latter to the main microinstability drive and investigate the energetics at the various scales. Additionally, electron scale turbulence based modifications of the high-k power law spectra in such plasmas will be presented and their visibility in measurable signals be discussed.

  10. Model based multivariable controller for large scale compression stations. Design and experimental validation on the LHC 18KW cryorefrigerator

    NASA Astrophysics Data System (ADS)

    Bonne, François; Alamir, Mazen; Bonnay, Patrick; Bradu, Benjamin

    2014-01-01

    In this paper, a multivariable model-based non-linear controller for Warm Compression Stations (WCS) is proposed. The strategy is to replace all the PID loops controlling the WCS with an optimally designed model-based multivariable loop. This new strategy leads to high stability and fast disturbance rejection such as those induced by a turbine or a compressor stop, a key-aspect in the case of large scale cryogenic refrigeration. The proposed control scheme can be used to have precise control of every pressure in normal operation or to stabilize and control the cryoplant under high variation of thermal loads (such as a pulsed heat load expected to take place in future fusion reactors such as those expected in the cryogenic cooling systems of the International Thermonuclear Experimental Reactor ITER or the Japan Torus-60 Super Advanced fusion experiment JT-60SA). The paper details how to set the WCS model up to synthesize the Linear Quadratic Optimal feedback gain and how to use it. After preliminary tuning at CEA-Grenoble on the 400W@1.8K helium test facility, the controller has been implemented on a Schneider PLC and fully tested first on the CERN's real-time simulator. Then, it was experimentally validated on a real CERN cryoplant. The efficiency of the solution is experimentally assessed using a reasonable operating scenario of start and stop of compressors and cryogenic turbines. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.

  11. Man in space: The use of animal models

    NASA Astrophysics Data System (ADS)

    Ballard, Rodney W.; Souza, Kenneth A.

    Animals have traditionally preceded man into space. During animal and human travels in space over the past almost 30 years, numerous anatomical, physiological, and biochemical changes have been observed. In order to safely qualify humans for extended duration space missions, scientific research needs to be performed. It may be possible to achieve many of these research goals with flight crews serving as experimental subjects; however, to do this with human subjects alone is impractical. Therefore, the use of animal surrogates as experimental subjects is essential to provide the missing information on the effects of spaceflights, to validate countermeasures, and to test medical treatment techniques which will be necessary for long duration missions. This research to assure human health, safety, and productivity in future extended duration space flights will include flights on NASA's Space Shuttle, unmanned biosatellites, and the Space Station Freedom.

  12. Predicting the stochastic guiding of kinesin-driven microtubules in microfabricated tracks: a statistical-mechanics-based modeling approach.

    PubMed

    Lin, Chih-Tin; Meyhofer, Edgar; Kurabayashi, Katsuo

    2010-01-01

    Directional control of microtubule shuttles via microfabricated tracks is key to the development of controlled nanoscale mass transport by kinesin motor molecules. Here we develop and test a model to quantitatively predict the stochastic behavior of microtubule guiding when they mechanically collide with the sidewalls of lithographically patterned tracks. By taking into account appropriate probability distributions of microscopic states of the microtubule system, the model allows us to theoretically analyze the roles of collision conditions and kinesin surface densities in determining how the motion of microtubule shuttles is controlled. In addition, we experimentally observe the statistics of microtubule collision events and compare our theoretical prediction with experimental data to validate our model. The model will direct the design of future hybrid nanotechnology devices that integrate nanoscale transport systems powered by kinesin-driven molecular shuttles.

  13. Man in space: the use of animal models.

    PubMed

    Ballard, R W; Souza, K A

    1991-01-01

    Animals have traditionally preceded man into space. During animal and human travels in space over the past almost 30 years, numerous anatomical, physiological, and biochemical changes have been observed. In order to safely qualify humans for extended duration space missions, scientific research needs to be performed. It may be possible to achieve many of these research goals with flight crews serving as experimental subjects; however, to do this with human subjects alone is impractical. Therefore, the use of animal surrogates as experimental subjects is essential to provide the missing information on the effects of spaceflights, to validate countermeasures, and to test medical treatment techniques which will be necessary for long duration missions. This research to assure human health, safety, and productivity in future extended duration space flights will include flights on NASA's Space Shuttle, unmanned biosatellites, and the Space Station Freedom.

  14. Resource and environmental surveys from space with the thematic mapper in the 1980's

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The selection of observation of vegetation is the primary optimization objective of the thematic mapper. The following are aspects of plans for the thematic mapper: (1) to include an appropriately modified first generation MSS in the thematic mapper mission; (2) to provide assured coverage for a minimum of six years to give agencies and other users an opportunity to justify the necessary commitment of resources for the transition into a completely valid operational phase; (3) to provide for global, direct data read-out, without the necessity for on-board data storage or dependence on foreign receiving stations; (4) to recognize the operational character of the thematic mapper after successful completion of its experimental evaluation; and (5) to combine future experimental packages with compatible orbits as part of the operational LANDSAT follow-on payloads.

  15. Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop.

    PubMed

    Sali, Andrej; Berman, Helen M; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M J J; Chiu, Wah; Peraro, Matteo Dal; Di Maio, Frank; Ferrin, Thomas E; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L; Meiler, Jens; Marti-Renom, Marc A; Montelione, Gaetano T; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J; Saibil, Helen; Schröder, Gunnar F; Schwieters, Charles D; Seidel, Claus A M; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L; Velankar, Sameer; Westbrook, John D

    2015-07-07

    Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, on October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Preparations for Global Precipitation Measurement(GPM)Ground Validation

    NASA Technical Reports Server (NTRS)

    Bidwell, S. W.; Bibyk, I. K.; Duming, J. F.; Everett, D. F.; Smith, E. A.; Wolff, D. B.

    2004-01-01

    The Global Precipitation Measurement (GPM) program is an international partnership led by the National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA). GPM will improve climate, weather, and hydro-meterorological forecasts through more frequent and more accurate measurement of precipitation across the globe. This paper describes the concept and the preparations for Ground Validation within the GPM program. Ground Validation (GV) plays a critical role in the program by investigating and quantitatively assessing the errors within the satellite retrievals. These quantitative estimates of retrieval errors will assist the scientific community by bounding the errors within their research products. The two fundamental requirements of the GPM Ground Validation program are: (1) error characterization of the precipitation retrievals and (2) continual improvement of the satellite retrieval algorithms. These two driving requirements determine the measurements, instrumentation, and location for ground observations. This paper describes GV plans for estimating the systematic and random components of retrieval error and for characterizing the spatial and temporal structure of the error. This paper describes the GPM program for algorithm improvement in which error models are developed and experimentally explored to uncover the physical causes of errors within the retrievals. GPM will ensure that information gained through Ground Validation is applied to future improvements in the spaceborne retrieval algorithms. This paper discusses the potential locations for validation measurement and research, the anticipated contributions of GPM's international partners, and the interaction of Ground Validation with other GPM program elements.

  17. Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.

    PubMed

    Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B

    2018-01-01

    The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.

  18. INTENTION TO EXPERIMENT WITH E-CIGARETTES IN A CROSS-SECTIONAL SURVEY OF UNDERGRADUATE UNIVERSITY STUDENTS IN HUNGARY

    PubMed Central

    Pénzes, Melinda; Foley, Kristie L.; Balázs, Péter; Urbán, Róbert

    2016-01-01

    Background Electronic cigarettes are often used to promote cessation. Only a few studies have explored the motivations for e-cigarette experimentation among young adults. Objectives The goals of this study were to assess the intention to try e-cigarettes among Hungarian university students and to develop a motivational scale to measure vulnerability to e-cigarette experimentation. Methods 826 Hungarian university students completed an internet-based survey in 2013 to measure motives for trying e-cigarettes. We conducted exploratory factor analyses and identified factors that promote and deter experimentation. Logistic regression analysis was performed to test the concurrent predictive validity of the identified motivational factors and we used these factors to predict e-cigarette experimentation, controlling for other known correlates of e-cigarette use. Results 24.9% of the participants have ever tried an e-cigarette and 17.2% of current nonsmokers experimented with the product. Almost 11% of respondents intended to try an e-cigarette in the future, yet only 0.6% were current e-cigarette users. Six factors were identified in the motivational scale for experimentation, four that promote usage (health benefits/smoking cessation; curiosity/taste variety; perceived social norms; convenience when smoking is prohibited) and two that deter usage (chemical hazard; danger of dependence). In a logistic regression analysis, the curiosity/taste factor was the only motivational factor significantly associated with the intention to try e-cigarettes in the future. Conclusions This is the first study to test a motivational scale about what motivates e-cigarettes usage among university students. Additional research is needed to better understand these factors and their influence on e-cigarette uptake. PMID:27159776

  19. Development and validation of LC-MS/MS method for the quantification of oxcarbazepine in human plasma using an experimental design.

    PubMed

    Srinubabu, Gedela; Ratnam, Bandaru Veera Venkata; Rao, Allam Appa; Rao, Medicherla Narasimha

    2008-01-01

    A rapid tandem mass spectrometric (MS-MS) method for the quantification of Oxcarbazepine (OXB) in human plasma using imipramine as an internal standard (IS) has been developed and validated. Chromatographic separation was achieved isocratically on a C18 reversed-phase column within 3.0 min, using a mobile phase of acetonitrile-10 mM ammonium formate (90 : 10 v/v) at a flow rate of 0.3 ml/min. Quantitation was achieved using multiple reaction monitoring (MRM) scan at MRM transitions m/z 253>208 and m/z 281>86 for OXB and the IS respectively. Calibration curves were linear over the concentration range of 0.2-16 mug/ml (r>0.999) with a limit of quantification of 0.2 mug/ml. Analytical recoveries of OXB from spiked human plasma were in the range of 74.9 to 76.3%. Plackett-Burman design was applied for screening of chromatographic and mass spectrometric factors; factorial design was applied for optimization of essential factors for the robustness study. A linear model was postulated and a 2(3) full factorial design was employed to estimate the model coefficients for intermediate precision. More specifically, experimental design helps the researcher to verify if changes in factor values produce a statistically significant variation of the observed response. The strategy is most effective if statistical design is used in most or all stages of the screening and optimizing process for future method validation of pharmacokinetic and bioequivalence studies.

  20. Structural biomechanics of the craniomaxillofacial skeleton under maximal masticatory loading: Inferences and critical analysis based on a validated computational model.

    PubMed

    Pakdel, Amir R; Whyne, Cari M; Fialkov, Jeffrey A

    2017-06-01

    The trend towards optimizing stabilization of the craniomaxillofacial skeleton (CMFS) with the minimum amount of fixation required to achieve union, and away from maximizing rigidity, requires a quantitative understanding of craniomaxillofacial biomechanics. This study uses computational modeling to quantify the structural biomechanics of the CMFS under maximal physiologic masticatory loading. Using an experimentally validated subject-specific finite element (FE) model of the CMFS, the patterns of stress and strain distribution as a result of physiological masticatory loading were calculated. The trajectories of the stresses were plotted to delineate compressive and tensile regimes over the entire CMFS volume. The lateral maxilla was found to be the primary vertical buttress under maximal bite force loading, with much smaller involvement of the naso-maxillary buttress. There was no evidence that the pterygo-maxillary region is a buttressing structure, counter to classical buttress theory. The stresses at the zygomatic sutures suggest that two-point fixation of zygomatic complex fractures may be sufficient for fixation under bite force loading. The current experimentally validated biomechanical FE model of the CMFS is a practical tool for in silico optimization of current practice techniques and may be used as a foundation for the development of design criteria for future technologies for the treatment of CMFS injury and disease. Copyright © 2017 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  1. Students' views about the nature of experimental physics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.; Lewandowski, H. J.

    2017-12-01

    The physics community explores and explains the physical world through a blend of theoretical and experimental studies. The future of physics as a discipline depends on training of students in both the theoretical and experimental aspects of the field. However, while student learning within lecture courses has been the subject of extensive research, lab courses remain relatively under-studied. In particular, there is little, if any, data available that address the effectiveness of physics lab courses at encouraging students to recognize the nature and importance of experimental physics within the discipline as a whole. To address this gap, we present the first large-scale, national study (Ninstitutions=75 and Nstudents=7167 ) of undergraduate physics lab courses through analysis of students' responses to a research-validated assessment designed to investigate students' beliefs about the nature of experimental physics. We find that students often enter and leave physics lab courses with ideas about experimental physics as practiced in their courses that are inconsistent with the views of practicing experimental physicists, and this trend holds at both the introductory and upper-division levels. Despite this inconsistency, we find that both introductory and upper-division students are able to accurately predict the expertlike response even in cases where their views about experimentation in their lab courses disagree. These finding have implications for the recruitment, retention, and adequate preparation of students in physics.

  2. An Integrated Approach Identifies Mediators of Local Recurrence in Head and Neck Squamous Carcinoma.

    PubMed

    Citron, Francesca; Armenia, Joshua; Franchin, Giovanni; Polesel, Jerry; Talamini, Renato; D'Andrea, Sara; Sulfaro, Sandro; Croce, Carlo M; Klement, William; Otasek, David; Pastrello, Chiara; Tokar, Tomas; Jurisica, Igor; French, Deborah; Bomben, Riccardo; Vaccher, Emanuela; Serraino, Diego; Belletti, Barbara; Vecchione, Andrea; Barzan, Luigi; Baldassarre, Gustavo

    2017-07-15

    Purpose: Head and neck squamous cell carcinomas (HNSCCs) cause more than 300,000 deaths worldwide each year. Locoregional and distant recurrences represent worse prognostic events and accepted surrogate markers of patients' overall survival. No valid biomarker and salvage therapy exist to identify and treat patients at high-risk of recurrence. We aimed to verify if selected miRNAs could be used as biomarkers of recurrence in HNSCC. Experimental Design: A NanoString array was used to identify miRNAs associated with locoregional recurrence in 44 patients with HNSCC. Bioinformatic approaches validated the signature and identified potential miRNA targets. Validation experiments were performed using an independent cohort of primary HNSCC samples and a panel of HNSCC cell lines. In vivo experiments validated the in vitro results. Results: Our data identified a four-miRNA signature that classified HNSCC patients at high- or low-risk of recurrence. These miRNAs collectively impinge on the epithelial-mesenchymal transition process. In silico and wet lab approaches showed that miR-9, expressed at high levels in recurrent HNSCC, targets SASH1 and KRT13, whereas miR-1, miR-133, and miR-150, expressed at low levels in recurrent HNSCC, collectively target SP1 and TGFβ pathways. A six-gene signature comprising these targets identified patients at high risk of recurrences, as well. Combined pharmacological inhibition of SP1 and TGFβ pathways induced HNSCC cell death and, when timely administered, prevented recurrence formation in a preclinical model of HNSCC recurrence. Conclusions: By integrating different experimental approaches and competences, we identified critical mediators of recurrence formation in HNSCC that may merit to be considered for future clinical development. Clin Cancer Res; 23(14); 3769-80. ©2017 AACR . ©2017 American Association for Cancer Research.

  3. A Comprehensive Validation Methodology for Sparse Experimental Data

    NASA Technical Reports Server (NTRS)

    Norman, Ryan B.; Blattnig, Steve R.

    2010-01-01

    A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.

  4. Experimental Characterization and Validation of Simultaneous Gust Alleviation and Energy Harvesting for Multifunctional Wing Spars

    DTIC Science & Technology

    2012-08-01

    U0=15m/s,  Lv  =350m   Cloud Wind and Clear Sky Gust Simulation Using Dryden PSD* Harvested Energy from Normal Vibration (Red) to...energy control law based on limited energy constraints 4) Experimentally validated simultaneous energy harvesting and vibration control Summary...Experimental Characterization and Validation of Simultaneous Gust Alleviation and Energy Harvesting for Multifunctional Wing Spars AFOSR

  5. Ocean acidification may aggravate social-ecological trade-offs in coastal fisheries.

    PubMed

    Voss, Rudi; Quaas, Martin F; Schmidt, Jörn O; Kapaun, Ute

    2015-01-01

    Ocean Acidification (OA) will influence marine ecosystems by changing species abundance and composition. Major effects are described for calcifying organisms, which are significantly impacted by decreasing pH values. Direct effects on commercially important fish are less well studied. The early life stages of fish populations often lack internal regulatory mechanisms to withstand the effects of abnormal pH. Negative effects can be expected on growth, survival, and recruitment success. Here we study Norwegian coastal cod, one of the few stocks where such a negative effect was experimentally quantified, and develop a framework for coupling experimental data on OA effects to ecological-economic fisheries models. In this paper, we scale the observed physiological responses to the population level by using the experimentally determined mortality rates as part of the stock-recruitment relationship. We then use an ecological-economic optimization model, to explore the potential effect of rising CO2 concentration on ecological (stock size), economic (profits), consumer-related (harvest) and social (employment) indicators, with scenarios ranging from present day conditions up to extreme acidification. Under the assumptions of our model, yields and profits could largely be maintained under moderate OA by adapting future fishing mortality (and related effort) to changes owing to altered pH. This adaptation comes at the costs of reduced stock size and employment, however. Explicitly visualizing these ecological, economic and social tradeoffs will help in defining realistic future objectives. Our results can be generalized to any stressor (or stressor combination), which is decreasing recruitment success. The main findings of an aggravation of trade-offs will remain valid. This seems to be of special relevance for coastal stocks with limited options for migration to avoid unfavorable future conditions and subsequently for coastal fisheries, which are often small scale local fisheries with limited operational ranges.

  6. Chemical approaches to targeted protein degradation through modulation of the ubiquitin-proteasome pathway.

    PubMed

    Collins, Ian; Wang, Hannah; Caldwell, John J; Chopra, Raj

    2017-03-15

    Manipulation of the ubiquitin-proteasome system to achieve targeted degradation of proteins within cells using chemical tools and drugs has the potential to transform pharmacological and therapeutic approaches in cancer and other diseases. An increased understanding of the molecular mechanism of thalidomide and its analogues following their clinical use has unlocked small-molecule modulation of the substrate specificity of the E3 ligase cereblon (CRBN), which in turn has resulted in the advancement of new immunomodulatory drugs (IMiDs) into the clinic. The degradation of multiple context-specific proteins by these pleiotropic small molecules provides a means to uncover new cell biology and to generate future drug molecules against currently undruggable targets. In parallel, the development of larger bifunctional molecules that bring together highly specific protein targets in complexes with CRBN, von Hippel-Lindau, or other E3 ligases to promote ubiquitin-dependent degradation has progressed to generate selective chemical compounds with potent effects in cells and in vivo models, providing valuable tools for biological target validation and with future potential for therapeutic use. In this review, we survey recent breakthroughs achieved in these two complementary methods and the discovery of new modes of direct and indirect engagement of target proteins with the proteasome. We discuss the experimental characterisation that validates the use of molecules that promote protein degradation as chemical tools, the preclinical and clinical examples disclosed to date, and the future prospects for this exciting area of chemical biology. © 2017 The Author(s).

  7. Experimental Validation of Model Updating and Damage Detection via Eigenvalue Sensitivity Methods with Artificial Boundary Conditions

    DTIC Science & Technology

    2017-09-01

    VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY

  8. VQSEC Home Page

    Science.gov Websites

    Complex Water Impact Visitor Information Validation and Qualification Sciences Experimental Complex Our the problem space. The Validation and Qualification Sciences Experimental Complex (VQSEC) at Sandia

  9. Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2

    NASA Technical Reports Server (NTRS)

    Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)

    1998-01-01

    The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.

  10. Experimental Design and Some Threats to Experimental Validity: A Primer

    ERIC Educational Resources Information Center

    Skidmore, Susan

    2008-01-01

    Experimental designs are distinguished as the best method to respond to questions involving causality. The purpose of the present paper is to explicate the logic of experimental design and why it is so vital to questions that demand causal conclusions. In addition, types of internal and external validity threats are discussed. To emphasize the…

  11. Mental Health Smartphone Apps: Review and Evidence-Based Recommendations for Future Developments.

    PubMed

    Bakker, David; Kazantzis, Nikolaos; Rickwood, Debra; Rickard, Nikki

    2016-03-01

    The number of mental health apps (MHapps) developed and now available to smartphone users has increased in recent years. MHapps and other technology-based solutions have the potential to play an important part in the future of mental health care; however, there is no single guide for the development of evidence-based MHapps. Many currently available MHapps lack features that would greatly improve their functionality, or include features that are not optimized. Furthermore, MHapp developers rarely conduct or publish trial-based experimental validation of their apps. Indeed, a previous systematic review revealed a complete lack of trial-based evidence for many of the hundreds of MHapps available. To guide future MHapp development, a set of clear, practical, evidence-based recommendations is presented for MHapp developers to create better, more rigorous apps. A literature review was conducted, scrutinizing research across diverse fields, including mental health interventions, preventative health, mobile health, and mobile app design. Sixteen recommendations were formulated. Evidence for each recommendation is discussed, and guidance on how these recommendations might be integrated into the overall design of an MHapp is offered. Each recommendation is rated on the basis of the strength of associated evidence. It is important to design an MHapp using a behavioral plan and interactive framework that encourages the user to engage with the app; thus, it may not be possible to incorporate all 16 recommendations into a single MHapp. Randomized controlled trials are required to validate future MHapps and the principles upon which they are designed, and to further investigate the recommendations presented in this review. Effective MHapps are required to help prevent mental health problems and to ease the burden on health systems.

  12. Mental Health Smartphone Apps: Review and Evidence-Based Recommendations for Future Developments

    PubMed Central

    Kazantzis, Nikolaos; Rickwood, Debra; Rickard, Nikki

    2016-01-01

    Background The number of mental health apps (MHapps) developed and now available to smartphone users has increased in recent years. MHapps and other technology-based solutions have the potential to play an important part in the future of mental health care; however, there is no single guide for the development of evidence-based MHapps. Many currently available MHapps lack features that would greatly improve their functionality, or include features that are not optimized. Furthermore, MHapp developers rarely conduct or publish trial-based experimental validation of their apps. Indeed, a previous systematic review revealed a complete lack of trial-based evidence for many of the hundreds of MHapps available. Objective To guide future MHapp development, a set of clear, practical, evidence-based recommendations is presented for MHapp developers to create better, more rigorous apps. Methods A literature review was conducted, scrutinizing research across diverse fields, including mental health interventions, preventative health, mobile health, and mobile app design. Results Sixteen recommendations were formulated. Evidence for each recommendation is discussed, and guidance on how these recommendations might be integrated into the overall design of an MHapp is offered. Each recommendation is rated on the basis of the strength of associated evidence. It is important to design an MHapp using a behavioral plan and interactive framework that encourages the user to engage with the app; thus, it may not be possible to incorporate all 16 recommendations into a single MHapp. Conclusions Randomized controlled trials are required to validate future MHapps and the principles upon which they are designed, and to further investigate the recommendations presented in this review. Effective MHapps are required to help prevent mental health problems and to ease the burden on health systems. PMID:26932350

  13. An eleven-year validation of a physically-based distributed dynamic ecohydorological model tRIBS+VEGGIE: Walnut Gulch Experimental Watershed

    NASA Astrophysics Data System (ADS)

    Sivandran, G.; Bisht, G.; Ivanov, V. Y.; Bras, R. L.

    2008-12-01

    A coupled, dynamic vegetation and hydrologic model, tRIBS+VEGGIE, was applied to the semiarid Walnut Gulch Experimental Watershed in Arizona. The physically-based, distributed nature of the coupled model allows for parameterization and simulation of watershed vegetation-water-energy dynamics on timescales varying from hourly to interannual. The model also allows for explicit spatial representation of processes that vary due to complex topography, such as lateral redistribution of moisture and partitioning of radiation with respect to aspect and slope. Model parameterization and forcing was conducted using readily available databases for topography, soil types, and land use cover as well as the data from network of meteorological stations located within the Walnut Gulch watershed. In order to test the performance of the model, three sets of simulations were conducted over an 11 year period from 1997 to 2007. Two simulations focus on heavily instrumented nested watersheds within the Walnut Gulch basin; (i) Kendall watershed, which is dominated by annual grasses; and (ii) Lucky Hills watershed, which is dominated by a mixture of deciduous and evergreen shrubs. The third set of simulations cover the entire Walnut Gulch Watershed. Model validation and performance were evaluated in relation to three broad categories; (i) energy balance components: the network of meteorological stations were used to validate the key energy fluxes; (ii) water balance components: the network of flumes, rain gauges and soil moisture stations installed within the watershed were utilized to validate the manner in which the model partitions moisture; and (iii) vegetation dynamics: remote sensing products from MODIS were used to validate spatial and temporal vegetation dynamics. Model results demonstrate satisfactory spatial and temporal agreement with observed data, giving confidence that key ecohydrological processes can be adequately represented for future applications of tRIBS+VEGGIE in regional modeling of land-atmosphere interactions.

  14. Students' views about the nature of experimental physics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany

    2017-04-01

    The physics community explores and explains the physical world through a blend of theoretical and experimental studies. The future of physics as a discipline depends on training of students in both the theoretical and experimental aspects of the field. However, while student learning within lecture courses has been the subject of extensive research, lab courses remain relatively under-studied. In particular, there is little, if any, data available that addresses the effectiveness of physics lab courses at encouraging students to recognize the nature and importance of experimental physics within the discipline as a whole. To address this gap, we present the first large-scale, national study (Ninstitutions = 71 and Nstudents = 7167) of undergraduate physics lab courses through analysis of students' responses to a research-validated assessment designed to investigate students' beliefs about the nature of experimental physics. We find that students often enter and leave physics lab courses with ideas about experimental physics that are inconsistent with the views of practicing experimental physicists, and this trend holds at both the introductory and upper-division levels. Despite this inconsistency, we find that both introductory and upper-division students are able to accurately predict the expert-like response even in cases where their personal views disagree. These finding have implications for the recruitment, retention, and adequate preparation of students in physics. This work was funded by the NSF-IUSE Grant No. DUE-1432204 and NSF Grant No. PHY-1125844.

  15. Measuring and Advancing Experimental Design Ability in an Introductory Course without Altering Existing Lab Curriculum†

    PubMed Central

    Shanks, Ryan A.; Robertson, Chuck L.; Haygood, Christian S.; Herdliksa, Anna M.; Herdliska, Heather R.; Lloyd, Steven A.

    2017-01-01

    Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model’s ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads. PMID:28904647

  16. Performance of the first Japanese large-scale facility for radon inhalation experiments with small animals.

    PubMed

    Ishimori, Yuu; Mitsunobu, Fumihiro; Yamaoka, Kiyonori; Tanaka, Hiroshi; Kataoka, Takahiro; Sakoda, Akihiro

    2011-07-01

    A radon test facility for small animals was developed in order to increase the statistical validity of differences of the biological response in various radon environments. This paper illustrates the performances of that facility, the first large-scale facility of its kind in Japan. The facility has a capability to conduct approximately 150 mouse-scale tests at the same time. The apparatus for exposing small animals to radon has six animal chamber groups with five independent cages each. Different radon concentrations in each animal chamber group are available. Because the first target of this study is to examine the in vivo behaviour of radon and its effects, the major functions to control radon and to eliminate thoron were examined experimentally. Additionally, radon progeny concentrations and their particle size distributions in the cages were also examined experimentally to be considered in future projects.

  17. Sliding contact fracture of dental ceramics: Principles and validation

    PubMed Central

    Ren, Linlin; Zhang, Yu

    2014-01-01

    Ceramic prostheses are subject to sliding contact under normal and tangential loads. Accurate prediction of the onset of fracture at two contacting surfaces holds the key to greater long-term performance of these prostheses. In this study, building on stress analysis of Hertzian contact and considering fracture criteria for linear elastic materials, a constitutive fracture mechanics relation was developed to incorporate the critical fracture load with the contact geometry, coefficient of friction and material fracture toughness. Critical loads necessary to cause fracture under a sliding indenter were calculated from the constitutive equation, and compared with the loads predicted from elastic stress analysis in conjunction with measured critical load for frictionless normal contact—a semi-empirical approach. The major predictions of the models were calibrated with experimentally determined critical loads of current and future dental ceramics after contact with a rigid spherical slider. Experimental results conform with the trends predicted by the models. PMID:24632538

  18. Design and Performance Evaluation of an Electro-Hydraulic Camless Engine Valve Actuator for Future Vehicle Applications

    PubMed Central

    Nam, Kanghyun; Cho, Kwanghyun; Park, Sang-Shin; Choi, Seibum B.

    2017-01-01

    This paper details the new design and dynamic simulation of an electro-hydraulic camless engine valve actuator (EH-CEVA) and experimental verification with lift position sensors. In general, camless engine technologies have been known for improving fuel efficiency, enhancing power output, and reducing emissions of internal combustion engines. Electro-hydraulic valve actuators are used to eliminate the camshaft of an existing internal combustion engines and used to control the valve timing and valve duration independently. This paper presents novel electro-hydraulic actuator design, dynamic simulations, and analysis based on design specifications required to satisfy the operation performances. An EH-CEVA has initially been designed and modeled by means of a powerful hydraulic simulation software, AMESim, which is useful for the dynamic simulations and analysis of hydraulic systems. Fundamental functions and performances of the EH-CEVA have been validated through comparisons with experimental results obtained in a prototype test bench. PMID:29258270

  19. Design and Performance Evaluation of an Electro-Hydraulic Camless Engine Valve Actuator for Future Vehicle Applications.

    PubMed

    Nam, Kanghyun; Cho, Kwanghyun; Park, Sang-Shin; Choi, Seibum B

    2017-12-18

    This paper details the new design and dynamic simulation of an electro-hydraulic camless engine valve actuator (EH-CEVA) and experimental verification with lift position sensors. In general, camless engine technologies have been known for improving fuel efficiency, enhancing power output, and reducing emissions of internal combustion engines. Electro-hydraulic valve actuators are used to eliminate the camshaft of an existing internal combustion engines and used to control the valve timing and valve duration independently. This paper presents novel electro-hydraulic actuator design, dynamic simulations, and analysis based on design specifications required to satisfy the operation performances. An EH-CEVA has initially been designed and modeled by means of a powerful hydraulic simulation software, AMESim, which is useful for the dynamic simulations and analysis of hydraulic systems. Fundamental functions and performances of the EH-CEVA have been validated through comparisons with experimental results obtained in a prototype test bench.

  20. Assessment of Experimental Uncertainty for a Floating Wind Semisubmersible under Hydrodynamic Loading: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N; Wendt, Fabian F; Jonkman, Jason

    The objective of this paper is to assess the sources of experimental uncertainty in an offshore wind validation campaign focused on better understanding the nonlinear hydrodynamic response behavior of a floating semisubmersible. The test specimen and conditions were simplified compared to other floating wind test campaigns to reduce potential sources of uncertainties and better focus on the hydrodynamic load attributes. Repeat tests were used to understand the repeatability of the test conditions and to assess the level of random uncertainty in the measurements. Attention was also given to understanding bias in all components of the test. The end goal ofmore » this work is to set uncertainty bounds on the response metrics of interest, which will be used in future work to evaluate the success of modeling tools in accurately calculating hydrodynamic loads and the associated motion responses of the system.« less

  1. Neural-network quantum state tomography

    NASA Astrophysics Data System (ADS)

    Torlai, Giacomo; Mazzola, Guglielmo; Carrasquilla, Juan; Troyer, Matthias; Melko, Roger; Carleo, Giuseppe

    2018-05-01

    The experimental realization of increasingly complex synthetic quantum systems calls for the development of general theoretical methods to validate and fully exploit quantum resources. Quantum state tomography (QST) aims to reconstruct the full quantum state from simple measurements, and therefore provides a key tool to obtain reliable analytics1-3. However, exact brute-force approaches to QST place a high demand on computational resources, making them unfeasible for anything except small systems4,5. Here we show how machine learning techniques can be used to perform QST of highly entangled states with more than a hundred qubits, to a high degree of accuracy. We demonstrate that machine learning allows one to reconstruct traditionally challenging many-body quantities—such as the entanglement entropy—from simple, experimentally accessible measurements. This approach can benefit existing and future generations of devices ranging from quantum computers to ultracold-atom quantum simulators6-8.

  2. Experimental verification of a radiofrequency power model for Wi-Fi technology.

    PubMed

    Fang, Minyu; Malone, David

    2010-04-01

    When assessing the power emitted from a Wi-Fi network, it has been observed that these networks operate at a relatively low duty cycle. In this paper, we extend a recently introduced model of emitted power in Wi-Fi networks to cover conditions where devices do not always have packets to transmit. We present experimental results to validate the original model and its extension by developing approximate, but practical, testbed measurement techniques. The accuracy of the models is confirmed, with small relative errors: less than 5-10%. Moreover, we confirm that the greatest power is emitted when the network is saturated with traffic. Using this, we give a simple technique to quickly estimate power output based on traffic levels and give examples showing how this might be used in practice to predict current or future power output from a Wi-Fi network.

  3. State of the art on alternative methods to animal testing from an industrial point of view: ready for regulation?

    PubMed

    Ashton, Rachel; De Wever, Bart; Fuchs, Horst W; Gaca, Marianna; Hill, Erin; Krul, Cyrille; Poth, Albrecht; Roggen, Erwin L

    2014-01-01

    Despite changing attitudes towards animal testing and current legislation to protect experimental animals, the rate of animal experiments seems to have changed little in recent years. On May 15-16, 2013, the In Vitro Testing Industrial Platform (IVTIP) held an open meeting to discuss the state of the art in alternative methods, how companies have, can, and will need to adapt and what drives and hinders regulatory acceptance and use. Several key messages arose from the meeting. First, industry and regulatory bodies should not wait for complete suites of alternative tests to become available, but should begin working with methods available right now (e.g., mining of existing animal data to direct future studies, implementation of alternative tests wherever scientifically valid rather than continuing to rely on animal tests) in non-animal and animal integrated strategies to reduce the numbers of animals tested. Sharing of information (communication), harmonization and standardization (coordination), commitment and collaboration are all required to improve the quality and speed of validation, acceptance, and implementation of tests. Finally, we consider how alternative methods can be used in research and development before formal implementation in regulations. Here we present the conclusions on what can be done already and suggest some solutions and strategies for the future.

  4. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...

  5. 40 CFR 761.386 - Required experimental conditions for the validation study and subsequent use during decontamination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the validation study and subsequent use during decontamination. 761.386 Section 761.386 Protection of... (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Comparison Study for... experimental conditions for the validation study and subsequent use during decontamination. The following...

  6. Methodological convergence of program evaluation designs.

    PubMed

    Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2014-01-01

    Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.

  7. Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model

    DTIC Science & Technology

    2010-03-01

    EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End

  8. Monte Carlo calculations of positron emitter yields in proton radiotherapy.

    PubMed

    Seravalli, E; Robert, C; Bauer, J; Stichelbaut, F; Kurz, C; Smeets, J; Van Ngoc Ty, C; Schaart, D R; Buvat, I; Parodi, K; Verhaegen, F

    2012-03-21

    Positron emission tomography (PET) is a promising tool for monitoring the three-dimensional dose distribution in charged particle radiotherapy. PET imaging during or shortly after proton treatment is based on the detection of annihilation photons following the ß(+)-decay of radionuclides resulting from nuclear reactions in the irradiated tissue. Therapy monitoring is achieved by comparing the measured spatial distribution of irradiation-induced ß(+)-activity with the predicted distribution based on the treatment plan. The accuracy of the calculated distribution depends on the correctness of the computational models, implemented in the employed Monte Carlo (MC) codes that describe the interactions of the charged particle beam with matter and the production of radionuclides and secondary particles. However, no well-established theoretical models exist for predicting the nuclear interactions and so phenomenological models are typically used based on parameters derived from experimental data. Unfortunately, the experimental data presently available are insufficient to validate such phenomenological hadronic interaction models. Hence, a comparison among the models used by the different MC packages is desirable. In this work, starting from a common geometry, we compare the performances of MCNPX, GATE and PHITS MC codes in predicting the amount and spatial distribution of proton-induced activity, at therapeutic energies, to the already experimentally validated PET modelling based on the FLUKA MC code. In particular, we show how the amount of ß(+)-emitters produced in tissue-like media depends on the physics model and cross-sectional data used to describe the proton nuclear interactions, thus calling for future experimental campaigns aiming at supporting improvements of MC modelling for clinical application of PET monitoring. © 2012 Institute of Physics and Engineering in Medicine

  9. Role of metabolism and viruses in aflatoxin-induced liver cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groopman, John D.; Kensler, Thomas W.

    The use of biomarkers in molecular epidemiology studies for identifying stages in the progression of development of the health effects of environmental agents has the potential for providing important information for critical regulatory, clinical and public health problems. Investigations of aflatoxins probably represent one of the most extensive data sets in the field and this work may serve as a template for future studies of other environmental agents. The aflatoxins are naturally occurring mycotoxins found on foods such as corn, peanuts, various other nuts and cottonseed and they have been demonstrated to be carcinogenic in many experimental models. As amore » result of nearly 30 years of study, experimental data and epidemiological studies in human populations, aflatoxin B{sub 1} was classified as carcinogenic to humans by the International Agency for Research on Cancer. The long-term goal of the research described herein is the application of biomarkers to the development of preventative interventions for use in human populations at high-risk for cancer. Several of the aflatoxin-specific biomarkers have been validated in epidemiological studies and are now being used as intermediate biomarkers in prevention studies. The development of these aflatoxin biomarkers has been based upon the knowledge of the biochemistry and toxicology of aflatoxins gleaned from both experimental and human studies. These biomarkers have subsequently been utilized in experimental models to provide data on the modulation of these markers under different situations of disease risk. This systematic approach provides encouragement for preventive interventions and should serve as a template for the development, validation and application of other chemical-specific biomarkers to cancer or other chronic diseases.« less

  10. Computational and experimental analysis of short peptide motifs for enzyme inhibition.

    PubMed

    Fu, Jinglin; Larini, Luca; Cooper, Anthony J; Whittaker, John W; Ahmed, Azka; Dong, Junhao; Lee, Minyoung; Zhang, Ting

    2017-01-01

    The metabolism of living systems involves many enzymes that play key roles as catalysts and are essential to biological function. Searching ligands with the ability to modulate enzyme activities is central to diagnosis and therapeutics. Peptides represent a promising class of potential enzyme modulators due to the large chemical diversity, and well-established methods for library synthesis. Peptides and their derivatives are found to play critical roles in modulating enzymes and mediating cellular uptakes, which are increasingly valuable in therapeutics. We present a methodology that uses molecular dynamics (MD) and point-variant screening to identify short peptide motifs that are critical for inhibiting β-galactosidase (β-Gal). MD was used to simulate the conformations of peptides and to suggest short motifs that were most populated in simulated conformations. The function of the simulated motifs was further validated by the experimental point-variant screening as critical segments for inhibiting the enzyme. Based on the validated motifs, we eventually identified a 7-mer short peptide for inhibiting an enzyme with low μM IC50. The advantage of our methodology is the relatively simplified simulation that is informative enough to identify the critical sequence of a peptide inhibitor, with a precision comparable to truncation and alanine scanning experiments. Our combined experimental and computational approach does not rely on a detailed understanding of mechanistic and structural details. The MD simulation suggests the populated motifs that are consistent with the results of the experimental alanine and truncation scanning. This approach appears to be applicable to both natural and artificial peptides. With more discovered short motifs in the future, they could be exploited for modulating biocatalysis, and developing new medicine.

  11. Computational dynamics of soft machines

    NASA Astrophysics Data System (ADS)

    Hu, Haiyan; Tian, Qiang; Liu, Cheng

    2017-06-01

    Soft machine refers to a kind of mechanical system made of soft materials to complete sophisticated missions, such as handling a fragile object and crawling along a narrow tunnel corner, under low cost control and actuation. Hence, soft machines have raised great challenges to computational dynamics. In this review article, recent studies of the authors on the dynamic modeling, numerical simulation, and experimental validation of soft machines are summarized in the framework of multibody system dynamics. The dynamic modeling approaches are presented first for the geometric nonlinearities of coupled overall motions and large deformations of a soft component, the physical nonlinearities of a soft component made of hyperelastic or elastoplastic materials, and the frictional contacts/impacts of soft components, respectively. Then the computation approach is outlined for the dynamic simulation of soft machines governed by a set of differential-algebraic equations of very high dimensions, with an emphasis on the efficient computations of the nonlinear elastic force vector of finite elements. The validations of the proposed approaches are given via three case studies, including the locomotion of a soft quadrupedal robot, the spinning deployment of a solar sail of a spacecraft, and the deployment of a mesh reflector of a satellite antenna, as well as the corresponding experimental studies. Finally, some remarks are made for future studies.

  12. Thermal charging study of compressed expanded natural graphite/phase change material composites

    DOE PAGES

    Mallow, Anne; Abdelaziz, Omar; Graham, Jr., Samuel

    2016-08-12

    The thermal charging performance of paraffin wax combined with compressed expanded natural graphite foam was studied for different graphite bulk densities. Constant heat fluxes between 0.39 W/cm 2 and 1.55 W/cm 2 were applied, as well as a constant boundary temperature of 60 °C. Thermal charging experiments indicate that, in the design of thermal batteries, thermal conductivity of the composite alone is an insufficient metric to determine the influence of the graphite foam on the thermal energy storage. By dividing the latent heat of the composite by the time to end of melt for each applied boundary condition, the energymore » storage performance was calculated to show the effects of composite thermal conductivity, graphite bulk density, and latent heat capacity. For the experimental volume, the addition of graphite beyond a graphite bulk density of 100 kg/m 3 showed limited benefit on the energy storage performance due to the decrease in latent heat storage capacity. These experimental results are used to validate a numerical model to predict the time to melt and for future use in the design of heat exchangers with graphite-foam based phase change material composites. As a result, size scale effects are explored parametrically with the validated model.« less

  13. Application of Jacobian-free Newton–Krylov method in implicitly solving two-fluid six-equation two-phase flow problems: Implementation, validation and benchmark

    DOE PAGES

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    2016-03-09

    This work represents a first-of-its-kind successful application to employ advanced numerical methods in solving realistic two-phase flow problems with two-fluid six-equation two-phase flow model. These advanced numerical methods include high-resolution spatial discretization scheme with staggered grids (high-order) fully implicit time integration schemes, and Jacobian-free Newton–Krylov (JFNK) method as the nonlinear solver. The computer code developed in this work has been extensively validated with existing experimental flow boiling data in vertical pipes and rod bundles, which cover wide ranges of experimental conditions, such as pressure, inlet mass flux, wall heat flux and exit void fraction. Additional code-to-code benchmark with the RELAP5-3Dmore » code further verifies the correct code implementation. The combined methods employed in this work exhibit strong robustness in solving two-phase flow problems even when phase appearance (boiling) and realistic discrete flow regimes are considered. Transitional flow regimes used in existing system analysis codes, normally introduced to overcome numerical difficulty, were completely removed in this work. As a result, this in turn provides the possibility to utilize more sophisticated flow regime maps in the future to further improve simulation accuracy.« less

  14. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  15. Reverse engineering biomolecular systems using -omic data: challenges, progress and opportunities.

    PubMed

    Quo, Chang F; Kaddi, Chanchala; Phan, John H; Zollanvari, Amin; Xu, Mingqing; Wang, May D; Alterovitz, Gil

    2012-07-01

    Recent advances in high-throughput biotechnologies have led to the rapid growing research interest in reverse engineering of biomolecular systems (REBMS). 'Data-driven' approaches, i.e. data mining, can be used to extract patterns from large volumes of biochemical data at molecular-level resolution while 'design-driven' approaches, i.e. systems modeling, can be used to simulate emergent system properties. Consequently, both data- and design-driven approaches applied to -omic data may lead to novel insights in reverse engineering biological systems that could not be expected before using low-throughput platforms. However, there exist several challenges in this fast growing field of reverse engineering biomolecular systems: (i) to integrate heterogeneous biochemical data for data mining, (ii) to combine top-down and bottom-up approaches for systems modeling and (iii) to validate system models experimentally. In addition to reviewing progress made by the community and opportunities encountered in addressing these challenges, we explore the emerging field of synthetic biology, which is an exciting approach to validate and analyze theoretical system models directly through experimental synthesis, i.e. analysis-by-synthesis. The ultimate goal is to address the present and future challenges in reverse engineering biomolecular systems (REBMS) using integrated workflow of data mining, systems modeling and synthetic biology.

  16. Ongoing Fixed Wing Research within the NASA Langley Aeroelasticity Branch

    NASA Technical Reports Server (NTRS)

    Bartels, Robert; Chwalowski, Pawel; Funk, Christie; Heeg, Jennifer; Hur, Jiyoung; Sanetrik, Mark; Scott, Robert; Silva, Walter; Stanford, Bret; Wiseman, Carol

    2015-01-01

    The NASA Langley Aeroelasticity Branch is involved in a number of research programs related to fixed wing aeroelasticity and aeroservoelasticity. These ongoing efforts are summarized here, and include aeroelastic tailoring of subsonic transport wing structures, experimental and numerical assessment of truss-braced wing flutter and limit cycle oscillations, and numerical modeling of high speed civil transport configurations. Efforts devoted to verification, validation, and uncertainty quantification of aeroelastic physics in a workshop setting are also discussed. The feasibility of certain future civil transport configurations will depend on the ability to understand and control complex aeroelastic phenomena, a goal that the Aeroelasticity Branch is well-positioned to contribute through these programs.

  17. A perspective on modeling the multiscale response of energetic materials

    NASA Astrophysics Data System (ADS)

    Rice, Betsy M.

    2017-01-01

    The response of an energetic material to insult is perhaps one of the most difficult processes to model due to concurrent chemical and physical phenomena occurring over scales ranging from atomistic to continuum. Unraveling the interdependencies of these complex processes across the scales through modeling can only be done within a multiscale framework. In this paper, I will describe progress in the development of a predictive, experimentally validated multiscale reactive modeling capability for energetic materials at the Army Research Laboratory. I will also describe new challenges and research opportunities that have arisen in the course of our development which should be pursued in the future.

  18. Validation of the NIMH-ChEFS adolescent face stimulus set in an adolescent, parent, and health professional sample.

    PubMed

    Coffman, Marika C; Trubanova, Andrea; Richey, J Anthony; White, Susan W; Kim-Spoon, Jungmeen; Ollendick, Thomas H; Pine, Daniel S

    2015-12-01

    Attention to faces is a fundamental psychological process in humans, with atypical attention to faces noted across several clinical disorders. Although many clinical disorders onset in adolescence, there is a lack of well-validated stimulus sets containing adolescent faces available for experimental use. Further, the images comprising most available sets are not controlled for high- and low-level visual properties. Here, we present a cross-site validation of the National Institute of Mental Health Child Emotional Faces Picture Set (NIMH-ChEFS), comprised of 257 photographs of adolescent faces displaying angry, fearful, happy, sad, and neutral expressions. All of the direct facial images from the NIMH-ChEFS set were adjusted in terms of location of facial features and standardized for luminance, size, and smoothness. Although overall agreement between raters in this study and the original development-site raters was high (89.52%), this differed by group such that agreement was lower for adolescents relative to mental health professionals in the current study. These results suggest that future research using this face set or others of adolescent/child faces should base comparisons on similarly-aged validation data. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Frequency Response Function Based Damage Identification for Aerospace Structures

    NASA Astrophysics Data System (ADS)

    Oliver, Joseph Acton

    Structural health monitoring technologies continue to be pursued for aerospace structures in the interests of increased safety and, when combined with health prognosis, efficiency in life-cycle management. The current dissertation develops and validates damage identification technology as a critical component for structural health monitoring of aerospace structures and, in particular, composite unmanned aerial vehicles. The primary innovation is a statistical least-squares damage identification algorithm based in concepts of parameter estimation and model update. The algorithm uses frequency response function based residual force vectors derived from distributed vibration measurements to update a structural finite element model through statistically weighted least-squares minimization producing location and quantification of the damage, estimation uncertainty, and an updated model. Advantages compared to other approaches include robust applicability to systems which are heavily damped, large, and noisy, with a relatively low number of distributed measurement points compared to the number of analytical degrees-of-freedom of an associated analytical structural model (e.g., modal finite element model). Motivation, research objectives, and a dissertation summary are discussed in Chapter 1 followed by a literature review in Chapter 2. Chapter 3 gives background theory and the damage identification algorithm derivation followed by a study of fundamental algorithm behavior on a two degree-of-freedom mass-spring system with generalized damping. Chapter 4 investigates the impact of noise then successfully proves the algorithm against competing methods using an analytical eight degree-of-freedom mass-spring system with non-proportional structural damping. Chapter 5 extends use of the algorithm to finite element models, including solutions for numerical issues, approaches for modeling damping approximately in reduced coordinates, and analytical validation using a composite sandwich plate model. Chapter 6 presents the final extension to experimental systems-including methods for initial baseline correlation and data reduction-and validates the algorithm on an experimental composite plate with impact damage. The final chapter deviates from development and validation of the primary algorithm to discuss development of an experimental scaled-wing test bed as part of a collaborative effort for developing structural health monitoring and prognosis technology. The dissertation concludes with an overview of technical conclusions and recommendations for future work.

  20. An experimentally validated network of nine haematopoietic transcription factors reveals mechanisms of cell state stability

    PubMed Central

    Schütte, Judith; Wang, Huange; Antoniou, Stella; Jarratt, Andrew; Wilson, Nicola K; Riepsaame, Joey; Calero-Nieto, Fernando J; Moignard, Victoria; Basilico, Silvia; Kinston, Sarah J; Hannah, Rebecca L; Chan, Mun Chiang; Nürnberg, Sylvia T; Ouwehand, Willem H; Bonzanni, Nicola; de Bruijn, Marella FTR; Göttgens, Berthold

    2016-01-01

    Transcription factor (TF) networks determine cell-type identity by establishing and maintaining lineage-specific expression profiles, yet reconstruction of mammalian regulatory network models has been hampered by a lack of comprehensive functional validation of regulatory interactions. Here, we report comprehensive ChIP-Seq, transgenic and reporter gene experimental data that have allowed us to construct an experimentally validated regulatory network model for haematopoietic stem/progenitor cells (HSPCs). Model simulation coupled with subsequent experimental validation using single cell expression profiling revealed potential mechanisms for cell state stabilisation, and also how a leukaemogenic TF fusion protein perturbs key HSPC regulators. The approach presented here should help to improve our understanding of both normal physiological and disease processes. DOI: http://dx.doi.org/10.7554/eLife.11469.001 PMID:26901438

  1. Examining the relationship between psychosocial and behavioral proxies for future consumption behavior: self-reported impact and bidding behavior in an experimental auction study on cigarette labeling

    PubMed Central

    Rousu, Matthew C.; Thrasher, James F.

    2014-01-01

    Experimental and observational research often involves asking consumers to self-report the impact of some proposed option. Because self-reported responses involve no consequence to the respondent for falsely revealing how he or she feels about an issue, self-reports may be subject to social desirability and other influences that bias responses in important ways. In this article, we analyzed data from an experiment on the impact of cigarette packaging and pack warnings, comparing smokers’ self-reported impact (four-item scale) and the bids they placed in experimental auctions to estimate differences in demand. The results were consistent across methods; however, the estimated effect size associated with different warning labels was two times greater for the four-item self-reported response scale when compared to the change in demand as indicated by auction bids. Our study provides evidence that self-reported psychosocial responses provide a valid proxy for behavioral change as reflected by experimental auction bidding behavior. More research is needed to better understand the advantages and disadvantages of behavioral economic methods and traditional self-report approaches to evaluating health behavior change interventions. PMID:24399267

  2. Cutting the Wires: Modularization of Cellular Networks for Experimental Design

    PubMed Central

    Lang, Moritz; Summers, Sean; Stelling, Jörg

    2014-01-01

    Understanding naturally evolved cellular networks requires the consecutive identification and revision of the interactions between relevant molecular species. In this process, initially often simplified and incomplete networks are extended by integrating new reactions or whole subnetworks to increase consistency between model predictions and new measurement data. However, increased consistency with experimental data alone is not sufficient to show the existence of biomolecular interactions, because the interplay of different potential extensions might lead to overall similar dynamics. Here, we present a graph-based modularization approach to facilitate the design of experiments targeted at independently validating the existence of several potential network extensions. Our method is based on selecting the outputs to measure during an experiment, such that each potential network extension becomes virtually insulated from all others during data analysis. Each output defines a module that only depends on one hypothetical network extension, and all other outputs act as virtual inputs to achieve insulation. Given appropriate experimental time-series measurements of the outputs, our modules can be analyzed, simulated, and compared to the experimental data separately. Our approach exemplifies the close relationship between structural systems identification and modularization, an interplay that promises development of related approaches in the future. PMID:24411264

  3. Design, Development, and Testing of a UAV Hardware-in-the-Loop Testbed for Aviation and Airspace Prognostics Research

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan; Teubert, Chris; Gorospe, George; Burgett, Drew; Quach, Cuong C.; Hogge, Edward

    2016-01-01

    The airspace is becoming more and more complicated, and will continue to do so in the future with the integration of Unmanned Aerial Vehicles (UAVs), autonomy, spacecraft, other forms of aviation technology into the airspace. The new technology and complexity increases the importance and difficulty of safety assurance. Additionally, testing new technologies on complex aviation systems & systems of systems can be very difficult, expensive, and sometimes unsafe in real life scenarios. Prognostic methodology provides an estimate of the health and risks of a component, vehicle, or airspace and knowledge of how that will change over time. That measure is especially useful in safety determination, mission planning, and maintenance scheduling. The developed testbed will be used to validate prediction algorithms for the real-time safety monitoring of the National Airspace System (NAS) and the prediction of unsafe events. The framework injects flight related anomalies related to ground systems, routing, airport congestion, etc. to test and verify algorithms for NAS safety. In our research work, we develop a live, distributed, hardware-in-the-loop testbed for aviation and airspace prognostics along with exploring further research possibilities to verify and validate future algorithms for NAS safety. The testbed integrates virtual aircraft using the X-Plane simulator and X-PlaneConnect toolbox, UAVs using onboard sensors and cellular communications, and hardware in the loop components. In addition, the testbed includes an additional research framework to support and simplify future research activities. It enables safe, accurate, and inexpensive experimentation and research into airspace and vehicle prognosis that would not have been possible otherwise. This paper describes the design, development, and testing of this system. Software reliability, safety and latency are some of the critical design considerations in development of the testbed. Integration of HITL elements in the development phases and veri cation/ validation are key elements to this report.

  4. CANEapp: a user-friendly application for automated next generation transcriptomic data analysis.

    PubMed

    Velmeshev, Dmitry; Lally, Patrick; Magistri, Marco; Faghihi, Mohammad Ali

    2016-01-13

    Next generation sequencing (NGS) technologies are indispensable for molecular biology research, but data analysis represents the bottleneck in their application. Users need to be familiar with computer terminal commands, the Linux environment, and various software tools and scripts. Analysis workflows have to be optimized and experimentally validated to extract biologically meaningful data. Moreover, as larger datasets are being generated, their analysis requires use of high-performance servers. To address these needs, we developed CANEapp (application for Comprehensive automated Analysis of Next-generation sequencing Experiments), a unique suite that combines a Graphical User Interface (GUI) and an automated server-side analysis pipeline that is platform-independent, making it suitable for any server architecture. The GUI runs on a PC or Mac and seamlessly connects to the server to provide full GUI control of RNA-sequencing (RNA-seq) project analysis. The server-side analysis pipeline contains a framework that is implemented on a Linux server through completely automated installation of software components and reference files. Analysis with CANEapp is also fully automated and performs differential gene expression analysis and novel noncoding RNA discovery through alternative workflows (Cuffdiff and R packages edgeR and DESeq2). We compared CANEapp to other similar tools, and it significantly improves on previous developments. We experimentally validated CANEapp's performance by applying it to data derived from different experimental paradigms and confirming the results with quantitative real-time PCR (qRT-PCR). CANEapp adapts to any server architecture by effectively using available resources and thus handles large amounts of data efficiently. CANEapp performance has been experimentally validated on various biological datasets. CANEapp is available free of charge at http://psychiatry.med.miami.edu/research/laboratory-of-translational-rna-genomics/CANE-app . We believe that CANEapp will serve both biologists with no computational experience and bioinformaticians as a simple, timesaving but accurate and powerful tool to analyze large RNA-seq datasets and will provide foundations for future development of integrated and automated high-throughput genomics data analysis tools. Due to its inherently standardized pipeline and combination of automated analysis and platform-independence, CANEapp is an ideal for large-scale collaborative RNA-seq projects between different institutions and research groups.

  5. Experimental and theoretical study of magnetohydrodynamic ship models.

    PubMed

    Cébron, David; Viroulet, Sylvain; Vidal, Jérémie; Masson, Jean-Paul; Viroulet, Philippe

    2017-01-01

    Magnetohydrodynamic (MHD) ships represent a clear demonstration of the Lorentz force in fluids, which explains the number of students practicals or exercises described on the web. However, the related literature is rather specific and no complete comparison between theory and typical small scale experiments is currently available. This work provides, in a self-consistent framework, a detailed presentation of the relevant theoretical equations for small MHD ships and experimental measurements for future benchmarks. Theoretical results of the literature are adapted to these simple battery/magnets powered ships moving on salt water. Comparison between theory and experiments are performed to validate each theoretical step such as the Tafel and the Kohlrausch laws, or the predicted ship speed. A successful agreement is obtained without any adjustable parameter. Finally, based on these results, an optimal design is then deduced from the theory. Therefore this work provides a solid theoretical and experimental ground for small scale MHD ships, by presenting in detail several approximations and how they affect the boat efficiency. Moreover, the theory is general enough to be adapted to other contexts, such as large scale ships or industrial flow measurement techniques.

  6. BETA (Bitter Electromagnet Testing Apparatus) Design and Testing

    NASA Astrophysics Data System (ADS)

    Bates, Evan; Birmingham, William; Rivera, William; Romero-Talamas, Carlos

    2016-10-01

    BETA is a 1T water cooled Bitter-type magnetic system that has been designed and constructed at the Dusty Plasma Laboratory of the University of Maryland, Baltimore County to serve as a prototype of a scaled 10T version. Currently the system is undergoing magnetic, thermal and mechanical testing to ensure safe operating conditions and to prove analytical design optimizations. These magnets will function as experimental tools for future dusty plasma based and collaborative experiments. An overview of design methods used for building a custom made Bitter magnet with user defined experimental constraints is reviewed. The three main design methods consist of minimizing the following: ohmic power, peak conductor temperatures, and stresses induced by Lorentz forces. We will also discuss the design of BETA which includes: the magnet core, pressure vessel, cooling system, power storage bank, high powered switching system, diagnostics with safety cutoff feedback, and data acquisition (DAQ)/magnet control Matlab code. Furthermore, we present experimental data from diagnostics for validation of our analytical preliminary design methodologies and finite element analysis calculations. BETA will contribute to the knowledge necessary to finalize the 10 T magnet design.

  7. Experimental Plan for EDF Energy Creep Rabbit Graphite Irradiations- Rev. 2 (replaces Rev. 0 ORNL/TM/2013/49).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burchell, Timothy D

    2014-07-01

    The experimental results obtained here will assist in the development and validation of future models of irradiation induced creep of graphite by providing the following data: Inert creep stain data from low to lifetime AGR fluence Inert creep-property data (especially CTE) from low to lifetime AGR fluence Effect of oxidation on creep modulus (by indirect comparison with experiment 1 and direct comparison with experiment 3 NB. Experiment 1 and 3 are not covered here) Data to develop a mechanistic understanding, including oAppropriate creep modulus (including pinning and high dose effects on structure) oInvestigation of CTE-creep strain behavior under inert conditionsmore » oInformation on the effect of applied stress/creep strain on crystallite orientation (requires XRD) oEffect of creep strain on micro-porosity (requires tomography & microscopy) This document describes the experimental work planned to meet the requirements of project technical specification [1] and EDF Energy requests for additional Pre-IE work. The PIE work is described in detail in this revision (Section 8 and 9).« less

  8. Numerical and experimental investigations of human swimming motions

    PubMed Central

    Takagi, Hideki; Nakashima, Motomu; Sato, Yohei; Matsuuchi, Kazuo; Sanders, Ross H.

    2016-01-01

    ABSTRACT This paper reviews unsteady flow conditions in human swimming and identifies the limitations and future potential of the current methods of analysing unsteady flow. The capability of computational fluid dynamics (CFD) has been extended from approaches assuming steady-state conditions to consideration of unsteady/transient conditions associated with the body motion of a swimmer. However, to predict hydrodynamic forces and the swimmer’s potential speeds accurately, more robust and efficient numerical methods are necessary, coupled with validation procedures, requiring detailed experimental data reflecting local flow. Experimental data obtained by particle image velocimetry (PIV) in this area are limited, because at present observations are restricted to a two-dimensional 1.0 m2 area, though this could be improved if the output range of the associated laser sheet increased. Simulations of human swimming are expected to improve competitive swimming, and our review has identified two important advances relating to understanding the flow conditions affecting performance in front crawl swimming: one is a mechanism for generating unsteady fluid forces, and the other is a theory relating to increased speed and efficiency. PMID:26699925

  9. Numerical and experimental investigations of human swimming motions.

    PubMed

    Takagi, Hideki; Nakashima, Motomu; Sato, Yohei; Matsuuchi, Kazuo; Sanders, Ross H

    2016-08-01

    This paper reviews unsteady flow conditions in human swimming and identifies the limitations and future potential of the current methods of analysing unsteady flow. The capability of computational fluid dynamics (CFD) has been extended from approaches assuming steady-state conditions to consideration of unsteady/transient conditions associated with the body motion of a swimmer. However, to predict hydrodynamic forces and the swimmer's potential speeds accurately, more robust and efficient numerical methods are necessary, coupled with validation procedures, requiring detailed experimental data reflecting local flow. Experimental data obtained by particle image velocimetry (PIV) in this area are limited, because at present observations are restricted to a two-dimensional 1.0 m(2) area, though this could be improved if the output range of the associated laser sheet increased. Simulations of human swimming are expected to improve competitive swimming, and our review has identified two important advances relating to understanding the flow conditions affecting performance in front crawl swimming: one is a mechanism for generating unsteady fluid forces, and the other is a theory relating to increased speed and efficiency.

  10. Experimental and theoretical study of magnetohydrodynamic ship models

    PubMed Central

    Viroulet, Sylvain; Vidal, Jérémie; Masson, Jean-Paul; Viroulet, Philippe

    2017-01-01

    Magnetohydrodynamic (MHD) ships represent a clear demonstration of the Lorentz force in fluids, which explains the number of students practicals or exercises described on the web. However, the related literature is rather specific and no complete comparison between theory and typical small scale experiments is currently available. This work provides, in a self-consistent framework, a detailed presentation of the relevant theoretical equations for small MHD ships and experimental measurements for future benchmarks. Theoretical results of the literature are adapted to these simple battery/magnets powered ships moving on salt water. Comparison between theory and experiments are performed to validate each theoretical step such as the Tafel and the Kohlrausch laws, or the predicted ship speed. A successful agreement is obtained without any adjustable parameter. Finally, based on these results, an optimal design is then deduced from the theory. Therefore this work provides a solid theoretical and experimental ground for small scale MHD ships, by presenting in detail several approximations and how they affect the boat efficiency. Moreover, the theory is general enough to be adapted to other contexts, such as large scale ships or industrial flow measurement techniques. PMID:28665941

  11. Enhancing cancer patient well-being with a nonpharmacological, heritage-focused intervention.

    PubMed

    Thomson, Linda J; Ander, Erica E; Menon, Usha; Lanceley, Anne; Chatterjee, Helen J

    2012-11-01

    Nonpharmacological, arts-focused interventions in health care have demonstrated considerable improvements in cancer patient well-being, although there is a little clinically robust, empirical evidence to demonstrate the value of heritage-focused practices. This study examined the effectiveness of a novel, nonpharmacological, heritage-focused intervention with adult female inpatients receiving cancer treatment in oncology wards of a large, central London hospital. In the tactile experimental condition, participants handled and discussed a selection of museum objects with a facilitator, whereas in the visual control condition, participants discussed photographs of the same objects. Sessions were conducted on a one-to-one basis at patients' bedsides and lasted about half an hour. Quantitative measures of psychological well-being with proven reliability and validity were used in a pretest/post-test control group, quasi-experimental design. Levels of positive emotion, well-being, and happiness were significantly enhanced in the experimental condition compared with the control condition for both oncology and nononcology patients. Findings indicate a future role for heritage-focused practices in enhancing health care environments. Copyright © 2012 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  12. STARDUST-U experiments on fluid-dynamic conditions affecting dust mobilization during LOVAs

    NASA Astrophysics Data System (ADS)

    Poggi, L. A.; Malizia, A.; Ciparisse, J. F.; Tieri, F.; Gelfusa, M.; Murari, A.; Del Papa, C.; Giovannangeli, I.; Gaudio, P.

    2016-07-01

    Since 2006 the Quantum Electronics and Plasma Physics (QEP) Research Group together with ENEA FusTech of Frascati have been working on dust re-suspension inside tokamaks and its potential capability to jeopardize the integrity of future fusion nuclear plants (i.e. ITER or DEMO) and to be a risk for the health of the operators. Actually, this team is working with the improved version of the "STARDUST" facility, i.e. "STARDUST-Upgrade". STARDUST-U facility has four new air inlet ports that allow the experimental replication of Loss of Vacuum Accidents (LOVAs). The experimental campaign to detect the different pressurization rates, local air velocity, temperature, have been carried out from all the ports in different accident conditions and the principal results will be analyzed and compared with the numerical simulations obtained through a CFD (Computational Fluid Dynamic) code. This preliminary thermo fluid-dynamic analysis of the accident is crucial for numerical model development and validation, and for the incoming experimental campaign of dust resuspension inside STARDUST-U due to well-defined accidents presented in this paper.

  13. Reply to Comment on ‘The motion of an arbitrarily rotating spherical projectile and its application to ball games’

    NASA Astrophysics Data System (ADS)

    Robinson, Garry; Robinson, Ian

    2014-06-01

    Jensen (2014 Phys. Scr. 89 067001) presents arguments that the expressions that we have used in our recent paper (Robinson and Robinson 2013 Phys. Scr. 88 018101) for the lift force and possibly the drag force acting on a rotating spherical projectile are dimensionally incorrect and therefore cannot be valid. We acknowledge that the alternative equations suggested by Jensen are dimensionally correct, and may well be borne out by future experimental results. However, we demonstrate that our equations are in fact also dimensionally correct, the key concept being that of having the appropriate dimensions for the multiplying constants, an extensively used practice with experimentally determined laws. After a detailed discussion of the situation, a simple illustrative example of Hooke's law for the restoring force, F, due to a mass attached to a spring displaced by a distance x from its equilibrium position is presented, where the spring constant, k, has such units as to render the equation dimensionally correct. Finally we discuss the implications of some relevant existing experimental results for the lift force.

  14. New Millenium Program Serving Earth and Space Sciences

    NASA Technical Reports Server (NTRS)

    Li, Fuk

    1999-01-01

    A cross-Enterprise program is to identify and validate flight breakthrough technologies that will significantly benefit future space science and earth science missions. The breakthrough technologies are: enable new capabilities to meet earth and space science needs and reducing costs of future missions. The flight validation are: mitigates risks to first users and enables rapid technology infusion into future missions.

  15. CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction

    NASA Technical Reports Server (NTRS)

    Davis, David O.

    2015-01-01

    Experimental investigations of specific flow phenomena, e.g., Shock Wave Boundary-Layer Interactions (SWBLI), provide great insight to the flow behavior but often lack the necessary details to be useful as CFD validation experiments. Reasons include: 1.Undefined boundary conditions Inconsistent results 2.Undocumented 3D effects (CL only measurements) 3.Lack of uncertainty analysis While there are a number of good subsonic experimental investigations that are sufficiently documented to be considered test cases for CFD and turbulence model validation, the number of supersonic and hypersonic cases is much less. This was highlighted by Settles and Dodsons [1] comprehensive review of available supersonic and hypersonic experimental studies. In all, several hundred studies were considered for their database.Of these, over a hundred were subjected to rigorous acceptance criteria. Based on their criteria, only 19 (12 supersonic, 7 hypersonic) were considered of sufficient quality to be used for validation purposes. Aeschliman and Oberkampf [2] recognized the need to develop a specific methodology for experimental studies intended specifically for validation purposes.

  16. 75 FR 53371 - Liquefied Natural Gas Facilities: Obtaining Approval of Alternative Vapor-Gas Dispersion Models

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-31

    ... factors as the approved models, are validated by experimental test data, and receive the Administrator's... stage of the MEP involves applying the model against a database of experimental test cases including..., particularly the requirement for validation by experimental test data. That guidance is based on the MEP's...

  17. Validating the energy transport modeling of the DIII-D and EAST ramp up experiments using TSC

    NASA Astrophysics Data System (ADS)

    Liu, Li; Guo, Yong; Chan, Vincent; Mao, Shifeng; Wang, Yifeng; Pan, Chengkang; Luo, Zhengping; Zhao, Hailin; Ye, Minyou

    2017-06-01

    The confidence in ramp up scenario design of the China fusion engineering test reactor (CFETR) can be significantly enhanced using validated transport models to predict the current profile and temperature profile. In the tokamak simulation code (TSC), two semi-empirical energy transport models (the Coppi-Tang (CT) and BGB model) and three theory-based models (the GLF23, MMM95 and CDBM model) are investigated on the CFETR relevant ramp up discharges, including three DIII-D ITER-like ramp up discharges and one EAST ohmic discharge. For the DIII-D discharges, all the transport models yield dynamic {{\\ell}\\text{i}} within +/- 0.15 deviations except for some time points where the experimental fluctuation is very strong. All the models agree with the experimental {β\\text{p}} except that the CT model strongly overestimates {β\\text{p}} in the first half of ramp up phase. When applying the CT, CDBM and GLF23 model to estimate the internal flux, they show maximum deviations of more than 10% because of inaccuracies in the temperature profile predictions, while the BGB model performs best on the internal flux. Although all the models fall short in reproducing the dynamic {{\\ell}\\text{i}} evolution for the EAST tokamak, the result of the BGB model is the closest to the experimental {{\\ell}\\text{i}} . Based on these comparisons, we conclude that the BGB model is the most consistent among these models for simulating CFETR ohmic ramp-up. The CT model with improvement for better simulation of the temperature profiles in the first half of ramp up phase will also be attractive. For the MMM95, GLF23 and CDBM model, better prediction of the edge temperature will improve the confidence for CFETR L-mode simulation. Conclusive validation of any transport model will require extensive future investigation covering a larger variety discharges.

  18. Modeling the role of environmental variables on the population dynamics of the malaria vector Anopheles gambiae sensu stricto

    PubMed Central

    2012-01-01

    Background The impact of weather and climate on malaria transmission has attracted considerable attention in recent years, yet uncertainties around future disease trends under climate change remain. Mathematical models provide powerful tools for addressing such questions and understanding the implications for interventions and eradication strategies, but these require realistic modeling of the vector population dynamics and its response to environmental variables. Methods Published and unpublished field and experimental data are used to develop new formulations for modeling the relationships between key aspects of vector ecology and environmental variables. These relationships are integrated within a validated deterministic model of Anopheles gambiae s.s. population dynamics to provide a valuable tool for understanding vector response to biotic and abiotic variables. Results A novel, parsimonious framework for assessing the effects of rainfall, cloudiness, wind speed, desiccation, temperature, relative humidity and density-dependence on vector abundance is developed, allowing ease of construction, analysis, and integration into malaria transmission models. Model validation shows good agreement with longitudinal vector abundance data from Tanzania, suggesting that recent malaria reductions in certain areas of Africa could be due to changing environmental conditions affecting vector populations. Conclusions Mathematical models provide a powerful, explanatory means of understanding the role of environmental variables on mosquito populations and hence for predicting future malaria transmission under global change. The framework developed provides a valuable advance in this respect, but also highlights key research gaps that need to be resolved if we are to better understand future malaria risk in vulnerable communities. PMID:22877154

  19. Numerical Modeling of Propellant Boil-Off in a Cryogenic Storage Tank

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.; Steadman, T. E.; Maroney, J. L.; Sass, J. P.; Fesmire, J. E.

    2007-01-01

    A numerical model to predict boil-off of stored propellant in large spherical cryogenic tanks has been developed. Accurate prediction of tank boil-off rates for different thermal insulation systems was the goal of this collaboration effort. The Generalized Fluid System Simulation Program, integrating flow analysis and conjugate heat transfer for solving complex fluid system problems, was used to create the model. Calculation of tank boil-off rate requires simultaneous simulation of heat transfer processes among liquid propellant, vapor ullage space, and tank structure. The reference tank for the boil-off model was the 850,000 gallon liquid hydrogen tank at Launch Complex 39B (LC- 39B) at Kennedy Space Center, which is under study for future infrastructure improvements to support the Constellation program. The methodology employed in the numerical model was validated using a sub-scale model and tank. Experimental test data from a 1/15th scale version of the LC-39B tank using both liquid hydrogen and liquid nitrogen were used to anchor the analytical predictions of the sub-scale model. Favorable correlations between sub-scale model and experimental test data have provided confidence in full-scale tank boil-off predictions. These methods are now being used in the preliminary design for other cases including future launch vehicles

  20. Long term corrosion on T91 and AISI1 316L steel in flowing lead alloy and corrosion protection barrier development: Experiments and models

    NASA Astrophysics Data System (ADS)

    Weisenburger, A.; Schroer, C.; Jianu, A.; Heinzel, A.; Konys, J.; Steiner, H.; Müller, G.; Fazio, C.; Gessi, A.; Babayan, S.; Kobzova, A.; Martinelli, L.; Ginestar, K.; Balbaud-Célerier, F.; Martín-Muñoz, F. J.; Soler Crespo, L.

    2011-08-01

    Considering the status of knowledge on corrosion and corrosion protection and especially the need for long term compatibility data of structural materials in HLM a set of experiments to generate reliable long term data was defined and performed. The long term corrosion behaviour of the two structural materials foreseen in ADS, 316L and T91, was investigated in the design relevant temperature field, i.e. from 300 to 550 °C. The operational window of the two steels in this temperature range was identified and all oxidation data were used to develop and validate the models of oxide scale growth in PbBi. A mechanistic model capable to predict the oxidation rate applying some experimentally fitted parameters has been developed. This model assumes parabolic oxidation and might be used for design and safety relevant investigations in future. Studies on corrosion barrier development allowed to define the required Al content for the formation of thin alumina scales in LBE. These results as well as future steps and required improvements are discussed. Variation of experimental conditions clearly showed that specific care has to be taken with respect to local flow conditions and oxygen concentrations.

  1. Experimental investigation of an RNA sequence space

    NASA Technical Reports Server (NTRS)

    Lee, Youn-Hyung; Dsouza, Lisa; Fox, George E.

    1993-01-01

    Modern rRNAs are the historic consequence of an ongoing evolutionary exploration of a sequence space. These extant sequences belong to a special subset of the sequence space that is comprised only of those primary sequences that can validly perform the biological function(s) required of the particular RNA. If it were possible to readily identify all such valid sequences, stochastic predictions could be made about the relative likelihood of various evolutionary pathways available to an RNA. Herein an experimental system which can assess whether a particular sequence is likely to have validity as a eubacterial 5S rRNA is described. A total of ten naturally occurring, and hence known to be valid, sequences and two point mutants of unknown validity were used to test the usefulness of the approach. Nine of the ten valid sequences tested positive whereas both mutants tested as clearly defective. The tenth valid sequence gave results that would be interpreted as reflecting a borderline status were the answer not known. These results demonstrate that it is possible to experimentally determine which sequences in local regions of the sequence space are potentially valid 5S rRNAs.

  2. Experimental Validation and Combustion Modeling of a JP-8 Surrogate in a Single Cylinder Diesel Engine

    DTIC Science & Technology

    2014-04-15

    SINGLE CYLINDER DIESEL ENGINE Amit Shrestha, Umashankar Joshi, Ziliang Zheng, Tamer Badawy, Naeim A. Henein, Wayne State University, Detroit, MI, USA...13-03-2014 4. TITLE AND SUBTITLE EXPERIMENTAL VALIDATION AND COMBUSTION MODELING OF A JP-8 SURROGATE IN A SINGLE CYLINDER DIESEL ENGINE 5a...INTERNATIONAL UNCLASSIFIED • Validate a two-component JP-8 surrogate in a single cylinder diesel engine. Validation parameters include – Ignition delay

  3. Ocean power technology design optimization

    DOE PAGES

    van Rij, Jennifer; Yu, Yi -Hsiang; Edwards, Kathleen; ...

    2017-07-18

    For this study, the National Renewable Energy Laboratory and Ocean Power Technologies (OPT) conducted a collaborative code validation and design optimization study for OPT's PowerBuoy wave energy converter (WEC). NREL utilized WEC-Sim, an open-source WEC simulator, to compare four design variations of OPT's PowerBuoy. As an input to the WEC-Sim models, viscous drag coefficients for the PowerBuoy floats were first evaluated using computational fluid dynamics. The resulting WEC-Sim PowerBuoy models were then validated with experimental power output and fatigue load data provided by OPT. The validated WEC-Sim models were then used to simulate the power performance and loads for operationalmore » conditions, extreme conditions, and directional waves, for each of the four PowerBuoy design variations, assuming the wave environment of Humboldt Bay, California. And finally, ratios of power-to-weight, power-to-fatigue-load, power-to-maximum-extreme-load, power-to-water-plane-area, and power-to-wetted-surface-area were used to make a final comparison of the potential PowerBuoy WEC designs. Lastly, the design comparison methodologies developed and presented in this study are applicable to other WEC devices and may be useful as a framework for future WEC design development projects.« less

  4. Modelling dimercaptosuccinic acid (DMSA) plasma kinetics in humans.

    PubMed

    van Eijkeren, Jan C H; Olie, J Daniël N; Bradberry, Sally M; Vale, J Allister; de Vries, Irma; Meulenbelt, Jan; Hunault, Claudine C

    2016-11-01

    No kinetic models presently exist which simulate the effect of chelation therapy on lead blood concentrations in lead poisoning. Our aim was to develop a kinetic model that describes the kinetics of dimercaptosuccinic acid (DMSA; succimer), a commonly used chelating agent, that could be used in developing a lead chelating model. This was a kinetic modelling study. We used a two-compartment model, with a non-systemic gastrointestinal compartment (gut lumen) and the whole body as one systemic compartment. The only data available from the literature were used to calibrate the unknown model parameters. The calibrated model was then validated by comparing its predictions with measured data from three different experimental human studies. The model predicted total DMSA plasma and urine concentrations measured in three healthy volunteers after ingestion of DMSA 10 mg/kg. The model was then validated by using data from three other published studies; it predicted concentrations within a factor of two, representing inter-human variability. A simple kinetic model simulating the kinetics of DMSA in humans has been developed and validated. The interest of this model lies in the future potential to use it to predict blood lead concentrations in lead-poisoned patients treated with DMSA.

  5. Ocean power technology design optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer; Yu, Yi -Hsiang; Edwards, Kathleen

    For this study, the National Renewable Energy Laboratory and Ocean Power Technologies (OPT) conducted a collaborative code validation and design optimization study for OPT's PowerBuoy wave energy converter (WEC). NREL utilized WEC-Sim, an open-source WEC simulator, to compare four design variations of OPT's PowerBuoy. As an input to the WEC-Sim models, viscous drag coefficients for the PowerBuoy floats were first evaluated using computational fluid dynamics. The resulting WEC-Sim PowerBuoy models were then validated with experimental power output and fatigue load data provided by OPT. The validated WEC-Sim models were then used to simulate the power performance and loads for operationalmore » conditions, extreme conditions, and directional waves, for each of the four PowerBuoy design variations, assuming the wave environment of Humboldt Bay, California. And finally, ratios of power-to-weight, power-to-fatigue-load, power-to-maximum-extreme-load, power-to-water-plane-area, and power-to-wetted-surface-area were used to make a final comparison of the potential PowerBuoy WEC designs. Lastly, the design comparison methodologies developed and presented in this study are applicable to other WEC devices and may be useful as a framework for future WEC design development projects.« less

  6. Validation of the MCNP computational model for neutron flux distribution with the neutron activation analysis measurement

    NASA Astrophysics Data System (ADS)

    Tiyapun, K.; Chimtin, M.; Munsorn, S.; Somchit, S.

    2015-05-01

    The objective of this work is to demonstrate the method for validating the predication of the calculation methods for neutron flux distribution in the irradiation tubes of TRIGA research reactor (TRR-1/M1) using the MCNP computer code model. The reaction rate using in the experiment includes 27Al(n, α)24Na and 197Au(n, γ)198Au reactions. Aluminium (99.9 wt%) and gold (0.1 wt%) foils and the gold foils covered with cadmium were irradiated in 9 locations in the core referred to as CT, C8, C12, F3, F12, F22, F29, G5, and G33. The experimental results were compared to the calculations performed using MCNP which consisted of the detailed geometrical model of the reactor core. The results from the experimental and calculated normalized reaction rates in the reactor core are in good agreement for both reactions showing that the material and geometrical properties of the reactor core are modelled very well. The results indicated that the difference between the experimental measurements and the calculation of the reactor core using the MCNP geometrical model was below 10%. In conclusion the MCNP computational model which was used to calculate the neutron flux and reaction rate distribution in the reactor core can be used for others reactor core parameters including neutron spectra calculation, dose rate calculation, power peaking factors calculation and optimization of research reactor utilization in the future with the confidence in the accuracy and reliability of the calculation.

  7. Using Numerical Modeling to Simulate Space Capsule Ground Landings

    NASA Technical Reports Server (NTRS)

    Heymsfield, Ernie; Fasanella, Edwin L.

    2009-01-01

    Experimental work is being conducted at the National Aeronautics and Space Administration s (NASA) Langley Research Center (LaRC) to investigate ground landing capabilities of the Orion crew exploration vehicle (CEV). The Orion capsule is NASA s replacement for the Space Shuttle. The Orion capsule will service the International Space Station and be used for future space missions to the Moon and to Mars. To evaluate the feasibility of Orion ground landings, a series of capsule impact tests are being performed at the NASA Langley Landing and Impact Research Facility (LandIR). The experimental results derived at LandIR provide means to validate and calibrate nonlinear dynamic finite element models, which are also being developed during this study. Because of the high cost and time involvement intrinsic to full-scale testing, numerical simulations are favored over experimental work. Subsequent to a numerical model validated by actual test responses, impact simulations will be conducted to study multiple impact scenarios not practical to test. Twenty-one swing tests using the LandIR gantry were conducted during the June 07 through October 07 time period to evaluate the Orion s impact response. Results for two capsule initial pitch angles, 0deg and -15deg , along with their computer simulations using LS-DYNA are presented in this article. A soil-vehicle friction coefficient of 0.45 was determined by comparing the test stopping distance with computer simulations. In addition, soil modeling accuracy is presented by comparing vertical penetrometer impact tests with computer simulations for the soil model used during the swing tests.

  8. James Webb Space Telescope optical simulation testbed IV: linear control alignment of the primary segmented mirror

    NASA Astrophysics Data System (ADS)

    Egron, Sylvain; Soummer, Rémi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Levecq, Olivier; Mazoyer, Johan; N'Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand

    2017-09-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop experiment designed to study wavefront sensing and control for a segmented space telescope, such as JWST. With the JWST Science and Operations Center co-located at STScI, JOST was developed to provide both a platform for staff training and to test alternate wavefront sensing and control strategies for independent validation or future improvements beyond the baseline operations. The design of JOST reproduces the physics of JWST's three-mirror anastigmat (TMA) using three custom aspheric lenses. It provides similar quality image as JWST (80% Strehl ratio) over a field equivalent to a NIRCam module, but at 633 nm. An Iris AO segmented mirror stands for the segmented primary mirror of JWST. Actuators allow us to control (1) the 18 segments of the segmented mirror in piston, tip, tilt and (2) the second lens, which stands for the secondary mirror, in tip, tilt and x, y, z positions. We present the most recent experimental results for the segmented mirror alignment. Our implementation of the Wavefront Sensing (WFS) algorithms using phase diversity is tested on simulation and experimentally. The wavefront control (WFC) algorithms, which rely on a linear model for optical aberrations induced by misalignment of the secondary lens and the segmented mirror, are tested and validated both on simulations and experimentally. In this proceeding, we present the performance of the full active optic control loop in presence of perturbations on the segmented mirror, and we detail the quality of the alignment correction.

  9. Ionization chamber correction factors for MR-linacs

    NASA Astrophysics Data System (ADS)

    Pojtinger, Stefan; Steffen Dohm, Oliver; Kapsch, Ralf-Peter; Thorwarth, Daniela

    2018-06-01

    Previously, readings of air-filled ionization chambers have been described as being influenced by magnetic fields. To use these chambers for dosimetry in magnetic resonance guided radiotherapy (MRgRT), this effect must be taken into account by introducing a correction factor k B. The purpose of this study is to systematically investigate k B for a typical reference setup for commercially available ionization chambers with different magnetic field strengths. The Monte Carlo simulation tool EGSnrc was used to simulate eight commercially available ionization chambers in magnetic fields whose magnetic flux density was in the range of 0–2.5 T. To validate the simulation, the influence of the magnetic field was experimentally determined for a PTW30013 Farmer-type chamber for magnetic flux densities between 0 and 1.425 T. Changes in the detector response of up to 8% depending on the magnetic flux density, on the chamber geometry and on the chamber orientation were obtained. In the experimental setup, a maximum deviation of less than 2% was observed when comparing measured values with simulated values. Dedicated values for two MR-linac systems (ViewRay MRIdian, ViewRay Inc, Cleveland, United States, 0.35 T/ 6 MV and Elekta Unity, Elekta AB, Stockholm, Sweden, 1.5 T/7 MV) were determined for future use in reference dosimetry. Simulated values for thimble-type chambers are in good agreement with experiments as well as with the results of previous publications. After further experimental validation, the results can be considered for definition of standard protocols for purposes of reference dosimetry in MRgRT.

  10. Ionization chamber correction factors for MR-linacs.

    PubMed

    Pojtinger, Stefan; Dohm, Oliver Steffen; Kapsch, Ralf-Peter; Thorwarth, Daniela

    2018-06-07

    Previously, readings of air-filled ionization chambers have been described as being influenced by magnetic fields. To use these chambers for dosimetry in magnetic resonance guided radiotherapy (MRgRT), this effect must be taken into account by introducing a correction factor k B . The purpose of this study is to systematically investigate k B for a typical reference setup for commercially available ionization chambers with different magnetic field strengths. The Monte Carlo simulation tool EGSnrc was used to simulate eight commercially available ionization chambers in magnetic fields whose magnetic flux density was in the range of 0-2.5 T. To validate the simulation, the influence of the magnetic field was experimentally determined for a PTW30013 Farmer-type chamber for magnetic flux densities between 0 and 1.425 T. Changes in the detector response of up to 8% depending on the magnetic flux density, on the chamber geometry and on the chamber orientation were obtained. In the experimental setup, a maximum deviation of less than 2% was observed when comparing measured values with simulated values. Dedicated values for two MR-linac systems (ViewRay MRIdian, ViewRay Inc, Cleveland, United States, 0.35 T/ 6 MV and Elekta Unity, Elekta AB, Stockholm, Sweden, 1.5 T/7 MV) were determined for future use in reference dosimetry. Simulated values for thimble-type chambers are in good agreement with experiments as well as with the results of previous publications. After further experimental validation, the results can be considered for definition of standard protocols for purposes of reference dosimetry in MRgRT.

  11. RF Performance of Membrane Aperture Shells

    NASA Technical Reports Server (NTRS)

    Flint, Eirc M.; Lindler, Jason E.; Thomas, David L.; Romanofsky, Robert

    2007-01-01

    This paper provides an overview of recent results establishing the suitability of Membrane Aperture Shell Technology (MAST) for Radio Frequency (RF) applications. These single surface shells are capable of maintaining their figure with no preload or pressurization and minimal boundary support, yet can be compactly roll stowed and passively self deploy. As such, they are a promising technology for enabling a future generation of RF apertures. In this paper, we review recent experimental and numerical results quantifying suitable RF performance. It is shown that candidate materials possess metallic coatings with sufficiently low surface roughness and that these materials can be efficiently fabricated into RF relevant doubly curved shapes. A numerical justification for using a reflectivity metric, as opposed to the more standard RF designer metric of skin depth, is presented and the resulting ability to use relatively thin coating thickness is experimentally validated with material sample tests. The validity of these independent film sample measurements are then confirmed through experimental results measuring RF performance for reasonable sized doubly curved apertures. Currently available best results are 22 dBi gain at 3 GHz (S-Band) for a 0.5m aperture tested in prime focus mode, 28dBi gain for the same antenna in the C-Band (4 to 6 GHz), and 36.8dBi for a smaller 0.25m antenna tested at 32 GHz in the Ka-Band. RF range test results for a segmented aperture (one possible scaling approach) are shown as well. Measured antenna system actual efficiencies (relative to the unachievable) ideal for these on axis tests are generally quite good, typically ranging from 50 to 90%.

  12. Airborne lidar mapping of vertical ozone distributions in support of the 1990 Clean Air Act Amendments

    NASA Technical Reports Server (NTRS)

    Uthe, Edward E.; Nielsen, Norman B.; Livingston, John M.

    1992-01-01

    The 1990 Clean Air Act Amendments mandated attainment of the ozone standard established by the U.S. Environmental Protection Agency. Improved photochemical models validated by experimental data are needed to develop strategies for reducing near surface ozone concentrations downwind of urban and industrial centers. For more than 10 years, lidar has been used on large aircraft to provide unique information on ozone distributions in the atmosphere. However, compact airborne lidar systems are needed for operation on small aircraft of the type typically used on regional air quality investigations to collect data with which to develop and validate air quality models. Data presented in this paper will consist of a comparison between airborne differential absorption lidar (DIAL) and airborne in-situ ozone measurements. Also discussed are future plans to improve the airborne ultraviolet-DIAL for ozone and other gas observations and addition of a Fourier Transform Infrared (FTIR) emission spectrometer to investigate the effects of other gas species on vertical ozone distribution.

  13. [The isolated perfused porcine kidney model for investigations concerning surgical therapy procedures].

    PubMed

    Peters, Kristina; Michel, Maurice Stephan; Matis, Ulrike; Häcker, Axel

    2006-01-01

    Experiments to develop innovative surgical therapy procedures are conventionally conducted on animals, as crucial aspects like tissue removal and bleeding disposition cannot be investigated in vitro. Extracorporeal organ models however reflect these aspects and could thus reduce the use of animals for this purpose fundamentally in the future. The aim of this work was to validate the isolated perfused porcine kidney model with regard to its use for surgical purposes on the basis of histological and radiological procedures. The results show that neither storage nor artificial perfusion led to any structural or functional damage which would affect the quality of the organ. The kidney model is highly suitable for simulating the main aspects of renal physiology and allows a constant calibration of perfusion pressure and tissue temperature. Thus, with only a moderate amount of work involved, the kidney model provides a cheap and readily available alternative to conventional animal experiments; it allows standardised experimental settings and provides valid results.

  14. Metrological analysis of a virtual flowmeter-based transducer for cryogenic helium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arpaia, P., E-mail: pasquale.arpaia@unina.it; Technology Department, European Organization for Nuclear Research; Girone, M., E-mail: mario.girone@cern.ch

    2015-12-15

    The metrological performance of a virtual flowmeter-based transducer for monitoring helium under cryogenic conditions is assessed. At this aim, an uncertainty model of the transducer, mainly based on a valve model, exploiting finite-element approach, and a virtual flowmeter model, based on the Sereg-Schlumberger method, are presented. The models are validated experimentally on a case study for helium monitoring in cryogenic systems at the European Organization for Nuclear Research (CERN). The impact of uncertainty sources on the transducer metrological performance is assessed by a sensitivity analysis, based on statistical experiment design and analysis of variance. In this way, the uncertainty sourcesmore » most influencing metrological performance of the transducer are singled out over the input range as a whole, at varying operating and setting conditions. This analysis turns out to be important for CERN cryogenics operation because the metrological design of the transducer is validated, and its components and working conditions with critical specifications for future improvements are identified.« less

  15. A Philosophical Perspective on Construct Validation: Application of Inductive Logic to the Analysis of Experimental Episode Construct Validity.

    ERIC Educational Resources Information Center

    Rossi, Robert Joseph

    Methods drawn from four logical theories associated with studies of inductive processes are applied to the assessment and evaluation of experimental episode construct validity. It is shown that this application provides for estimates of episode informativeness with respect to the person examined in terms of the construct and to the construct…

  16. Development and validation of an instrument to assess future orientation and resilience in adolescence.

    PubMed

    Di Maggio, Ilaria; Ginevra, Maria Cristina; Nota, Laura; Soresi, Salvatore

    2016-08-01

    The study is aimed at providing the development and initial validation of the Design My Future (DMF), which may be administered in career counseling and research activities to assess adolescents' future orientation and resilience. Two studies with two independent samples of Italian adolescents were conducted to examine psychometric requisites of DMF. Specifically, in the first study, after developing items and examined the content validity, the factorial structure, reliability and discriminant validity of the DMF were tested. In the second study, the measurement invariance across gender, conducing a sequence of nested CFA models, was evaluated. Results showed good psychometric support for the instrument with Italian adolescents. Copyright © 2016 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  17. A systematic review of the asymmetric inheritance of cellular organelles in eukaryotes: A critique of basic science validity and imprecision

    PubMed Central

    Collins, Anne; Ross, Janine

    2017-01-01

    We performed a systematic review to identify all original publications describing the asymmetric inheritance of cellular organelles in normal animal eukaryotic cells and to critique the validity and imprecision of the evidence. Searches were performed in Embase, MEDLINE and Pubmed up to November 2015. Screening of titles, abstracts and full papers was performed by two independent reviewers. Data extraction and validity were performed by one reviewer and checked by a second reviewer. Study quality was assessed using the SYRCLE risk of bias tool, for animal studies and by developing validity tools for the experimental model, organelle markers and imprecision. A narrative data synthesis was performed. We identified 31 studies (34 publications) of the asymmetric inheritance of organelles after mitotic or meiotic division. Studies for the asymmetric inheritance of centrosomes (n = 9); endosomes (n = 6), P granules (n = 4), the midbody (n = 3), mitochondria (n = 3), proteosomes (n = 2), spectrosomes (n = 2), cilia (n = 2) and endoplasmic reticulum (n = 2) were identified. Asymmetry was defined and quantified by variable methods. Assessment of the statistical reliability of the results indicated only two studies (7%) were judged to have low concern, the majority of studies (77%) were 'unclear' and five (16%) were judged to have 'high concerns'; the main reasons were low technical repeats (<10). Assessment of model validity indicated that the majority of studies (61%) were judged to be valid, ten studies (32%) were unclear and two studies (7%) were judged to have 'high concerns'; both described 'stem cells' without providing experimental evidence to confirm this (pluripotency and self-renewal). Assessment of marker validity indicated that no studies had low concern, most studies were unclear (96.5%), indicating there were insufficient details to judge if the markers were appropriate. One study had high concern for marker validity due to the contradictory results of two markers for the same organelle. For most studies the validity and imprecision of results could not be confirmed. In particular, data were limited due to a lack of reporting of interassay variability, sample size calculations, controls and functional validation of organelle markers. An evaluation of 16 systematic reviews containing cell assays found that only 50% reported adherence to PRISMA or ARRIVE reporting guidelines and 38% reported a formal risk of bias assessment. 44% of the reviews did not consider how relevant or valid the models were to the research question. 75% reviews did not consider how valid the markers were. 69% of reviews did not consider the impact of the statistical reliability of the results. Future systematic reviews in basic or preclinical research should ensure the rigorous reporting of the statistical reliability of the results in addition to the validity of the methods. Increased awareness of the importance of reporting guidelines and validation tools is needed for the scientific community. PMID:28562636

  18. Digital hardware implementation of a stochastic two-dimensional neuron model.

    PubMed

    Grassia, F; Kohno, T; Levi, T

    2016-11-01

    This study explores the feasibility of stochastic neuron simulation in digital systems (FPGA), which realizes an implementation of a two-dimensional neuron model. The stochasticity is added by a source of current noise in the silicon neuron using an Ornstein-Uhlenbeck process. This approach uses digital computation to emulate individual neuron behavior using fixed point arithmetic operation. The neuron model's computations are performed in arithmetic pipelines. It was designed in VHDL language and simulated prior to mapping in the FPGA. The experimental results confirmed the validity of the developed stochastic FPGA implementation, which makes the implementation of the silicon neuron more biologically plausible for future hybrid experiments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    NASA Astrophysics Data System (ADS)

    Mirizzi, F.

    2014-02-01

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper.

  20. A Multimodal Database for a Home Remote Medical Care Application

    NASA Astrophysics Data System (ADS)

    Medjahed, Hamid; Istrate, Dan; Boudy, Jerome; Steenkeste, François; Baldinger, Jean-Louis; Dorizzi, Bernadette

    The home remote monitoring systems aim to make a protective contribution to the well being of individuals (patients, elderly persons) requiring moderate amounts of support for independent living spaces, and improving their everyday life. Existing researches of these systems suffer from lack of experimental data and a standard medical database intended for their validation and improvement. This paper presents a multi-sensors environment for acquiring and recording a multimodal medical database, which includes physiological data (cardiac frequency, activity or agitation, posture, fall), environment sounds and localization data. It provides graphical interface functions to manage, process and index these data. The paper focuses on the system implementation, its usage and it points out possibilities for future work.

  1. CMB constraints on running non-Gaussianity

    NASA Astrophysics Data System (ADS)

    Oppizzi, F.; Liguori, M.; Renzi, A.; Arroja, F.; Bartolo, N.

    2018-05-01

    We develop a complete set of tools for CMB forecasting, simulation and estimation of primordial running bispectra, arising from a variety of curvaton and single-field (DBI) models of Inflation. We validate our pipeline using mock CMB running non-Gaussianity realizations and test it on real data by obtaining experimental constraints on the fNL running spectral index, nNG, using WMAP 9-year data. Our final bounds (68% C.L.) read ‑0.6< nNG<1.4}, ‑0.3< nNG<1.2, ‑1.1

  2. IRD dropout study

    NASA Technical Reports Server (NTRS)

    Yalowitz, Jeffrey S.; Schroer, Michael A.; Dickson, John E., Jr.

    1992-01-01

    This final report describes work performed by SRS Technologies for the NASA Marshall Space Flight Center under Contract NAS8-39077, entitled 'Integrated Receiver-Decoder Dropout Study'. The purpose of the study was to determine causes of signal fading effects on ultra-high-frequency (UHF) range safety transmissions to the Space Shuttle during flyout. Of particular interest were deep fades observed at the External Tank (ET) Integrated Receiver-Decoder (IRD) during the flyout interval between solid rocket booster separation and ET separation. Analytical and simulation methods were employed in this study to assess observations captured in flight telemetry data records. Conclusions based on the study are presented in this report, and recommendations are given for future experimental validation of the results.

  3. Control Design and Performance Analysis for Autonomous Formation Flight Experimentss

    NASA Astrophysics Data System (ADS)

    Rice, Caleb Michael

    Autonomous Formation Flight is a key approach for reducing greenhouse gas emissions and managing traffic in future high density airspace. Unmanned Aerial Vehicles (UAV's) have made it possible for the physical demonstration and validation of autonomous formation flight concepts inexpensively and eliminates the flight risk to human pilots. This thesis discusses the design, implementation, and flight testing of three different formation flight control methods, Proportional Integral and Derivative (PID); Fuzzy Logic (FL); and NonLinear Dynamic Inversion (NLDI), and their respective performance behavior. Experimental results show achievable autonomous formation flight and performance quality with a pair of low-cost unmanned research fixed wing aircraft and also with a solo vertical takeoff and landing (VTOL) quadrotor.

  4. Can jurors recognize missing control groups, confounds, and experimenter bias in psychological science?

    PubMed

    McAuliff, Bradley D; Kovera, Margaret Bull; Nunez, Gabriel

    2009-06-01

    This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed.

  5. [Effects of tai chi in postmenopausal women with osteoporosis: a systematic review].

    PubMed

    Chang, Ting-Jung; Ting, Yu-Ting; Sheu, Shei-Lan; Chang, Hsiao-Yun

    2014-10-01

    Tai chi has been increasingly applied in osteoporosis patients. However, systematic reviews of the efficacy of this practice have been few and of limited scope. This study reviews previous experimental research work using tai chi as an intervention in postmenopausal women with osteoporosis and to appraise the reported research designs used, tai chi methods used, and outcomes. A systematic review method was used to search 14 databases for articles published between January 1980 and July 2013. Searched keywords included: "tai chi," "osteoporosis," and "postmenopausal women". The 2,458 articles initially identified were reduced to 4 valid articles based on considerations of criteria and repeatability. The 4 valid articles used either a randomized clinical trial (RCT) or a controlled clinical trial (CCT). They were further analyzed and synthesized in terms of common variables such as balance, muscle strength, and quality of life. Three of the 4 studies identified significant pretest / posttest differences in physiological aspects of quality of life in participants but did not obtain consistent results in terms of the psychological aspects. While reports identified a significant and positive tai chi effect on balance, they all used different measurements to do so. Only one of the four studies identified significant improvement in muscle strength. Therefore, this review could not identify clear support for the effectiveness of tai chi on balance or muscle strength. This review did not definitively support the positive effects of tai chi on balance, muscle strength, and quality of life in postmenopausal women with osteoporosis. The designs used in the tai chi interventions may be referenced for future studies. We suggest that future studies use data triangulation rather than a single-item tool to validate the research in order to cross-verify the same information. This may strengthen the research and increase the credibility and the validity of related findings.

  6. Exploring Alternative Parameterizations for Snowfall with Validation from Satellite and Terrestrial Radars

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew L.; Petersen, Walter A.; Case, Jonathan L.; Dembek, Scott R.

    2009-01-01

    Increases in computational resources have allowed operational forecast centers to pursue experimental, high resolution simulations that resolve the microphysical characteristics of clouds and precipitation. These experiments are motivated by a desire to improve the representation of weather and climate, but will also benefit current and future satellite campaigns, which often use forecast model output to guide the retrieval process. The combination of reliable cloud microphysics and radar reflectivity may constrain radiative transfer models used in satellite simulators during future missions, including EarthCARE and the NASA Global Precipitation Measurement. Aircraft, surface and radar data from the Canadian CloudSat/CALIPSO Validation Project are used to check the validity of size distribution and density characteristics for snowfall simulated by the NASA Goddard six-class, single moment bulk water microphysics scheme, currently available within the Weather Research and Forecast (WRF) Model. Widespread snowfall developed across the region on January 22, 2007, forced by the passing of a mid latitude cyclone, and was observed by the dual-polarimetric, C-band radar King City, Ontario, as well as the NASA 94 GHz CloudSat Cloud Profiling Radar. Combined, these data sets provide key metrics for validating model output: estimates of size distribution parameters fit to the inverse-exponential equations prescribed within the model, bulk density and crystal habit characteristics sampled by the aircraft, and representation of size characteristics as inferred by the radar reflectivity at C- and W-band. Specified constants for distribution intercept and density differ significantly from observations throughout much of the cloud depth. Alternate parameterizations are explored, using column-integrated values of vapor excess to avoid problems encountered with temperature-based parameterizations in an environment where inversions and isothermal layers are present. Simulation of CloudSat reflectivity is performed by adopting the discrete-dipole parameterizations and databases provided in literature, and demonstrate an improved capability in simulating radar reflectivity at W-band versus Mie scattering assumptions.

  7. Validation of WIND for a Series of Inlet Flows

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Abbott, John M.; Cavicchi, Richard H.

    2002-01-01

    Validation assessments compare WIND CFD simulations to experimental data for a series of inlet flows ranging in Mach number from low subsonic to hypersonic. The validation procedures follow the guidelines of the AIAA. The WIND code performs well in matching the available experimental data. The assessments demonstrate the use of WIND and provide confidence in its use for the analysis of aircraft inlets.

  8. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  9. Incremental Validity of Useful Field of View Subtests for the Prediction of Instrumental Activities of Daily Living

    PubMed Central

    Aust, Frederik; Edwards, Jerri D.

    2015-01-01

    Introduction The Useful Field of View Test (UFOV®) is a cognitive measure that predicts older adults’ ability to perform a range of everyday activities. However, little is known about the individual contribution of each subtest to these predictions and the underlying constructs of UFOV performance remain a topic of debate. Method We investigated the incremental validity of UFOV subtests for the prediction of Instrumental Activities of Daily Living (IADL) performance in two independent datasets, the SKILL (n = 828) and ACTIVE (n = 2426) studies. We, then, explored the cognitive and visual abilities assessed by UFOV using a range of neuropsychological and vision tests administered in the SKILL study. Results In the four subtest variant of UFOV, only subtests 2 and 3 consistently made independent contributions to the prediction of IADL performance across three different behavioral measures. In all cases, the incremental validity of UFOV subtests 1 and 4 was negligible. Furthermore, we found that UFOV was related to processing speed, general non-speeded cognition, and visual function; the omission of subtests 1 and 4 from the test score did not affect these associations. Conclusions UFOV subtests 1 and 4 appear to be of limited use to predict IADL and possibly other everyday activities. Future experimental research should investigate if shortening the UFOV by omitting these subtests is a reliable and valid assessment approach. PMID:26782018

  10. Validation of CTF Droplet Entrainment and Annular/Mist Closure Models using Riso Steam/Water Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wysocki, Aaron J.; Salko, Robert K.

    This report summarizes the work done to validate the droplet entrainment and de-entrainment models as well as two-phase closure models in the CTF code by comparison with experimental data obtained at Riso National Laboratory. The Riso data included a series of over 250 steam/water experiments that were performed in both tube and annulus geometries over a range of various pressures and outlet qualities. Experimental conditions were set so that the majority of cases were in the annular/mist ow regime. Measurements included liquid lm ow rate, droplet ow rate, lm thickness, and two-phase pressure drop. CTF was used to model 180more » of the tubular geometry cases, matching experimental geometry, outlet pressure, and outlet ow quality to experimental values. CTF results were compared to the experimental data at the outlet of the test section in terms of vapor and entrained liquid ow fractions, pressure drop per unit length, and liquid lm thickness. The entire process of generating CTF input decks, running cases, extracting data, and generating comparison plots was scripted using Python and Matplotlib for a completely automated validation process. All test cases and scripting tools have been committed to the COBRA-TF master repository and selected cases have been added to the continuous testing system to serve as regression tests. The dierences between the CTF- and experimentally-calculated ow fraction values were con- sistent with previous calculations by Wurtz, who applied the same entrainment correlation to the same data. It has been found that CTF's entrainment/de-entrainment predictive capability in the annular/mist ow regime for this particular facility is comparable to the licensed industry code, COBRAG. While lm and droplet predictions are generally good, it has been found that accuracy is diminished at lower ow qualities. This nding is consistent with the noted deciencies in the Wurtz entrainment model employed by CTF. The CTF predicted two-phase pressure drop in the annular/mist ow regime has been found to be highly inaccurate, exhibiting a clear bias with respect to the experimental data. This inaccuracy led to an investigation that revealed deciencies in the implementation of the annular/mist interfacial friction model, which should be investigated further in the future. Looking to published COBRAG results for this same facility reveal it exhibits no bias with regard to experimental pressure drop results. In addition to the problems with pressure drop prediction, the lm thickness was also signicantly under-predicted by CTF compared to both experimental data and Wurtz's analytical calculations. Film thickness is calculated using a simple geometric relationship and lm void fraction in CTF, which is dependent on slip ratio and interfacial friction. It is possible that the issues aecting the pressure drop and lm void prediction are related.« less

  11. Design validation and performance of closed loop gas recirculation system

    NASA Astrophysics Data System (ADS)

    Kalmani, S. D.; Joshi, A. V.; Majumder, G.; Mondal, N. K.; Shinde, R. R.

    2016-11-01

    A pilot experimental set up of the India Based Neutrino Observatory's ICAL detector has been operational for the last 4 years at TIFR, Mumbai. Twelve glass RPC detectors of size 2 × 2 m2, with a gas gap of 2 mm are under test in a closed loop gas recirculation system. These RPCs are continuously purged individually, with a gas mixture of R134a (C2H2F4), isobutane (iC4H10) and sulphur hexafluoride (SF6) at a steady rate of 360 ml/h to maintain about one volume change a day. To economize gas mixture consumption and to reduce the effluents from being released into the atmosphere, a closed loop system has been designed, fabricated and installed at TIFR. The pressure and flow rate in the loop is controlled by mass flow controllers and pressure transmitters. The performance and integrity of RPCs in the pilot experimental set up is being monitored to assess the effect of periodic fluctuation and transients in atmospheric pressure and temperature, room pressure variation, flow pulsations, uniformity of gas distribution and power failures. The capability of closed loop gas recirculation system to respond to these changes is also studied. The conclusions from the above experiment are presented. The validations of the first design considerations and subsequent modifications have provided improved guidelines for the future design of the engineering module gas system.

  12. An Automated, Experimenter-Free Method for the Standardised, Operant Cognitive Testing of Rats

    PubMed Central

    Rivalan, Marion; Munawar, Humaira; Fuchs, Anna; Winter, York

    2017-01-01

    Animal models of human pathology are essential for biomedical research. However, a recurring issue in the use of animal models is the poor reproducibility of behavioural and physiological findings within and between laboratories. The most critical factor influencing this issue remains the experimenter themselves. One solution is the use of procedures devoid of human intervention. We present a novel approach to experimenter-free testing cognitive abilities in rats, by combining undisturbed group housing with automated, standardized and individual operant testing. This experimenter-free system consisted of an automated-operant system (Bussey-Saksida rat touch screen) connected to a home cage containing group living rats via an automated animal sorter (PhenoSys). The automated animal sorter, which is based on radio-frequency identification (RFID) technology, functioned as a mechanical replacement of the experimenter. Rats learnt to regularly and individually enter the operant chamber and remained there for the duration of the experimental session only. Self-motivated rats acquired the complex touch screen task of trial-unique non-matching to location (TUNL) in half the time reported for animals that were manually placed into the operant chamber. Rat performance was similar between the two groups within our laboratory, and comparable to previously published results obtained elsewhere. This reproducibility, both within and between laboratories, confirms the validity of this approach. In addition, automation reduced daily experimental time by 80%, eliminated animal handling, and reduced equipment cost. This automated, experimenter-free setup is a promising tool of great potential for testing a large variety of functions with full automation in future studies. PMID:28060883

  13. Three-dimensional (3D) evaluation of liquid distribution in shake flask using an optical fluorescence technique.

    PubMed

    Azizan, Amizon; Büchs, Jochen

    2017-01-01

    Biotechnological development in shake flask necessitates vital engineering parameters e.g. volumetric power input, mixing time, gas liquid mass transfer coefficient, hydromechanical stress and effective shear rate. Determination and optimization of these parameters through experiments are labor-intensive and time-consuming. Computational Fluid Dynamics (CFD) provides the ability to predict and validate these parameters in bioprocess engineering. This work provides ample experimental data which are easily accessible for future validations to represent the hydrodynamics of the fluid flow in the shake flask. A non-invasive measuring technique using an optical fluorescence method was developed for shake flasks containing a fluorescent solution with a waterlike viscosity at varying filling volume (V L  = 15 to 40 mL) and shaking frequency ( n  = 150 to 450 rpm) at a constant shaking diameter (d o  = 25 mm). The method detected the leading edge (LB) and tail of the rotating bulk liquid (TB) relative to the direction of the centrifugal acceleration at varying circumferential heights from the base of the shake flask. The determined LB and TB points were translated into three-dimensional (3D) circumferential liquid distribution plots. The maximum liquid height (H max ) of the bulk liquid increased with increasing filling volume and shaking frequency of the shaking flask, as expected. The toroidal shapes of LB and TB are clearly asymmetrical and the measured TB differed by the elongation of the liquid particularly towards the torus part of the shake flask. The 3D liquid distribution data collected at varying filling volume and shaking frequency, comprising of LB and TB values relative to the direction of the centrifugal acceleration are essential for validating future numerical solutions using CFD to predict vital engineering parameters in shake flask.

  14. Cutting the wires: modularization of cellular networks for experimental design.

    PubMed

    Lang, Moritz; Summers, Sean; Stelling, Jörg

    2014-01-07

    Understanding naturally evolved cellular networks requires the consecutive identification and revision of the interactions between relevant molecular species. In this process, initially often simplified and incomplete networks are extended by integrating new reactions or whole subnetworks to increase consistency between model predictions and new measurement data. However, increased consistency with experimental data alone is not sufficient to show the existence of biomolecular interactions, because the interplay of different potential extensions might lead to overall similar dynamics. Here, we present a graph-based modularization approach to facilitate the design of experiments targeted at independently validating the existence of several potential network extensions. Our method is based on selecting the outputs to measure during an experiment, such that each potential network extension becomes virtually insulated from all others during data analysis. Each output defines a module that only depends on one hypothetical network extension, and all other outputs act as virtual inputs to achieve insulation. Given appropriate experimental time-series measurements of the outputs, our modules can be analyzed, simulated, and compared to the experimental data separately. Our approach exemplifies the close relationship between structural systems identification and modularization, an interplay that promises development of related approaches in the future. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  15. Physiological and Psychological Predictors of Short-Term Disability in Workers with a History of Low Back Pain: A Longitudinal Study

    PubMed Central

    Dubois, Jean-Daniel; Cantin, Vincent; Piché, Mathieu; Descarreaux, Martin

    2016-01-01

    Despite an elusive pathophysiology, common characteristics are often observed in individuals with chronic low back pain (LBP). These include psychological symptoms, altered pain perception, altered pain modulation and altered muscle activation. These factors have been explored as possible determinants of disability, either separately or in cross-sectional studies, but were never assessed in a single longitudinal study. Therefore, the objective was to determine the relative contribution of psychological and neurophysiological factors to future disability in individuals with past LBP. The study included two experimental sessions (baseline and six months later) to assess cutaneous heat pain and pain tolerance thresholds, pain inhibition, as well as trunk muscle activation. Both sessions included the completion of validated questionnaires to determine clinical pain, disability, pain catastrophizing, fear-avoidance beliefs and pain vigilance. One hundred workers with a history of LBP and 19 healthy individuals took part in the first experimental session. The second experimental session was exclusively conducted on workers with a history of LBP (77/100). Correlation analyses between initial measures and disability at six months were conducted, and measures significantly associated with disability were used in multiple regression analyses. A first regression analysis showed that psychological symptoms contributed unique variance to future disability (R2 = 0.093, p = .009). To control for the fluctuating nature of LBP, a hierarchical regression was conducted while controlling for clinical pain at six months (R2 = 0.213, p < .001) where pain inhibition contributed unique variance in the second step of the regression (R2 change = 0.094, p = .005). These results indicate that pain inhibition processes may constitute potential targets for treatment to alleviate future disability in individuals with past or present LBP. Then again, the link between psychological symptoms and pain inhibition needs to be clarified as both of these factors are linked together and influence disability in their own way. PMID:27783666

  16. [Theoretical and methodological bases for formation of future drivers 'readiness to application of physical-rehabilitation technologies].

    PubMed

    Yemets, Anatoliy V; Donchenko, Viktoriya I; Scrinick, Eugenia O

    2018-01-01

    Introduction: Experimental work is aimed at introducing theoretical and methodological foundations for the professional training of the future doctor. The aim: Identify the dynamics of quantitative and qualitative indicators of the readiness of a specialist in medicine. Materials and methods: The article presents the course and results of the experimental work of the conditions of forming the readiness of future specialists in medicine. Results: Our methodical bases for studying the disciplines of the general practice and specialized professional stage of experimental training of future physicians have been worked out. Conclusions: It is developed taking into account the peculiarities of future physician training of materials for various stages of experimental implementation in the educational process of higher medical educational institutions.

  17. Methodology used in comparative studies assessing programmes of transition from paediatrics to adult care programmes: a systematic review

    PubMed Central

    Le Roux, E; Mellerio, H; Guilmin-Crépon, S; Gottot, S; Jacquin, P; Boulkedid, R; Alberti, C

    2017-01-01

    Objective To explore the methodologies employed in studies assessing transition of care interventions, with the aim of defining goals for the improvement of future studies. Design Systematic review of comparative studies assessing transition to adult care interventions for young people with chronic conditions. Data sources MEDLINE, EMBASE, ClinicalTrial.gov. Eligibility criteria for selecting studies 2 reviewers screened comparative studies with experimental and quasi-experimental designs, published or registered before July 2015. Eligible studies evaluate transition interventions at least in part after transfer to adult care of young people with chronic conditions with at least one outcome assessed quantitatively. Results 39 studies were reviewed, 26/39 (67%) published their final results and 13/39 (33%) were in progress. In 9 studies (9/39, 23%) comparisons were made between preintervention and postintervention in a single group. Randomised control groups were used in 9/39 (23%) studies. 2 (2/39, 5%) reported blinding strategies. Use of validated questionnaires was reported in 28% (11/39) of studies. In terms of reporting in published studies 15/26 (58%) did not report age at transfer, and 6/26 (23%) did not report the time of collection of each outcome. Conclusions Few evaluative studies exist and their level of methodological quality is variable. The complexity of interventions, multiplicity of outcomes, difficulty of blinding and the small groups of patients have consequences on concluding on the effectiveness of interventions. The evaluation of the transition interventions requires an appropriate and common methodology which will provide access to a better level of evidence. We identified areas for improvement in terms of randomisation, recruitment and external validity, blinding, measurement validity, standardised assessment and reporting. Improvements will increase our capacity to determine effective interventions for transition care. PMID:28131998

  18. Biocomputational identification and validation of novel microRNAs predicted from bubaline whole genome shotgun sequences.

    PubMed

    Manku, H K; Dhanoa, J K; Kaur, S; Arora, J S; Mukhopadhyay, C S

    2017-10-01

    MicroRNAs (miRNAs) are small (19-25 base long), non-coding RNAs that regulate post-transcriptional gene expression by cleaving targeted mRNAs in several eukaryotes. The miRNAs play vital roles in multiple biological and metabolic processes, including developmental timing, signal transduction, cell maintenance and differentiation, diseases and cancers. Experimental identification of microRNAs is expensive and lab-intensive. Alternatively, computational approaches for predicting putative miRNAs from genomic or exomic sequences rely on features of miRNAs viz. secondary structures, sequence conservation, minimum free energy index (MFEI) etc. To date, not a single miRNA has been identified in bubaline (Bubalus bubalis), which is an economically important livestock. The present study aims at predicting the putative miRNAs of buffalo using comparative computational approach from buffalo whole genome shotgun sequencing data (INSDC: AWWX00000000.1). The sequences were blasted against the known mammalian miRNA. The obtained miRNAs were then passed through a series of filtration criteria to obtain the set of predicted (putative and novel) bubaline miRNA. Eight miRNAs were selected based on lowest E-value and validated by real time PCR (SYBR green chemistry) using RNU6 as endogenous control. The results from different trails of real time PCR shows that out of selected 8 miRNAs, only 2 (hsa-miR-1277-5p; bta-miR-2285b) are not expressed in bubaline PBMCs. The potential target genes based on their sequence complementarities were then predicted using miRanda. This work is the first report on prediction of bubaline miRNA from whole genome sequencing data followed by experimental validation. The finding could pave the way to future studies in economically important traits in buffalo. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Etude par elements finis du comportement thermo-chimiomecanique de la pâte monolithique

    NASA Astrophysics Data System (ADS)

    Girard, Pierre-Luc

    Aluminum industry is in a fierce international competition requiring the constant improvement of the electrolysis cell effectiveness and longevity. The selection of the cell's materials components becomes an important factor to increase the cell's life. The ramming paste, used to seal the cathode lining, is compacted in the joints between the cathode and the side wall of the cell. It is a complex thermo-chemo-reactive material whose proprieties change with the evolution of his baking level. Therefore, the objective of this project is to propose a thermo-chemo-mechanical constitutive law for the ramming paste and implement it in the finite element software ANSYSRTM. A constitutive model was first chosen from the available literature on the subject. It is a pressure dependent model that uses hardening, softening and baking mechanisms in its definition to mimic the behavior of carbon-based materials. Subsequently, the numerical tool was validated using the finite element toolbox FESh++, which contains the most representative carbon-based thermochimio- mechanical material constitutive law at this time. Finally, a validation of the experimental setup BERTA (Banc d'essai de resistance thermomecanique ALCAN) was made in prevision of a larger scale experimental validation of the constitutive law in a near future. However, the analysis of the results shows that BERTA is not suited to adequately measure the mechanical deformation of such kind of material. Following this project, the numerical tool will be used in numerical simulation to introduce the various effects of the baking of the ramming paste during the cell startup. This new tool will help the industrial partner to enhance the understanding of Hall-Heroult cell start-up and optimize this critical step.

  20. Drug Repositioning by Kernel-Based Integration of Molecular Structure, Molecular Activity, and Phenotype Data

    PubMed Central

    Wang, Yongcui; Chen, Shilong; Deng, Naiyang; Wang, Yong

    2013-01-01

    Computational inference of novel therapeutic values for existing drugs, i.e., drug repositioning, offers the great prospect for faster and low-risk drug development. Previous researches have indicated that chemical structures, target proteins, and side-effects could provide rich information in drug similarity assessment and further disease similarity. However, each single data source is important in its own way and data integration holds the great promise to reposition drug more accurately. Here, we propose a new method for drug repositioning, PreDR (Predict Drug Repositioning), to integrate molecular structure, molecular activity, and phenotype data. Specifically, we characterize drug by profiling in chemical structure, target protein, and side-effects space, and define a kernel function to correlate drugs with diseases. Then we train a support vector machine (SVM) to computationally predict novel drug-disease interactions. PreDR is validated on a well-established drug-disease network with 1,933 interactions among 593 drugs and 313 diseases. By cross-validation, we find that chemical structure, drug target, and side-effects information are all predictive for drug-disease relationships. More experimentally observed drug-disease interactions can be revealed by integrating these three data sources. Comparison with existing methods demonstrates that PreDR is competitive both in accuracy and coverage. Follow-up database search and pathway analysis indicate that our new predictions are worthy of further experimental validation. Particularly several novel predictions are supported by clinical trials databases and this shows the significant prospects of PreDR in future drug treatment. In conclusion, our new method, PreDR, can serve as a useful tool in drug discovery to efficiently identify novel drug-disease interactions. In addition, our heterogeneous data integration framework can be applied to other problems. PMID:24244318

  1. Integrated tokamak modeling: when physics informs engineering and research planning

    NASA Astrophysics Data System (ADS)

    Poli, Francesca

    2017-10-01

    Simulations that integrate virtually all the relevant engineering and physics aspects of a real tokamak experiment are a power tool for experimental interpretation, model validation and planning for both present and future devices. This tutorial will guide through the building blocks of an ``integrated'' tokamak simulation, such as magnetic flux diffusion, thermal, momentum and particle transport, external heating and current drive sources, wall particle sources and sinks. Emphasis is given to the connection and interplay between external actuators and plasma response, between the slow time scales of the current diffusion and the fast time scales of transport, and how reduced and high-fidelity models can contribute to simulate a whole device. To illustrate the potential and limitations of integrated tokamak modeling for discharge prediction, a helium plasma scenario for the ITER pre-nuclear phase is taken as an example. This scenario presents challenges because it requires core-edge integration and advanced models for interaction between waves and fast-ions, which are subject to a limited experimental database for validation and guidance. Starting from a scenario obtained by re-scaling parameters from the demonstration inductive ``ITER baseline'', it is shown how self-consistent simulations that encompass both core and edge plasma regions, as well as high-fidelity heating and current drive source models are needed to set constraints on the density, magnetic field and heating scheme. This tutorial aims at demonstrating how integrated modeling, when used with adequate level of criticism, can not only support design of operational scenarios, but also help to asses the limitations and gaps in the available models, thus indicating where improved modeling tools are required and how present experiments can help their validation and inform research planning. Work supported by DOE under DE-AC02-09CH1146.

  2. Interaction model between capsule robot and intestine based on nonlinear viscoelasticity.

    PubMed

    Zhang, Cheng; Liu, Hao; Tan, Renjia; Li, Hongyi

    2014-03-01

    Active capsule endoscope could also be called capsule robot, has been developed from laboratory research to clinical application. However, the system still has defects, such as poor controllability and failing to realize automatic checks. The imperfection of the interaction model between capsule robot and intestine is one of the dominating reasons causing the above problems. A model is hoped to be established for the control method of the capsule robot in this article. It is established based on nonlinear viscoelasticity. The interaction force of the model consists of environmental resistance, viscous resistance and Coulomb friction. The parameters of the model are identified by experimental investigation. Different methods are used in the experiment to obtain different values of the same parameter at different velocities. The model is proved to be valid by experimental verification. The achievement in this article is the attempted perfection of an interaction model. It is hoped that the model can optimize the control method of the capsule robot in the future.

  3. Mice, myths, and men

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fry, R.J.M.

    The author discusses some examples of how different experimental animal systems have helped to answer questions about the effects of radiation, in particular, carcinogenesis, and to indicate how the new experimental model systems promise an even more exciting future. Entwined in these themes will be observations about susceptibility and extrapolation across species. The hope of developing acceptable methods of extrapolation of estimates of the risk of radiogenic cancer increases as molecular biology reveals the trail of remarkable similarities in the genetic control of many functions common to many species. A major concern about even attempting to extrapolate estimates of risksmore » of radiation-induced cancer across species has been that the mechanisms of carcinogenesis were so different among different species that it would negate the validity of extrapolation. The more that has become known about the genes involved in cancer, especially those related to the initial events in carcinogenesis, the more have the reasons for considering methods of extrapolation across species increased.« less

  4. Theoretical and Experimental Photoelectron Spectroscopy Characterization of the Ground State of Thymine Cation.

    PubMed

    Majdi, Youssef; Hochlaf, Majdi; Pan, Yi; Lau, Kai-Chung; Poisson, Lionel; Garcia, Gustavo A; Nahon, Laurent; Al-Mogren, Muneerah Mogren; Schwell, Martin

    2015-06-11

    We report on the vibronic structure of the ground state X̃(2)A″ of the thymine cation, which has been measured using a threshold photoelectron photoion coincidence technique and vacuum ultraviolet synchrotron radiation. The threshold photoelectron spectrum, recorded over ∼0.7 eV above the ionization potential (i.e., covering the whole ground state of the cation) shows rich vibrational structure that has been assigned with the help of calculated anharmonic modes of the ground electronic cation state at the PBE0/aug-cc-pVDZ level of theory. The adiabatic ionization energy has been experimentally determined as AIE = 8.913 ± 0.005 eV, in very good agreement with previous high resolution results. The corresponding theoretical value of AIE = 8.917 eV has been calculated in this work with the explicitly correlated method/basis set (R)CCSD(T)-F12/cc-pVTZ-F12, which validates the theoretical approach and benchmarks its accuracy for future studies of medium-sized biological molecules.

  5. Tensile Properties of Polymeric Matrix Composites Subjected to Cryogenic Environments

    NASA Technical Reports Server (NTRS)

    Whitley, Karen S.; Gates, Thomas S.

    2004-01-01

    Polymer matrix composites (PMC s) have seen limited use as structural materials in cryogenic environments. One reason for the limited use of PMC s in cryogenic structures is a design philosophy that typically requires a large, validated database of material properties in order to ensure a reliable and defect free structure. It is the intent of this paper to provide an initial set of mechanical properties developed from experimental data of an advanced PMC (IM7/PETI-5) exposed to cryogenic temperatures and mechanical loading. The application of this data is to assist in the materials down-select and design of cryogenic fuel tanks for future reusable space vehicles. The details of the material system, test program, and experimental methods will be outlined. Tension modulus and strength were measured at room temperature, -196 C, and -269 C on five different laminates. These properties were also tested after aging at -186 C with and without loading applied. Microcracking was observed in one laminate.

  6. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    NASA Technical Reports Server (NTRS)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  7. Carbon dioxide dynamics in an artificial ecosystem

    NASA Astrophysics Data System (ADS)

    Hu, Enzhu; Hu, Dawei; Tong, Ling; Li, Ming; Fu, Yuming; He, Wenting; Liu, Hong

    An experimental artificial ecosystem was established as a tool to understand the behavior of closed ecosystem and to develop the technology for a future bioregenerative life support system for lunar or planetary exploration. Total effective volume of the system is 0.7 m3 . It consists of a higher plant chamber, an animal chamber and a photo-bioreactor which cultivated lettuce (Lactuca sativa L.), silkworm (Bombyx Mori L.) and microalgae (Chlorella), respectively. For uniform and sustained observations, lettuce and silkworms was cultivated using sequential cultivation method, and microalgae using continuous culture. Four researchers took turns breathing the system air through a tube for brief periods every few hours. A mathematic model, simulating the carbon dioxide dynamics was developed. The main biological parameters concerning photosynthesis of lettuce and microalgae, respiration of silkworms and human were validated by the experimental data. The model described the respiratory relationship between autotrophic and heterotrophic compartments. A control strategy was proposed as a tool for the atmosphere management of the artificial ecosystem.

  8. A study of the effects of an experimental spiral physics curriculum taught to sixth grade girls and boys

    NASA Astrophysics Data System (ADS)

    Davis, Edith G.

    The pilot study compared the effectiveness of using an experimental spiral physics curriculum to a traditional linear physics curriculum for sixth through eighth grades. The study also surveyed students' parents and principals about students' academic history and background as well as identified resilient children's attributes for academic success. The pilot study was used to help validate the testing instrument as well as help refine the complete study. The purpose of the complete study was to compare the effectiveness of using an experimental spiral physics curriculum and a traditional linear curriculum with sixth graders only; seventh and eighth graders were dropped in the complete study. The study also surveyed students' parents, teachers, and principals about students' academic history and background as well as identified resilient children's attributes for academic success. Both the experimental spiral physics curriculum and the traditional linear physics curriculum increased physics achievement; however, there was no statistically significant difference in effectiveness of teaching experimental spiral physics curriculum in the aggregated sixth grade group compared to the traditional linear physics curriculum. It is important to note that the majority of the subgroups studied did show statistically significant differences in effectiveness for the experimental spiral physics curriculum compared to the traditional linear physics curriculum. The Grounded Theory analysis of resilient student characteristics resulted in categories for future studies including the empathy factor ("E" factor), the tenacity factor ("T" factor), the relational factor ("R" factor), and the spiritual factor ("S" factor).

  9. Availability of Neutronics Benchmarks in the ICSBEP and IRPhEP Handbooks for Computational Tools Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bess, John D.; Briggs, J. Blair; Ivanova, Tatiana

    2017-02-01

    In the past several decades, numerous experiments have been performed worldwide to support reactor operations, measurements, design, and nuclear safety. Those experiments represent an extensive international investment in infrastructure, expertise, and cost, representing significantly valuable resources of data supporting past, current, and future research activities. Those valuable assets represent the basis for recording, development, and validation of our nuclear methods and integral nuclear data [1]. The loss of these experimental data, which has occurred all too much in the recent years, is tragic. The high cost to repeat many of these measurements can be prohibitive, if not impossible, to surmount.more » Two international projects were developed, and are under the direction of the Organisation for Co-operation and Development Nuclear Energy Agency (OECD NEA) to address the challenges of not just data preservation, but evaluation of the data to determine its merit for modern and future use. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was established to identify and verify comprehensive critical benchmark data sets; evaluate the data, including quantification of biases and uncertainties; compile the data and calculations in a standardized format; and formally document the effort into a single source of verified benchmark data [2]. Similarly, the International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications [3]. Annually, contributors from around the world continue to collaborate in the evaluation and review of select benchmark experiments for preservation and dissemination. The extensively peer-reviewed integral benchmark data can then be utilized to support nuclear design and safety analysts to validate the analytical tools, methods, and data needed for next-generation reactor design, safety analysis requirements, and all other front- and back-end activities contributing to the overall nuclear fuel cycle where quality neutronics calculations are paramount.« less

  10. Validation of the thermal challenge problem using Bayesian Belief Networks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFarland, John; Swiler, Laura Painton

    The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less

  11. Development of a Culturally Valid Counselor Burnout Inventory for Korean Counselors

    ERIC Educational Resources Information Center

    Yu, Kumlan; Lee, Sang Min; Nesbit, Elisabeth A.

    2008-01-01

    This article describes the development of the culturally valid Counselor Burnout Inventory. A multistage approach including item translation; item refinement; and evaluation of factorial validity, reliability, and score validity was used to test constructs and validation. Implications for practice and future research are discussed. (Contains 3…

  12. Development and validation of a model of bio-barriers for remediation of Cr(VI) contaminated aquifers using laboratory column experiments.

    PubMed

    Shashidhar, T; Bhallamudi, S Murty; Philip, Ligy

    2007-07-16

    Bench scale transport and biotransformation experiments and mathematical model simulations were carried out to study the effectiveness of bio-barriers for the containment of hexavalent chromium in contaminated confined aquifers. Experimental results showed that a 10cm thick bio-barrier with an initial biomass concentration of 0.205mg/g of soil was able to completely contain a Cr(VI) plume of 25mg/L concentration. It was also observed that pore water velocity and initial biomass concentration are the most significant parameters in the containment of Cr(VI). The mathematical model developed is based on one-dimensional advection-dispersion reaction equations for Cr(VI) and molasses in saturated, homogeneous porous medium. The transport of Cr(VI) and molasses is coupled with adsorption and Monod's inhibition kinetics for immobile bacteria. It was found that, in general, the model was able to simulate the experimental results satisfactorily. However, there was disparity between the numerically simulated and experimental breakthrough curves for Cr(VI) and molasses in cases where there was high clay content and high microbial activity. The mathematical model could contribute towards improved designs of future bio-barriers for the remediation of Cr(VI) contaminated aquifers.

  13. Numerical simulations of the hard X-ray pulse intensity distribution at the Linac Coherent Light Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pardini, Tom; Aquila, Andrew; Boutet, Sebastien

    Numerical simulations of the current and future pulse intensity distributions at selected locations along the Far Experimental Hall, the hard X-ray section of the Linac Coherent Light Source (LCLS), are provided. Estimates are given for the pulse fluence, energy and size in and out of focus, taking into account effects due to the experimentally measured divergence of the X-ray beam, and measured figure errors of all X-ray optics in the beam path. Out-of-focus results are validated by comparison with experimental data. Previous work is expanded on, providing quantitatively correct predictions of the pulse intensity distribution. Numerical estimates in focus aremore » particularly important given that the latter cannot be measured with direct imaging techniques due to detector damage. Finally, novel numerical estimates of improvements to the pulse intensity distribution expected as part of the on-going upgrade of the LCLS X-ray transport system are provided. As a result, we suggest how the new generation of X-ray optics to be installed would outperform the old one, satisfying the tight requirements imposed by X-ray free-electron laser facilities.« less

  14. A self-sensing magnetorheological damper with power generation

    NASA Astrophysics Data System (ADS)

    Chen, Chao; Liao, Wei-Hsin

    2012-02-01

    Magnetorheological (MR) dampers are promising for semi-active vibration control of various dynamic systems. In the current MR damper systems, a separate power supply and dynamic sensor are required. To enable the MR damper to be self-powered and self-sensing in the future, in this paper we propose and investigate a self-sensing MR damper with power generation, which integrates energy harvesting, dynamic sensing and MR damping technologies into one device. This MR damper has self-contained power generation and velocity sensing capabilities, and is applicable to various dynamic systems. It combines the advantages of energy harvesting—reusing wasted energy, MR damping—controllable damping force, and sensing—providing dynamic information for controlling system dynamics. This multifunctional integration would bring great benefits such as energy saving, size and weight reduction, lower cost, high reliability, and less maintenance for the MR damper systems. In this paper, a prototype of the self-sensing MR damper with power generation was designed, fabricated, and tested. Theoretical analyses and experimental studies on power generation were performed. A velocity-sensing method was proposed and experimentally validated. The magnetic-field interference among three functions was prevented by a combined magnetic-field isolation method. Modeling, analysis, and experimental results on damping forces are also presented.

  15. Numerical simulations of the hard X-ray pulse intensity distribution at the Linac Coherent Light Source

    DOE PAGES

    Pardini, Tom; Aquila, Andrew; Boutet, Sebastien; ...

    2017-06-15

    Numerical simulations of the current and future pulse intensity distributions at selected locations along the Far Experimental Hall, the hard X-ray section of the Linac Coherent Light Source (LCLS), are provided. Estimates are given for the pulse fluence, energy and size in and out of focus, taking into account effects due to the experimentally measured divergence of the X-ray beam, and measured figure errors of all X-ray optics in the beam path. Out-of-focus results are validated by comparison with experimental data. Previous work is expanded on, providing quantitatively correct predictions of the pulse intensity distribution. Numerical estimates in focus aremore » particularly important given that the latter cannot be measured with direct imaging techniques due to detector damage. Finally, novel numerical estimates of improvements to the pulse intensity distribution expected as part of the on-going upgrade of the LCLS X-ray transport system are provided. As a result, we suggest how the new generation of X-ray optics to be installed would outperform the old one, satisfying the tight requirements imposed by X-ray free-electron laser facilities.« less

  16. Practical use of a framework for network science experimentation

    NASA Astrophysics Data System (ADS)

    Toth, Andrew; Bergamaschi, Flavio

    2014-06-01

    In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.

  17. A comprehensive computational model of sound transmission through the porcine lung

    PubMed Central

    Dai, Zoujun; Peng, Ying; Henry, Brian M.; Mansy, Hansen A.; Sandler, Richard H.; Royston, Thomas J.

    2014-01-01

    A comprehensive computational simulation model of sound transmission through the porcine lung is introduced and experimentally evaluated. This “subject-specific” model utilizes parenchymal and major airway geometry derived from x-ray CT images. The lung parenchyma is modeled as a poroviscoelastic material using Biot theory. A finite element (FE) mesh of the lung that includes airway detail is created and used in comsol FE software to simulate the vibroacoustic response of the lung to sound input at the trachea. The FE simulation model is validated by comparing simulation results to experimental measurements using scanning laser Doppler vibrometry on the surface of an excised, preserved lung. The FE model can also be used to calculate and visualize vibroacoustic pressure and motion inside the lung and its airways caused by the acoustic input. The effect of diffuse lung fibrosis and of a local tumor on the lung acoustic response is simulated and visualized using the FE model. In the future, this type of visualization can be compared and matched with experimentally obtained elastographic images to better quantify regional lung material properties to noninvasively diagnose and stage disease and response to treatment. PMID:25190415

  18. A comprehensive computational model of sound transmission through the porcine lung.

    PubMed

    Dai, Zoujun; Peng, Ying; Henry, Brian M; Mansy, Hansen A; Sandler, Richard H; Royston, Thomas J

    2014-09-01

    A comprehensive computational simulation model of sound transmission through the porcine lung is introduced and experimentally evaluated. This "subject-specific" model utilizes parenchymal and major airway geometry derived from x-ray CT images. The lung parenchyma is modeled as a poroviscoelastic material using Biot theory. A finite element (FE) mesh of the lung that includes airway detail is created and used in comsol FE software to simulate the vibroacoustic response of the lung to sound input at the trachea. The FE simulation model is validated by comparing simulation results to experimental measurements using scanning laser Doppler vibrometry on the surface of an excised, preserved lung. The FE model can also be used to calculate and visualize vibroacoustic pressure and motion inside the lung and its airways caused by the acoustic input. The effect of diffuse lung fibrosis and of a local tumor on the lung acoustic response is simulated and visualized using the FE model. In the future, this type of visualization can be compared and matched with experimentally obtained elastographic images to better quantify regional lung material properties to noninvasively diagnose and stage disease and response to treatment.

  19. ISPyB: an information management system for synchrotron macromolecular crystallography.

    PubMed

    Delagenière, Solange; Brenchereau, Patrice; Launer, Ludovic; Ashton, Alun W; Leal, Ricardo; Veyrier, Stéphanie; Gabadinho, José; Gordon, Elspeth J; Jones, Samuel D; Levik, Karl Erik; McSweeney, Seán M; Monaco, Stéphanie; Nanao, Max; Spruce, Darren; Svensson, Olof; Walsh, Martin A; Leonard, Gordon A

    2011-11-15

    Individual research groups now analyze thousands of samples per year at synchrotron macromolecular crystallography (MX) resources. The efficient management of experimental data is thus essential if the best possible experiments are to be performed and the best possible data used in downstream processes in structure determination pipelines. Information System for Protein crystallography Beamlines (ISPyB), a Laboratory Information Management System (LIMS) with an underlying data model allowing for the integration of analyses down-stream of the data collection experiment was developed to facilitate such data management. ISPyB is now a multisite, generic LIMS for synchrotron-based MX experiments. Its initial functionality has been enhanced to include improved sample tracking and reporting of experimental protocols, the direct ranking of the diffraction characteristics of individual samples and the archiving of raw data and results from ancillary experiments and post-experiment data processing protocols. This latter feature paves the way for ISPyB to play a central role in future macromolecular structure solution pipelines and validates the application of the approach used in ISPyB to other experimental techniques, such as biological solution Small Angle X-ray Scattering and spectroscopy, which have similar sample tracking and data handling requirements.

  20. Validation of the Soil Moisture Active Passive mission using USDA-ARS experimental watersheds

    USDA-ARS?s Scientific Manuscript database

    The calibration and validation program of the Soil Moisture Active Passive mission (SMAP) relies upon an international cooperative of in situ networks to provide ground truth references across a variety of landscapes. The USDA Agricultural Research Service operates several experimental watersheds wh...

  1. MRI-based modeling for radiocarpal joint mechanics: validation criteria and results for four specimen-specific models.

    PubMed

    Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet

    2011-10-01

    The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations. Only the model for one specimen met the validation criteria for average and peak pressure of both articulations; however the experimental measures for peak pressure also exhibited high variability. MRI-based modeling can reliably be used for evaluating the contact area and contact force with similar confidence as in currently available experimental techniques. Average contact pressure, and peak contact pressure were more variable from all measurement techniques, and these measures from MRI-based modeling should be used with some caution.

  2. Experimental Validation of ARFI Surveillance of Subcutaneous Hemorrhage (ASSH) Using Calibrated Infusions in a Tissue-Mimicking Model and Dogs.

    PubMed

    Geist, Rebecca E; DuBois, Chase H; Nichols, Timothy C; Caughey, Melissa C; Merricks, Elizabeth P; Raymer, Robin; Gallippi, Caterina M

    2016-09-01

    Acoustic radiation force impulse (ARFI) Surveillance of Subcutaneous Hemorrhage (ASSH) has been previously demonstrated to differentiate bleeding phenotype and responses to therapy in dogs and humans, but to date, the method has lacked experimental validation. This work explores experimental validation of ASSH in a poroelastic tissue-mimic and in vivo in dogs. The experimental design exploits calibrated flow rates and infusion durations of evaporated milk in tofu or heparinized autologous blood in dogs. The validation approach enables controlled comparisons of ASSH-derived bleeding rate (BR) and time to hemostasis (TTH) metrics. In tissue-mimicking experiments, halving the calibrated flow rate yielded ASSH-derived BRs that decreased by 44% to 48%. Furthermore, for calibrated flow durations of 5.0 minutes and 7.0 minutes, average ASSH-derived TTH was 5.2 minutes and 7.0 minutes, respectively, with ASSH predicting the correct TTH in 78% of trials. In dogs undergoing calibrated autologous blood infusion, ASSH measured a 3-minute increase in TTH, corresponding to the same increase in the calibrated flow duration. For a measured 5% decrease in autologous infusion flow rate, ASSH detected a 7% decrease in BR. These tissue-mimicking and in vivo preclinical experimental validation studies suggest the ASSH BR and TTH measures reflect bleeding dynamics. © The Author(s) 2015.

  3. Scaling Analysis Techniques to Establish Experimental Infrastructure for Component, Subsystem, and Integrated System Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabharwall, Piyush; O'Brien, James E.; McKellar, Michael G.

    2015-03-01

    Hybrid energy system research has the potential to expand the application for nuclear reactor technology beyond electricity. The purpose of this research is to reduce both technical and economic risks associated with energy systems of the future. Nuclear hybrid energy systems (NHES) mitigate the variability of renewable energy sources, provide opportunities to produce revenue from different product streams, and avoid capital inefficiencies by matching electrical output to demand by using excess generation capacity for other purposes when it is available. An essential step in the commercialization and deployment of this advanced technology is scaled testing to demonstrate integrated dynamic performancemore » of advanced systems and components when risks cannot be mitigated adequately by analysis or simulation. Further testing in a prototypical environment is needed for validation and higher confidence. This research supports the development of advanced nuclear reactor technology and NHES, and their adaptation to commercial industrial applications that will potentially advance U.S. energy security, economy, and reliability and further reduce carbon emissions. Experimental infrastructure development for testing and feasibility studies of coupled systems can similarly support other projects having similar developmental needs and can generate data required for validation of models in thermal energy storage and transport, energy, and conversion process development. Experiments performed in the Systems Integration Laboratory will acquire performance data, identify scalability issues, and quantify technology gaps and needs for various hybrid or other energy systems. This report discusses detailed scaling (component and integrated system) and heat transfer figures of merit that will establish the experimental infrastructure for component, subsystem, and integrated system testing to advance the technology readiness of components and systems to the level required for commercial application and demonstration under NHES.« less

  4. Validation of a coupled core-transport, pedestal-structure, current-profile and equilibrium model

    NASA Astrophysics Data System (ADS)

    Meneghini, O.

    2015-11-01

    The first workflow capable of predicting the self-consistent solution to the coupled core-transport, pedestal structure, and equilibrium problems from first-principles and its experimental tests are presented. Validation with DIII-D discharges in high confinement regimes shows that the workflow is capable of robustly predicting the kinetic profiles from on axis to the separatrix and matching the experimental measurements to within their uncertainty, with no prior knowledge of the pedestal height nor of any measurement of the temperature or pressure. Self-consistent coupling has proven to be essential to match the experimental results, and capture the non-linear physics that governs the core and pedestal solutions. In particular, clear stabilization of the pedestal peeling ballooning instabilities by the global Shafranov shift and destabilization by additional edge bootstrap current, and subsequent effect on the core plasma profiles, have been clearly observed and documented. In our model, self-consistency is achieved by iterating between the TGYRO core transport solver (with NEO and TGLF for neoclassical and turbulent flux), and the pedestal structure predicted by the EPED model. A self-consistent equilibrium is calculated by EFIT, while the ONETWO transport package evolves the current profile and calculates the particle and energy sources. The capabilities of such workflow are shown to be critical for the design of future experiments such as ITER and FNSF, which operate in a regime where the equilibrium, the pedestal, and the core transport problems are strongly coupled, and for which none of these quantities can be assumed to be known. Self-consistent core-pedestal predictions for ITER, as well as initial optimizations, will be presented. Supported by the US Department of Energy under DE-FC02-04ER54698, DE-SC0012652.

  5. Experimental validation of coil phase parametrisation on ASDEX Upgrade, and extension to ITER

    NASA Astrophysics Data System (ADS)

    Ryan, D. A.; Liu, Y. Q.; Kirk, A.; Suttrop, W.; Dudson, B.; Dunne, M.; Willensdorfer, M.; the ASDEX Upgrade team; the EUROfusion MST1 team

    2018-06-01

    It has been previously demonstrated in Li et al (2016 Nucl. Fusion 56 126007) that the optimum upper/lower coil phase shift ΔΦopt for alignment of RMP coils for ELM mitigation depends sensitively on q 95, and other equilibrium plasma parameters. Therefore, ΔΦopt is expected to vary widely during the current ramp of ITER plasmas, with negative implications for ELM mitigation during this period. A previously derived and numerically benchmarked parametrisation of the coil phase for optimal ELM mitigation on ASDEX Upgrade (Ryan et al 2017 Plasma Phys. Control. Fusion 59 024005) is validated against experimental measurements of ΔΦopt, made by observing the changes to the ELM frequency as the coil phase is scanned. It is shown that the parametrisation may predict the optimal coil phase to within 32° of the experimental measurement for n = 2 applied perturbations. It is explained that this agreement is sufficient to ensure that the ELM mitigation is not compromised by poor coil alignment. It is also found that the phase which maximises ELM mitigation is shifted from the phase which maximizes density pump-out, in contrast to theoretical expectations that ELM mitigation and density pump out have the same ΔΦ ul dependence. A time lag between the ELM frequency response and density response to the RMP is suggested as the cause. The method for numerically deriving the parametrisation is repeated for the ITER coil set, using the baseline scenario as a reference equilibrium, and the parametrisation coefficients given for future use in a feedback coil alignment system. The relative merits of square or sinusoidal toroidal current waveforms for ELM mitigation are briefly discussed.

  6. The development and validation of the Youth Actuarial Care Needs Assessment Tool for Non-Offenders (Y-ACNAT-NO).

    PubMed

    Assink, Mark; van der Put, Claudia E; Oort, Frans J; Stams, Geert Jan J M

    2015-03-04

    In The Netherlands, police officers not only come into contact with juvenile offenders, but also with a large number of juveniles who were involved in a criminal offense, but not in the role of a suspect (i.e., juvenile non-offenders). Until now, no valid and reliable instrument was available that can be used by Dutch police officers for estimating the risk for future care needs of juvenile non-offenders. In the present study, the Youth Actuarial Care Needs Assessment Tool for Non-Offenders (Y-ACNAT-NO) was developed for predicting the risk for future care needs that consisted of (1) a future supervision order as imposed by a juvenile court judge and (2) future worrisome incidents involving child abuse, domestic violence/strife, and/or sexual offensive behavior at the juvenile's living address (i.e., problems in the child-rearing environment). Police records of 3,200 juveniles were retrieved from the Dutch police registration system after which the sample was randomly split in a construction (n = 1,549) and validation sample (n = 1,651). The Y-ACNAT-NO was developed by performing an Exhaustive CHAID analysis using the construction sample. The predictive validity of the instrument was examined in the validation sample by calculating several performance indicators that assess discrimination and calibration. The CHAID output yielded an instrument that consisted of six variables and eleven different risk groups. The risk for future care needs ranged from 0.06 in the lowest risk group to 0.83 in the highest risk group. The AUC value in the validation sample was .764 (95% CI [.743, .784]) and Sander's calibration score indicated an average assessment error of 3.74% in risk estimates per risk category. The Y-ACNAT-NO is the first instrument that can be used by Dutch police officers for estimating the risk for future care needs of juvenile non-offenders. The predictive validity of the Y-ACNAT-NO in terms of discrimination and calibration was sufficient to justify its use as an initial screening instrument when a decision is needed about referring a juvenile for further assessment of care needs.

  7. Infrared Spectral Radiance Intercomparisons With Satellite and Aircraft Sensors

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xu; Smith, William L.

    2014-01-01

    Measurement system validation is critical for advanced satellite sounders to reach their full potential of improving observations of the Earth's atmosphere, clouds, and surface for enabling enhancements in weather prediction, climate monitoring capability, and environmental change detection. Experimental field campaigns, focusing on satellite under-flights with well-calibrated FTS sensors aboard high-altitude aircraft, are an essential part of the validation task. Airborne FTS systems can enable an independent, SI-traceable measurement system validation by directly measuring the same level-1 parameters spatially and temporally coincident with the satellite sensor of interest. Continuation of aircraft under-flights for multiple satellites during multiple field campaigns enables long-term monitoring of system performance and inter-satellite cross-validation. The NASA / NPOESS Airborne Sounder Testbed - Interferometer (NAST-I) has been a significant contributor in this area by providing coincident high spectral/spatial resolution observations of infrared spectral radiances along with independently-retrieved geophysical products for comparison with like products from satellite sensors being validated. This presentation gives an overview of benefits achieved using airborne sensors such as NAST-I utilizing examples from recent field campaigns. The methodology implemented is not only beneficial to new sensors such as the Cross-track Infrared Sounder (CrIS) flying aboard the Suomi NPP and future JPSS satellites but also of significant benefit to sensors of longer flight heritage such as the Atmospheric InfraRed Sounder (AIRS) and the Infrared Atmospheric Sounding Interferometer (IASI) on the AQUA and METOP-A platforms, respectively, to ensure data quality continuity important for climate and other applications. Infrared spectral radiance inter-comparisons are discussed with a particular focus on usage of NAST-I data for enabling inter-platform cross-validation.

  8. Aeroelasticity Benchmark Assessment: Subsonic Fixed Wing Program

    NASA Technical Reports Server (NTRS)

    Florance, Jennifer P.; Chwalowski, Pawel; Wieseman, Carol D.

    2010-01-01

    The fundamental technical challenge in computational aeroelasticity is the accurate prediction of unsteady aerodynamic phenomena and the effect on the aeroelastic response of a vehicle. Currently, a benchmarking standard for use in validating the accuracy of computational aeroelasticity codes does not exist. Many aeroelastic data sets have been obtained in wind-tunnel and flight testing throughout the world; however, none have been globally presented or accepted as an ideal data set. There are numerous reasons for this. One reason is that often, such aeroelastic data sets focus on the aeroelastic phenomena alone (flutter, for example) and do not contain associated information such as unsteady pressures and time-correlated structural dynamic deflections. Other available data sets focus solely on the unsteady pressures and do not address the aeroelastic phenomena. Other discrepancies can include omission of relevant data, such as flutter frequency and / or the acquisition of only qualitative deflection data. In addition to these content deficiencies, all of the available data sets present both experimental and computational technical challenges. Experimental issues include facility influences, nonlinearities beyond those being modeled, and data processing. From the computational perspective, technical challenges include modeling geometric complexities, coupling between the flow and the structure, grid issues, and boundary conditions. The Aeroelasticity Benchmark Assessment task seeks to examine the existing potential experimental data sets and ultimately choose the one that is viewed as the most suitable for computational benchmarking. An initial computational evaluation of that configuration will then be performed using the Langley-developed computational fluid dynamics (CFD) software FUN3D1 as part of its code validation process. In addition to the benchmarking activity, this task also includes an examination of future research directions. Researchers within the Aeroelasticity Branch will examine other experimental efforts within the Subsonic Fixed Wing (SFW) program (such as testing of the NASA Common Research Model (CRM)) and other NASA programs and assess aeroelasticity issues and research topics.

  9. Towards self-consistent plasma modelisation in presence of neoclassical tearing mode and sawteeth: effects on transport coefficients

    NASA Astrophysics Data System (ADS)

    Basiuk, V.; Huynh, P.; Merle, A.; Nowak, S.; Sauter, O.; Contributors, JET; the EUROfusion-IM Team

    2017-12-01

    The neoclassical tearing modes (NTM) increase the effective heat and particle radial transport inside the plasma, leading to a flattening of the electron and ion temperature and density profiles at a given location depending on the safety factor q rational surface (Hegna and Callen 1997 Phys. Plasmas 4 2940). In burning plasma such as in ITER, this NTM-induced increased transport could reduce significantly the fusion performance and even lead to a disruption. Validating models describing the NTM-induced transport in present experiment is thus important to help quantifying this effect on future devices. In this work, we apply an NTM model to an integrated simulation of current, heat and particle transport on JET discharges using the European transport simulator. In this model, the heat and particle radial transport coefficients are modified by a Gaussian function locally centered at the NTM position and characterized by a full width proportional to the island size through a constant parameter adapted to obtain the best simulations of experimental profiles. In the simulation, the NTM model is turned on at the same time as the mode is triggered in the experiment. The island evolution is itself determined by the modified Rutherford equation, using self-consistent plasma parameters determined by the transport evolution. The achieved simulation reproduces the experimental measurements within the error bars, before and during the NTM. A small discrepancy is observed on the radial location of the island due to a shift of the position of the computed q = 3/2 surface compared to the experimental one. To explain such small shift (up to about 12% with respect to the position observed from the experimental electron temperature profiles), sensitivity studies of the NTM location as a function of the initialization parameters are presented. First results validate both the transport model and the transport modification calculated by the NTM model.

  10. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    NASA Technical Reports Server (NTRS)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  11. Further Validation of a CFD Code for Calculating the Performance of Two-Stage Light Gas Guns

    NASA Technical Reports Server (NTRS)

    Bogdanoff, David W.

    2017-01-01

    Earlier validations of a higher-order Godunov code for modeling the performance of two-stage light gas guns are reviewed. These validation comparisons were made between code predictions and experimental data from the NASA Ames 1.5" and 0.28" guns and covered muzzle velocities of 6.5 to 7.2 km/s. In the present report, five more series of code validation comparisons involving experimental data from the Ames 0.22" (1.28" pump tube diameter), 0.28", 0.50", 1.00" and 1.50" guns are presented. The total muzzle velocity range of the validation data presented herein is 3 to 11.3 km/s. The agreement between the experimental data and CFD results is judged to be very good. Muzzle velocities were predicted within 0.35 km/s for 74% of the cases studied with maximum differences being 0.5 km/s and for 4 out of 50 cases, 0.5 - 0.7 km/s.

  12. Semi-Empirical Validation of the Cross-Band Relative Absorption Technique for the Measurement of Molecular Mixing Ratios

    NASA Technical Reports Server (NTRS)

    Pliutau, Denis; Prasad, Narasimha S

    2013-01-01

    Studies were performed to carry out semi-empirical validation of a new measurement approach we propose for molecular mixing ratios determination. The approach is based on relative measurements in bands of O2 and other molecules and as such may be best described as cross band relative absorption (CoBRA). . The current validation studies rely upon well verified and established theoretical and experimental databases, satellite data assimilations and modeling codes such as HITRAN, line-by-line radiative transfer model (LBLRTM), and the modern-era retrospective analysis for research and applications (MERRA). The approach holds promise for atmospheric mixing ratio measurements of CO2 and a variety of other molecules currently under investigation for several future satellite lidar missions. One of the advantages of the method is a significant reduction of the temperature sensitivity uncertainties which is illustrated with application to the ASCENDS mission for the measurement of CO2 mixing ratios (XCO2). Additional advantages of the method include the possibility to closely match cross-band weighting function combinations which is harder to achieve using conventional differential absorption techniques and the potential for additional corrections for water vapor and other interferences without using the data from numerical weather prediction (NWP) models.

  13. Human genetics as a model for target validation: finding new therapies for diabetes.

    PubMed

    Thomsen, Soren K; Gloyn, Anna L

    2017-06-01

    Type 2 diabetes is a global epidemic with major effects on healthcare expenditure and quality of life. Currently available treatments are inadequate for the prevention of comorbidities, yet progress towards new therapies remains slow. A major barrier is the insufficiency of traditional preclinical models for predicting drug efficacy and safety. Human genetics offers a complementary model to assess causal mechanisms for target validation. Genetic perturbations are 'experiments of nature' that provide a uniquely relevant window into the long-term effects of modulating specific targets. Here, we show that genetic discoveries over the past decades have accurately predicted (now known) therapeutic mechanisms for type 2 diabetes. These findings highlight the potential for use of human genetic variation for prospective target validation, and establish a framework for future applications. Studies into rare, monogenic forms of diabetes have also provided proof-of-principle for precision medicine, and the applicability of this paradigm to complex disease is discussed. Finally, we highlight some of the limitations that are relevant to the use of genome-wide association studies (GWAS) in the search for new therapies for diabetes. A key outstanding challenge is the translation of GWAS signals into disease biology and we outline possible solutions for tackling this experimental bottleneck.

  14. Validation of MHD Models using MST RFP Plasmas

    NASA Astrophysics Data System (ADS)

    Jacobson, C. M.; Chapman, B. E.; den Hartog, D. J.; McCollam, K. J.; Sarff, J. S.; Sovinec, C. R.

    2017-10-01

    Rigorous validation of computational models used in fusion energy sciences over a large parameter space and across multiple magnetic configurations can increase confidence in their ability to predict the performance of future devices. MST is a well diagnosed reversed-field pinch (RFP) capable of operation with plasma current ranging from 60 kA to 500 kA. The resulting Lundquist number S, a key parameter in resistive magnetohydrodynamics (MHD), ranges from 4 ×104 to 8 ×106 for standard RFP plasmas and provides substantial overlap with MHD RFP simulations. MST RFP plasmas are simulated using both DEBS, a nonlinear single-fluid visco-resistive MHD code, and NIMROD, a nonlinear extended MHD code, with S ranging from 104 to 105 for single-fluid runs, and the magnetic Prandtl number Pm = 1 . Validation metric comparisons are presented, focusing on how normalized magnetic fluctuations at the edge b scale with S. Preliminary results for the dominant n = 6 mode are b S - 0 . 20 +/- 0 . 02 for single-fluid NIMROD, b S - 0 . 25 +/- 0 . 05 for DEBS, and b S - 0 . 20 +/- 0 . 02 for experimental measurements, however there is a significant discrepancy in mode amplitudes. Preliminary two-fluid NIMROD results are also presented. Work supported by US DOE.

  15. Review and assessment of turbulence models for hypersonic flows

    NASA Astrophysics Data System (ADS)

    Roy, Christopher J.; Blottner, Frederick G.

    2006-10-01

    Accurate aerodynamic prediction is critical for the design and optimization of hypersonic vehicles. Turbulence modeling remains a major source of uncertainty in the computational prediction of aerodynamic forces and heating for these systems. The first goal of this article is to update the previous comprehensive review of hypersonic shock/turbulent boundary-layer interaction experiments published in 1991 by Settles and Dodson (Hypersonic shock/boundary-layer interaction database. NASA CR 177577, 1991). In their review, Settles and Dodson developed a methodology for assessing experiments appropriate for turbulence model validation and critically surveyed the existing hypersonic experiments. We limit the scope of our current effort by considering only two-dimensional (2D)/axisymmetric flows in the hypersonic flow regime where calorically perfect gas models are appropriate. We extend the prior database of recommended hypersonic experiments (on four 2D and two 3D shock-interaction geometries) by adding three new geometries. The first two geometries, the flat plate/cylinder and the sharp cone, are canonical, zero-pressure gradient flows which are amenable to theory-based correlations, and these correlations are discussed in detail. The third geometry added is the 2D shock impinging on a turbulent flat plate boundary layer. The current 2D hypersonic database for shock-interaction flows thus consists of nine experiments on five different geometries. The second goal of this study is to review and assess the validation usage of various turbulence models on the existing experimental database. Here we limit the scope to one- and two-equation turbulence models where integration to the wall is used (i.e., we omit studies involving wall functions). A methodology for validating turbulence models is given, followed by an extensive evaluation of the turbulence models on the current hypersonic experimental database. A total of 18 one- and two-equation turbulence models are reviewed, and results of turbulence model assessments for the six models that have been extensively applied to the hypersonic validation database are compiled and presented in graphical form. While some of the turbulence models do provide reasonable predictions for the surface pressure, the predictions for surface heat flux are generally poor, and often in error by a factor of four or more. In the vast majority of the turbulence model validation studies we review, the authors fail to adequately address the numerical accuracy of the simulations (i.e., discretization and iterative error) and the sensitivities of the model predictions to freestream turbulence quantities or near-wall y+ mesh spacing. We recommend new hypersonic experiments be conducted which (1) measure not only surface quantities but also mean and fluctuating quantities in the interaction region and (2) provide careful estimates of both random experimental uncertainties and correlated bias errors for the measured quantities and freestream conditions. For the turbulence models, we recommend that a wide-range of turbulence models (including newer models) be re-examined on the current hypersonic experimental database, including the more recent experiments. Any future turbulence model validation efforts should carefully assess the numerical accuracy and model sensitivities. In addition, model corrections (e.g., compressibility corrections) should be carefully examined for their effects on a standard, low-speed validation database. Finally, as new experiments or direct numerical simulation data become available with information on mean and fluctuating quantities, they should be used to improve the turbulence models and thus increase their predictive capability.

  16. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  17. Nonlinear model analysis of all-optical flip-flop and inverter operations of microring laser

    NASA Astrophysics Data System (ADS)

    Kobayashi, Naoki; Kawamura, Yusaku; Aoki, Ryosuke; Kokubun, Yasuo

    2018-03-01

    We explore a theoretical model of bistability at two adjacent lasing wavelengths from an InGaAs/InGaAsP multiple quantum well (MQW) microring laser. We show that nonlinear effects on the phase and amplitude play significant roles in the lasing operations of the microring laser. Numerical simulations indicate that all-optical flip-flop operations and inverter operations can be observed within the same device by controlling the injection current. The validity of our analysis is confirmed by a comparison of the results for numerical simulations with experimental results of the lasing spectrum. We believe that the analysis presented in this paper will be useful for the future design of all-optical signal processing devices.

  18. National Transonic Facility Wall Pressure Calibration Using Modern Design of Experiments (Invited)

    NASA Technical Reports Server (NTRS)

    Underwood, Pamela J.; Everhart, Joel L.; DeLoach, Richard

    2001-01-01

    The Modern Design of Experiments (MDOE) has been applied to wind tunnel testing at NASA Langley Research Center for several years. At Langley, MDOE has proven to be a useful and robust approach to aerodynamic testing that yields significant reductions in the cost and duration of experiments while still providing for the highest quality research results. This paper extends its application to include empty tunnel wall pressure calibrations. These calibrations are performed in support of wall interference corrections. This paper will present the experimental objectives, and the theoretical design process. To validate the tunnel-empty-calibration experiment design, preliminary response surface models calculated from previously acquired data are also presented. Finally, lessons learned and future wall interference applications of MDOE are discussed.

  19. DE-NE0008277_PROTEUS final technical report 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enqvist, Andreas

    This project details re-evaluations of experiments of gas-cooled fast reactor (GCFR) core designs performed in the 1970s at the PROTEUS reactor and create a series of International Reactor Physics Experiment Evaluation Project (IRPhEP) benchmarks. Currently there are no gas-cooled fast reactor (GCFR) experiments available in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). These experiments are excellent candidates for reanalysis and development of multiple benchmarks because these experiments provide high-quality integral nuclear data relevant to the validation and refinement of thorium, neptunium, uranium, plutonium, iron, and graphite cross sections. It would be cost prohibitive to reproduce suchmore » a comprehensive suite of experimental data to support any future GCFR endeavors.« less

  20. New Robust Design Guideline forImperfection Sensitive Composite Launcher Structures- The Desicos Project

    NASA Astrophysics Data System (ADS)

    Degenhardt, Richard

    2014-06-01

    Space industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite space and aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis. Currently, the potential of composite light weight structures, which are prone to buckling, is not fully exploited as appropriate guidelines in the field of space applications do not exist. This paper deals with the state-of-the-art advances and challenges related to coupled stability analysis of composite structures which show very complex stability behaviour. Improved design guidelines for composites structures are still under development. This paper gives a short state-of-the-art and presents a proposal for a future design guideline.

  1. On-chip spin-controlled orbital angular momentum directional coupling

    NASA Astrophysics Data System (ADS)

    Xie, Zhenwei; Lei, Ting; Si, Guangyuan; Du, Luping; Lin, Jiao; Min, Changjun; Yuan, Xiaocong

    2018-01-01

    Optical vortex beams have many potential applications in the particle trapping, quantum encoding, optical orbital angular momentum (OAM) communications and interconnects. However, the on-chip compact OAM detection is still a big challenge. Based on a holographic configuration and a spin-dependent structure design, we propose and demonstrate an on-chip spin-controlled OAM-mode directional coupler, which can couple the OAM signal to different directions due to its topological charge. While the directional coupling function can be switched on/off by altering the spin of incident beam. Both simulation and experimental measurements verify the validity of the proposed approach. This work would benefit the on-chip OAM devices for optical communications and high dimensional quantum coding/decoding in the future.

  2. Vivaldi: visualization and validation of biomacromolecular NMR structures from the PDB.

    PubMed

    Hendrickx, Pieter M S; Gutmanas, Aleksandras; Kleywegt, Gerard J

    2013-04-01

    We describe Vivaldi (VIsualization and VALidation DIsplay; http://pdbe.org/vivaldi), a web-based service for the analysis, visualization, and validation of NMR structures in the Protein Data Bank (PDB). Vivaldi provides access to model coordinates and several types of experimental NMR data using interactive visualization tools, augmented with structural annotations and model-validation information. The service presents information about the modeled NMR ensemble, validation of experimental chemical shifts, residual dipolar couplings, distance and dihedral angle constraints, as well as validation scores based on empirical knowledge and databases. Vivaldi was designed for both expert NMR spectroscopists and casual non-expert users who wish to obtain a better grasp of the information content and quality of NMR structures in the public archive. Copyright © 2013 Wiley Periodicals, Inc.

  3. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    NASA Technical Reports Server (NTRS)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition, after investigating various methods, a Smoothed Particle Hydrodynamics Model (SPH Model) was developed to model wire feeding process. Its computational efficiency and simple architecture makes it more robust and flexible than other models. More research on material properties may be needed to realistically model the AAM processes. A microscale model was developed to investigate heterogeneous nucleation, dendritic grain growth, epitaxial growth of columnar grains, columnar-to-equiaxed transition, grain transport in melt, and other properties. The orientations of the columnar grains were almost perpendicular to the laser motion's direction. Compared to the similar studies in the literature, the multiple grain morphology modeling result is in the same order of magnitude as optical morphologies in the experiment. Experimental work was conducted to validate different models. An infrared camera was incorporated as a process monitoring and validating tool to identify the solidus and mushy zones during deposition. The images were successfully processed to identify these regions. This research project has investigated multiscale and multiphysics of the complex AAM processes thus leading to advanced understanding of these processes. The project has also developed several modeling tools and experimental validation tools that will be very critical in the future of AAM process qualification and certification.

  4. Experimental validation of predicted cancer genes using FRET

    NASA Astrophysics Data System (ADS)

    Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.

    2018-07-01

    Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.

  5. Solar-Diesel Hybrid Power System Optimization and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Jacobus, Headley Stewart

    As of 2008 1.46 billion people, or 22 percent of the World's population, were without electricity. Many of these people live in remote areas where decentralized generation is the only method of electrification. Most mini-grids are powered by diesel generators, but new hybrid power systems are becoming a reliable method to incorporate renewable energy while also reducing total system cost. This thesis quantifies the measurable Operational Costs for an experimental hybrid power system in Sierra Leone. Two software programs, Hybrid2 and HOMER, are used during the system design and subsequent analysis. Experimental data from the installed system is used to validate the two programs and to quantify the savings created by each component within the hybrid system. This thesis bridges the gap between design optimization studies that frequently lack subsequent validation and experimental hybrid system performance studies.

  6. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations

    PubMed Central

    Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.

    2017-01-01

    A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices. PMID:28594889

  7. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    PubMed

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices.

  8. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    NASA Astrophysics Data System (ADS)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).

  9. Serious Gaming for Test & Evaluation of Clean-Slate (Ab Initio) National Airspace System (NAS) Designs

    NASA Technical Reports Server (NTRS)

    Allen, B. Danette; Alexandrov, Natalia

    2016-01-01

    Incremental approaches to air transportation system development inherit current architectural constraints, which, in turn, place hard bounds on system capacity, efficiency of performance, and complexity. To enable airspace operations of the future, a clean-slate (ab initio) airspace design(s) must be considered. This ab initio National Airspace System (NAS) must be capable of accommodating increased traffic density, a broader diversity of aircraft, and on-demand mobility. System and subsystem designs should scale to accommodate the inevitable demand for airspace services that include large numbers of autonomous Unmanned Aerial Vehicles and a paradigm shift in general aviation (e.g., personal air vehicles) in addition to more traditional aerial vehicles such as commercial jetliners and weather balloons. The complex and adaptive nature of ab initio designs for the future NAS requires new approaches to validation, adding a significant physical experimentation component to analytical and simulation tools. In addition to software modeling and simulation, the ability to exercise system solutions in a flight environment will be an essential aspect of validation. The NASA Langley Research Center (LaRC) Autonomy Incubator seeks to develop a flight simulation infrastructure for ab initio modeling and simulation that assumes no specific NAS architecture and models vehicle-to-vehicle behavior to examine interactions and emergent behaviors among hundreds of intelligent aerial agents exhibiting collaborative, cooperative, coordinative, selfish, and malicious behaviors. The air transportation system of the future will be a complex adaptive system (CAS) characterized by complex and sometimes unpredictable (or unpredicted) behaviors that result from temporal and spatial interactions among large numbers of participants. A CAS not only evolves with a changing environment and adapts to it, it is closely coupled to all systems that constitute the environment. Thus, the ecosystem that contains the system and other systems evolves with the CAS as well. The effects of the emerging adaptation and co-evolution are difficult to capture with only combined mathematical and computational experimentation. Therefore, an ab initio flight simulation environment must accommodate individual vehicles, groups of self-organizing vehicles, and large-scale infrastructure behavior. Inspired by Massively Multiplayer Online Role Playing Games (MMORPG) and Serious Gaming, the proposed ab initio simulation environment is similar to online gaming environments in which player participants interact with each other, affect their environment, and expect the simulation to persist and change regardless of any individual player's active participation.

  10. Computational-experimental approach to drug-target interaction mapping: A case study on kinase inhibitors

    PubMed Central

    Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister

    2017-01-01

    Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach offers practical benefits for probing novel insights into the mode of action of investigational compounds, and for the identification of new target selectivities for drug repurposing applications. PMID:28787438

  11. Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview

    NASA Technical Reports Server (NTRS)

    Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard

    1996-01-01

    The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'

  12. De Novo Design and Experimental Characterization of Ultrashort Self-Associating Peptides

    PubMed Central

    Xue, Bo; Robinson, Robert C.; Hauser, Charlotte A. E.; Floudas, Christodoulos A.

    2014-01-01

    Self-association is a common phenomenon in biology and one that can have positive and negative impacts, from the construction of the architectural cytoskeleton of cells to the formation of fibrils in amyloid diseases. Understanding the nature and mechanisms of self-association is important for modulating these systems and in creating biologically-inspired materials. Here, we present a two-stage de novo peptide design framework that can generate novel self-associating peptide systems. The first stage uses a simulated multimeric template structure as input into the optimization-based Sequence Selection to generate low potential energy sequences. The second stage is a computational validation procedure that calculates Fold Specificity and/or Approximate Association Affinity (K*association) based on metrics that we have devised for multimeric systems. This framework was applied to the design of self-associating tripeptides using the known self-associating tripeptide, Ac-IVD, as a structural template. Six computationally predicted tripeptides (Ac-LVE, Ac-YYD, Ac-LLE, Ac-YLD, Ac-MYD, Ac-VIE) were chosen for experimental validation in order to illustrate the self-association outcomes predicted by the three metrics. Self-association and electron microscopy studies revealed that Ac-LLE formed bead-like microstructures, Ac-LVE and Ac-YYD formed fibrillar aggregates, Ac-VIE and Ac-MYD formed hydrogels, and Ac-YLD crystallized under ambient conditions. An X-ray crystallographic study was carried out on a single crystal of Ac-YLD, which revealed that each molecule adopts a β-strand conformation that stack together to form parallel β-sheets. As an additional validation of the approach, the hydrogel-forming sequences of Ac-MYD and Ac-VIE were shuffled. The shuffled sequences were computationally predicted to have lower K*association values and were experimentally verified to not form hydrogels. This illustrates the robustness of the framework in predicting self-associating tripeptides. We expect that this enhanced multimeric de novo peptide design framework will find future application in creating novel self-associating peptides based on unnatural amino acids, and inhibitor peptides of detrimental self-aggregating biological proteins. PMID:25010703

  13. Advanced Numerical Model for Irradiated Concrete

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giorla, Alain B.

    In this report, we establish a numerical model for concrete exposed to irradiation to address these three critical points. The model accounts for creep in the cement paste and its coupling with damage, temperature and relative humidity. The shift in failure mode with the loading rate is also properly represented. The numerical model for creep has been validated and calibrated against different experiments in the literature [Wittmann, 1970, Le Roy, 1995]. Results from a simplified model are shown to showcase the ability of numerical homogenization to simulate irradiation effects in concrete. In future works, the complete model will be appliedmore » to the analysis of the irradiation experiments of Elleuch et al. [1972] and Kelly et al. [1969]. This requires a careful examination of the experimental environmental conditions as in both cases certain critical information are missing, including the relative humidity history. A sensitivity analysis will be conducted to provide lower and upper bounds of the concrete expansion under irradiation, and check if the scatter in the simulated results matches the one found in experiments. The numerical and experimental results will be compared in terms of expansion and loss of mechanical stiffness and strength. Both effects should be captured accordingly by the model to validate it. Once the model has been validated on these two experiments, it can be applied to simulate concrete from nuclear power plants. To do so, the materials used in these concrete must be as well characterized as possible. The main parameters required are the mechanical properties of each constituent in the concrete (aggregates, cement paste), namely the elastic modulus, the creep properties, the tensile and compressive strength, the thermal expansion coefficient, and the drying shrinkage. These can be either measured experimentally, estimated from the initial composition in the case of cement paste, or back-calculated from mechanical tests on concrete. If some are unknown, a sensitivity analysis must be carried out to provide lower and upper bounds of the material behaviour. Finally, the model can be used as a basis to formulate a macroscopic material model for concrete subject to irradiation, which later can be used in structural analyses to estimate the structural impact of irradiation on nuclear power plants.« less

  14. The global status of freshwater fish age validation studies and a prioritization framework for future research

    USGS Publications Warehouse

    Pope, Kevin L.; Hamel, Martin J.; Pegg, Mark A.; Spurgeon, Jonathan J.

    2016-01-01

    Age information derived from calcified structures is commonly used to estimate recruitment, growth, and mortality for fish populations. Validation of daily or annual marks on age structures is often assumed, presumably due to a lack of general knowledge concerning the status of age validation studies. Therefore, the current status of freshwater fish age validation studies was summarized to show where additional effort is needed, and increase the accessibility of validation studies to researchers. In total, 1351 original peer-reviewed articles were reviewed from freshwater systems that studied age in fish. Periodicity and age validation studies were found for 88 freshwater species comprising 21 fish families. The number of age validation studies has increased over the last 30 years following previous calls for more research; however, few species have validated structures spanning all life stages. In addition, few fishes of conservation concern have validated ageing structures. A prioritization framework, using a combination of eight characteristics, is offered to direct future age validation studies and close the validation information gap. Additional study, using the offered prioritization framework, and increased availability of published studies that incorporate uncertainty when presenting research results dealing with age information are needed.

  15. High Fidelity Measurement and Modeling of Interactions between Acoustics and Heat Release in Highly-Compact, High-Pressure Flames

    DTIC Science & Technology

    2016-05-24

    experimental data. However, the time and length scales, and energy deposition rates in the canonical laboratory flames that have been studied over the...is to obtain high-fidelity experimental data critically needed to validate research codes at relevant conditions, and to develop systematic and...validated with experimental data. However, the time and length scales, and energy deposition rates in the canonical laboratory flames that have been

  16. Retrieval of Droplet size Density Distribution from Multiple field of view Cross polarized Lidar Signals: Theory and Experimental Validation

    DTIC Science & Technology

    2016-06-02

    Retrieval of droplet-size density distribution from multiple-field-of-view cross-polarized lidar signals: theory and experimental validation...theoretical and experimental studies of mul- tiple scattering and multiple-field-of-view (MFOV) li- dar detection have made possible the retrieval of cloud...droplet cloud are typical of Rayleigh scattering, with a signature close to a dipole (phase function quasi -flat and a zero-depolarization ratio

  17. Parametric Study of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental Validation Planning

    DTIC Science & Technology

    2001-08-30

    Body with Thermo-Chemical destribution of Heat-Protected System . In: Physical and Gasdynamic Phenomena in Supersonic Flows Over Bodies. Edit. By...Final Report on ISTC Contract # 1809p Parametric Study of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental...of Advanced Mixing of Fuel/Oxidant System in High Speed Gaseous Flows and Experimental Validation Planning 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT

  18. Experimental validation of ultrasonic guided modes in electrical cables by optical interferometry.

    PubMed

    Mateo, Carlos; de Espinosa, Francisco Montero; Gómez-Ullate, Yago; Talavera, Juan A

    2008-03-01

    In this work, the dispersion curves of elastic waves propagating in electrical cables and in bare copper wires are obtained theoretically and validated experimentally. The theoretical model, based on Gazis equations formulated according to the global matrix methodology, is resolved numerically. Viscoelasticity and attenuation are modeled theoretically using the Kelvin-Voigt model. Experimental tests are carried out using interferometry. There is good agreement between the simulations and the experiments despite the peculiarities of electrical cables.

  19. Validation of Heavy Ion Transport Capabilities in PHITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronningen, Reginald M.

    The performance of the Monte Carlo code system PHITS is validated for heavy ion transport capabilities by performing simulations and comparing results against experimental data from heavy ion reactions of benchmark quality. These data are from measurements of secondary neutron production cross sections in reactions of Xe at 400 MeV/u with lithium and lead targets, measurements of neutrons outside of thick concrete and iron shields, and measurements of isotope yields produced in the fragmentation of a 140 MeV/u 48Ca beam on a beryllium target and on a tantalum target. A practical example that tests magnetic field capabilities is shown formore » a simulated 48Ca beam at 500 MeV/u striking a lithium target to produce the rare isotope 44Si, with ion transport through a fragmentation-reaction magnetic pre-separator. The results of this study show that PHITS performs reliably for the simulation of radiation fields that is necessary for designing safe, reliable and cost effective future high-powered heavy-ion accelerators in rare isotope beam facilities.« less

  20. Technical and conceptual considerations for using animated stimuli in studies of animal behavior.

    PubMed

    Chouinard-Thuly, Laura; Gierszewski, Stefanie; Rosenthal, Gil G; Reader, Simon M; Rieucau, Guillaume; Woo, Kevin L; Gerlai, Robert; Tedore, Cynthia; Ingley, Spencer J; Stowers, John R; Frommen, Joachim G; Dolins, Francine L; Witte, Klaudia

    2017-02-01

    Rapid technical advances in the field of computer animation (CA) and virtual reality (VR) have opened new avenues in animal behavior research. Animated stimuli are powerful tools as they offer standardization, repeatability, and complete control over the stimulus presented, thereby "reducing" and "replacing" the animals used, and "refining" the experimental design in line with the 3Rs. However, appropriate use of these technologies raises conceptual and technical questions. In this review, we offer guidelines for common technical and conceptual considerations related to the use of animated stimuli in animal behavior research. Following the steps required to create an animated stimulus, we discuss (I) the creation, (II) the presentation, and (III) the validation of CAs and VRs. Although our review is geared toward computer-graphically designed stimuli, considerations on presentation and validation also apply to video playbacks. CA and VR allow both new behavioral questions to be addressed and existing questions to be addressed in new ways, thus we expect a rich future for these methods in both ultimate and proximate studies of animal behavior.

  1. Technical and conceptual considerations for using animated stimuli in studies of animal behavior

    PubMed Central

    Rosenthal, Gil G.; Reader, Simon M.; Rieucau, Guillaume; Woo, Kevin L.; Gerlai, Robert; Tedore, Cynthia; Ingley, Spencer J.; Stowers, John R.; Frommen, Joachim G.; Dolins, Francine L.; Witte, Klaudia

    2017-01-01

    Abstract Rapid technical advances in the field of computer animation (CA) and virtual reality (VR) have opened new avenues in animal behavior research. Animated stimuli are powerful tools as they offer standardization, repeatability, and complete control over the stimulus presented, thereby “reducing” and “replacing” the animals used, and “refining” the experimental design in line with the 3Rs. However, appropriate use of these technologies raises conceptual and technical questions. In this review, we offer guidelines for common technical and conceptual considerations related to the use of animated stimuli in animal behavior research. Following the steps required to create an animated stimulus, we discuss (I) the creation, (II) the presentation, and (III) the validation of CAs and VRs. Although our review is geared toward computer-graphically designed stimuli, considerations on presentation and validation also apply to video playbacks. CA and VR allow both new behavioral questions to be addressed and existing questions to be addressed in new ways, thus we expect a rich future for these methods in both ultimate and proximate studies of animal behavior. PMID:29491958

  2. Critical Low-Noise Technologies Being Developed for Engine Noise Reduction Systems Subproject

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.; Civinskas, Kestutis C.

    2004-01-01

    NASA's previous Advanced Subsonic Technology (AST) Noise Reduction Program delivered the initial technologies for meeting a 10-year goal of a 10-dB reduction in total aircraft system noise. Technology Readiness Levels achieved for the engine-noise-reduction technologies ranged from 4 (rig scale) to 6 (engine demonstration). The current Quiet Aircraft Technology (QAT) project is building on those AST accomplishments to achieve the additional noise reduction needed to meet the Aerospace Technology Enterprise's 10-year goal, again validated through a combination of laboratory rig and engine demonstration tests. In order to meet the Aerospace Technology Enterprise goal for future aircraft of a 50- reduction in the perceived noise level, reductions of 4 dB are needed in both fan and jet noise. The primary objectives of the Engine Noise Reduction Systems (ENRS) subproject are, therefore, to develop technologies to reduce both fan and jet noise by 4 dB, to demonstrate these technologies in engine tests, and to develop and experimentally validate Computational Aero Acoustics (CAA) computer codes that will improve our ability to predict engine noise.

  3. 77 FR 485 - Wind Plant Performance-Public Meeting on Modeling and Testing Needs for Complex Air Flow...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-05

    ... modeling needs and experimental validation techniques for complex flow phenomena in and around off- shore... experimental validation. Ultimately, research in this area may lead to significant improvements in wind plant... meeting will consist of an initial plenary session in which invited speakers will survey available...

  4. Plasma Potential and Langmuir Probe Measurements in the Near-field Plume of the NASA-457Mv2 Hall Thruster

    NASA Technical Reports Server (NTRS)

    Shastry, Rohit; Huang, Wensheng; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani

    2012-01-01

    In order to further the design of future high-power Hall thrusters and provide experimental validation for ongoing modeling efforts, plasma potential and Langmuir probe measurements were performed on the 50-kW NASA-457Mv2. An electrostatic probe array comprised of a near-field Faraday probe, single Langmuir probe, and emissive probe was used to interrogate the near-field plume from approximately 0.1 - 2.0 mean thruster diameters downstream of the thruster exit plane at the following operating conditions: 300 V, 400 V and 500 V at 30 kW and 500 V at 50 kW. Results have shown that the acceleration zone is limited to within 0.4 mean thruster diameters of the exit plane while the high-temperature region is limited to 0.25 mean thruster diameters from the exit plane at all four operating conditions. Maximum plasma potentials in the near-field at 300 and 400 V were approximately 50 V with respect to cathode potential, while maximum electron temperatures varied from 24 - 32 eV, depending on operating condition. Isothermal lines at all operating conditions were found to strongly resemble the magnetic field topology in the high-temperature regions. This distribution was found to create regions of high temperature and low density near the magnetic poles, indicating strong, thick sheath formation along these surfaces. The data taken from this study are considered valuable for future design as well as modeling validation.

  5. Validation of a Monte Carlo simulation of the Inveon PET scanner using GATE

    NASA Astrophysics Data System (ADS)

    Lu, Lijun; Zhang, Houjin; Bian, Zhaoying; Ma, Jianhua; Feng, Qiangjin; Chen, Wufan

    2016-08-01

    The purpose of this study is to validate the application of GATE (Geant4 Application for Tomographic Emission) Monte Carlo simulation toolkit in order to model the performance characteristics of Siemens Inveon small animal PET system. The simulation results were validated against experimental/published data in accordance with the NEMA NU-4 2008 protocol for standardized evaluation of spatial resolution, sensitivity, scatter fraction (SF) and noise equivalent counting rate (NECR) of a preclinical PET system. An agreement of less than 18% was obtained between the radial, tangential and axial spatial resolutions of the simulated and experimental results. The simulated peak NECR of mouse-size phantom agreed with the experimental result, while for the rat-size phantom simulated value was higher than experimental result. The simulated and experimental SFs of mouse- and rat- size phantom both reached an agreement of less than 2%. It has been shown the feasibility of our GATE model to accurately simulate, within certain limits, all major performance characteristics of Inveon PET system.

  6. Simulating effects of changing climate and CO(2) emissions on soil carbon pools at the Hubbard Brook experimental forest.

    PubMed

    Dib, Alain E; Johnson, Chris E; Driscoll, Charles T; Fahey, Timothy J; Hayhoe, Katharine

    2014-05-01

    Carbon (C) sequestration in forest biomass and soils may help decrease regional C footprints and mitigate future climate change. The efficacy of these practices must be verified by monitoring and by approved calculation methods (i.e., models) to be credible in C markets. Two widely used soil organic matter models - CENTURY and RothC - were used to project changes in SOC pools after clear-cutting disturbance, as well as under a range of future climate and atmospheric carbon dioxide (CO(2) ) scenarios. Data from the temperate, predominantly deciduous Hubbard Brook Experimental Forest (HBEF) in New Hampshire, USA, were used to parameterize and validate the models. Clear-cutting simulations demonstrated that both models can effectively simulate soil C dynamics in the northern hardwood forest when adequately parameterized. The minimum postharvest SOC predicted by RothC occurred in postharvest year 14 and was within 1.5% of the observed minimum, which occurred in year 8. CENTURY predicted the postharvest minimum SOC to occur in year 45, at a value 6.9% greater than the observed minimum; the slow response of both models to disturbance suggests that they may overestimate the time required to reach new steady-state conditions. Four climate change scenarios were used to simulate future changes in SOC pools. Climate-change simulations predicted increases in SOC by as much as 7% at the end of this century, partially offsetting future CO(2) emissions. This sequestration was the product of enhanced forest productivity, and associated litter input to the soil, due to increased temperature, precipitation and CO(2) . The simulations also suggested that considerable losses of SOC (8-30%) could occur if forest vegetation at HBEF does not respond to changes in climate and CO(2) levels. Therefore, the source/sink behavior of temperate forest soils likely depends on the degree to which forest growth is stimulated by new climate and CO(2) conditions. © 2013 John Wiley & Sons Ltd.

  7. A new simple local muscle recovery model and its theoretical and experimental validation.

    PubMed

    Ma, Liang; Zhang, Wei; Wu, Su; Zhang, Zhanwu

    2015-01-01

    This study was conducted to provide theoretical and experimental validation of a local muscle recovery model. Muscle recovery has been modeled in different empirical and theoretical approaches to determine work-rest allowance for musculoskeletal disorder (MSD) prevention. However, time-related parameters and individual attributes have not been sufficiently considered in conventional approaches. A new muscle recovery model was proposed by integrating time-related task parameters and individual attributes. Theoretically, this muscle recovery model was compared to other theoretical models mathematically. Experimentally, a total of 20 subjects participated in the experimental validation. Hand grip force recovery and shoulder joint strength recovery were measured after a fatiguing operation. The recovery profile was fitted by using the recovery model, and individual recovery rates were calculated as well after fitting. Good fitting values (r(2) > .8) were found for all the subjects. Significant differences in recovery rates were found among different muscle groups (p < .05). The theoretical muscle recovery model was primarily validated by characterization of the recovery process after fatiguing operation. The determined recovery rate may be useful to represent individual recovery attribute.

  8. Competency measurements: testing convergent validity for two measures.

    PubMed

    Cowin, Leanne S; Hengstberger-Sims, Cecily; Eagar, Sandy C; Gregory, Linda; Andrew, Sharon; Rolley, John

    2008-11-01

    This paper is a report of a study to investigate whether the Australian National Competency Standards for Registered Nurses demonstrate correlations with the Finnish Nurse Competency Scale. Competency assessment has become popular as a key regulatory requirement and performance indicator. The term competency, however, does not have a globally accepted definition and this has the potential to create controversy, ambiguity and confusion. Variations in meaning and definitions adopted in workplaces and educational settings will affect the interpretation of research findings and have implications for the nursing profession. A non-experimental cross-sectional survey design was used with a convenience sample of 116 new graduate nurses in 2005. The second version of the Australian National Competency Standards and the Nurse Competency Scale was used to elicit responses to self-assessed competency in the transitional year (first year as a Registered Nurse). Correlational analysis of self-assessed levels of competence revealed a relationship between the Australian National Competency Standards (ANCI) and the Nurse Competency Scale (NCS). The correlational relation between ANCI domains and NCS factors suggests that these scales are indeed used to measure related dimensions. A statistically significant relationship (r = 0.75) was found between the two competency measures. Although the finding of convergent validity is insufficient to establish construct validity for competency as used in both measures in this study, it is an important step towards this goal. Future studies on relationships between competencies must take into account the validity and reliability of the tools.

  9. Design and validation of an advanced entrained flow reactor system for studies of rapid solid biomass fuel particle conversion and ash formation reactions

    NASA Astrophysics Data System (ADS)

    Wagner, David R.; Holmgren, Per; Skoglund, Nils; Broström, Markus

    2018-06-01

    The design and validation of a newly commissioned entrained flow reactor is described in the present paper. The reactor was designed for advanced studies of fuel conversion and ash formation in powder flames, and the capabilities of the reactor were experimentally validated using two different solid biomass fuels. The drop tube geometry was equipped with a flat flame burner to heat and support the powder flame, optical access ports, a particle image velocimetry (PIV) system for in situ conversion monitoring, and probes for extraction of gases and particulate matter. A detailed description of the system is provided based on simulations and measurements, establishing the detailed temperature distribution and gas flow profiles. Mass balance closures of approximately 98% were achieved by combining gas analysis and particle extraction. Biomass fuel particles were successfully tracked using shadow imaging PIV, and the resulting data were used to determine the size, shape, velocity, and residence time of converting particles. Successful extractive sampling of coarse and fine particles during combustion while retaining their morphology was demonstrated, and it opens up for detailed time resolved studies of rapid ash transformation reactions; in the validation experiments, clear and systematic fractionation trends for K, Cl, S, and Si were observed for the two fuels tested. The combination of in situ access, accurate residence time estimations, and precise particle sampling for subsequent chemical analysis allows for a wide range of future studies, with implications and possibilities discussed in the paper.

  10. Implementing an X-ray validation pipeline for the Protein Data Bank

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, Swanand; Velankar, Sameer; Kleywegt, Gerard J., E-mail: gerard@ebi.ac.uk

    2012-04-01

    The implementation of a validation pipeline, based on community recommendations, for future depositions of X-ray crystal structures in the Protein Data Bank is described. There is an increasing realisation that the quality of the biomacromolecular structures deposited in the Protein Data Bank (PDB) archive needs to be assessed critically using established and powerful validation methods. The Worldwide Protein Data Bank (wwPDB) organization has convened several Validation Task Forces (VTFs) to advise on the methods and standards that should be used to validate all of the entries already in the PDB as well as all structures that will be deposited inmore » the future. The recommendations of the X-ray VTF are currently being implemented in a software pipeline. Here, ongoing work on this pipeline is briefly described as well as ways in which validation-related information could be presented to users of structural data.« less

  11. A method for validation of finite element forming simulation on basis of a pointwise comparison of distance and curvature

    NASA Astrophysics Data System (ADS)

    Dörr, Dominik; Joppich, Tobias; Schirmaier, Fabian; Mosthaf, Tobias; Kärger, Luise; Henning, Frank

    2016-10-01

    Thermoforming of continuously fiber reinforced thermoplastics (CFRTP) is ideally suited to thin walled and complex shaped products. By means of forming simulation, an initial validation of the producibility of a specific geometry, an optimization of the forming process and the prediction of fiber-reorientation due to forming is possible. Nevertheless, applied methods need to be validated. Therefor a method is presented, which enables the calculation of error measures for the mismatch between simulation results and experimental tests, based on measurements with a conventional coordinate measuring device. As a quantitative measure, describing the curvature is provided, the presented method is also suitable for numerical or experimental sensitivity studies on wrinkling behavior. The applied methods for forming simulation, implemented in Abaqus explicit, are presented and applied to a generic geometry. The same geometry is tested experimentally and simulation and test results are compared by the proposed validation method.

  12. Validation of an automated mite counter for Dermanyssus gallinae in experimental laying hen cages.

    PubMed

    Mul, Monique F; van Riel, Johan W; Meerburg, Bastiaan G; Dicke, Marcel; George, David R; Groot Koerkamp, Peter W G

    2015-08-01

    For integrated pest management (IPM) programs to be maximally effective, monitoring of the growth and decline of the pest populations is essential. Here, we present the validation results of a new automated monitoring device for the poultry red mite (Dermanyssus gallinae), a serious pest in laying hen facilities world-wide. This monitoring device (called an "automated mite counter") was validated in experimental laying hen cages with live birds and a growing population of D. gallinae. This validation study resulted in 17 data points of 'number of mites counted' by the automated mite counter and the 'number of mites present' in the experimental laying hen cages. The study demonstrated that the automated mite counter was able to track the D. gallinae population effectively. A wider evaluation showed that this automated mite counter can become a useful tool in IPM of D. gallinae in laying hen facilities.

  13. Validation of experimental molecular crystal structures with dispersion-corrected density functional theory calculations.

    PubMed

    van de Streek, Jacco; Neumann, Marcus A

    2010-10-01

    This paper describes the validation of a dispersion-corrected density functional theory (d-DFT) method for the purpose of assessing the correctness of experimental organic crystal structures and enhancing the information content of purely experimental data. 241 experimental organic crystal structures from the August 2008 issue of Acta Cryst. Section E were energy-minimized in full, including unit-cell parameters. The differences between the experimental and the minimized crystal structures were subjected to statistical analysis. The r.m.s. Cartesian displacement excluding H atoms upon energy minimization with flexible unit-cell parameters is selected as a pertinent indicator of the correctness of a crystal structure. All 241 experimental crystal structures are reproduced very well: the average r.m.s. Cartesian displacement for the 241 crystal structures, including 16 disordered structures, is only 0.095 Å (0.084 Å for the 225 ordered structures). R.m.s. Cartesian displacements above 0.25 A either indicate incorrect experimental crystal structures or reveal interesting structural features such as exceptionally large temperature effects, incorrectly modelled disorder or symmetry breaking H atoms. After validation, the method is applied to nine examples that are known to be ambiguous or subtly incorrect.

  14. Computational and experimental analysis of the flow in an annular centrifugal contactor

    NASA Astrophysics Data System (ADS)

    Wardle, Kent E.

    The annular centrifugal contactor has been developed for solvent extraction processes for recycling used nuclear fuel. The compact size and high efficiency of these contactors have made them the choice for advanced reprocessing schemes and a key equipment for a proposed future advanced fuel cycle facility. While a sufficient base of experience exists to facilitate successful operation of current contactor technology, a more complete understanding of the fluid flow within the contactor would enable further advancements in design and operation of future units and greater confidence for use of such contactors in a variety of other solvent extraction applications. This research effort has coupled computational fluid dynamics modeling with a variety of experimental measurements and observations to provide a valid detailed analysis of the flow within the centrifugal contactor. CFD modeling of the free surface flow in the annular mixing zone using the Volume of Fluid (VOF) volume tracking method combined with Large Eddy Simulation (LES) of turbulence was found to have very good agreement with the experimental measurements and observations. A detailed study of the flow and mixing for different housing vane geometries was performed and it was found that the four straight mixing vane geometry had greater mixing for the flow rate simulated and more predictable operation over a range of low to moderate flow rates. The separation zone was also modeled providing a useful description of the flow in this region and identifying critical design features. It is anticipated that this work will form a foundation for additional efforts at improving the design and operation of centrifugal contactors and provide a framework for progress towards simulation of solvent extraction processes.

  15. Experimental validation benchmark data for CFD of transient convection from forced to natural with flow reversal on a vertical flat plate

    DOE PAGES

    Lance, Blake W.; Smith, Barton L.

    2016-06-23

    Transient convection has been investigated experimentally for the purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. A specialized facility for validation benchmark experiments called the Rotatable Buoyancy Tunnel was used to acquire thermal and velocity measurements of flow over a smooth, vertical heated plate. The initial condition was forced convection downward with subsequent transition to mixed convection, ending with natural convection upward after a flow reversal. Data acquisition through the transient was repeated for ensemble-averaged results. With simple flow geometry, validation data were acquired at the benchmark level. All boundary conditions (BCs) were measured and their uncertainties quantified.more » Temperature profiles on all four walls and the inlet were measured, as well as as-built test section geometry. Inlet velocity profiles and turbulence levels were quantified using Particle Image Velocimetry. System Response Quantities (SRQs) were measured for comparison with CFD outputs and include velocity profiles, wall heat flux, and wall shear stress. Extra effort was invested in documenting and preserving the validation data. Details about the experimental facility, instrumentation, experimental procedure, materials, BCs, and SRQs are made available through this paper. As a result, the latter two are available for download and the other details are included in this work.« less

  16. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  17. Computational fluid dynamics modeling of laboratory flames and an industrial flare.

    PubMed

    Singh, Kanwar Devesh; Gangadharan, Preeti; Chen, Daniel H; Lou, Helen H; Li, Xianchang; Richmond, Peyton

    2014-11-01

    A computational fluid dynamics (CFD) methodology for simulating the combustion process has been validated with experimental results. Three different types of experimental setups were used to validate the CFD model. These setups include an industrial-scale flare setups and two lab-scale flames. The CFD study also involved three different fuels: C3H6/CH/Air/N2, C2H4/O2/Ar and CH4/Air. In the first setup, flare efficiency data from the Texas Commission on Environmental Quality (TCEQ) 2010 field tests were used to validate the CFD model. In the second setup, a McKenna burner with flat flames was simulated. Temperature and mass fractions of important species were compared with the experimental data. Finally, results of an experimental study done at Sandia National Laboratories to generate a lifted jet flame were used for the purpose of validation. The reduced 50 species mechanism, LU 1.1, the realizable k-epsilon turbulence model, and the EDC turbulence-chemistry interaction model were usedfor this work. Flare efficiency, axial profiles of temperature, and mass fractions of various intermediate species obtained in the simulation were compared with experimental data and a good agreement between the profiles was clearly observed. In particular the simulation match with the TCEQ 2010 flare tests has been significantly improved (within 5% of the data) compared to the results reported by Singh et al. in 2012. Validation of the speciated flat flame data supports the view that flares can be a primary source offormaldehyde emission.

  18. Experimental validation of a numerical model predicting the charging characteristics of Teflon and Kapton under electron beam irradiation

    NASA Technical Reports Server (NTRS)

    Hazelton, R. C.; Yadlowsky, E. J.; Churchill, R. J.; Parker, L. W.; Sellers, B.

    1981-01-01

    The effect differential charging of spacecraft thermal control surfaces is assessed by studying the dynamics of the charging process. A program to experimentally validate a computer model of the charging process was established. Time resolved measurements of the surface potential were obtained for samples of Kapton and Teflon irradiated with a monoenergetic electron beam. Results indicate that the computer model and experimental measurements agree well and that for Teflon, secondary emission is the governing factor. Experimental data indicate that bulk conductivities play a significant role in the charging of Kapton.

  19. Empirical correlations of the performance of vapor-anode PX-series AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, L.; Merrill, J.M.; Mayberry, C.

    Power systems based on AMTEC technology will be used for future NASA missions, including a Pluto-Express (PX) or Europa mission planned for approximately year 2004. AMTEC technology may also be used as an alternative to photovoltaic based power systems for future Air Force missions. An extensive development program of Alkali-Metal Thermal-to-Electric Conversion (AMTEC) technology has been underway at the Vehicle Technologies Branch of the Air Force Research Laboratory (AFRL) in Albuquerque, New Mexico since 1992. Under this program, numerical modeling and experimental investigations of the performance of the various multi-BASE tube, vapor-anode AMTEC cells have been and are being performed.more » Vacuum testing of AMTEC cells at AFRL determines the effects of changing the hot and cold end temperatures, T{sub hot} and T{sub cold}, and applied external load, R{sub ext}, on the cell electric power output, current-voltage characteristics, and conversion efficiency. Test results have traditionally been used to provide feedback to cell designers, and to validate numerical models. The current work utilizes the test data to develop empirical correlations for cell output performance under various working conditions. Because the empirical correlations are developed directly from the experimental data, uncertainties arising from material properties that must be used in numerical modeling can be avoided. Empirical correlations of recent vapor-anode PX-series AMTEC cells have been developed. Based on AMTEC theory and the experimental data, the cell output power (as well as voltage and current) was correlated as a function of three parameters (T{sub hot}, T{sub cold}, and R{sub ext}) for a given cell. Correlations were developed for different cells (PX-3C, PX-3A, PX-G3, and PX-5A), and were in good agreement with experimental data for these cells. Use of these correlations can greatly reduce the testing required to determine electrical performance of a given type of AMTEC cell over a wide range of operating conditions.« less

  20. Improving the psychometric properties of dot-probe attention measures using response-based computation.

    PubMed

    Evans, Travis C; Britton, Jennifer C

    2018-09-01

    Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Experimental Characterization of Close-Emitter Interference in an Optical Camera Communication System

    PubMed Central

    Chavez-Burbano, Patricia; Rabadan, Jose; Perez-Jimenez, Rafael

    2017-01-01

    Due to the massive insertion of embedded cameras in a wide variety of devices and the generalized use of LED lamps, Optical Camera Communication (OCC) has been proposed as a practical solution for future Internet of Things (IoT) and smart cities applications. Influence of mobility, weather conditions, solar radiation interference, and external light sources over Visible Light Communication (VLC) schemes have been addressed in previous works. Some authors have studied the spatial intersymbol interference from close emitters within an OCC system; however, it has not been characterized or measured in function of the different transmitted wavelengths. In this work, this interference has been experimentally characterized and the Normalized Power Signal to Interference Ratio (NPSIR) for easily determining the interference in other implementations, independently of the selected system devices, has been also proposed. A set of experiments in a darkroom, working with RGB multi-LED transmitters and a general purpose camera, were performed in order to obtain the NPSIR values and to validate the deduced equations for 2D pixel representation of real distances. These parameters were used in the simulation of a wireless sensor network scenario in a small office, where the Bit Error Rate (BER) of the communication link was calculated. The experiments show that the interference of other close emitters in terms of the distance and the used wavelength can be easily determined with the NPSIR. Finally, the simulation validates the applicability of the deduced equations for scaling the initial results into real scenarios. PMID:28677613

  2. ThermoData Engine (TDE): software implementation of the dynamic data evaluation concept. 9. Extensible thermodynamic constraints for pure compounds and new model developments.

    PubMed

    Diky, Vladimir; Chirico, Robert D; Muzny, Chris D; Kazakov, Andrei F; Kroenlein, Kenneth; Magee, Joseph W; Abdulagatov, Ilmutdin; Frenkel, Michael

    2013-12-23

    ThermoData Engine (TDE) is the first full-scale software implementation of the dynamic data evaluation concept, as reported in this journal. The present article describes the background and implementation for new additions in latest release of TDE. Advances are in the areas of program architecture and quality improvement for automatic property evaluations, particularly for pure compounds. It is shown that selection of appropriate program architecture supports improvement of the quality of the on-demand property evaluations through application of a readily extensible collection of constraints. The basis and implementation for other enhancements to TDE are described briefly. Other enhancements include the following: (1) implementation of model-validity enforcement for specific equations that can provide unphysical results if unconstrained, (2) newly refined group-contribution parameters for estimation of enthalpies of formation for pure compounds containing carbon, hydrogen, and oxygen, (3) implementation of an enhanced group-contribution method (NIST-Modified UNIFAC) in TDE for improved estimation of phase-equilibrium properties for binary mixtures, (4) tools for mutual validation of ideal-gas properties derived through statistical calculations and those derived independently through combination of experimental thermodynamic results, (5) improvements in program reliability and function that stem directly from the recent redesign of the TRC-SOURCE Data Archival System for experimental property values, and (6) implementation of the Peng-Robinson equation of state for binary mixtures, which allows for critical evaluation of mixtures involving supercritical components. Planned future developments are summarized.

  3. Experimental Characterization of Close-Emitter Interference in an Optical Camera Communication System.

    PubMed

    Chavez-Burbano, Patricia; Guerra, Victor; Rabadan, Jose; Rodríguez-Esparragón, Dionisio; Perez-Jimenez, Rafael

    2017-07-04

    Due to the massive insertion of embedded cameras in a wide variety of devices and the generalized use of LED lamps, Optical Camera Communication (OCC) has been proposed as a practical solution for future Internet of Things (IoT) and smart cities applications. Influence of mobility, weather conditions, solar radiation interference, and external light sources over Visible Light Communication (VLC) schemes have been addressed in previous works. Some authors have studied the spatial intersymbol interference from close emitters within an OCC system; however, it has not been characterized or measured in function of the different transmitted wavelengths. In this work, this interference has been experimentally characterized and the Normalized Power Signal to Interference Ratio (NPSIR) for easily determining the interference in other implementations, independently of the selected system devices, has been also proposed. A set of experiments in a darkroom, working with RGB multi-LED transmitters and a general purpose camera, were performed in order to obtain the NPSIR values and to validate the deduced equations for 2D pixel representation of real distances. These parameters were used in the simulation of a wireless sensor network scenario in a small office, where the Bit Error Rate (BER) of the communication link was calculated. The experiments show that the interference of other close emitters in terms of the distance and the used wavelength can be easily determined with the NPSIR. Finally, the simulation validates the applicability of the deduced equations for scaling the initial results into real scenarios.

  4. Experimental aeroelasticity in wind tunnels - History, status, and future in brief

    NASA Technical Reports Server (NTRS)

    Ricketts, Rodney H.

    1993-01-01

    The state of the art of experimental aeroelasticity in the United States is assessed. A brief history of the development of ground test facilities, apparatus, and testing methods is presented. Several experimental programs are described that were previously conducted and helped to improve the state of the art. Some specific future directions for improving and enhancing experimental aeroelasticity are suggested.

  5. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    NASA Technical Reports Server (NTRS)

    Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  6. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    NASA Technical Reports Server (NTRS)

    Storey, Jed; Kirk, Daniel (Editor); Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  7. The morality of abortion and the deprivation of futures

    PubMed Central

    Brown, M.

    2000-01-01

    In an influential essay entitled Why abortion is wrong, Donald Marquis argues that killing actual persons is wrong because it unjustly deprives victims of their future; that the fetus has a future similar in morally relevant respects to the future lost by competent adult homicide victims, and that, as consequence, abortion is justifiable only in the same circumstances in which killing competent adult human beings is justifiable.1 The metaphysical claim implicit in the first premise, that actual persons have a future of value, is ambiguous. The Future Like Ours argument (FLO) would be valid if "future of value" were used consistently to mean either "potential future of value" or "self-represented future of value", and FLO would be sound if one or the other interpretation supported both the moral claim and the metaphysical claim, but if, as I argue, any interpretation which makes the argument valid renders it unsound, then FLO must be rejected. Its apparent strength derives from equivocation on the concept of "a future of value". Key Words: Abortion • Future Like Ours • Donald Marquis • potentiality • pro-choice PMID:10786320

  8. Detailed experimental investigations on flow behaviors and velocity field properties of a supersonic mixing layer

    NASA Astrophysics Data System (ADS)

    Tan, Jianguo; Zhang, Dongdong; Li, Hao; Hou, Juwei

    2018-03-01

    The flow behaviors and mixing characteristics of a supersonic mixing layer with a convective Mach number of 0.2 have been experimentally investigated utilizing nanoparticle-based planar laser scattering and particle image velocimetry techniques. The full development and evolution process, including the formation of Kelvin-Helmholtz vortices, breakdown of large-scale structures and establishment of self-similar turbulence, is exhibited clearly in the experiments, which can give a qualitative graphically comparing for the DNS and LES results. The shocklets are first captured at this low convective Mach number, and their generation mechanisms are elaborated and analyzed. The convective velocity derived from two images with space-time correlations is well consistent with the theoretical result. The pairing and merging process of large-scale vortices in transition region is clearly revealed in the velocity vector field. The analysis of turbulent statistics indicates that in weakly compressible mixing layers, with the increase of convective Mach number, the peak values of streamwise turbulence intensity and Reynolds shear stress experience a sharp decrease, while the anisotropy ratio seems to keep quasi unchanged. The normalized growth rate of the present experiments shows a well agreement with former experimental and DNS data. The validation of present experimental results is important for that in the future the present work can be a reference for assessing the accuracy of numerical data.

  9. miRSponge: a manually curated database for experimentally supported miRNA sponges and ceRNAs.

    PubMed

    Wang, Peng; Zhi, Hui; Zhang, Yunpeng; Liu, Yue; Zhang, Jizhou; Gao, Yue; Guo, Maoni; Ning, Shangwei; Li, Xia

    2015-01-01

    In this study, we describe miRSponge, a manually curated database, which aims at providing an experimentally supported resource for microRNA (miRNA) sponges. Recent evidence suggests that miRNAs are themselves regulated by competing endogenous RNAs (ceRNAs) or 'miRNA sponges' that contain miRNA binding sites. These competitive molecules can sequester miRNAs to prevent them interacting with their natural targets to play critical roles in various biological and pathological processes. It has become increasingly important to develop a high quality database to record and store ceRNA data to support future studies. To this end, we have established the experimentally supported miRSponge database that contains data on 599 miRNA-sponge interactions and 463 ceRNA relationships from 11 species following manual curating from nearly 1200 published articles. Database classes include endogenously generated molecules including coding genes, pseudogenes, long non-coding RNAs and circular RNAs, along with exogenously introduced molecules including viral RNAs and artificial engineered sponges. Approximately 70% of the interactions were identified experimentally in disease states. miRSponge provides a user-friendly interface for convenient browsing, retrieval and downloading of dataset. A submission page is also included to allow researchers to submit newly validated miRNA sponge data. Database URL: http://www.bio-bigdata.net/miRSponge. © The Author(s) 2015. Published by Oxford University Press.

  10. miRSponge: a manually curated database for experimentally supported miRNA sponges and ceRNAs

    PubMed Central

    Wang, Peng; Zhi, Hui; Zhang, Yunpeng; Liu, Yue; Zhang, Jizhou; Gao, Yue; Guo, Maoni; Ning, Shangwei; Li, Xia

    2015-01-01

    In this study, we describe miRSponge, a manually curated database, which aims at providing an experimentally supported resource for microRNA (miRNA) sponges. Recent evidence suggests that miRNAs are themselves regulated by competing endogenous RNAs (ceRNAs) or ‘miRNA sponges’ that contain miRNA binding sites. These competitive molecules can sequester miRNAs to prevent them interacting with their natural targets to play critical roles in various biological and pathological processes. It has become increasingly important to develop a high quality database to record and store ceRNA data to support future studies. To this end, we have established the experimentally supported miRSponge database that contains data on 599 miRNA-sponge interactions and 463 ceRNA relationships from 11 species following manual curating from nearly 1200 published articles. Database classes include endogenously generated molecules including coding genes, pseudogenes, long non-coding RNAs and circular RNAs, along with exogenously introduced molecules including viral RNAs and artificial engineered sponges. Approximately 70% of the interactions were identified experimentally in disease states. miRSponge provides a user-friendly interface for convenient browsing, retrieval and downloading of dataset. A submission page is also included to allow researchers to submit newly validated miRNA sponge data. Database URL: http://www.bio-bigdata.net/miRSponge. PMID:26424084

  11. When Theater Comes to Engineering Design: Oh How Creative They Can Be.

    PubMed

    Pfeiffer, Ferris M; Bauer, Rachel E; Borgelt, Steve; Burgoyne, Suzanne; Grant, Sheila; Hunt, Heather K; Pardoe, Jennie J; Schmidt, David C

    2017-07-01

    The creative process is fun, complex, and sometimes frustrating, but it is critical to the future of our nation and progress in science, technology, engineering, mathematics (STEM), as well as other fields. Thus, we set out to see if implementing methods of active learning typical to the theater department could impact the creativity of senior capstone design students in the bioengineering (BE) department. Senior bioengineering capstone design students were allowed to self-select into groups. Prior to the beginning of coursework, all students completed a validated survey measuring engineering design self-efficacy. The control and experimental groups both received standard instruction, but in addition the experimental group received 1 h per week of creativity training developed by a theater professor. Following the semester, the students again completed the self-efficacy survey. The surveys were examined to identify differences in the initial and final self-efficacy in the experimental and control groups over the course of the semester. An analysis of variance was used to compare the experimental and control groups with p < 0.05 considered significant. Students in the experimental group reported more than a twofold (4.8 (C) versus 10.9 (E)) increase of confidence. Additionally, students in the experimental group were more motivated and less anxious when engaging in engineering design following the semester of creativity instruction. The results of this pilot study indicate that there is a significant potential to improve engineering students' creative self-efficacy through the implementation of a "curriculum of creativity" which is developed using theater methods.

  12. The Green Eating Project: web-based intervention to promote environmentally conscious eating behaviours in US university students.

    PubMed

    Monroe, Jessica T; Lofgren, Ingrid E; Sartini, Becky L; Greene, Geoffrey W

    2015-09-01

    To investigate the effectiveness of an online, interactive intervention, referred to as the Green Eating (GE) Project, to motivate university students to adopt GE behaviours. The study was quasi-experimental and integrated into courses for credit/extra credit. Courses were randomly stratified into experimental or non-treatment control. The 5-week intervention consisted of four modules based on different GE topics. Participants completed the GE survey at baseline (experimental, n 241; control, n 367) and post (experimental, n 187; control, n 304). The GE survey has been previously validated and consists of Transtheoretical Model constructs including stage of change (SOC), decisional balance (DB: Pros and Cons) and self-efficacy (SE: School and Home) as well as behaviours for GE. Modules contained basic information regarding each topic and knowledge items to assess content learning. The GE Project took place at a public university in the north-eastern USA. Participants were full-time students between the ages of 18 and 24 years. The GE Project was effective in significantly increasing GE behaviours, DB Pros, SE School and knowledge in experimental compared with control, but did not reduce DB Cons or increase SE Home. Experimental participants were also more likely to be in later SOC for GE at post testing. The GE Project was effective in increasing GE behaviours in university students. Motivating consumers towards adopting GE could assist in potentially mitigating negative consequences of the food system on the environment. Future research could tailor the intervention to participant SOC to further increase the effects or design the modules for other participants.

  13. Multi-Year Leaf-Level Response to Sub-Ambient and Elevated Experimental CO2 in Betula nana

    PubMed Central

    Broere, Tom; Kürschner, Wolfram M.; Donders, Timme H.; Wagner-Cremer, Friederike

    2016-01-01

    The strong link between stomatal frequency and CO2 in woody plants is key for understanding past CO2 dynamics, predicting future change, and evaluating the significant role of vegetation in the hydrological cycle. Experimental validation is required to evaluate the long-term adaptive leaf response of C3 plants to CO2 conditions; however, studies to date have only focused on short-term single-season experiments and may not capture (1) the full ontogeny of leaves to experimental CO2 exposure or (2) the true adjustment of structural stomatal properties to CO2, which we postulate is likely to occur over several growing seasons. We conducted controlled growth chamber experiments at 150 ppmv, 450 ppmv and 800 ppmv CO2 with woody C3 shrub Betula nana (dwarf birch) over two successive annual growing seasons and evaluated the structural stomatal response to atmospheric CO2 conditions. We find that while some adjustment of leaf morphological and stomatal parameters occurred in the first growing season where plants are exposed to experimental CO2 conditions, amplified adjustment of non-plastic stomatal properties such as stomatal conductance occurred in the second year of experimental CO2 exposure. We postulate that the species response limit to CO2 of B. nana may occur around 400–450 ppmv. Our findings strongly support the necessity for multi-annual experiments in C3 perennials in order to evaluate the effects of environmental conditions and provide a likely explanation of the contradictory results between historical and palaeobotanical records and experimental data. PMID:27285314

  14. Marvel Analysis of the Measured High-resolution Rovibronic Spectra of TiO

    NASA Astrophysics Data System (ADS)

    McKemmish, Laura K.; Masseron, Thomas; Sheppard, Samuel; Sandeman, Elizabeth; Schofield, Zak; Furtenbacher, Tibor; Császár, Attila G.; Tennyson, Jonathan; Sousa-Silva, Clara

    2017-02-01

    Accurate, experimental rovibronic energy levels, with associated labels and uncertainties, are reported for 11 low-lying electronic states of the diatomic {}48{{Ti}}16{{O}} molecule, determined using the Marvel (Measured Active Rotational-Vibrational Energy Levels) algorithm. All levels are based on lines corresponding to critically reviewed and validated high-resolution experimental spectra taken from 24 literature sources. The transition data are in the 2-22,160 cm-1 region. Out of the 49,679 measured transitions, 43,885 are triplet-triplet, 5710 are singlet-singlet, and 84 are triplet-singlet transitions. A careful analysis of the resulting experimental spectroscopic network (SN) allows 48,590 transitions to be validated. The transitions determine 93 vibrational band origins of {}48{{Ti}}16{{O}}, including 71 triplet and 22 singlet ones. There are 276 (73) triplet-triplet (singlet-singlet) band-heads derived from Marvel experimental energies, 123(38) of which have never been assigned in low- or high-resolution experiments. The highest J value, where J stands for the total angular momentum, for which an energy level is validated is 163. The number of experimentally derived triplet and singlet {}48{{Ti}}16{{O}} rovibrational energy levels is 8682 and 1882, respectively. The lists of validated lines and levels for {}48{{Ti}}16{{O}} are deposited in the supporting information to this paper.

  15. A Perspective on Research on Dishonesty: Limited External Validity Due to the Lack of Possibility of Self-Selection in Experimental Designs.

    PubMed

    Houdek, Petr

    2017-01-01

    The aim of this perspective article is to show that current experimental evidence on factors influencing dishonesty has limited external validity. Most of experimental studies is built on random assignments, in which control/experimental groups of subjects face varied sizes of the expected reward for behaving dishonestly, opportunities for cheating, means of rationalizing dishonest behavior etc., and mean groups' reactions are observed. The studies have internal validity in assessing the causal influence of these and other factors, but they lack external validity in organizational, market and other environments. If people can opt into or out of diverse real-world environments, an experiment aimed at studying factors influencing real-life degree of dishonesty should permit for such an option. The behavior of such self-selected groups of marginal subjects would probably contain a larger level of (non)deception than the behavior of average people. The article warns that there are not many studies that would enable self-selection or sorting of participants into varying environments, and that limits current knowledge of the extent and dynamics of dishonest and fraudulent behavior. The article focuses on suggestions how to improve dishonesty research, especially how to avoid the experimenter demand bias.

  16. A Perspective on Research on Dishonesty: Limited External Validity Due to the Lack of Possibility of Self-Selection in Experimental Designs

    PubMed Central

    Houdek, Petr

    2017-01-01

    The aim of this perspective article is to show that current experimental evidence on factors influencing dishonesty has limited external validity. Most of experimental studies is built on random assignments, in which control/experimental groups of subjects face varied sizes of the expected reward for behaving dishonestly, opportunities for cheating, means of rationalizing dishonest behavior etc., and mean groups’ reactions are observed. The studies have internal validity in assessing the causal influence of these and other factors, but they lack external validity in organizational, market and other environments. If people can opt into or out of diverse real-world environments, an experiment aimed at studying factors influencing real-life degree of dishonesty should permit for such an option. The behavior of such self-selected groups of marginal subjects would probably contain a larger level of (non)deception than the behavior of average people. The article warns that there are not many studies that would enable self-selection or sorting of participants into varying environments, and that limits current knowledge of the extent and dynamics of dishonest and fraudulent behavior. The article focuses on suggestions how to improve dishonesty research, especially how to avoid the experimenter demand bias. PMID:28955279

  17. Fatigue Failure of Space Shuttle Main Engine Turbine Blades

    NASA Technical Reports Server (NTRS)

    Swanson, Gregrory R.; Arakere, Nagaraj K.

    2000-01-01

    Experimental validation of finite element modeling of single crystal turbine blades is presented. Experimental results from uniaxial high cycle fatigue (HCF) test specimens and full scale Space Shuttle Main Engine test firings with the High Pressure Fuel Turbopump Alternate Turbopump (HPFTP/AT) provide the data used for the validation. The conclusions show the significant contribution of the crystal orientation within the blade on the resulting life of the component, that the analysis can predict this variation, and that experimental testing demonstrates it.

  18. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    NASA Astrophysics Data System (ADS)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  19. Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.

    PubMed

    Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan

    2013-01-01

    In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.

  20. Validating proposed migration equation and parameters' values as a tool to reproduce and predict 137Cs vertical migration activity in Spanish soils.

    PubMed

    Olondo, C; Legarda, F; Herranz, M; Idoeta, R

    2017-04-01

    This paper shows the procedure performed to validate the migration equation and the migration parameters' values presented in a previous paper (Legarda et al., 2011) regarding the migration of 137 Cs in Spanish mainland soils. In this paper, this model validation has been carried out checking experimentally obtained activity concentration values against those predicted by the model. This experimental data come from the measured vertical activity profiles of 8 new sampling points which are located in northern Spain. Before testing predicted values of the model, the uncertainty of those values has been assessed with the appropriate uncertainty analysis. Once establishing the uncertainty of the model, both activity concentration values, experimental versus model predicted ones, have been compared. Model validation has been performed analyzing its accuracy, studying it as a whole and also at different depth intervals. As a result, this model has been validated as a tool to predict 137 Cs behaviour in a Mediterranean environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.

    PubMed

    Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao

    2017-06-30

    Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.

  2. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do

    PubMed Central

    2017-01-01

    Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113

  3. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no attempt to validate a specific model, but several models of increasing complexity are compared with experimental results. The outcome is interpreted as a demonstration of the paradigm proposed by van der Heijde, 26 that different constituencies have different objectives for the validation process and therefore their acceptance criteria differ also.

  4. Experimental gravitation in space - Is there a future?

    NASA Astrophysics Data System (ADS)

    Wharton, R. A.; McKay, C. P.; Mancinelli, R. L.; Simmons, G. M.

    Experimental gravitation enters the 1990s with a past full of successes, but with a future full of uncertainties. Intellectually, the field is as vigorous as ever, with major thrusts in three main areas: the search for gravitational radiation, the study of post and post-post Newtonian effects, and the detection of hypothetical feeble new interactions. It is the only branch of space research involved in fundamental physics. But politically and financially, the future is uncertain. Competition for funding and for flight opportunities will be stiff for the foreseeable future, both with other disciplines such as astrophysics, planetary science and the military, and within experimental gravitation itself. Difficult choices lie ahead. This paper reviews the current state of the field and attempts to peer into the future.

  5. On the Simulation of Sea States with High Significant Wave Height for the Validation of Parameter Retrieval Algorithms for Future Altimetry Missions

    NASA Astrophysics Data System (ADS)

    Kuschenerus, Mieke; Cullen, Robert

    2016-08-01

    To ensure reliability and precision of wave height estimates for future satellite altimetry missions such as Sentinel 6, reliable parameter retrieval algorithms that can extract significant wave heights up to 20 m have to be established. The retrieved parameters, i.e. the retrieval methods need to be validated extensively on a wide range of possible significant wave heights. Although current missions require wave height retrievals up to 20 m, there is little evidence of systematic validation of parameter retrieval methods for sea states with wave heights above 10 m. This paper provides a definition of a set of simulated sea states with significant wave height up to 20 m, that allow simulation of radar altimeter response echoes for extreme sea states in SAR and low resolution mode. The simulated radar responses are used to derive significant wave height estimates, which can be compared with the initial models, allowing precision estimations of the applied parameter retrieval methods. Thus we establish a validation method for significant wave height retrieval for sea states causing high significant wave heights, to allow improved understanding and planning of future satellite altimetry mission validation.

  6. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Saxena, Abhinav; Goebel, Kai

    2012-01-01

    Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the future inputs to the system. Prognostics algorithm must account for this inherent uncertainty. In addition, these algorithms never know exactly the state of the system at the desired time of prediction, or the exact model describing the future evolution of the system, accumulating additional uncertainty into the predicted EOL. Prediction algorithms that do not account for these sources of uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in the prediction problem. We develop a general model-based prediction algorithm that incorporates these sources of uncertainty, and propose a novel approach to efficiently handle uncertainty in the future input trajectories of a system by using the unscented transformation. Using this approach, we are not only able to reduce the computational load but also estimate the bounds of uncertainty in a deterministic manner, which can be useful to consider during decision-making. Using a lithium-ion battery as a case study, we perform several simulation-based experiments to explore these issues, and validate the overall approach using experimental data from a battery testbed.

  7. Data Quality Assurance for Supersonic Jet Noise Measurements

    NASA Technical Reports Server (NTRS)

    Brown, Clifford A.; Henderson, Brenda S.; Bridges, James E.

    2010-01-01

    The noise created by a supersonic aircraft is a primary concern in the design of future high-speed planes. The jet noise reduction technologies required on these aircraft will be developed using scale-models mounted to experimental jet rigs designed to simulate the exhaust gases from a full-scale jet engine. The jet noise data collected in these experiments must accurately predict the noise levels produced by the full-scale hardware in order to be a useful development tool. A methodology has been adopted at the NASA Glenn Research Center s Aero-Acoustic Propulsion Laboratory to insure the quality of the supersonic jet noise data acquired from the facility s High Flow Jet Exit Rig so that it can be used to develop future nozzle technologies that reduce supersonic jet noise. The methodology relies on mitigating extraneous noise sources, examining the impact of measurement location on the acoustic results, and investigating the facility independence of the measurements. The methodology is documented here as a basis for validating future improvements and its limitations are noted so that they do not affect the data analysis. Maintaining a high quality jet noise laboratory is an ongoing process. By carefully examining the data produced and continually following this methodology, data quality can be maintained and improved over time.

  8. Selecting and Improving Quasi-Experimental Designs in Effectiveness and Implementation Research.

    PubMed

    Handley, Margaret A; Lyles, Courtney R; McCulloch, Charles; Cattamanchi, Adithya

    2018-04-01

    Interventional researchers face many design challenges when assessing intervention implementation in real-world settings. Intervention implementation requires holding fast on internal validity needs while incorporating external validity considerations (such as uptake by diverse subpopulations, acceptability, cost, and sustainability). Quasi-experimental designs (QEDs) are increasingly employed to achieve a balance between internal and external validity. Although these designs are often referred to and summarized in terms of logistical benefits, there is still uncertainty about (a) selecting from among various QEDs and (b) developing strategies to strengthen the internal and external validity of QEDs. We focus here on commonly used QEDs (prepost designs with nonequivalent control groups, interrupted time series, and stepped-wedge designs) and discuss several variants that maximize internal and external validity at the design, execution and implementation, and analysis stages.

  9. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  10. Biomarkers in systemic lupus erythematosus: challenges and prospects for the future

    PubMed Central

    Kao, Amy H.; Manzi, Susan; Ahearn, Joseph M.

    2013-01-01

    The search for lupus biomarkers to diagnose, monitor, stratify, and predict individual response to therapy is currently more intense than ever before. This effort is essential for several reasons. First, epidemic overdiagnosis and underdiagnosis of lupus, even by certified rheumatologists, leads to errors in therapy with concomitant side effects which may be more serious than the disease itself. Second, identification of lupus flares remains as much an art as it is a science. Third, the capacity to stratify patients so as to predict those who will develop specific patterns of organ involvement is not currently possible but would potentially lead to preventive therapeutic strategies. Fourth, only one new drug for the treatment of lupus has been approved by the US Food and Drug Administration in over 50 years. A major obstacle in this pipeline is the dearth of biomarkers available to prove a patient has responded to an experimental therapeutic intervention. This review will summarize the challenges faced in the discovery and validation of lupus biomarkers, the most promising lupus biomarkers identified to date, and the promise of future directions. PMID:23904865

  11. Application of historical mobility testing to sensor-based robotic performance

    NASA Astrophysics Data System (ADS)

    Willoughby, William E.; Jones, Randolph A.; Mason, George L.; Shoop, Sally A.; Lever, James H.

    2006-05-01

    The USA Engineer Research and Development Center (ERDC) has conducted on-/off-road experimental field testing with full-sized and scale-model military vehicles for more than fifty years. Some 4000 acres of local terrain are available for tailored field evaluations or verification/validation of future robotic designs in a variety of climatic regimes. Field testing and data collection procedures, as well as techniques for quantifying terrain in engineering terms, have been developed and refined into algorithms and models for predicting vehicle-terrain interactions and resulting forces or speeds of military-sized vehicles. Based on recent experiments with Matilda, Talon, and Pacbot, these predictive capabilities appear to be relevant to most robotic systems currently in development. Utilization of current testing capabilities with sensor-based vehicle drivers, or use of the procedures for terrain quantification from sensor data, would immediately apply some fifty years of historical knowledge to the development, refinement, and implementation of future robotic systems. Additionally, translation of sensor-collected terrain data into engineering terms would allow assessment of robotic performance a priori deployment of the actual system and ensure maximum system performance in the theater of operation.

  12. Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reboud, C.; Premel, D.; Lesselier, D.

    2007-03-21

    Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.

  13. Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations

    NASA Astrophysics Data System (ADS)

    Reboud, C.; Prémel, D.; Lesselier, D.; Bisiaux, B.

    2007-03-01

    Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.

  14. Validation of a physically based catchment model for application in post-closure radiological safety assessments of deep geological repositories for solid radioactive wastes.

    PubMed

    Thorne, M C; Degnan, P; Ewen, J; Parkin, G

    2000-12-01

    The physically based river catchment modelling system SHETRAN incorporates components representing water flow, sediment transport and radionuclide transport both in solution and bound to sediments. The system has been applied to simulate hypothetical future catchments in the context of post-closure radiological safety assessments of a potential site for a deep geological disposal facility for intermediate and certain low-level radioactive wastes at Sellafield, west Cumbria. In order to have confidence in the application of SHETRAN for this purpose, various blind validation studies have been undertaken. In earlier studies, the validation was undertaken against uncertainty bounds in model output predictions set by the modelling team on the basis of how well they expected the model to perform. However, validation can also be carried out with bounds set on the basis of how well the model is required to perform in order to constitute a useful assessment tool. Herein, such an assessment-based validation exercise is reported. This exercise related to a field plot experiment conducted at Calder Hollow, west Cumbria, in which the migration of strontium and lanthanum in subsurface Quaternary deposits was studied on a length scale of a few metres. Blind predictions of tracer migration were compared with experimental results using bounds set by a small group of assessment experts independent of the modelling team. Overall, the SHETRAN system performed well, failing only two out of seven of the imposed tests. Furthermore, of the five tests that were not failed, three were positively passed even when a pessimistic view was taken as to how measurement errors should be taken into account. It is concluded that the SHETRAN system, which is still being developed further, is a powerful tool for application in post-closure radiological safety assessments.

  15. Analytic Validation of RNA In Situ Hybridization (RISH) for AR and AR-V7 Expression in Human Prostate Cancer

    PubMed Central

    Guedes, Liana B.; Morais, Carlos L.; Almutairi, Fawaz; Haffner, Michael C.; Zheng, Qizhi; Isaacs, John T.; Antonarakis, Emmanuel S.; Lu, Changxue; Tsai, Harrison; Luo, Jun; De Marzo, Angelo M.; Lotan, Tamara L.

    2016-01-01

    Purpose RNA expression of androgen receptor splice variants may be a biomarker of resistance to novel androgen deprivation therapies in castrate resistant prostate cancer (CRPC). We analytically validated an RNA in situ hybridization (RISH) assay for total AR and AR-V7 for use in formalin fixed paraffin embedded (FFPE) prostate tumors. Experimental Design We used prostate cell lines and xenografts to validate chromogenic RISH to detect RNA containing AR exon 1 (AR-E1, surrogate for total AR RNA species) and cryptic exon 3 (AR-CE3, surrogate for AR-V7 expression). RISH signals were quantified in FFPE primary tumors and CRPC specimens, comparing to known AR and AR-V7 status by immunohistochemistry and RT-PCR. Results The quantified RISH results correlated significantly with total AR and AR-V7 levels by RT-PCR in cell lines, xenografts and autopsy metastases. Both AR-E1 and AR-CE3 RISH signals were localized in nuclear punctae in addition to the expected cytoplasmic speckles. Compared to admixed benign glands, AR-E1 expression was significantly higher in primary tumor cells with a median fold increase of 3.0 and 1.4 in two independent cohorts (p<0.0001 and p=0.04, respectively). While AR-CE3 expression was detectable in primary prostatic tumors, levels were substantially higher in a subset of CRPC metastases and cell lines, and were correlated with AR-E1 expression. Conclusions RISH for AR-E1 and AR-CE3 is an analytically valid method to examine total AR and AR-V7 RNA levels in FFPE tissues. Future clinical validation studies are required to determine whether AR RISH is a prognostic or predictive biomarker in specific clinical contexts. PMID:27166397

  16. Control of stacking loads in final waste disposal according to the borehole technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feuser, W.; Barnert, E.; Vijgen, H.

    1996-12-01

    The semihydrostatic model has been developed in order to assess the mechanical toads acting on heat-generating ILW(Q) and HTGR fuel element waste packages to be emplaced in vertical boreholes according to the borehole technique in underground rock salt formations. For the experimental validation of the theory, laboratory test stands reduced in scale are set up to simulate the bottom section of a repository borehole. A comparison of the measurement results with the data computed by the model, a correlation between the test stand results, and a systematic determination of material-typical crushed salt parameters in a separate research project will servemore » to derive a set of characteristic equations enabling a description of real conditions in a future repository.« less

  17. Encoding, training and retrieval in ferroelectric tunnel junctions

    NASA Astrophysics Data System (ADS)

    Xu, Hanni; Xia, Yidong; Xu, Bo; Yin, Jiang; Yuan, Guoliang; Liu, Zhiguo

    2016-05-01

    Ferroelectric tunnel junctions (FTJs) are quantum nanostructures that have great potential in the hardware basis for future neuromorphic applications. Among recently proposed possibilities, the artificial cognition has high hopes, where encoding, training, memory solidification and retrieval constitute a whole chain that is inseparable. However, it is yet envisioned but experimentally unconfirmed. The poor retention or short-term store of tunneling electroresistance, in particular the intermediate states, is still a key challenge in FTJs. Here we report the encoding, training and retrieval in BaTiO3 FTJs, emulating the key features of information processing in terms of cognitive neuroscience. This is implemented and exemplified through processing characters. Using training inputs that are validated by the evolution of both barrier profile and domain configuration, accurate recalling of encoded characters in the retrieval stage is demonstrated.

  18. An improved Burgers cellular automaton model for bicycle flow

    NASA Astrophysics Data System (ADS)

    Xue, Shuqi; Jia, Bin; Jiang, Rui; Li, Xingang; Shan, Jingjing

    2017-12-01

    As an energy-efficient and healthy transport mode, bicycling has recently attracted the attention of governments, transport planners, and researchers. The dynamic characteristics of the bicycle flow must be investigated to improve the facility design and traffic operation of bicycling. We model the bicycle flow by using an improved Burgers cellular automaton model. Through a following move mechanism, the modified model enables bicycles to move smoothly and increase the critical density to a more rational level than the original model. The model is calibrated and validated by using experimental data and field data. The results show that the improved model can effectively simulate the bicycle flow. The performance of the model under different parameters is investigated and discussed. Strengths and limitations of the improved model are suggested for future work.

  19. Thick photosensitive polyimide film side wall angle variability and scum improvement for IC packaging stress control

    NASA Astrophysics Data System (ADS)

    Mehta, Sohan Singh; Yeung, Marco; Mirza, Fahad; Raman, Thiagarajan; Longenbach, Travis; Morgan, Justin; Duggan, Mark; Soedibyo, Rio A.; Reidy, Sean; Rabie, Mohamed; Cho, Jae Kyu; Premachandran, C. S.; Faruqui, Danish

    2018-03-01

    In this paper, we demonstrate photosensitive polyimide (PSPI) profile optimization to effectively reduce stress concentrations and enable PSPI as protection package-induced stress. Through detailed package simulation, we demonstrate 45% reduction in stress as the sidewall angle (SWA) of PSPI is increased from 45 to 80 degrees in Cu pillar package types. Through modulation of coating and develop multi-step baking temperature and time, as well as dose energy and post litho surface treatments, we demonstrate a method for reliably obtaining PSPI sidewall angle >75 degree. Additionally, we experimentally validate the simulation findings that PSPI sidewall angle impacts chip package interaction (CPI). Finally, we conclude this paper with PSPI material and tool qualification requirements for future technology node based on current challenges.

  20. Modeling a Thermoelectric HVAC System for Automobiles

    NASA Astrophysics Data System (ADS)

    Junior, C. S.; Strupp, N. C.; Lemke, N. C.; Koehler, J.

    2009-07-01

    In automobiles thermal energy is used at various energy scales. With regard to reduction of CO2 emissions, efficient generation of hot and cold temperatures and wise use of waste heat are of paramount importance for car manufacturers worldwide. Thermoelectrics could be a vital component in automobiles of the future. To evaluate the applicability of thermoelectric modules in automobiles, a Modelica model of a thermoelectric liquid-gas heat exchanger was developed for transient simulations. The model uses component models from the object-oriented Modelica library TIL. It was validated based on experimental data of a prototype heat exchanger and used to simulate transient and steady-state behavior. The use of the model within the energy management of an automobile is successfully shown for the air-conditioning system of a car.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumitrescu, Eugene; Humble, Travis S.

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  2. Using the Relevance Vector Machine Model Combined with Local Phase Quantization to Predict Protein-Protein Interactions from Protein Sequences.

    PubMed

    An, Ji-Yong; Meng, Fan-Rong; You, Zhu-Hong; Fang, Yu-Hong; Zhao, Yu-Jun; Zhang, Ming

    2016-01-01

    We propose a novel computational method known as RVM-LPQ that combines the Relevance Vector Machine (RVM) model and Local Phase Quantization (LPQ) to predict PPIs from protein sequences. The main improvements are the results of representing protein sequences using the LPQ feature representation on a Position Specific Scoring Matrix (PSSM), reducing the influence of noise using a Principal Component Analysis (PCA), and using a Relevance Vector Machine (RVM) based classifier. We perform 5-fold cross-validation experiments on Yeast and Human datasets, and we achieve very high accuracies of 92.65% and 97.62%, respectively, which is significantly better than previous works. To further evaluate the proposed method, we compare it with the state-of-the-art support vector machine (SVM) classifier on the Yeast dataset. The experimental results demonstrate that our RVM-LPQ method is obviously better than the SVM-based method. The promising experimental results show the efficiency and simplicity of the proposed method, which can be an automatic decision support tool for future proteomics research.

  3. Accurate model annotation of a near-atomic resolution cryo-EM map

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hryc, Corey F.; Chen, Dong-Hua; Afonine, Pavel V.

    Electron cryomicroscopy (cryo-EM) has been used to determine the atomic coordinates (models) from density maps of biological assemblies. These models can be assessed by their overall fit to the experimental data and stereochemical information. However, these models do not annotate the actual density values of the atoms nor their positional uncertainty. Here, we introduce a computational procedure to derive an atomic model from a cryo- EM map with annotated metadata. The accuracy of such a model is validated by a faithful replication of the experimental cryo-EM map computed using the coordinates and associated metadata. The functional interpretation of any structuralmore » features in the model and its utilization for future studies can be made in the context of its measure of uncertainty. We applied this protocol to the 3.3-Å map of the mature P22 bacteriophage capsid, a large and complex macromolecular assembly. With this protocol, we identify and annotate previously undescribed molecular interactions between capsid subunits that are crucial to maintain stability in the absence of cementing proteins or cross-linking, as occur in other bacteriophages.« less

  4. Accurate model annotation of a near-atomic resolution cryo-EM map.

    PubMed

    Hryc, Corey F; Chen, Dong-Hua; Afonine, Pavel V; Jakana, Joanita; Wang, Zhao; Haase-Pettingell, Cameron; Jiang, Wen; Adams, Paul D; King, Jonathan A; Schmid, Michael F; Chiu, Wah

    2017-03-21

    Electron cryomicroscopy (cryo-EM) has been used to determine the atomic coordinates (models) from density maps of biological assemblies. These models can be assessed by their overall fit to the experimental data and stereochemical information. However, these models do not annotate the actual density values of the atoms nor their positional uncertainty. Here, we introduce a computational procedure to derive an atomic model from a cryo-EM map with annotated metadata. The accuracy of such a model is validated by a faithful replication of the experimental cryo-EM map computed using the coordinates and associated metadata. The functional interpretation of any structural features in the model and its utilization for future studies can be made in the context of its measure of uncertainty. We applied this protocol to the 3.3-Å map of the mature P22 bacteriophage capsid, a large and complex macromolecular assembly. With this protocol, we identify and annotate previously undescribed molecular interactions between capsid subunits that are crucial to maintain stability in the absence of cementing proteins or cross-linking, as occur in other bacteriophages.

  5. Model-based redesign of global transcription regulation

    PubMed Central

    Carrera, Javier; Rodrigo, Guillermo; Jaramillo, Alfonso

    2009-01-01

    Synthetic biology aims to the design or redesign of biological systems. In particular, one possible goal could be the rewiring of the transcription regulation network by exchanging the endogenous promoters. To achieve this objective, we have adapted current methods to the inference of a model based on ordinary differential equations that is able to predict the network response after a major change in its topology. Our procedure utilizes microarray data for training. We have experimentally validated our inferred global regulatory model in Escherichia coli by predicting transcriptomic profiles under new perturbations. We have also tested our methodology in silico by providing accurate predictions of the underlying networks from expression data generated with artificial genomes. In addition, we have shown the predictive power of our methodology by obtaining the gene profile in experimental redesigns of the E. coli genome, where rewiring the transcriptional network by means of knockouts of master regulators or by upregulating transcription factors controlled by different promoters. Our approach is compatible with most network inference methods, allowing to explore computationally future genome-wide redesign experiments in synthetic biology. PMID:19188257

  6. Investigation on bending failure to characterize crashworthiness of 6xxx-series aluminium sheet alloys with bending-tension test procedure

    NASA Astrophysics Data System (ADS)

    Henn, Philipp; Liewald, Mathias; Sindel, Manfred

    2018-05-01

    As lightweight design as well as crash performance are crucial to future car body design, exact material characterisation is important to use materials at their full potential and reach maximum efficiency. Within the scope of this paper, the potential of newly established bending-tension test procedure to characterise material crashworthiness is investigated. In this test setup for the determination of material failure, a buckling-bending test is coupled with a subsequent tensile test. If prior bending load is critical, tensile strength and elongation in the subsequent tensile test are dramatically reduced. The new test procedure therefore offers an applicable definition of failure as the incapacity of energy consumption in subsequent phases of the crash represents failure of a component. In addition to that, the correlation of loading condition with actual crash scenarios (buckling and free bending) is improved compared to three- point bending test. The potential of newly established bending-tension test procedure to characterise material crashworthiness is investigated in this experimental studys on two aluminium sheet alloys. Experimental results are validated with existing ductility characterisation from edge compression test.

  7. Accurate model annotation of a near-atomic resolution cryo-EM map

    PubMed Central

    Hryc, Corey F.; Chen, Dong-Hua; Afonine, Pavel V.; Jakana, Joanita; Wang, Zhao; Haase-Pettingell, Cameron; Jiang, Wen; Adams, Paul D.; King, Jonathan A.; Schmid, Michael F.; Chiu, Wah

    2017-01-01

    Electron cryomicroscopy (cryo-EM) has been used to determine the atomic coordinates (models) from density maps of biological assemblies. These models can be assessed by their overall fit to the experimental data and stereochemical information. However, these models do not annotate the actual density values of the atoms nor their positional uncertainty. Here, we introduce a computational procedure to derive an atomic model from a cryo-EM map with annotated metadata. The accuracy of such a model is validated by a faithful replication of the experimental cryo-EM map computed using the coordinates and associated metadata. The functional interpretation of any structural features in the model and its utilization for future studies can be made in the context of its measure of uncertainty. We applied this protocol to the 3.3-Å map of the mature P22 bacteriophage capsid, a large and complex macromolecular assembly. With this protocol, we identify and annotate previously undescribed molecular interactions between capsid subunits that are crucial to maintain stability in the absence of cementing proteins or cross-linking, as occur in other bacteriophages. PMID:28270620

  8. Global Quantitative Modeling of Chromatin Factor Interactions

    PubMed Central

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  9. Inlets, ducts, and nozzles

    NASA Technical Reports Server (NTRS)

    Abbott, John M.; Anderson, Bernhard H.; Rice, Edward J.

    1990-01-01

    The internal fluid mechanics research program in inlets, ducts, and nozzles consists of a balanced effort between the development of computational tools (both parabolized Navier-Stokes and full Navier-Stokes) and the conduct of experimental research. The experiments are designed to better understand the fluid flow physics, to develop new or improved flow models, and to provide benchmark quality data sets for validation of the computational methods. The inlet, duct, and nozzle research program is described according to three major classifications of flow phenomena: (1) highly 3-D flow fields; (2) shock-boundary-layer interactions; and (3) shear layer control. Specific examples of current and future elements of the research program are described for each of these phenomenon. In particular, the highly 3-D flow field phenomenon is highlighted by describing the computational and experimental research program in transition ducts having a round-to-rectangular area variation. In the case of shock-boundary-layer interactions, the specific details of research for normal shock-boundary-layer interactions are described. For shear layer control, research in vortex generators and the use of aerodynamic excitation for enhancement of the jet mixing process are described.

  10. Accurate model annotation of a near-atomic resolution cryo-EM map

    DOE PAGES

    Hryc, Corey F.; Chen, Dong-Hua; Afonine, Pavel V.; ...

    2017-03-07

    Electron cryomicroscopy (cryo-EM) has been used to determine the atomic coordinates (models) from density maps of biological assemblies. These models can be assessed by their overall fit to the experimental data and stereochemical information. However, these models do not annotate the actual density values of the atoms nor their positional uncertainty. Here, we introduce a computational procedure to derive an atomic model from a cryo- EM map with annotated metadata. The accuracy of such a model is validated by a faithful replication of the experimental cryo-EM map computed using the coordinates and associated metadata. The functional interpretation of any structuralmore » features in the model and its utilization for future studies can be made in the context of its measure of uncertainty. We applied this protocol to the 3.3-Å map of the mature P22 bacteriophage capsid, a large and complex macromolecular assembly. With this protocol, we identify and annotate previously undescribed molecular interactions between capsid subunits that are crucial to maintain stability in the absence of cementing proteins or cross-linking, as occur in other bacteriophages.« less

  11. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  12. X-ray spectroscopic diagnostics and modeling of polar-drive implosion experiments on the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Hakel, P.; Kyrala, G. A.; Bradley, P. A.; Krasheninnikova, N. S.; Murphy, T. J.; Schmitt, M. J.; Tregillis, I. L.; Kanzleieter, R. J.; Batha, S. H.; Fontes, C. J.; Sherrill, M. E.; Kilcrease, D. P.; Regan, S. P.

    2014-06-01

    A series of experiments featuring laser-imploded plastic-shell targets filled with hydrogen or deuterium were performed on the National Ignition Facility. The shells (some deuterated) were doped in selected locations with Cu, Ga, and Ge, whose spectroscopic signals (indicative of local plasma conditions) were collected with a time-integrated, 1-D imaging, spectrally resolved, and absolute-intensity calibrated instrument. The experimental spectra compare well with radiation hydrodynamics simulations post-processed with a non-local thermal equilibrium atomic kinetics and spectroscopic-quality radiation-transport model. The obtained degree of agreement between the modeling and experimental data supports the application of spectroscopic techniques for the determination of plasma conditions, which can ultimately lead to the validation of theoretical models for thermonuclear burn in the presence of mix. Furthermore, the use of a lower-Z dopant element (e.g., Fe) is suggested for future experiments, since the ˜2 keV electron temperatures reached in mixed regions are not high enough to drive sufficient H-like Ge and Cu line emissions needed for spectroscopic plasma diagnostics.

  13. MicroRNA-mediated regulatory circuits: outlook and perspectives

    NASA Astrophysics Data System (ADS)

    Cora', Davide; Re, Angela; Caselle, Michele; Bussolino, Federico

    2017-08-01

    MicroRNAs have been found to be necessary for regulating genes implicated in almost all signaling pathways, and consequently their dysfunction influences many diseases, including cancer. Understanding of the complexity of the microRNA-mediated regulatory network has grown in terms of size, connectivity and dynamics with the development of computational and, more recently, experimental high-throughput approaches for microRNA target identification. Newly developed studies on recurrent microRNA-mediated circuits in regulatory networks, also known as network motifs, have substantially contributed to addressing this complexity, and therefore to helping understand the ways by which microRNAs achieve their regulatory role. This review provides a summarizing view of the state-of-the-art, and perspectives of research efforts on microRNA-mediated regulatory motifs. In this review, we discuss the topological properties characterizing different types of circuits, and the regulatory features theoretically enabled by such properties, with a special emphasis on examples of circuits typifying their biological significance in experimentally validated contexts. Finally, we will consider possible future developments, in particular regarding microRNA-mediated circuits involving long non-coding RNAs and epigenetic regulators.

  14. Non-response to sad mood induction: implications for emotion research.

    PubMed

    Rottenberg, Jonathan; Kovacs, Maria; Yaroslavsky, Ilya

    2018-05-01

    Experimental induction of sad mood states is a mainstay of laboratory research on affect and cognition, mood regulation, and mood disorders. Typically, the success of such mood manipulations is reported as a statistically significant pre- to post-induction change in the self-rated intensity of the target affect. The present commentary was motivated by an unexpected finding in one of our studies concerning the response rate to a well-validated sad mood induction. Using the customary statistical approach, we found a significant mean increase in self-rated sadness intensity with a moderate effect size, verifying the "success" of the mood induction. However, that "success" masked that, between one-fifth and about one-third of our samples (adolescents who had histories of childhood-onset major depressive disorder and healthy controls) reported absolutely no sadness in response to the mood induction procedure. We consider implications of our experience for emotion research by (1) commenting upon the typically overlooked phenomenon of nonresponse, (2) suggesting changes in reporting practices regarding mood induction success, and (3) outlining future directions to help scientists determine why some subjects do not respond to experimental mood induction.

  15. Modeling of Stone-impact Resistance of Monolithic Glass Ply Using Continuum Damage Mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Xin; Khaleel, Mohammad A.; Davies, Richard W.

    2005-04-01

    We study the stone-impact resistance of a monolithic glass ply using a combined experimental and computational approach. Instrumented stone impact tests were first carried out in controlled environment. Explicit finite element analyses were then used to simulate the interactions of the indentor and the glass layer during the impact event, and a continuum damage mechanics (CDM) model was used to describe the constitutive behavior of glass. The experimentally measured strain histories for low velocity impact served as validation of the modeling procedures. Next, stair-stepping impact experiments were performed with two indentor sizes on two glass ply thickness, and the testmore » results were used to calibrate the critical stress parameters used in the CDM constitutive model. The purpose of this study is to establish the modeling procedures and the CDM critical stress parameters under impact loading conditions. The modeling procedures and the CDM model will be used in our future studies to predict through-thickness damage evolution patterns for different laminated windshield designs in automotive applications.« less

  16. Does the disorder matter? Investigating a moderating effect on coached noncredible overreporting using the MMPI-2 and PAI.

    PubMed

    Veltri, Carlo O C; Williams, John E

    2013-04-01

    The use of psychological tests to help identify the noncredible overreporting of psychiatric disorders is a long-standing practice that has received considerable attention from researchers. The purpose of this study was to experimentally determine whether feigning specific psychiatric disorders moderated the influence of coaching on the detection of noncredible overreporting using the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and the Personality Assessment Inventory (PAI). Using a 2 × 3 experimental analogue design, 265 undergraduates were asked to feign schizophrenia, posttraumatic stress disorder, or generalized anxiety disorder and were either coached about validity scales and disorders or not. The results of this study indicated that the specific psychiatric disorder being feigned did moderate the impact coaching had on the detection of overreported psychopathology using several scales on the MMPI-2 and PAI. Future research examining noncredible overreporting should take into account the impact caused by the interaction of psychiatric disorder with coaching on the detection of symptom overreporting and also identify other important moderating/mediating variables in order to develop more effective means of identifying response bias.

  17. The Martian surface radiation environment - a comparison of models and MSL/RAD measurements

    NASA Astrophysics Data System (ADS)

    Matthiä, Daniel; Ehresmann, Bent; Lohf, Henning; Köhler, Jan; Zeitlin, Cary; Appel, Jan; Sato, Tatsuhiko; Slaba, Tony; Martin, Cesar; Berger, Thomas; Boehm, Eckart; Boettcher, Stephan; Brinza, David E.; Burmeister, Soenke; Guo, Jingnan; Hassler, Donald M.; Posner, Arik; Rafkin, Scot C. R.; Reitz, Günther; Wilson, John W.; Wimmer-Schweingruber, Robert F.

    2016-03-01

    Context: The Radiation Assessment Detector (RAD) on the Mars Science Laboratory (MSL) has been measuring the radiation environment on the surface of Mars since August 6th 2012. MSL-RAD is the first instrument to provide detailed information about charged and neutral particle spectra and dose rates on the Martian surface, and one of the primary objectives of the RAD investigation is to help improve and validate current radiation transport models. Aims: Applying different numerical transport models with boundary conditions derived from the MSL-RAD environment the goal of this work was to both provide predictions for the particle spectra and the radiation exposure on the Martian surface complementing the RAD sensitive range and, at the same time, validate the results with the experimental data, where applicable. Such validated models can be used to predict dose rates for future manned missions as well as for performing shield optimization studies. Methods: Several particle transport models (GEANT4, PHITS, HZETRN/OLTARIS) were used to predict the particle flux and the corresponding radiation environment caused by galactic cosmic radiation on Mars. From the calculated particle spectra the dose rates on the surface are estimated. Results: Calculations of particle spectra and dose rates induced by galactic cosmic radiation on the Martian surface are presented. Although good agreement is found in many cases for the different transport codes, GEANT4, PHITS, and HZETRN/OLTARIS, some models still show large, sometimes order of magnitude discrepancies in certain particle spectra. We have found that RAD data is helping to make better choices of input parameters and physical models. Elements of these validated models can be applied to more detailed studies on how the radiation environment is influenced by solar modulation, Martian atmosphere and soil, and changes due to the Martian seasonal pressure cycle. By extending the range of the calculated particle spectra with respect to the experimental data additional information about the radiation environment is gained, and the contribution of different particle species to the dose is estimated.

  18. Validation of High-Fidelity Reactor Physics Models for Support of the KJRR Experimental Campaign in the Advanced Test Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigg, David W.; Nielsen, Joseph W.; Norman, Daren R.

    The Korea Atomic Energy Research Institute is currently in the process of qualifying a Low-Enriched Uranium fuel element design for the new Ki-Jang Research Reactor (KJRR). As part of this effort, a prototype KJRR fuel element was irradiated for several operating cycles in the Northeast Flux Trap of the Advanced Test Reactor (ATR) at the Idaho National Laboratory. The KJRR fuel element contained a very large quantity of fissile material (618g 235U) in comparison with historical ATR experiment standards (<1g 235U), and its presence in the ATR flux trap was expected to create a neutronic configuration that would be wellmore » outside of the approved validation envelope for the reactor physics analysis methods used to support ATR operations. Accordingly it was necessary, prior to high-power irradiation of the KJRR fuel element in the ATR, to conduct an extensive set of new low-power physics measurements with the KJRR fuel element installed in the ATR Critical Facility (ATRC), a companion facility to the ATR that is located in an immediately adjacent building, sharing the same fuel handling and storage canal. The new measurements had the objective of expanding the validation envelope for the computational reactor physics tools used to support ATR operations and safety analysis to include the planned KJRR irradiation in the ATR and similar experiments that are anticipated in the future. The computational and experimental results demonstrated that the neutronic behavior of the KJRR fuel element in the ATRC is well-understood, both in terms of its general effects on core excess reactivity and fission power distributions, its effects on the calibration of the core lobe power measurement system, as well as in terms of its own internal fission rate distribution and total fission power per unit ATRC core power. Taken as a whole, these results have significantly extended the ATR physics validation envelope, thereby enabling an entire new class of irradiation experiments.« less

  19. Experimental validation of the DARWIN2.3 package for fuel cycle applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    San-Felice, L.; Eschbach, R.; Bourdot, P.

    2012-07-01

    The DARWIN package, developed by the CEA and its French partners (AREVA and EDF) provides the required parameters for fuel cycle applications: fuel inventory, decay heat, activity, neutron, {gamma}, {alpha}, {beta} sources and spectrum, radiotoxicity. This paper presents the DARWIN2.3 experimental validation for fuel inventory and decay heat calculations on Pressurized Water Reactor (PWR). In order to validate this code system for spent fuel inventory a large program has been undertaken, based on spent fuel chemical assays. This paper deals with the experimental validation of DARWIN2.3 for the Pressurized Water Reactor (PWR) Uranium Oxide (UOX) and Mixed Oxide (MOX) fuelmore » inventory calculation, focused on the isotopes involved in Burn-Up Credit (BUC) applications and decay heat computations. The calculation - experiment (C/E-1) discrepancies are calculated with the latest European evaluation file JEFF-3.1.1 associated with the SHEM energy mesh. An overview of the tendencies is obtained on a complete range of burn-up from 10 to 85 GWd/t (10 to 60 GWcVt for MOX fuel). The experimental validation of the DARWIN2.3 package for decay heat calculation is performed using calorimetric measurements carried out at the Swedish Interim Spent Fuel Storage Facility for Pressurized Water Reactor (PWR) assemblies, covering a large burn-up (20 to 50 GWd/t) and cooling time range (10 to 30 years). (authors)« less

  20. Simulation of Liquid Droplet in Air and on a Solid Surface

    NASA Astrophysics Data System (ADS)

    Launglucknavalai, Kevin

    Although multiphase gas and liquid phenomena occurs widely in engineering problems, many aspects of multiphase interaction like within droplet dynamics are still not quantified. This study aims to qualify the Lattice Boltzmann (LBM) Interparticle Potential multiphase computational method in order to build a foundation for future multiphase research. This study consists of two overall sections. The first section in Chapter 2 focuses on understanding the LBM method and Interparticle Potential model. It outlines the LBM method and how it relates to macroscopic fluid dynamics. The standard form of LBM is obtained. The perturbation solution obtaining the Navier-Stokes equations from the LBM equation is presented. Finally, the Interparticle Potential model is incorporated into the numerical LBM method. The second section in Chapter 3 presents the verification and validation cases to confirm the behavior of the single-phase and multiphase LBM models. Experimental and analytical results are used briefly to compare with numerical results when possible using Poiseuille channel flow and flow over a cylinder. While presenting the numerical results, practical considerations like converting LBM scale variables to physical scale variables are considered. Multiphase results are verified using Laplaces law and artificial behaviors of the model are explored. In this study, a better understanding of the LBM method and Interparticle Potential model is gained. This allows the numerical method to be used for comparison with experimental results in the future and provides a better understanding of multiphase physics overall.

  1. Viability of Cross-Flow Fan with Helical Blades for Vertical Take-off and Landing Aircraft

    DTIC Science & Technology

    2012-09-01

    fluid dynamics (CFD) software, ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental results...computational fluid dynamics software (CFD), ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental...37 B. SIZING PARAMETERS AND ILLUSTRATION ................................. 37 APPENDIX B. ANSYS CFX PARAMETERS

  2. Protocol for Reliability Assessment of Structural Health Monitoring Systems Incorporating Model-assisted Probability of Detection (MAPOD) Approach

    DTIC Science & Technology

    2011-09-01

    a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range

  3. Work plan for improving the DARWIN2.3 depleted material balance calculation of nuclides of interest for the fuel cycle

    NASA Astrophysics Data System (ADS)

    Rizzo, Axel; Vaglio-Gaudard, Claire; Martin, Julie-Fiona; Noguère, Gilles; Eschbach, Romain

    2017-09-01

    DARWIN2.3 is the reference package used for fuel cycle applications in France. It solves the Boltzmann and Bateman equations in a coupling way, with the European JEFF-3.1.1 nuclear data library, to compute the fuel cycle values of interest. It includes both deterministic transport codes APOLLO2 (for light water reactors) and ERANOS2 (for fast reactors), and the DARWIN/PEPIN2 depletion code, each of them being developed by CEA/DEN with the support of its industrial partners. The DARWIN2.3 package has been experimentally validated for pressurized and boiling water reactors, as well as for sodium fast reactors; this experimental validation relies on the analysis of post-irradiation experiments (PIE). The DARWIN2.3 experimental validation work points out some isotopes for which the depleted concentration calculation can be improved. Some other nuclides have no available experimental validation, and their concentration calculation uncertainty is provided by the propagation of a priori nuclear data uncertainties. This paper describes the work plan of studies initiated this year to improve the accuracy of the DARWIN2.3 depleted material balance calculation concerning some nuclides of interest for the fuel cycle.

  4. Experimental and Numerical Analysis of Triaxially Braided Composites Utilizing a Modified Subcell Modeling Approach

    NASA Technical Reports Server (NTRS)

    Cater, Christopher; Xiao, Xinran; Goldberg, Robert K.; Kohlman, Lee W.

    2015-01-01

    A combined experimental and analytical approach was performed for characterizing and modeling triaxially braided composites with a modified subcell modeling strategy. Tensile coupon tests were conducted on a [0deg/60deg/-60deg] braided composite at angles of 0deg, 30deg, 45deg, 60deg and 90deg relative to the axial tow of the braid. It was found that measured coupon strength varied significantly with the angle of the applied load and each coupon direction exhibited unique final failures. The subcell modeling approach implemented into the finite element software LS-DYNA was used to simulate the various tensile coupon test angles. The modeling approach was successful in predicting both the coupon strength and reported failure mode for the 0deg, 30deg and 60deg loading directions. The model over-predicted the strength in the 90deg direction; however, the experimental results show a strong influence of free edge effects on damage initiation and failure. In the absence of these local free edge effects, the subcell modeling approach showed promise as a viable and computationally efficient analysis tool for triaxially braided composite structures. Future work will focus on validation of the approach for predicting the impact response of the braided composite against flat panel impact tests.

  5. Experimental and Numerical Analysis of Triaxially Braided Composites Utilizing a Modified Subcell Modeling Approach

    NASA Technical Reports Server (NTRS)

    Cater, Christopher; Xiao, Xinran; Goldberg, Robert K.; Kohlman, Lee W.

    2015-01-01

    A combined experimental and analytical approach was performed for characterizing and modeling triaxially braided composites with a modified subcell modeling strategy. Tensile coupon tests were conducted on a [0deg/60deg/-60deg] braided composite at angles [0deg, 30deg, 45deg, 60deg and 90deg] relative to the axial tow of the braid. It was found that measured coupon strength varied significantly with the angle of the applied load and each coupon direction exhibited unique final failures. The subcell modeling approach implemented into the finite element software LS-DYNA was used to simulate the various tensile coupon test angles. The modeling approach was successful in predicting both the coupon strength and reported failure mode for the 0deg, 30deg and 60deg loading directions. The model over-predicted the strength in the 90deg direction; however, the experimental results show a strong influence of free edge effects on damage initiation and failure. In the absence of these local free edge effects, the subcell modeling approach showed promise as a viable and computationally efficient analysis tool for triaxially braided composite structures. Future work will focus on validation of the approach for predicting the impact response of the braided composite against flat panel impact tests.

  6. Applying Knowledge of Enzyme Biochemistry to the Prediction of Functional Sites for Aiding Drug Discovery.

    PubMed

    Pai, Priyadarshini P; Mondal, Sukanta

    2017-01-01

    Enzymes are biological catalysts that play an important role in determining the patterns of chemical transformations pertaining to life. Many milestones have been achieved in unraveling the mechanisms in which the enzymes orchestrate various cellular processes using experimental and computational approaches. Experimental studies generating nearly all possible mutations of target enzymes have been aided by rapid computational approaches aiming at enzyme functional classification, understanding domain organization, functional site identification. The functional architecture, essentially, is involved in binding or interaction with ligands including substrates, products, cofactors, inhibitors, providing for their function, such as in catalysis, ligand mediated cell signaling, allosteric regulation and post-translational modifications. With the increasing availability of enzyme information and advances in algorithm development, computational approaches have now become more capable of providing precise inputs for enzyme engineering, and in the process also making it more efficient. This has led to interesting findings, especially in aberrant enzyme interactions, such as hostpathogen interactions in infection, neurodegenerative diseases, cancer and diabetes. This review aims to summarize in retrospection - the mined knowledge, vivid perspectives and challenging strides in using available experimentally validated enzyme information for characterization. An analytical outlook is presented on the scope of exploring future directions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. Experimental task-based optimization of a four-camera variable-pinhole small-animal SPECT system

    NASA Astrophysics Data System (ADS)

    Hesterman, Jacob Y.; Kupinski, Matthew A.; Furenlid, Lars R.; Wilson, Donald W.

    2005-04-01

    We have previously utilized lumpy object models and simulated imaging systems in conjunction with the ideal observer to compute figures of merit for hardware optimization. In this paper, we describe the development of methods and phantoms necessary to validate or experimentally carry out these optimizations. Our study was conducted on a four-camera small-animal SPECT system that employs interchangeable pinhole plates to operate under a variety of pinhole configurations and magnifications (representing optimizable system parameters). We developed a small-animal phantom capable of producing random backgrounds for each image sequence. The task chosen for the study was the detection of a 2mm diameter sphere within the phantom-generated random background. A total of 138 projection images were used, half of which included the signal. As our observer, we employed the channelized Hotelling observer (CHO) with Laguerre-Gauss channels. The signal-to-noise (SNR) of this observer was used to compare different system configurations. Results indicate agreement between experimental and simulated data with higher detectability rates found for multiple-camera, multiple-pinhole, and high-magnification systems, although it was found that mixtures of magnifications often outperform systems employing a single magnification. This work will serve as a basis for future studies pertaining to system hardware optimization.

  8. Concept Development for Future Domains: A New Method of Knowledge Elicitation

    DTIC Science & Technology

    2005-06-01

    Procedure: U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) examined methods to generate, refine, test , and validate new...generate, elaborate, refine, describe, test , and validate new Future Force concepts relating to doctrine, tactics, techniques, procedures, unit and team...System (Harvey, 1993), and the Job Element Method (Primoff & Eyde , 1988). Figure 1 provides a more comprehensive list of task analytic methods. Please see

  9. Remembering the past and planning for the future in rats

    PubMed Central

    Crystal, Jonathon D.

    2012-01-01

    A growing body of research suggests that rats represent and remember specific earlier events from the past. An important criterion for validating a rodent model of episodic memory is to establish that the content of the representation is about a specific event in the past rather than vague information about remoteness. Recent evidence suggests that rats may also represent events that are anticipated to occur in the future. An important capacity afforded by a representation of the future is the ability to plan for the occurrence of a future event. However, relatively little is known about the content of represented future events and the cognitive mechanisms that may support planning. This article reviews evidence that rats remember specific earlier events from the past, represent events that are anticipated to ccur in the future, and develops criteria for validating a rodent model of future planning. These criteria include representing a specific time in the future, the ability to temporarily disengage from a plan and reactivate the plan at an appropriate time in the future, and flexibility to deploy a plan in novel conditions. PMID:23219951

  10. The Ca(2+)-EDTA chelation as standard reaction to validate Isothermal Titration Calorimeter measurements (ITC).

    PubMed

    Ràfols, Clara; Bosch, Elisabeth; Barbas, Rafael; Prohens, Rafel

    2016-07-01

    A study about the suitability of the chelation reaction of Ca(2+)with ethylenediaminetetraacetic acid (EDTA) as a validation standard for Isothermal Titration Calorimeter measurements has been performed exploring the common experimental variables (buffer, pH, ionic strength and temperature). Results obtained in a variety of experimental conditions have been amended according to the side reactions involved in the main process and to the experimental ionic strength and, finally, validated by contrast with the potentiometric reference values. It is demonstrated that the chelation reaction performed in acetate buffer 0.1M and 25°C shows accurate and precise results and it is robust enough to be adopted as a standard calibration process. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Experimental validation of an ultrasonic flowmeter for unsteady flows

    NASA Astrophysics Data System (ADS)

    Leontidis, V.; Cuvier, C.; Caignaert, G.; Dupont, P.; Roussette, O.; Fammery, S.; Nivet, P.; Dazin, A.

    2018-04-01

    An ultrasonic flowmeter was developed for further applications in cryogenic conditions and for measuring flow rate fluctuations in the range of 0 to 70 Hz. The prototype was installed in a flow test rig, and was validated experimentally both in steady and unsteady water flow conditions. A Coriolis flowmeter was used for the calibration under steady state conditions, whereas in the unsteady case the validation was done simultaneously against two methods: particle image velocimetry (PIV), and with pressure transducers installed flush on the wall of the pipe. The results show that the developed flowmeter and the proposed methodology can accurately measure the frequency and amplitude of unsteady fluctuations in the experimental range of 0-9 l s-1 of the mean main flow rate and 0-70 Hz of the imposed disturbances.

  12. Flight Research and Validation Formerly Experimental Capabilities Supersonic Project

    NASA Technical Reports Server (NTRS)

    Banks, Daniel

    2009-01-01

    This slide presentation reviews the work of the Experimental Capabilities Supersonic project, that is being reorganized into Flight Research and Validation. The work of Experimental Capabilities Project in FY '09 is reviewed, and the specific centers that is assigned to do the work is given. The portfolio of the newly formed Flight Research and Validation (FRV) group is also reviewed. The various projects for FY '10 for the FRV are detailed. These projects include: Eagle Probe, Channeled Centerbody Inlet Experiment (CCIE), Supersonic Boundary layer Transition test (SBLT), Aero-elastic Test Wing-2 (ATW-2), G-V External Vision Systems (G5 XVS), Air-to-Air Schlieren (A2A), In Flight Background Oriented Schlieren (BOS), Dynamic Inertia Measurement Technique (DIM), and Advanced In-Flight IR Thermography (AIR-T).

  13. Development and validation of the coping with terror scale.

    PubMed

    Stein, Nathan R; Schorr, Yonit; Litz, Brett T; King, Lynda A; King, Daniel W; Solomon, Zahava; Horesh, Danny

    2013-10-01

    Terrorism creates lingering anxiety about future attacks. In prior terror research, the conceptualization and measurement of coping behaviors were constrained by the use of existing coping scales that index reactions to daily hassles and demands. The authors created and validated the Coping with Terror Scale to fill the measurement gap. The authors emphasized content validity, leveraging the knowledge of terror experts and groups of Israelis. A multistep approach involved construct definition and item generation, trimming and refining the measure, exploring the factor structure underlying item responses, and garnering evidence for reliability and validity. The final scale comprised six factors that were generally consistent with the authors' original construct specifications. Scores on items linked to these factors demonstrate good reliability and validity. Future studies using the Coping with Terror Scale with other populations facing terrorist threats are needed to test its ability to predict resilience, functional impairment, and psychological distress.

  14. Aerodynamic Database Development for Mars Smart Lander Vehicle Configurations

    NASA Technical Reports Server (NTRS)

    Bobskill, Glenn J.; Parikh, Paresh C.; Prabhu, Ramadas K.; Tyler, Erik D.

    2002-01-01

    An aerodynamic database has been generated for the Mars Smart Lander Shelf-All configuration using computational fluid dynamics (CFD) simulations. Three different CFD codes, USM3D and FELISA, based on unstructured grid technology and LAURA, an established and validated structured CFD code, were used. As part of this database development, the results for the Mars continuum were validated with experimental data and comparisons made where applicable. The validation of USM3D and LAURA with the Unitary experimental data, the use of intermediate LAURA check analyses, as well as the validation of FELISA with the Mach 6 CF(sub 4) experimental data provided a higher confidence in the ability for CFD to provide aerodynamic data in order to determine the static trim characteristics for longitudinal stability. The analyses of the noncontinuum regime showed the existence of multiple trim angles of attack that can be unstable or stable trim points. This information is needed to design guidance controller throughout the trajectory.

  15. Issues and approach to develop validated analysis tools for hypersonic flows: One perspective

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.

    1993-01-01

    Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.

  16. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  17. Issues and approach to develop validated analysis tools for hypersonic flows: One perspective

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.

    1992-01-01

    Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.

  18. Fractional viscoelasticity in fractal and non-fractal media: Theory, experimental validation, and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Mashayekhi, Somayeh; Miles, Paul; Hussaini, M. Yousuff; Oates, William S.

    2018-02-01

    In this paper, fractional and non-fractional viscoelastic models for elastomeric materials are derived and analyzed in comparison to experimental results. The viscoelastic models are derived by expanding thermodynamic balance equations for both fractal and non-fractal media. The order of the fractional time derivative is shown to strongly affect the accuracy of the viscoelastic constitutive predictions. Model validation uses experimental data describing viscoelasticity of the dielectric elastomer Very High Bond (VHB) 4910. Since these materials are known for their broad applications in smart structures, it is important to characterize and accurately predict their behavior across a large range of time scales. Whereas integer order viscoelastic models can yield reasonable agreement with data, the model parameters often lack robustness in prediction at different deformation rates. Alternatively, fractional order models of viscoelasticity provide an alternative framework to more accurately quantify complex rate-dependent behavior. Prior research that has considered fractional order viscoelasticity lacks experimental validation and contains limited links between viscoelastic theory and fractional order derivatives. To address these issues, we use fractional order operators to experimentally validate fractional and non-fractional viscoelastic models in elastomeric solids using Bayesian uncertainty quantification. The fractional order model is found to be advantageous as predictions are significantly more accurate than integer order viscoelastic models for deformation rates spanning four orders of magnitude.

  19. Perceived visual informativeness (PVI): construct and scale development to assess visual information in printed materials.

    PubMed

    King, Andy J; Jensen, Jakob D; Davis, LaShara A; Carcioppolo, Nick

    2014-01-01

    There is a paucity of research on the visual images used in health communication messages and campaign materials. Even though many studies suggest further investigation of these visual messages and their features, few studies provide specific constructs or assessment tools for evaluating the characteristics of visual messages in health communication contexts. The authors conducted 2 studies to validate a measure of perceived visual informativeness (PVI), a message construct assessing visual messages presenting statistical or indexical information. In Study 1, a 7-item scale was created that demonstrated good internal reliability (α = .91), as well as convergent and divergent validity with related message constructs such as perceived message quality, perceived informativeness, and perceived attractiveness. PVI also converged with a preference for visual learning but was unrelated to a person's actual vision ability. In addition, PVI exhibited concurrent validity with a number of important constructs including perceived message effectiveness, decisional satisfaction, and three key public health theory behavior predictors: perceived benefits, perceived barriers, and self-efficacy. Study 2 provided more evidence that PVI is an internally reliable measure and demonstrates that PVI is a modifiable message feature that can be tested in future experimental work. PVI provides an initial step to assist in the evaluation and testing of visual messages in campaign and intervention materials promoting informed decision making and behavior change.

  20. Improving cultural diversity awareness of physical therapy educators.

    PubMed

    Lazaro, Rolando T; Umphred, Darcy A

    2007-01-01

    In a climate of increasing diversity in the population of patients requiring physical therapy (PT) services, PT educators must prepare students and future clinicians to work competently in culturally diverse environments. To be able to achieve this goal, PT educators must be culturally competent as well. The purposes of the study were to develop a valid and reliable instrument to assess cultural diversity awareness and to develop an educational workshop to improve cultural diversity awareness of PT academic and clinical educators. Phase 1 of the study involved the development of an instrument to assess cultural diversity awareness. The Cultural Diversity Awareness Questionnaire (CDAQ) was developed, validated for content, analyzed for reliability, and field and pilot tested. Results indicated that the CDAQ has favorable psychometric properties. Phase 2 of the study involved the development and implementation of the Cultural Diversity Workshop (CDW). The seminar contents and class materials were developed, validated, and implemented as a one-day cultural diversity awareness seminar. A one-group, pretest-posttest experimental design was used, with participants who completed the CDAQ before and after the workshop. Results indicated that the workshop was effective in improving cultural diversity awareness of the participants. Results of the workshop evaluation affirmed the achievement of objectives and effectiveness of the facilitator. This study provided a solid initial foundation upon which a comprehensive cultural competence program can be developed.

  1. Numerical Simulations of Noise Generated by High Aspect Ratio Supersonic Rectangular Jets - Validation

    NASA Astrophysics Data System (ADS)

    Viswanath, Kamal; Johnson, Ryan; Kailasanath, Kailas; Malla, Bhupatindra; Gutmark, Ephraim

    2017-11-01

    The noise from high performance jet engines of both civilian and military aircraft is an area of active concern. Asymmetric exhaust nozzle configurations, in particular rectangular, potentially offer a passive way of modulating the farfield noise and are likely to become more important in the future. High aspect ratio nozzles offer the further benefit of easier airframe integration. In this study we validate the far field noise for ideally and over expanded supersonic jets issuing from a high aspect ratio rectangular nozzle geometry. Validation of the acoustic data is performed against experimentally recorded sound pressure level (SPL) spectra for a host of observer locations around the asymmetric nozzle. Data is presented for a slightly heated jet case for both nozzle pressure ratios. The contrast in the noise profile from low aspect ratio rectangular and circular nozzle jets are highlighted, especially the variation in the azimuthal direction that shows ``quiet'' and ``loud'' planes in the farfield in the peak noise direction. This variation is analyzed in the context of the effect of mixing at the sharp corners, the sense of the vortex pairs setup in the exit plane, and the evolution of the high aspect ratio exit cross-section as it propagates downstream including possible axis-switching. Supported by Office of Naval Research (ONR) through the Computational Physics Task Area under the NRL 6.1 Base Program.

  2. Validation Methods for Fault-Tolerant avionics and control systems, working group meeting 1

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The proceedings of the first working group meeting on validation methods for fault tolerant computer design are presented. The state of the art in fault tolerant computer validation was examined in order to provide a framework for future discussions concerning research issues for the validation of fault tolerant avionics and flight control systems. The development of positions concerning critical aspects of the validation process are given.

  3. If We Don’t, Who Will? The Employment of the United States Army to Combat Potential Pandemic Outbreaks in West Africa

    DTIC Science & Technology

    2015-06-12

    27 viii Threats to Validity and Biases ...draw conclusions and make recommendations for future research. Threats to Validity and Biases There are a several issues that pose a threat to...validity and bias to the research. Threats to validity affect the accuracy of the research and soundness of the conclusion. Threats to external validity

  4. Land Product Validation (LPV)

    NASA Technical Reports Server (NTRS)

    Schaepman, Gabriela; Roman, Miguel O.

    2013-01-01

    This presentation will discuss Land Product Validation (LPV) objectives and goals, LPV structure update, interactions with other initiatives during report period, outreach to the science community, future meetings and next steps.

  5. Systematic Review of Measures Used in Pictorial Cigarette Pack Warning Experiments.

    PubMed

    Francis, Diane B; Hall, Marissa G; Noar, Seth M; Ribisl, Kurt M; Brewer, Noel T

    2017-10-01

    We sought to describe characteristics and psychometric properties of measures used in pictorial cigarette pack warning experiments and provide recommendations for future studies. Our systematic review identified 68 pictorial cigarette pack warning experiments conducted between 2000 and 2016 in 22 countries. Two independent coders coded all studies on study features, including sample characteristics, theoretical framework, and constructs assessed. We also coded measurement characteristics, including construct, number of items, source, reliability, and validity. We identified 278 measures representing 61 constructs. The most commonly assessed construct categories were warning reactions (62% of studies) and perceived effectiveness (60%). The most commonly used outcomes were affective reactions (35%), perceived likelihood of harm (22%), intention to quit smoking (22%), perceptions that warnings motivate people to quit smoking (18%), and credibility (16%). Only 4 studies assessed smoking behavior. More than half (54%) of all measures were single items. For multi-item measures, studies reported reliability data 68% of the time (mean α = 0.88, range α = 0.68-0.98). Studies reported sources of measures only 33% of the time and rarely reported validity data. Of 68 studies, 37 (54%) mentioned a theory as informing the study. Our review found great variability in constructs and measures used to evaluate the impact of cigarette pack pictorial warnings. Many measures were single items with unknown psychometric properties. Recommendations for future studies include a greater emphasis on theoretical models that inform measurement, use of reliable and validated (preferably multi-item) measures, and better reporting of measure sources. Robust and consistent measurement is important for building a strong, cumulative evidence base to support pictorial cigarette pack warning policies. This systematic review of experimental studies of pictorial cigarette warnings demonstrates the need for standardized, theory-based measures. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. 7 CFR 27.43 - Validity of cotton class certificates.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Validity of cotton class certificates. 27.43 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.43 Validity of cotton class certificates. Each cotton class certificate for cotton classified...

  7. 7 CFR 27.43 - Validity of cotton class certificates.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Validity of cotton class certificates. 27.43 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.43 Validity of cotton class certificates. Each cotton class certificate for cotton classified...

  8. 7 CFR 27.43 - Validity of cotton class certificates.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Validity of cotton class certificates. 27.43 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.43 Validity of cotton class certificates. Each cotton class certificate for cotton classified...

  9. 7 CFR 27.43 - Validity of cotton class certificates.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Validity of cotton class certificates. 27.43 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.43 Validity of cotton class certificates. Each cotton class certificate for cotton classified...

  10. 7 CFR 27.43 - Validity of cotton class certificates.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Validity of cotton class certificates. 27.43 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.43 Validity of cotton class certificates. Each cotton class certificate for cotton classified...

  11. Computer Simulations of Coronary Blood Flow Through a Constriction

    DTIC Science & Technology

    2014-03-01

    interventional procedures (e.g., stent deployment). Building off previous models that have been partially validated with experimental data, this thesis... stent deployment). Building off previous models that have been partially validated with experimental data, this thesis continues to develop the...the artery and increase blood flow. Generally a stent , or a mesh wire tube, is permanently inserted in order to scaffold open the artery wall

  12. Perceptions vs Reality: A Longitudinal Experiment in Influenced Judgement Performance

    DTIC Science & Technology

    2003-03-25

    validity were manifested equally between treatment and control groups , thereby lending further validity to the experimental research design . External...Stanley (1975) identify this as a True Experimental Design : Pretest- Posttest Control Group Design . However, due to the longitudinal aspect required to...1975:43). Nonequivalence will be ruled out as pretest equivalence is shown between treatment and control groups (1975:47). For quasi

  13. Design, development and method validation of a novel multi-resonance microwave sensor for moisture measurement.

    PubMed

    Peters, Johanna; Taute, Wolfgang; Bartscher, Kathrin; Döscher, Claas; Höft, Michael; Knöchel, Reinhard; Breitkreutz, Jörg

    2017-04-08

    Microwave sensor systems using resonance technology at a single resonance in the range of 2-3 GHz have been shown to be a rapid and reliable tool for moisture determination in solid materials including pharmaceutical granules. So far, their application is limited to lower moisture ranges or limitations above certain moisture contents had to be accepted. Aim of the present study was to develop a novel multi-resonance sensor system in order to expand the measurement range. Therefore, a novel sensor using additional resonances over a wide frequency band was designed and used to investigate inherent limitations of first generation sensor systems and material-related limits. Using granule samples with different moisture contents, an experimental protocol for calibration and validation of the method was established. Pursuant to this protocol, a multiple linear regression (MLR) prediction model built by correlating microwave moisture values to the moisture determined by Karl Fischer titration was chosen and rated using conventional criteria such as coefficient of determination (R 2 ) and root mean square error of calibration (RMSEC). Using different operators, different analysis dates and different ambient conditions the method was fully validated following the guidance of ICH Q2(R1). The study clearly showed explanations for measurement uncertainties of first generation sensor systems which confirmed the approach to overcome these by using additional resonances. The established prediction model could be validated in the range of 7.6-19.6%, demonstrating its fit for its future purpose, the moisture content determination during wet granulations. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Methodology used in comparative studies assessing programmes of transition from paediatrics to adult care programmes: a systematic review.

    PubMed

    Le Roux, E; Mellerio, H; Guilmin-Crépon, S; Gottot, S; Jacquin, P; Boulkedid, R; Alberti, C

    2017-01-27

    To explore the methodologies employed in studies assessing transition of care interventions, with the aim of defining goals for the improvement of future studies. Systematic review of comparative studies assessing transition to adult care interventions for young people with chronic conditions. MEDLINE, EMBASE, ClinicalTrial.gov. 2 reviewers screened comparative studies with experimental and quasi-experimental designs, published or registered before July 2015. Eligible studies evaluate transition interventions at least in part after transfer to adult care of young people with chronic conditions with at least one outcome assessed quantitatively. 39 studies were reviewed, 26/39 (67%) published their final results and 13/39 (33%) were in progress. In 9 studies (9/39, 23%) comparisons were made between preintervention and postintervention in a single group. Randomised control groups were used in 9/39 (23%) studies. 2 (2/39, 5%) reported blinding strategies. Use of validated questionnaires was reported in 28% (11/39) of studies. In terms of reporting in published studies 15/26 (58%) did not report age at transfer, and 6/26 (23%) did not report the time of collection of each outcome. Few evaluative studies exist and their level of methodological quality is variable. The complexity of interventions, multiplicity of outcomes, difficulty of blinding and the small groups of patients have consequences on concluding on the effectiveness of interventions. The evaluation of the transition interventions requires an appropriate and common methodology which will provide access to a better level of evidence. We identified areas for improvement in terms of randomisation, recruitment and external validity, blinding, measurement validity, standardised assessment and reporting. Improvements will increase our capacity to determine effective interventions for transition care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  15. Rapid Countermeasure Discovery against Francisella tularensis Based on a Metabolic Network Reconstruction

    PubMed Central

    Chaudhury, Sidhartha; Abdulhameed, Mohamed Diwan M.; Singh, Narender; Tawa, Gregory J.; D’haeseleer, Patrik M.; Zemla, Adam T.; Navid, Ali; Zhou, Carol E.; Franklin, Matthew C.; Cheung, Jonah; Rudolph, Michael J.; Love, James; Graf, John F.; Rozak, David A.; Dankmeyer, Jennifer L.; Amemiya, Kei; Daefler, Simon; Wallqvist, Anders

    2013-01-01

    In the future, we may be faced with the need to provide treatment for an emergent biological threat against which existing vaccines and drugs have limited efficacy or availability. To prepare for this eventuality, our objective was to use a metabolic network-based approach to rapidly identify potential drug targets and prospectively screen and validate novel small-molecule antimicrobials. Our target organism was the fully virulent Francisella tularensis subspecies tularensis Schu S4 strain, a highly infectious intracellular pathogen that is the causative agent of tularemia and is classified as a category A biological agent by the Centers for Disease Control and Prevention. We proceeded with a staggered computational and experimental workflow that used a strain-specific metabolic network model, homology modeling and X-ray crystallography of protein targets, and ligand- and structure-based drug design. Selected compounds were subsequently filtered based on physiological-based pharmacokinetic modeling, and we selected a final set of 40 compounds for experimental validation of antimicrobial activity. We began screening these compounds in whole bacterial cell-based assays in biosafety level 3 facilities in the 20th week of the study and completed the screens within 12 weeks. Six compounds showed significant growth inhibition of F. tularensis, and we determined their respective minimum inhibitory concentrations and mammalian cell cytotoxicities. The most promising compound had a low molecular weight, was non-toxic, and abolished bacterial growth at 13 µM, with putative activity against pantetheine-phosphate adenylyltransferase, an enzyme involved in the biosynthesis of coenzyme A, encoded by gene coaD. The novel antimicrobial compounds identified in this study serve as starting points for lead optimization, animal testing, and drug development against tularemia. Our integrated in silico/in vitro approach had an overall 15% success rate in terms of active versus tested compounds over an elapsed time period of 32 weeks, from pathogen strain identification to selection and validation of novel antimicrobial compounds. PMID:23704901

  16. A systematic RE-AIM review to assess sugar-sweetened beverage interventions for children and adolescents across the socio-ecological model

    PubMed Central

    Porter, Kathleen; Estabrooks, Paul; Zoellner, Jamie

    2016-01-01

    Background Sugar-sweetened beverage (SSB) consumption among children and adolescents is a determinant of childhood obesity. Many programs to reduce consumption across the socio-ecological model report significant positive results; however, the generalizability of the results, including whether reporting differences exist among socio-ecological strategy levels, is unknown. Objectives This systematic review aims to (1) examine the extent to which studies reported internal and external validity indicators defined by RE-AIM (reach, effectiveness, adoption, implementation, maintenance) and (2) assess reporting differences by socio-ecological level: intrapersonal/interpersonal (Level 1), environmental/policy (Level 2), multi-level (Combined Level). Methods Six major databases (PubMed, Web of Science, Cinahl, CAB Abstracts, ERIC, and Agiricola) systematic literature review was conducted to identify studies from 2004–2015 meeting inclusion criteria (targeting children aged 3–12, adolescents 13–17, and young adults 18 years, experimental/quasi-experimental, substantial SSB component). Interventions were categorized by socio-ecological level, and data were extracted using a validated RE-AIM protocol. A one-way ANOVA assessed differences between levels. Results There were 55 eligible studies (N) accepted, including 21 Level 1, 18 Level 2, and 16 Combined Level studies. Thirty-six (65%) were conducted in the USA, 19 (35%) internationally, and 39 (71%) were implemented in schools. Across levels, reporting averages were low for all RE-AIM dimensions (reach=29%, efficacy/effectiveness=45%, adoption=26%, implementation=27%, maintenance=14%). Level 2 studies had significantly lower reporting on reach and effectiveness (10% and 26%, respectively) compared to Level 1 (44%, 57%) or Combined Level studies (31%, 52%) (p<0.001). Adoption, implementation, and maintenance reporting did not vary among levels. Conclusion Interventions to reduce SSB in children and adolescents across the socio-ecological spectrum do not provide the necessary information for dissemination and implementation in community nutrition settings. Future interventions should address both internal and external validity to maximize population impact. PMID:27262383

  17. The EUROSEISTEST Experimental Test Site in Greece

    NASA Astrophysics Data System (ADS)

    Pitilakis, K.; Manos, G.; Raptakis, D.; Anastasiadis, A.; Makra, K.; Manakou, M.

    2009-04-01

    The European experimental site EUROSEISTEST has been established since 1993 in the epicentral area of the June 20th 1978 earthquake (40.8˚ N, 23.2˚ E, Ms 6.5, Imax VIII+ MSK, Papazachos et al., 1979), located in the active tectonic Mygdonian basin, 30km NNE from Thessaloniki, Greece. Euroseistest has been funded by the European Commission - Directorate General for Research and Development under the framework of consecutive EC research projects (EuroseisTest, EuroseisMod and Eurroseisrisk). It is specially designed and dedicated to conduct experimental and theoretical studies on site effects, soil and site characterization and soil-foundation-structure interaction phenomena. The geological, geophysical and geotechnical conditions of the Euroseistest valley (Mygdonian graben) is very well constrained through numerous in situ campaigns and laboratory tests. The permanent accelerometric network comprises 21 digital 3D stations, including vertical arrays down to 200m (schist bedrock), covering a surface of about 100 sq Km. The site is also covered by a permanent seismological network. A number of high quality recordings, from temporary and permanent arrays, gave the possibility to perform advanced experimental and theoretical studies on site effects (e.g. Raptakis et al., 1998; Pitilakis et al., 1999; Raptakis et al., 2000; Chávez-García et al., 2000; Makra, 2000; Makra et al., 2001 & 2005). The main advantage of Euroseistest is the detailed knowledge of the 3D geological-geotechnical structure of the basin (Manakou, 2007) and its dense permanent accelerometric network. For this reason the site has been recently selected by CEA to validate and check the advanced numerical codes to be used in Cadarache ITER project. Besides the study of site effects, Euroseistest offers interesting possibilities to study SSI problems through two model structures (scaled 1:3). A 6-storey building and a bridge pier, which have been constructed and instrumented in the centre of the valley, close to the main vertical array. Euroseistest experimental site provides a rigorous high quality database comprising geological, geotechnical, geophysical and seismological data, as well as a valuable set of experimental facilities to study both experimentally and theoretically complex site effects and soil-foundation structure problems. Numerous publications have been already released (see in the web page). It is foreseen to strengthen in the near future the possibility to provide wide access to European and international scientific community to perform joint studies, to validate their models and to improve or develop new ones.

  18. Cross-sections of residual nuclei from deuteron irradiation of thin thorium target at energy 7 GeV

    NASA Astrophysics Data System (ADS)

    Vespalec, Radek; Adam, Jindrich; Baldin, Anton Alexandrovich; Khushvaktov, Jurabek; Solnyshkin, Alexander Alexandrovich; Tsoupko-Sitnikov, Vsevolod Mikhailovich; Tyutyunikov, Sergey Ivanovich; Vrzalova, Jitka; Zavorka, Lukas; Zeman, Miroslav

    2017-09-01

    The residual nuclei yields are of great importance for the estimation of basic radiation-technology characteristics (like a total target activity, production of long-lived nuclides etc.) of accelerator driven systems planned for transmutation of spent nuclear fuel and for a design of radioisotopes production facilities. Experimental data are also essential for validation of nuclear codes describing various stages of a spallation reaction. Therefore, the main aim of this work is to add new experimental data in energy region of relativistic deuterons, as similar data are missing in nuclear databases. The sample made of thin natural thorium foil was irradiated at JINR Nuclotron accelerator with a deuteron beam of the total kinetic energy 7 GeV. Integral number of deuterons was determined with the use of aluminum activation detectors. Products of deuteron induced spallation reaction were qualified and quantified by means of gamma-ray spectroscopy method. Several important spectroscopic corrections were applied to obtain results of high accuracy. Experimental cumulative and independent cross-sections were determined for more than 80 isotopes including meta-stable isomers. The total uncertainty of results rarely exceeded 9%. Experimental results were compared with MCNP6.1 Monte-Carlo code predictions. Generally, experimental and calculated cross-sections are in a reasonably good agreement, with the exception of a few light isotopes in a fragmentation region, where the calculations are highly under-estimated. Measured data will be useful for future development of high-energy nuclear codes. After completion, final data will be added into the EXFOR database.

  19. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk

    2014-07-28

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to supportmore » the design of optimised electrocaloric units and operating conditions.« less

  20. Experimental and Quasi-Experimental Design.

    ERIC Educational Resources Information Center

    Cottrell, Edward B.

    With an emphasis on the problems of control of extraneous variables and threats to internal and external validity, the arrangement or design of experiments is discussed. The purpose of experimentation in an educational institution, and the principles governing true experimentation (randomization, replication, and control) are presented, as are…

  1. M and S supporting unmanned autonomous systems (UAxS) concept development and experimentation

    NASA Astrophysics Data System (ADS)

    Biagini, Marco; Scaccianoce, Alfio; Corona, Fabio; Forconi, Sonia; Byrum, Frank; Fowler, Olivia; Sidoran, James L.

    2017-05-01

    The development of the next generation of multi-domain unmanned semi and fully autonomous C4ISR systems involves a multitude of security concerns and interoperability challenges. Conceptual solutions to capability shortfalls and gaps can be identified through Concept Development and Experimentation (CD and E) cycles. Modelling and Simulation (M and S) is a key tool in supporting unmanned autonomous systems (UAxS) CD and E activities and addressing associated security challenges. This paper serves to illustrate the application of M and S to UAxS development and highlight initiatives made by the North Atlantic Treaty Organization (NATO) M and S Centre of Excellence (CoE) to facilitate interoperability. The NATO M and S CoE collaborates with other NATO and Nations bodies in order to develop UAxS projects such as the Allied Command for Transformation Counter Unmanned Autonomous Systems (CUAxS) project or the work of Science and Technology Organization (STO) panels. Some initiatives, such as Simulated Interactive Robotics Initiative (SIRI) made the baseline for further developments and to study emerging technologies in M and S and robotics fields. Artificial Intelligence algorithm modelling, Robot Operating Systems (ROS), network operations, cyber security, interoperable languages and related data models are some of the main aspects considered in this paper. In particular, the implementation of interoperable languages like C-BML and NIEM MilOps are discussed in relation to a Command and Control - Simulation Interoperability (C2SIM) paradigm. All these technologies are used to build a conceptual architecture to support UAxS CD and E.In addition, other projects that the NATO M and S CoE is involved in, such as the NATO Urbanization Project could provide credible future operational environments and benefit UAxS project development, as dual application of UAxS technology in large urbanized areas.In conclusion, this paper contains a detailed overview regarding how applying Modelling and Simulation to support CD and E activities is a valid approach to develop and validate future capabilities requirements in general and next generation UAxS.

  2. They See a Rat, We Seek a Cure for Diseases: The Current Status of Animal Experimentation in Medical Practice

    PubMed Central

    Kehinde, Elijah O.

    2013-01-01

    The objective of this review article was to examine current and prospective developments in the scientific use of laboratory animals, and to find out whether or not there are still valid scientific benefits of and justification for animal experimentation. The PubMed and Web of Science databases were searched using the following key words: animal models, basic research, pharmaceutical research, toxicity testing, experimental surgery, surgical simulation, ethics, animal welfare, benign, malignant diseases. Important relevant reviews, original articles and references from 1970 to 2012 were reviewed for data on the use of experimental animals in the study of diseases. The use of laboratory animals in scientific research continues to generate intense public debate. Their use can be justified today in the following areas of research: basic scientific research, use of animals as models for human diseases, pharmaceutical research and development, toxicity testing and teaching of new surgical techniques. This is because there are inherent limitations in the use of alternatives such as in vitro studies, human clinical trials or computer simulation. However, there are problems of transferability of results obtained from animal research to humans. Efforts are on-going to find suitable alternatives to animal experimentation like cell and tissue culture and computer simulation. For the foreseeable future, it would appear that to enable scientists to have a more precise understanding of human disease, including its diagnosis, prognosis and therapeutic intervention, there will still be enough grounds to advocate animal experimentation. However, efforts must continue to minimize or eliminate the need for animal testing in scientific research as soon as possible. PMID:24217224

  3. They see a rat, we seek a cure for diseases: the current status of animal experimentation in medical practice.

    PubMed

    Kehinde, Elijah O

    2013-01-01

    The objective of this review article was to examine current and prospective developments in the scientific use of laboratory animals, and to find out whether or not there are still valid scientific benefits of and justification for animal experimentation. The PubMed and Web of Science databases were searched using the following key words: animal models, basic research, pharmaceutical research, toxicity testing, experimental surgery, surgical simulation, ethics, animal welfare, benign, malignant diseases. Important relevant reviews, original articles and references from 1970 to 2012 were reviewed for data on the use of experimental animals in the study of diseases. The use of laboratory animals in scientific research continues to generate intense public debate. Their use can be justified today in the following areas of research: basic scientific research, use of animals as models for human diseases, pharmaceutical research and development, toxicity testing and teaching of new surgical techniques. This is because there are inherent limitations in the use of alternatives such as in vitro studies, human clinical trials or computer simulation. However, there are problems of transferability of results obtained from animal research to humans. Efforts are on-going to find suitable alternatives to animal experimentation like cell and tissue culture and computer simulation. For the foreseeable future, it would appear that to enable scientists to have a more precise understanding of human disease, including its diagnosis, prognosis and therapeutic intervention, there will still be enough grounds to advocate animal experimentation. However, efforts must continue to minimize or eliminate the need for animal testing in scientific research as soon as possible. © 2013 S. Karger AG, Basel.

  4. A Human Alcohol Self-Administration Paradigm to Model Individual Differences in Impaired Control over Alcohol Use

    PubMed Central

    Leeman, Robert F.; Corbin, William R.; Nogueira, Christine; Krishnan-Sarin, Suchitra; Potenza, Marc N.; O’Malley, Stephanie S.

    2014-01-01

    We developed an alcohol self-administration paradigm to model individual differences in impaired control. The paradigm includes moderate drinking guidelines meant to model limits on alcohol consumption, which are typically exceeded by people with impaired control. Possible payment reductions provided a disincentive for excessive drinking. Alcohol use above the guideline, despite possible pay reductions, was considered to be indicative of impaired control. Heavy-drinking 21–25 year-olds (N = 39) were randomized to an experimental condition including the elements of the impaired control paradigm or to a free-drinking condition without these elements. Alcohol self-administration was compared between these two conditions to establish the internal validity of the experimental paradigm. In both conditions, participants self-administered beer and non-alcoholic beverages for 3 hours in a bar setting with 1–3 other participants. Experimental condition participants self-administered significantly fewer beers and drank to lower blood-alcohol concentrations (BACs) on average than those in the free-drinking condition. Experimental condition participants were more likely than free-drinking condition participants to intersperse non-alcoholic beverages with beer and to drink at a slower pace. Although experimental condition participants drank more moderately than those in the free-drinking condition overall, their range of drinking was considerable (BAC range = .024–.097) with several participants drinking excessively. A lower initial subjective response to alcohol and earlier age of alcohol use onset were associated with greater alcohol self-administration in the experimental condition. Given the variability in response, the impaired control laboratory paradigm may have utility for preliminary tests of novel interventions in future studies and for identifying individual differences in problem-drinking risk. PMID:23937598

  5. CFD validation experiments for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.

  6. Valid Knowledge: The Economy and the Academy

    ERIC Educational Resources Information Center

    Williams, Peter John

    2007-01-01

    The future of Western universities as public institutions is the subject of extensive continuing debate, underpinned by the issue of what constitutes "valid knowledge". Where in the past only propositional knowledge codified by academics was considered valid, in the new economy enabled by information and communications technology, the procedural…

  7. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  8. Uncertainties and understanding of experimental and theoretical results regarding reactions forming heavy and superheavy nuclei

    NASA Astrophysics Data System (ADS)

    Giardina, G.; Mandaglio, G.; Nasirov, A. K.; Anastasi, A.; Curciarello, F.; Fazio, G.

    2018-02-01

    Experimental and theoretical results of the PCN fusion probability of reactants in the entrance channel and the Wsur survival probability against fission at deexcitation of the compound nucleus formed in heavy-ion collisions are discussed. The theoretical results for a set of nuclear reactions leading to formation of compound nuclei (CNs) with the charge number Z = 102- 122 reveal a strong sensitivity of PCN to the characteristics of colliding nuclei in the entrance channel, dynamics of the reaction mechanism, and excitation energy of the system. We discuss the validity of assumptions and procedures for analysis of experimental data, and also the limits of validity of theoretical results obtained by the use of phenomenological models. The comparison of results obtained in many investigated reactions reveals serious limits of validity of the data analysis and calculation procedures.

  9. Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment

    NASA Technical Reports Server (NTRS)

    Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.

    2008-01-01

    Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.

  10. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is amore » follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.« less

  11. Examining students' views about validity of experiments: From introductory to Ph.D. students

    NASA Astrophysics Data System (ADS)

    Hu, Dehui; Zwickl, Benjamin M.

    2018-06-01

    We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.

  12. Analysis of Fade Detection and Compensation Experimental Results in a Ka-Band Satellite System. Degree awarded by Akron Univ., May 2000

    NASA Technical Reports Server (NTRS)

    Johnson, Sandra

    2001-01-01

    The frequency bands being used for new satellite communication systems are constantly increasing to accommodate the requirements for additional capacity. At these higher frequencies, propagation impairments that did not significantly affect the signal at lower frequencies begin to have considerable impact. In Ka-band, the next logical commercial frequency band to be used for satellite communication, attenuation of the signal due to rain is a primary concern. An experimental satellite built by NASA, the Advanced Communication Technology Satellite (ACTS), launched in September 1993, is the first US communication satellite operating in the Ka-band. In addition to higher carrier frequencies, a number of other new technologies, including onboard baseband processing, multiple beam antennas, and rain fade detection and compensation techniques, were designed into the ACTS. Verification experiments have been conducted since the launch to characterize the new technologies. The focus of this thesis is to describe and validate the method used by the ACTS Very Small Aperture Terminal (VSAT) ground stations in detecting the presence of fade in the communication signal and to adaptively compensate for it by the addition of burst rate reduction and forward error correction. Measured data obtained from the ACTS program is used to validate the compensation technique. In this thesis, models in MATLAB are developed to statistically characterize the increased availability achieved by the compensation techniques in terms of the bit error rate time enhancement factor. Several improvements to the ACTS technique are discussed and possible implementations for future Ka-band systems are also presented.

  13. Physical control oriented model of large scale refrigerators to synthesize advanced control schemes. Design, validation, and first control results

    NASA Astrophysics Data System (ADS)

    Bonne, François; Alamir, Mazen; Bonnay, Patrick

    2014-01-01

    In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection, to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.

  14. Towards a viscoelastic model for the unfused midpalatal suture: development and validation using the midsagittal suture in New Zealand white rabbits.

    PubMed

    Romanyk, D L; Liu, S S; Lipsett, M G; Toogood, R W; Lagravère, M O; Major, P W; Carey, J P

    2013-06-21

    Maxillary expansion treatment is a commonly used procedure by orthodontists to widen a patient's upper jaw. As this is typically performed in adolescent patients, the midpalatal suture, connective tissue adjoining the two maxilla halves, remains unfused. Studies that have investigated patient response to expansion treatment, generally through finite element analysis, have considered this suture to behave in a linear elastic manner or it was left vacant. The purpose of the study presented here was to develop a model that could represent the midpalatal suture's viscoelastic behavior. Quasilinear viscoelastic, modified superposition, Schapery's, and Burgers modeling approaches were all considered. Raw data from a previously published study using New Zealand White Rabbits was utilized for model parameter estimation and validation. In this study, Sentalloy(®) coil springs at load levels of 0.49N (50g), 0.98N (100g), and 1.96N (200g) were used to widen the midsagittal suture of live rabbits over a period of 6 weeks. Evaluation was based on a models ability to represent experimental data well over all three load sets. Ideally, a single set of model constants could be used to represent data over all loads tested. Upon completion of the analysis it was found that the modified superposition method was able to replicate experimental data within one standard deviation of the means using a single set of constants for all loads. Future work should focus on model improvement as well as prediction of treatment outcomes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Physical control oriented model of large scale refrigerators to synthesize advanced control schemes. Design, validation, and first control results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonne, François; Bonnay, Patrick; Alamir, Mazen

    2014-01-29

    In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection,more » to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.« less

  16. Comparison Between Numerically Simulated and Experimentally Measured Flowfield Quantities Behind a Pulsejet

    NASA Technical Reports Server (NTRS)

    Geng, Tao; Paxson, Daniel E.; Zheng, Fei; Kuznetsov, Andrey V.; Roberts, William L.

    2008-01-01

    Pulsed combustion is receiving renewed interest as a potential route to higher performance in air breathing propulsion systems. Pulsejets offer a simple experimental device with which to study unsteady combustion phenomena and validate simulations. Previous computational fluid dynamic (CFD) simulation work focused primarily on the pulsejet combustion and exhaust processes. This paper describes a new inlet sub-model which simulates the fluidic and mechanical operation of a valved pulsejet head. The governing equations for this sub-model are described. Sub-model validation is provided through comparisons of simulated and experimentally measured reed valve motion, and time averaged inlet mass flow rate. The updated pulsejet simulation, with the inlet sub-model implemented, is validated through comparison with experimentally measured combustion chamber pressure, inlet mass flow rate, operational frequency, and thrust. Additionally, the simulated pulsejet exhaust flowfield, which is dominated by a starting vortex ring, is compared with particle imaging velocimetry (PIV) measurements on the bases of velocity, vorticity, and vortex location. The results show good agreement between simulated and experimental data. The inlet sub-model is shown to be critical for the successful modeling of pulsejet operation. This sub-model correctly predicts both the inlet mass flow rate and its phase relationship with the combustion chamber pressure. As a result, the predicted pulsejet thrust agrees very well with experimental data.

  17. The Future of Hadrons: The Nexus of Subatomic Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quigg, Chris

    2011-09-01

    The author offers brief observations on matters discussed at the XIV International Conference on Hadron Spectroscopy and explore prospects for hadron physics. Quantum chromodynamics (QCD) has been validated as a new law of nature. It is internally consistent up to very high energies, and so could be a complete theory of the strong interactions. Whether QCD is the final answer for the strong interactions is a subject for continuing experimental tests, which are being extended in experimentation at the Large Hadron Collider. Beyond the comparison of perturbative calculations with experiment, it remains critically important to test the confinement hypothesis bymore » searching for free quarks, or for signatures of unconfined color. Sensitive negative searches for quarks continue to be interesting, and the definitive observation of free quarks would be revolutionary. Breakdowns of factorization would compromise the utility of perturbative QCD. Other discoveries that would require small or large revisions to QCD include the observation of new kinds of colored matter beyond quarks and gluons, the discovery that quarks are composite, or evidence that SU(3){sub c} gauge symmetry is the vestige of a larger, spontaneously broken, color symmetry. While probing our underlying theory for weakness or new openings, we have plenty to do to apply QCD to myriad experimental settings, to learn its implications for matter under unusual conditions, and to become more adept at calculating its consequences. New experimental tools provide the means for progress on a very broad front.« less

  18. Manipulating glucocorticoids in wild animals: basic and applied perspectives

    PubMed Central

    Sopinka, Natalie M.; Patterson, Lucy D.; Redfern, Julia C.; Pleizier, Naomi K.; Belanger, Cassia B.; Midwood, Jon D.; Crossin, Glenn T.; Cooke, Steven J.

    2015-01-01

    One of the most comprehensively studied responses to stressors in vertebrates is the endogenous production and regulation of glucocorticoids (GCs). Extensive laboratory research using experimental elevation of GCs in model species is instrumental in learning about stressor-induced physiological and behavioural mechanisms; however, such studies fail to inform our understanding of ecological and evolutionary processes in the wild. We reviewed emerging research that has used GC manipulations in wild vertebrates to assess GC-mediated effects on survival, physiology, behaviour, reproduction and offspring quality. Within and across taxa, exogenous manipulation of GCs increased, decreased or had no effect on traits examined in the reviewed studies. The notable diversity in responses to GC manipulation could be associated with variation in experimental methods, inherent differences among species, morphs, sexes and age classes, and the ecological conditions in which responses were measured. In their current form, results from experimental studies may be applied to animal conservation on a case-by-case basis in contexts such as threshold-based management. We discuss ways to integrate mechanistic explanations for changes in animal abundance in altered environments with functional applications that inform conservation practitioners of which species and traits may be most responsive to environmental change or human disturbance. Experimental GC manipulation holds promise for determining mechanisms underlying fitness impairment and population declines. Future work in this area should examine multiple life-history traits, with consideration of individual variation and, most importantly, validation of GC manipulations within naturally occurring and physiologically relevant ranges. PMID:27293716

  19. Innovation and problem solving: a review of common mechanisms.

    PubMed

    Griffin, Andrea S; Guez, David

    2014-11-01

    Behavioural innovations have become central to our thinking about how animals adjust to changing environments. It is now well established that animals vary in their ability to innovate, but understanding why remains a challenge. This is because innovations are rare, so studying innovation requires alternative experimental assays that create opportunities for animals to express their ability to invent new behaviours, or use pre-existing ones in new contexts. Problem solving of extractive foraging tasks has been put forward as a suitable experimental assay. We review the rapidly expanding literature on problem solving of extractive foraging tasks in order to better understand to what extent the processes underpinning problem solving, and the factors influencing problem solving, are in line with those predicted, and found, to underpin and influence innovation in the wild. Our aim is to determine whether problem solving can be used as an experimental proxy of innovation. We find that in most respects, problem solving is determined by the same underpinning mechanisms, and is influenced by the same factors, as those predicted to underpin, and to influence, innovation. We conclude that problem solving is a valid experimental assay for studying innovation, propose a conceptual model of problem solving in which motor diversity plays a more central role than has been considered to date, and provide recommendations for future research using problem solving to investigate innovation. This article is part of a Special Issue entitled: Cognition in the wild. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Experimental Validation of an Integrated Controls-Structures Design Methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.

    1996-01-01

    The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.

  1. Detection of overreported psychopathology with the MMPI-2-RF [corrected] validity scales.

    PubMed

    Sellbom, Martin; Bagby, R Michael

    2010-12-01

    We examined the utility of the validity scales on the recently released Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2 RF; Ben-Porath & Tellegen, 2008) to detect overreported psychopathology. This set of validity scales includes a newly developed scale and revised versions of the original MMPI-2 validity scales. We used an analogue, experimental simulation in which MMPI-2 RF responses (derived from archived MMPI-2 protocols) of undergraduate students instructed to overreport psychopathology (in either a coached or noncoached condition) were compared with those of psychiatric inpatients who completed the MMPI-2 under standardized instructions. The MMPI-2 RF validity scale Infrequent Psychopathology Responses best differentiated the simulation groups from the sample of patients, regardless of experimental condition. No other validity scale added consistent incremental predictive utility to Infrequent Psychopathology Responses in distinguishing the simulation groups from the sample of patients. Classification accuracy statistics confirmed the recommended cut scores in the MMPI-2 RF manual (Ben-Porath & Tellegen, 2008).

  2. Flow in prosthetic heart valves: state-of-the-art and future directions.

    PubMed

    Yoganathan, Ajit P; Chandran, K B; Sotiropoulos, Fotis

    2005-12-01

    Since the first successful implantation of a prosthetic heart valve four decades ago, over 50 different designs have been developed including both mechanical and bioprosthetic valves. Today, the most widely implanted design is the mechanical bileaflet, with over 170,000 implants worldwide each year. Several different mechanical valves are currently available and many of them have good bulk forward flow hemodynamics, with lower transvalvular pressure drops, larger effective orifice areas, and fewer regions of forward flow stasis than their earlier-generation counterparts such as the ball-and-cage and tilting-disc valves. However, mechanical valve implants suffer from complications resulting from thrombus deposition and patients implanted with these valves need to be under long-term anti-coagulant therapy. In general, blood thinners are not needed with bioprosthetic implants, but tissue valves suffer from structural failure with, an average life-time of 10-12 years, before replacement is needed. Flow-induced stresses on the formed elements in blood have been implicated in thrombus initiation within the mechanical valve prostheses. Regions of stress concentration on the leaflets during the complex motion of the leaflets have been implicated with structural failure of the leaflets with bioprosthetic valves. In vivo and in vitro experimental studies have yielded valuable information on the relationship between hemodynamic stresses and the problems associated with the implants. More recently, Computational Fluid Dynamics (CFD) has emerged as a promising tool, which, alongside experimentation, can yield insights of unprecedented detail into the hemodynamics of prosthetic heart valves. For CFD to realize its full potential, however, it must rely on numerical techniques that can handle the enormous geometrical complexities of prosthetic devices with spatial and temporal resolution sufficiently high to accurately capture all hemodynamically relevant scales of motion. Such algorithms do not exist today and their development should be a major research priority. For CFD to further gain the confidence of valve designers and medical practitioners it must also undergo comprehensive validation with experimental data. Such validation requires the use of high-resolution flow measuring tools and techniques and the integration of experimental studies with CFD modeling.

  3. Experimental validation of Mueller-Stokes theory and investigation of the influence of the Cotton-Mouton effect on polarimetry in a magnetized fusion plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J.; Peebles, W. A.; Crocker, N. A.

    Mueller-Stokes theory can be used to calculate the polarization evolution of an electromagnetic (EM) wave as it propagates through a magnetized plasma. Historically, the theory has been used to interpret polarimeter signals from systems operating on fusion plasmas. These interpretations have mostly employed approximations of Mueller-Stokes theory in regimes where either the Faraday rotation (FR) or the Cotton-Mouton (CM) effect is dominant. The current paper presents the first systematic comparison of polarimeter measurements with the predictions of full Mueller-Stokes theory where conditions transition smoothly from a FR-dominant (i.e., weak CM effect) plasma to one where the CM effect plays amore » significant role. A synthetic diagnostic code, based on Mueller-Stokes theory accurately reproduces the trends evident in the experimentally measured polarimeter phase over this entire operating range, thereby validating Mueller-Stokes theory. The synthetic diagnostic code is then used to investigate the influence of the CM effect on polarimetry measurements. As expected, the measurements are well approximated by the FR effect when the CM effect is predicted to be weak. However, the code shows that as the CM effect increases, it can compete with the FR effect in rotating the polarization of the EM-wave. This results in a reduced polarimeter response to the FR effect, just as observed in the experiment. The code also shows if sufficiently large, the CM effect can even reverse the handedness of a wave launched with circular polarization. This helps to understand the surprising experimental observations that the sensitivity to the FR effect can be nearly eliminated at high enough B{sub T} (2.0 T). The results also suggest that the CM effect on the plasma midplane can be exploited to potentially measure magnetic shear in tokamak plasmas. These results establish increased confidence in the use of such a synthetic diagnostic code to guide future polarimetry design and interpret the resultant experimental data.« less

  4. Back to the Consideration of Future Consequences Scale: time to reconsider?

    PubMed

    Rappange, David R; Brouwer, Werner B F; van Exel, N Job A

    2009-10-01

    The Consideration of Future Consequences (CFC) Scale is a measure of the extent to which individuals consider and are influenced by the distant outcomes of current behavior. In this study, the authors conducted factor analysis to investigate the factor structure of the 12-item CFC Scale. The authors found evidence for a multiple factor solution including one completely present-oriented factor consisting of all 7 present-oriented items, and one or two future-oriented factors consisting of the remaining future-oriented items. Further evidence indicated that the present-oriented factor and the 12-item CFC Scale perform similarly in terms of internal consistency and convergent validity. The structure and content of the future-oriented factor(s) is unclear. From the findings, the authors raise questions regarding the construct validity of the CFC Scale, the interpretation of its results, and the usefulness of the CFC scale in its current form in applied research.

  5. Fault-tolerant clock synchronization validation methodology. [in computer systems

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.

    1987-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.

  6. An Evaluation of Computerized Tests as Predictors of Job Performance: II. Differential Validity for Global and Job Element Criteria. Final Report.

    ERIC Educational Resources Information Center

    Cory, Charles H.

    This report presents data concerning the validity of a set of experimental computerized and paper-and-pencil tests for measures of on-job performance on global and job elements. It reports on the usefulness of 30 experimental and operational variables for predicting marks on 42 job elements and on a global criterion for Electrician's Mate,…

  7. Viscoelasticity of Axisymmetric Composite Structures: Analysis and Experimental Validation

    DTIC Science & Technology

    2013-02-01

    compressive stress at the interface between the composite and steel prior to the sheath’s cut-off. Accordingly, the viscoelastic analysis is used...The hoop-stress profile in figure 6 shows the steel region is in compression , resulting from the winding tension of composite overwrap. The stress...mechanical and thermal loads. Experimental validation of the model is conducted using a high- tensioned composite overwrapped on a steel cylinder. The creep

  8. Control of a Vanadium Redox Battery and supercapacitor using a Three-Level Neutral Point Clamped converter

    NASA Astrophysics Data System (ADS)

    Etxeberria, A.; Vechiu, I.; Baudoin, S.; Camblong, H.; Kreckelbergh, S.

    2014-02-01

    The increasing use of distributed generators, which are mainly based on renewable sources, can create several issues in the operation of the electric grid. The microgrid is being analysed as a solution to the integration in the grid of the renewable sources at a high penetration level in a controlled way. The storage systems play a vital role in order to keep the energy and power balance of the microgrid. Due to the technical limitations of the currently available storage systems, it is necessary to use more than one storage technology to satisfy the requirements of the microgrid application. This work validates in simulations and experimentally the use of a Three-Level Neutral Point Clamped converter to control the power flow of a hybrid storage system formed by a SuperCapacitor and a Vanadium Redox Battery. The operation of the system is validated in two case studies in the experimental platform installed in ESTIA. The experimental results prove the validity of the proposed system as well as the designed control algorithm. The good agreement among experimental and simulation results also validates the simulation model, that can therefore be used to analyse the operation of the system in different case studies.

  9. Relationship of otolith strontium-to-calcium ratios and salinity: Experimental validation for juvenile salmonids

    USGS Publications Warehouse

    Zimmerman, C.E.

    2005-01-01

    Analysis of otolith strontium (Sr) or strontium-to-calcium (Sr:Ca) ratios provides a powerful tool to reconstruct the chronology of migration among salinity environments for diadromous salmonids. Although use of this method has been validated by examination of known individuals and translocation experiments, it has never been validated under controlled experimental conditions. In this study, incorporation of otolith Sr was tested across a range of salinities and resulting levels of ambient Sr and Ca concentrations in juvenile chinook salmon (Oncorhynchus tshawytscha), coho salmon (Oncorhynchus kisutch), sockeye salmon (Oncorhynchus nerka), rainbow trout (Oncorhynchus rnykiss), and Arctic char (Salvelinus alpinus). Experimental water was mixed, using stream water and seawater as end members, to create experimental salinities of 0.1, 6.3, 12.7, 18.6, 25.5, and 33.0 psu. Otolith Sr and Sr:Ca ratios were significantly related to salinity for all species (r2 range: 0.80-0.91) but provide only enough predictive resolution to discriminate among fresh water, brackish water, and saltwater residency. These results validate the use of otolith Sr:Ca ratios to broadly discriminate salinity histories encountered by salmonids but highlight the need for further research concerning the influence of osmoregulation and physiological changes associated with smoking on otolith microchemistry.

  10. Computer-aided design of liposomal drugs: In silico prediction and experimental validation of drug candidates for liposomal remote loading.

    PubMed

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-10

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.

  11. Computer-aided design of liposomal drugs: in silico prediction and experimental validation of drug candidates for liposomal remote loading

    PubMed Central

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-01

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343

  12. FY 2016 Status Report on the Modeling of the M8 Calibration Series using MAMMOTH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Benjamin Allen; Ortensi, Javier; DeHart, Mark David

    2016-09-01

    This report provides a summary of the progress made towards validating the multi-physics reactor analysis application MAMMOTH using data from measurements performed at the Transient Reactor Test facility, TREAT. The work completed consists of a series of comparisons of TREAT element types (standard and control rod assemblies) in small geometries as well as slotted mini-cores to reference Monte Carlo simulations to ascertain the accuracy of cross section preparation techniques. After the successful completion of these smaller problems, a full core model of the half slotted core used in the M8 Calibration series was assembled. Full core MAMMOTH simulations were comparedmore » to Serpent reference calculations to assess the cross section preparation process for this larger configuration. As part of the validation process the M8 Calibration series included a steady state wire irradiation experiment and coupling factors for the experiment region. The shape of the power distribution obtained from the MAMMOTH simulation shows excellent agreement with the experiment. Larger differences were encountered in the calculation of the coupling factors, but there is also great uncertainty on how the experimental values were obtained. Future work will focus on resolving some of these differences.« less

  13. Overview of Recent DIII-D Experimental Results

    NASA Astrophysics Data System (ADS)

    Fenstermacher, Max; DIII-D Team

    2017-10-01

    Recent DIII-D experiments contributed to the ITER physics basis and to physics understanding for extrapolation to future devices. A predict-first analysis showed how shape can enhance access to RMP ELM suppression. 3D equilibrium changes from ELM control RMPs, were linked to density pumpout. Ion velocity imaging in the SOL showed 3D C2+flow perturbations near RMP induced n =1 islands. Correlation ECE reveals a 40% increase in Te turbulence during QH-mode and 70% during RMP ELM suppression vs. ELMing H-mode. A long-lived predator-prey oscillation replaces edge MHD in recent low-torque QH-mode plasmas. Spatio-temporally resolved runaway electron measurements validate the importance of synchrotron and collisional damping on RE dissipation. A new small angle slot divertor achieves strong plasma cooling and facilitates detachment access. Fast ion confinement was improved in high q_min scenarios using variable beam energy optimization. First reproducible, stable ITER baseline scenarios were established. Studies have validated a model for edge momentum transport that predicts the pedestal main-ion intrinsic velocity value and direction. Work supported by the US DOE under DE-FC02-04ER54698 and DE-AC52-07NA27344.

  14. Targeting IL-2: an unexpected effect in treating immunological diseases.

    PubMed

    Ye, Congxiu; Brand, David; Zheng, Song G

    2018-01-01

    Regulatory T cells (Treg) play a crucial role in maintaining immune homeostasis since Treg dysfunction in both animals and humans is associated with multi-organ autoimmune and inflammatory disease. While IL-2 is generally considered to promote T-cell proliferation and enhance effector T-cell function, recent studies have demonstrated that treatments that utilize low-dose IL-2 unexpectedly induce immune tolerance and promote Treg development resulting in the suppression of unwanted immune responses and eventually leading to treatment of some autoimmune disorders. In the present review, we discuss the biology of IL-2 and its signaling to help define the key role played by IL-2 in the development and function of Treg cells. We also summarize proof-of-concept clinical trials which have shown that low-dose IL-2 can control autoimmune diseases safely and effectively by specifically expanding and activating Treg. However, future studies will be needed to validate a better and safer dosing strategy for low-dose IL-2 treatments utilizing well-controlled clinical trials. More studies will also be needed to validate the appropriate dose of IL-2/anti-cytokine or IL-2/anti-IL-2 complex in the experimental animal models before moving to the clinic.

  15. Validation of the new code package APOLLO2.8 for accurate PWR neutronics calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santamarina, A.; Bernard, D.; Blaise, P.

    2013-07-01

    This paper summarizes the Qualification work performed to demonstrate the accuracy of the new APOLLO2.S/SHEM-MOC package based on JEFF3.1.1 nuclear data file for the prediction of PWR neutronics parameters. This experimental validation is based on PWR mock-up critical experiments performed in the EOLE/MINERVE zero-power reactors and on P.I. Es on spent fuel assemblies from the French PWRs. The Calculation-Experiment comparison for the main design parameters is presented: reactivity of UOX and MOX lattices, depletion calculation and fuel inventory, reactivity loss with burnup, pin-by-pin power maps, Doppler coefficient, Moderator Temperature Coefficient, Void coefficient, UO{sub 2}-Gd{sub 2}O{sub 3} poisoning worth, Efficiency ofmore » Ag-In-Cd and B4C control rods, Reflector Saving for both standard 2-cm baffle and GEN3 advanced thick SS reflector. From this qualification process, calculation biases and associated uncertainties are derived. This code package APOLLO2.8 is already implemented in the ARCADIA new AREVA calculation chain for core physics and is currently under implementation in the future neutronics package of the French utility Electricite de France. (authors)« less

  16. A Test of the Validity of Inviscid Wall-Modeled LES

    NASA Astrophysics Data System (ADS)

    Redman, Andrew; Craft, Kyle; Aikens, Kurt

    2015-11-01

    Computational expense is one of the main deterrents to more widespread use of large eddy simulations (LES). As such, it is important to reduce computational costs whenever possible. In this vein, it may be reasonable to assume that high Reynolds number flows with turbulent boundary layers are inviscid when using a wall model. This assumption relies on the grid being too coarse to resolve either the viscous length scales in the outer flow or those near walls. We are not aware of other studies that have suggested or examined the validity of this approach. The inviscid wall-modeled LES assumption is tested here for supersonic flow over a flat plate on three different grids. Inviscid and viscous results are compared to those of another wall-modeled LES as well as experimental data - the results appear promising. Furthermore, the inviscid assumption reduces simulation costs by about 25% and 39% for supersonic and subsonic flows, respectively, with the current LES application. Recommendations are presented as are future areas of research. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. Computational resources on TACC Stampede were provided under XSEDE allocation ENG150001.

  17. Containment Sodium Chemistry Models in MELCOR.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Humphries, Larry L.; Denman, Matthew R

    To meet regulatory needs for sodium fast reactors’ future development, including licensing requirements, Sandia National Laboratories is modernizing MELCOR, a severe accident analysis computer code developed for the U.S. Nuclear Regulatory Commission (NRC). Specifically, Sandia is modernizing MELCOR to include the capability to model sodium reactors. However, Sandia’s modernization effort primarily focuses on the containment response aspects of the sodium reactor accidents. Sandia began modernizing MELCOR in 2013 to allow a sodium coolant, rather than water, for conventional light water reactors. In the past three years, Sandia has been implementing the sodium chemistry containment models in CONTAIN-LMR, a legacy NRCmore » code, into MELCOR. These chemistry models include spray fire, pool fire and atmosphere chemistry models. Only the first two chemistry models have been implemented though it is intended to implement all these models into MELCOR. A new package called “NAC” has been created to manage the sodium chemistry model more efficiently. In 2017 Sandia began validating the implemented models in MELCOR by simulating available experiments. The CONTAIN-LMR sodium models include sodium atmosphere chemistry and sodium-concrete interaction models. This paper presents sodium property models, the implemented models, implementation issues, and a path towards validation against existing experimental data.« less

  18. Estimating power capability of aged lithium-ion batteries in presence of communication delays

    NASA Astrophysics Data System (ADS)

    Fridholm, Björn; Wik, Torsten; Kuusisto, Hannes; Klintberg, Anton

    2018-04-01

    Efficient control of electrified powertrains requires accurate estimation of the power capability of the battery for the next few seconds into the future. When implemented in a vehicle, the power estimation is part of a control loop that may contain several networked controllers which introduces time delays that may jeopardize stability. In this article, we present and evaluate an adaptive power estimation method that robustly can handle uncertain health status and time delays. A theoretical analysis shows that stability of the closed loop system can be lost if the resistance of the model is under-estimated. Stability can, however, be restored by filtering the estimated power at the expense of slightly reduced bandwidth of the signal. The adaptive algorithm is experimentally validated in lab tests using an aged lithium-ion cell subject to a high power load profile in temperatures from -20 to +25 °C. The upper voltage limit was set to 4.15 V and the lower voltage limit to 2.6 V, where significant non-linearities are occurring and the validity of the model is limited. After an initial transient when the model parameters are adapted, the prediction accuracy is within ± 2 % of the actually available power.

  19. A large-scale benchmark of gene prioritization methods.

    PubMed

    Guala, Dimitri; Sonnhammer, Erik L L

    2017-04-21

    In order to maximize the use of results from high-throughput experimental studies, e.g. GWAS, for identification and diagnostics of new disease-associated genes, it is important to have properly analyzed and benchmarked gene prioritization tools. While prospective benchmarks are underpowered to provide statistically significant results in their attempt to differentiate the performance of gene prioritization tools, a strategy for retrospective benchmarking has been missing, and new tools usually only provide internal validations. The Gene Ontology(GO) contains genes clustered around annotation terms. This intrinsic property of GO can be utilized in construction of robust benchmarks, objective to the problem domain. We demonstrate how this can be achieved for network-based gene prioritization tools, utilizing the FunCoup network. We use cross-validation and a set of appropriate performance measures to compare state-of-the-art gene prioritization algorithms: three based on network diffusion, NetRank and two implementations of Random Walk with Restart, and MaxLink that utilizes network neighborhood. Our benchmark suite provides a systematic and objective way to compare the multitude of available and future gene prioritization tools, enabling researchers to select the best gene prioritization tool for the task at hand, and helping to guide the development of more accurate methods.

  20. A Surrogate Approach to the Experimental Optimization of Multielement Airfoils

    NASA Technical Reports Server (NTRS)

    Otto, John C.; Landman, Drew; Patera, Anthony T.

    1996-01-01

    The incorporation of experimental test data into the optimization process is accomplished through the use of Bayesian-validated surrogates. In the surrogate approach, a surrogate for the experiment (e.g., a response surface) serves in the optimization process. The validation step of the framework provides a qualitative assessment of the surrogate quality, and bounds the surrogate-for-experiment error on designs "near" surrogate-predicted optimal designs. The utility of the framework is demonstrated through its application to the experimental selection of the trailing edge ap position to achieve a design lift coefficient for a three-element airfoil.

  1. The Use of Virtual Reality in the Study of People's Responses to Violent Incidents.

    PubMed

    Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel

    2009-01-01

    This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call 'plausibility' - including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.

  2. The Use of Virtual Reality in the Study of People's Responses to Violent Incidents

    PubMed Central

    Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel

    2009-01-01

    This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call ‘plausibility’ – including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents. PMID:20076762

  3. Validation of Magnetic Resonance Thermometry by Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Rydquist, Grant; Owkes, Mark; Verhulst, Claire M.; Benson, Michael J.; Vanpoppel, Bret P.; Burton, Sascha; Eaton, John K.; Elkins, Christopher P.

    2016-11-01

    Magnetic Resonance Thermometry (MRT) is a new experimental technique that can create fully three-dimensional temperature fields in a noninvasive manner. However, validation is still required to determine the accuracy of measured results. One method of examination is to compare data gathered experimentally to data computed with computational fluid dynamics (CFD). In this study, large-eddy simulations have been performed with the NGA computational platform to generate data for a comparison with previously run MRT experiments. The experimental setup consisted of a heated jet inclined at 30° injected into a larger channel. In the simulations, viscosity and density were scaled according to the local temperature to account for differences in buoyant and viscous forces. A mesh-independent study was performed with 5 mil-, 15 mil- and 45 mil-cell meshes. The program Star-CCM + was used to simulate the complete experimental geometry. This was compared to data generated from NGA. Overall, both programs show good agreement with the experimental data gathered with MRT. With this data, the validity of MRT as a diagnostic tool has been shown and the tool can be used to further our understanding of a range of flows with non-trivial temperature distributions.

  4. Turbofan Engine Post-Instability Behavior - Computer Simulations, Test Validation, and Application of Simulations,

    DTIC Science & Technology

    COMPRESSORS, *AIR FLOW, TURBOFAN ENGINES , TRANSIENTS, SURGES, STABILITY, COMPUTERIZED SIMULATION, EXPERIMENTAL DATA, VALIDATION, DIGITAL SIMULATION, INLET GUIDE VANES , ROTATION, STALLING, RECOVERY, HYSTERESIS

  5. The Future of Virtual Reality in Education: A Future Oriented Meta Analysis of the Literature

    ERIC Educational Resources Information Center

    Passig, David

    2009-01-01

    Many have elaborated on the potential of virtual reality (VR) in learning. This article attempts at organizing the literature in this issue in order to better identify indicators that can account for future valid trends, and seeks to bring to attention how authors who wrote about the future of VR in education confused futures' terms and produced…

  6. Secondary school physics teachers' conceptions of scientific evidence: A collective case study

    NASA Astrophysics Data System (ADS)

    Taylor, Joseph A.

    Engaging secondary school students in inquiry-oriented tasks that more closely simulate the scholarly activities of scientists has been recommended as a way to improve scientific literacy. Two tasks that are frequently recommended include students' design of original experiments, and students' evaluation of scientific evidence and conclusions. Yet, little is known about teachers' conceptions of experimentation. The principal aim of this study, therefore, was to describe the nature of prospective and practicing physics teachers' conceptions of scientific evidence. More specifically, the following research questions guided this study: (1) What types of issues related to the measurement reliability and experimental validity of scientific evidence do practicing and prospective physics teachers think about when designing experiments? (2) When presented with hypothetical scenarios that describe unsound experimental procedures or poorly supported conclusions (or both), what concerns will prospective and practicing physics teachers raise? And (3) When the participants' responses to parallel research prompts are compared across protocols, what similarities and differences exist? The nature of the teacher-participants' conceptions was described from an analysis of data collected from research prompts such as interviews and hand written artifacts. In these research prompts, the teachers "thought aloud" while designing experiments and critically evaluated student-collected evidence presented in hypothetical classroom scenarios. The data from this study suggested that the three teachers, while contemplating the reliability and validity of scientific evidence, frequently used their conceptions of evidence in conjunction with specific subject matter conceptions. The data also indicated that the relationship between subject matter knowledge and conceptions of evidence was more pronounced for some conceptions of evidence than for others. Suggestions for future research included conducting similar studies in other physics content areas as well as other scientific disciplines. Implications for science teacher education suggested that science and science methods courses encourage the construction of evidence-based arguments, as well as engagement in peer review and critique.

  7. A CFD validation roadmap for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  8. A CFD validation roadmap for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1993-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  9. How valid are future generations' arguments for preserving wilderness?

    Treesearch

    Thomas A. More; James R. Averill; Thomas H. Stevens

    2000-01-01

    We are often urged to preserve wilderness for the sake of future generations. Future generations consist of potential persons who are mute stakeholders in the decisions of today. Many claims about the rights of future generations or our present obligations to them have been vigorously advanced and just as vigorously denied. Recent theorists, however, have argued for a...

  10. The Question of Education Science: "Experiment"ism Versus "Experimental"ism

    ERIC Educational Resources Information Center

    Howe, Kenneth R.

    2005-01-01

    The ascendant view in the current debate about education science -- experimentism -- is a reassertion of the randomized experiment as the methodological gold standard. Advocates of this view have ignored, not answered, long-standing criticisms of the randomized experiment: its frequent impracticality, its lack of external validity, its confinement…

  11. Internal Validity: A Must in Research Designs

    ERIC Educational Resources Information Center

    Cahit, Kaya

    2015-01-01

    In experimental research, internal validity refers to what extent researchers can conclude that changes in dependent variable (i.e. outcome) are caused by manipulations in independent variable. The causal inference permits researchers to meaningfully interpret research results. This article discusses (a) internal validity threats in social and…

  12. Jefferson Lab Science: Present and Future

    DOE PAGES

    McKeown, Robert D.

    2015-02-12

    The Continuous Electron Beam Accelerator Facility (CEBAF) and associated experimental equipment at Jefferson Lab comprise a unique facility for experimental nuclear physics. Furthermore, this facility is presently being upgraded, which will enable a new experimental program with substantial discovery potential to address important topics in nuclear, hadronic, and electroweak physics. Further in the future, it is envisioned that the Laboratory will evolve into an electron-ion colliding beam facility.

  13. Experimental validation of Monte Carlo (MANTIS) simulated x-ray response of columnar CsI scintillator screens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freed, Melanie; Miller, Stuart; Tang, Katherine

    Purpose: MANTIS is a Monte Carlo code developed for the detailed simulation of columnar CsI scintillator screens in x-ray imaging systems. Validation of this code is needed to provide a reliable and valuable tool for system optimization and accurate reconstructions for a variety of x-ray applications. Whereas previous validation efforts have focused on matching of summary statistics, in this work the authors examine the complete point response function (PRF) of the detector system in addition to relative light output values. Methods: Relative light output values and high-resolution PRFs have been experimentally measured with a custom setup. A corresponding set ofmore » simulated light output values and PRFs have also been produced, where detailed knowledge of the experimental setup and CsI:Tl screen structures are accounted for in the simulations. Four different screens were investigated with different thicknesses, column tilt angles, and substrate types. A quantitative comparison between the experimental and simulated PRFs was performed for four different incidence angles (0 deg., 15 deg., 30 deg., and 45 deg.) and two different x-ray spectra (40 and 70 kVp). The figure of merit (FOM) used measures the normalized differences between the simulated and experimental data averaged over a region of interest. Results: Experimental relative light output values ranged from 1.456 to 1.650 and were in approximate agreement for aluminum substrates, but poor agreement for graphite substrates. The FOMs for all screen types, incidence angles, and energies ranged from 0.1929 to 0.4775. To put these FOMs in context, the same FOM was computed for 2D symmetric Gaussians fit to the same experimental data. These FOMs ranged from 0.2068 to 0.8029. Our analysis demonstrates that MANTIS reproduces experimental PRFs with higher accuracy than a symmetric 2D Gaussian fit to the experimental data in the majority of cases. Examination of the spatial distribution of differences between the PRFs shows that the main reason for errors between MANTIS and the experimental data is that MANTIS-generated PRFs are sharper than the experimental PRFs. Conclusions: The experimental validation of MANTIS performed in this study demonstrates that MANTIS is able to reliably predict experimental PRFs, especially for thinner screens, and can reproduce the highly asymmetric shape seen in the experimental data. As a result, optimizations and reconstructions carried out using MANTIS should yield results indicative of actual detector performance. Better characterization of screen properties is necessary to reconcile the simulated light output values with experimental data.« less

  14. Complete Statistical Survey Results of 1982 Texas Competency Validation Project.

    ERIC Educational Resources Information Center

    Rogers, Sandra K.; Dahlberg, Maurine F.

    This report documents a project to develop current statewide validated competencies for auto mechanics, diesel mechanics, welding, office occupations, and printing. Section 1 describes the four steps used in the current competency validation project and provides a standardized process for conducting future studies at the local or statewide level.…

  15. Validation of Agricultural Mechanics Curriculum Manual.

    ERIC Educational Resources Information Center

    Hatcher, Elizabeth; And Others

    This study was concerned with the validation of the Oklahoma Curriculum and Instructional Materials Center's agricultural mechanics curriculum manual and the development of a model whereby future manuals can be validated. Five units in the manual were randomly selected from a list of units to be taught during the second semester of the 1977-78…

  16. Temperature and heat flux datasets of a complex object in a fire plume for the validation of fire and thermal response codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jernigan, Dann A.; Blanchat, Thomas K.

    It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less

  17. A critical assessment of wind tunnel results for the NACA 0012 airfoil

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.

    1987-01-01

    A large body of experimental results, obtained in more than 40 wind tunnels on a single, well-known two-dimensional configuration, has been critically examined and correlated. An assessment of some of the possible sources of error has been made for each facility, and data which are suspect have been identified. It was found that no single experiment provided a complete set of reliable data, although one investigation stands out as superior in many respects. However, from the aggregate of data the representative properties of the NACA 0012 airfoil can be identified with reasonable confidence over wide ranges of Mach number, Reynolds number, and angles of attack. This synthesized information can now be used to assess and validate existing and future wind tunnel results and to evaluate advanced Computational Fluid Dynamics codes.

  18. Discrimination of correlated and entangling quantum channels with selective process tomography

    DOE PAGES

    Dumitrescu, Eugene; Humble, Travis S.

    2016-10-10

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  19. Review of Transient Testing of Fast Reactor Fuels in the Transient REActor Test Facility (TREAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, C.; Wachs, D.; Carmack, J.

    The restart of the Transient REActor Test (TREAT) facility provides a unique opportunity to engage the fast reactor fuels community to reinitiate in-pile experimental safety studies. Historically, the TREAT facility played a critical role in characterizing the behavior of both metal and oxide fast reactor fuels under off-normal conditions, irradiating hundreds of fuel pins to support fast reactor fuel development programs. The resulting test data has provided validation for a multitude of fuel performance and severe accident analysis computer codes. This paper will provide a review of the historical database of TREAT experiments including experiment design, instrumentation, test objectives, andmore » salient findings. Additionally, the paper will provide an introduction to the current and future experiment plans of the U.S. transient testing program at TREAT.« less

  20. Subscale Flight Testing for Aircraft Loss of Control: Accomplishments and Future Directions

    NASA Technical Reports Server (NTRS)

    Cox, David E.; Cunningham, Kevin; Jordan, Thomas L.

    2012-01-01

    Subscale flight-testing provides a means to validate both dynamic models and mitigation technologies in the high-risk flight conditions associated with aircraft loss of control. The Airborne Subscale Transport Aircraft Research (AirSTAR) facility was designed to be a flexible and efficient research facility to address this type of flight-testing. Over the last several years (2009-2011) it has been used to perform 58 research flights with an unmanned, remotely-piloted, dynamically-scaled airplane. This paper will present an overview of the facility and its architecture and summarize the experimental data collected. All flights to date have been conducted within visual range of a safety observer. Current plans for the facility include expanding the test volume to altitudes and distances well beyond visual range. The architecture and instrumentation changes associated with this upgrade will also be presented.

Top