Sample records for extensive validation experiments

  1. Temperature measurement reliability and validity with thermocouple extension leads or changing lead temperature.

    PubMed

    Jutte, Lisa S; Long, Blaine C; Knight, Kenneth L

    2010-01-01

    Thermocouples' leads are often too short, necessitating the use of an extension lead. To determine if temperature measures were influenced by extension-lead use or lead temperature changes. Descriptive laboratory study. Laboratory. Experiment 1: 10 IT-21 thermocouples and 5 extension leads. Experiment 2: 5 IT-21 and PT-6 thermocouples. In experiment 1, temperature data were collected on 10 IT-21 thermocouples in a stable water bath with and without extension leads. In experiment 2, temperature data were collected on 5 IT-21 and PT-6 thermocouples in a stable water bath before, during, and after ice-pack application to extension leads. In experiment 1, extension leads did not influence IT-21 validity (P  =  .45) or reliability (P  =  .10). In experiment 2, postapplication IT-21 temperatures were greater than preapplication and application measures (P < .05). Extension leads had no influence on temperature measures. Ice application to leads may increase measurement error.

  2. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  3. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    NASA Technical Reports Server (NTRS)

    Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  4. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    NASA Technical Reports Server (NTRS)

    Storey, Jed; Kirk, Daniel (Editor); Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  5. Extension and Validation of a Hybrid Particle-Finite Element Method for Hypervelocity Impact Simulation. Chapter 2

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.; Shivarama, Ravishankar

    2004-01-01

    The hybrid particle-finite element method of Fahrenthold and Horban, developed for the simulation of hypervelocity impact problems, has been extended to include new formulations of the particle-element kinematics, additional constitutive models, and an improved numerical implementation. The extended formulation has been validated in three dimensional simulations of published impact experiments. The test cases demonstrate good agreement with experiment, good parallel speedup, and numerical convergence of the simulation results.

  6. Environmental Stimulation, Parental Nurturance and Cognitive Development in Humans

    ERIC Educational Resources Information Center

    Farah, Martha J.; Betancourt, Laura; Shera, David M.; Savage, Jessica H.; Giannetta, Joan M.; Brodsky, Nancy L.; Malmud, Elsa K.; Hurt, Hallam

    2008-01-01

    The effects of environmental stimulation and parental nurturance on brain development have been studied extensively in animals. Much less is known about the relations between childhood experience and cognitive development in humans. Using a longitudinally collected data set with ecologically valid in-home measures of childhood experience and later…

  7. Validating Personal Well-Being Experiences at School: A Quantitative Examination of Secondary School Students

    ERIC Educational Resources Information Center

    Phan, Huy P.; Ngu, Bing H.

    2015-01-01

    Progress in education has involved, to a large extent, a focus on individuals' well-being experiences at school (ACU and Erebus International, 2008; Fraillon, 2004). This line of inquiry has produced extensive findings, highlighting the diverse coverage and scope of this psychosocial theoretical orientation. We recently developed a theoretical…

  8. Additional extensions to the NASCAP computer code, volume 2

    NASA Technical Reports Server (NTRS)

    Stannard, P. R.; Katz, I.; Mandell, M. J.

    1982-01-01

    Particular attention is given to comparison of the actural response of the SCATHA (Spacecraft Charging AT High Altitudes) P78-2 satellite with theoretical (NASCAP) predictions. Extensive comparisons for a variety of environmental conditions confirm the validity of the NASCAP model. A summary of the capabilities and range of validity of NASCAP is presented, with extensive reference to previously published applications. It is shown that NASCAP is capable of providing quantitatively accurate results when the object and environment are adequately represented and fall within the range of conditions for which NASCAP was intended. Three dimensional electric field affects play an important role in determining the potential of dielectric surfaces and electrically isolated conducting surfaces, particularly in the presence of artificially imposed high voltages. A theory for such phenomena is presented and applied to the active control experiments carried out in SCATHA, as well as other space and laboratory experiments. Finally, some preliminary work toward modeling large spacecraft in polar Earth orbit is presented. An initial physical model is presented including charge emission. A simple code based upon the model is described along with code test results.

  9. The Social Psychology of Perception Experiments: Hills, Backpacks, Glucose, and the Problem of Generalizability

    ERIC Educational Resources Information Center

    Durgin, Frank H.; Klein, Brennan; Spiegel, Ariana; Strawser, Cassandra J.; Williams, Morgan

    2012-01-01

    Experiments take place in a physical environment but also a social environment. Generalizability from experimental manipulations to more typical contexts may be limited by violations of ecological validity with respect to either the physical or the social environment. A replication and extension of a recent study (a blood glucose manipulation) was…

  10. An Investigation of Aerosol Measurements from the Halogen Occultation Experiment: Validation, Size Distributions, Composition, and Relation to Other Chemical Species

    NASA Technical Reports Server (NTRS)

    Deshler, Terry; Hervig, Mark E.

    1998-01-01

    The efforts envisioned within the original proposal (accepted February 1994) and the extension of this proposal (accepted February 1997) included measurement validations, the retrieval of aerosol size distributions and distribution moments, aerosol correction studies, and investigations of polar stratospheric clouds. A majority of the results from this grant have been published. The principal results from this grant are discussed.

  11. Extension of the firefly algorithm and preference rules for solving MINLP problems

    NASA Astrophysics Data System (ADS)

    Costa, M. Fernanda P.; Francisco, Rogério B.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.

    2017-07-01

    An extension of the firefly algorithm (FA) for solving mixed-integer nonlinear programming (MINLP) problems is presented. Although penalty functions are nowadays frequently used to handle integrality conditions and inequality and equality constraints, this paper proposes the implementation within the FA of a simple rounded-based heuristic and four preference rules to find and converge to MINLP feasible solutions. Preliminary numerical experiments are carried out to validate the proposed methodology.

  12. F-16XL-2 Supersonic Laminar Flow Control Flight Test Experiment

    NASA Technical Reports Server (NTRS)

    Anders, Scott G.; Fischer, Michael C.

    1999-01-01

    The F-16XL-2 Supersonic Laminar Flow Control Flight Test Experiment was part of the NASA High-Speed Research Program. The goal of the experiment was to demonstrate extensive laminar flow, to validate computational fluid dynamics (CFD) codes and design methodology, and to establish laminar flow control design criteria. Topics include the flight test hardware and design, airplane modification, the pressure and suction distributions achieved, the laminar flow achieved, and the data analysis and code correlation.

  13. Determining Content Validity for the Transition Awareness and Possibilities Scale (TAPS)

    ERIC Educational Resources Information Center

    Ross, Melynda Burck

    2011-01-01

    The Transition Awareness & Possibilities Scale (TAPS) was crafted after an extensive review of literature was conducted to find research that examined and described specific aspects of transition programming: inputs, including supports and skill instruction; processes, including parent and support provider perceptions of the transition experience;…

  14. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  15. SMOS L1C and L2 Validation in Australia

    NASA Technical Reports Server (NTRS)

    Rudiger, Christoph; Walker, Jeffrey P.; Kerr, Yann H.; Mialon, Arnaud; Merlin, Olivier; Kim, Edward J.

    2012-01-01

    Extensive airborne field campaigns (Australian Airborne Cal/val Experiments for SMOS - AACES) were undertaken during the 2010 summer and winter seasons of the southern hemisphere. The purpose of those campaigns was the validation of the Level 1c (brightness temperature) and Level 2 (soil moisture) products of the ESA-led Soil Moisture and Ocean Salinity (SMOS) mission. As SMOS is the first satellite to globally map L-band (1.4GHz) emissions from the Earth?s surface, and the first 2-dimensional interferometric microwave radiometer used for Earth observation, large scale and long-term validation campaigns have been conducted world-wide, of which AACES is the most extensive. AACES combined large scale medium-resolution airborne L-band and spectral observations, along with high-resolution in-situ measurements of soil moisture across a 50,000km2 area of the Murrumbidgee River catchment, located in south-eastern Australia. This paper presents a qualitative assessment of the SMOS brightness temperature and soil moisture products.

  16. Validating presupposed versus focused text information.

    PubMed

    Singer, Murray; Solar, Kevin G; Spear, Jackie

    2017-04-01

    There is extensive evidence that readers continually validate discourse accuracy and congruence, but that they may also overlook conspicuous text contradictions. Validation may be thwarted when the inaccurate ideas are embedded sentence presuppositions. In four experiments, we examined readers' validation of presupposed ("given") versus new text information. Throughout, a critical concept, such as a truck versus a bus, was introduced early in a narrative. Later, a character stated or thought something about the truck, which therefore matched or mismatched its antecedent. Furthermore, truck was presented as either given or new information. Mismatch target reading times uniformly exceeded the matching ones by similar magnitudes for given and new concepts. We obtained this outcome using different grammatical constructions and with different antecedent-target distances. In Experiment 4, we examined only given critical ideas, but varied both their matching and the main verb's factivity (e.g., factive know vs. nonfactive think). The Match × Factivity interaction closely resembled that previously observed for new target information (Singer, 2006). Thus, readers can successfully validate given target information. Although contemporary theories tend to emphasize either deficient or successful validation, both types of theory can accommodate the discourse and reader variables that may regulate validation.

  17. TU-D-201-05: Validation of Treatment Planning Dose Calculations: Experience Working with MPPG 5.a

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xue, J; Park, J; Kim, L

    2016-06-15

    Purpose: Newly published medical physics practice guideline (MPPG 5.a.) has set the minimum requirements for commissioning and QA of treatment planning dose calculations. We present our experience in the validation of a commercial treatment planning system based on MPPG 5.a. Methods: In addition to tests traditionally performed to commission a model-based dose calculation algorithm, extensive tests were carried out at short and extended SSDs, various depths, oblique gantry angles and off-axis conditions to verify the robustness and limitations of a dose calculation algorithm. A comparison between measured and calculated dose was performed based on validation tests and evaluation criteria recommendedmore » by MPPG 5.a. An ion chamber was used for the measurement of dose at points of interest, and diodes were used for photon IMRT/VMAT validations. Dose profiles were measured with a three-dimensional scanning system and calculated in the TPS using a virtual water phantom. Results: Calculated and measured absolute dose profiles were compared at each specified SSD and depth for open fields. The disagreement is easily identifiable with the difference curve. Subtle discrepancy has revealed the limitation of the measurement, e.g., a spike at the high dose region and an asymmetrical penumbra observed on the tests with an oblique MLC beam. The excellent results we had (> 98% pass rate on 3%/3mm gamma index) on the end-to-end tests for both IMRT and VMAT are attributed to the quality beam data and the good understanding of the modeling. The limitation of the model and the uncertainty of measurement were considered when comparing the results. Conclusion: The extensive tests recommended by the MPPG encourage us to understand the accuracy and limitations of a dose algorithm as well as the uncertainty of measurement. Our experience has shown how the suggested tests can be performed effectively to validate dose calculation models.« less

  18. Three-dimensional computational fluid dynamics modelling and experimental validation of the Jülich Mark-F solid oxide fuel cell stack

    NASA Astrophysics Data System (ADS)

    Nishida, R. T.; Beale, S. B.; Pharoah, J. G.; de Haart, L. G. J.; Blum, L.

    2018-01-01

    This work is among the first where the results of an extensive experimental research programme are compared to performance calculations of a comprehensive computational fluid dynamics model for a solid oxide fuel cell stack. The model, which combines electrochemical reactions with momentum, heat, and mass transport, is used to obtain results for an established industrial-scale fuel cell stack design with complex manifolds. To validate the model, comparisons with experimentally gathered voltage and temperature data are made for the Jülich Mark-F, 18-cell stack operating in a test furnace. Good agreement is obtained between the model and experiment results for cell voltages and temperature distributions, confirming the validity of the computational methodology for stack design. The transient effects during ramp up of current in the experiment may explain a lower average voltage than model predictions for the power curve.

  19. Reliability and validity of the Performance Recorder 1 for measuring isometric knee flexor and extensor strength.

    PubMed

    Neil, Sarah E; Myring, Alec; Peeters, Mon Jef; Pirie, Ian; Jacobs, Rachel; Hunt, Michael A; Garland, S Jayne; Campbell, Kristin L

    2013-11-01

    Muscular strength is a key parameter of rehabilitation programs and a strong predictor of functional capacity. Traditional methods to measure strength, such as manual muscle testing (MMT) and hand-held dynamometry (HHD), are limited by the strength and experience of the tester. The Performance Recorder 1 (PR1) is a strength assessment tool attached to resistance training equipment and may be a time- and cost-effective tool to measure strength in clinical practice that overcomes some limitations of MMT and HHD. However, reliability and validity of the PR1 have not been reported. Test-retest and inter-rater reliability was assessed using the PR1 in healthy adults (n  =  15) during isometric knee flexion and extension. Criterion-related validity was assessed through comparison of values obtained from the PR1 and Biodex® isokinetic dynamometer. Test-retest reliability was excellent for peak knee flexion (intra-class correlation coefficient [ICC] of 0.96, 95% CI: 0.85, 0.99) and knee extension (ICC  =  0.96, 95% CI: 0.87, 0.99). Inter-rater reliability was also excellent for peak knee flexion (ICC  =  0.95, 95% CI: 0.85, 0.99) and peak knee extension (ICC  =  0.97, 95% CI: 0.91, 0.99). Validity was moderate for peak knee flexion (ICC  =  0.75, 95% CI: 0.38, 0.92) but poor for peak knee extension (ICC  =  0.37, 95% CI: 0, 0.73). The PR1 provides a reliable measure of isometric knee flexor and extensor strength in healthy adults that could be used in the clinical setting, but absolute values may not be comparable to strength assessment by gold-standard measures.

  20. Validation of a photography-based goniometry method for measuring joint range of motion.

    PubMed

    Blonna, Davide; Zarkadas, Peter C; Fitzsimmons, James S; O'Driscoll, Shawn W

    2012-01-01

    A critical component of evaluating the outcomes after surgery to restore lost elbow motion is the range of motion (ROM) of the elbow. This study examined if digital photography-based goniometry is as accurate and reliable as clinical goniometry for measuring elbow ROM. Instrument validity and reliability for photography-based goniometry were evaluated for a consecutive series of 50 elbow contractures by 4 observers with different levels of elbow experience. Goniometric ROM measurements were taken with the elbows in full extension and full flexion directly in the clinic (once) and from digital photographs (twice in a blinded random manner). Instrument validity for photography-based goniometry was extremely high (intraclass correlation coefficient: extension = 0.98, flexion = 0.96). For extension and flexion measurements by the expert surgeon, systematic error was negligible (0° and 1°, respectively). Limits of agreement were 7° (95% confidence interval [CI], 5° to 9°) and -7° (95% CI, -5° to -9°) for extension and 8° (95% CI, 6° to 10°) and -7° (95% CI, -5° to -9°) for flexion. Interobserver reliability for photography-based goniometry was better than that for clinical goniometry. The least experienced observer's photographic goniometry measurements were closer to the reference measurements than the clinical goniometry measurements. Photography-based goniometry is accurate and reliable for measuring elbow ROM. The photography-based method relied less on observer expertise than clinical goniometry. This validates an objective measure of patient outcome without requiring doctor-patient contact at a tertiary care center, where most contracture surgeries are done. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.

  1. Anticipatory and consummatory components of the experience of pleasure in schizophrenia: cross-cultural validation and extension.

    PubMed

    Chan, Raymond C K; Wang, Ya; Huang, Jia; Shi, Yanfang; Wang, Yuna; Hong, Xiaohong; Ma, Zheng; Li, Zhanjian; Lai, M K; Kring, Ann M

    2010-01-30

    This study examined anticipatory and consummatory pleasure in schizophrenia patients with and without negative symptoms. Negative symptom patients experienced less anticipatory pleasure than non-negative symptom patients; only one facet of consummatory pleasure was unaffected in negative schizophrenia. Greater pleasure deficits were correlated with more severe positive and negative symptoms.

  2. Satellite-based soil moisture validation and field experiments; Skylab to SMAP

    USDA-ARS?s Scientific Manuscript database

    Soil moisture remote sensing has reached a level of maturity that is now limited primarily by technology and funding. This is a result of extensive research and development that began in earnest in the 1970s and by the late 1990s had provided the basis and direction needed to support two dedicated s...

  3. Criterion-Related Validity of Sit-and-Reach Tests for Estimating Hamstring and Lumbar Extensibility: a Meta-Analysis

    PubMed Central

    Mayorga-Vega, Daniel; Merino-Marban, Rafael; Viciana, Jesús

    2014-01-01

    The main purpose of the present meta-analysis was to examine the scientific literature on the criterion-related validity of sit-and-reach tests for estimating hamstring and lumbar extensibility. For this purpose relevant studies were searched from seven electronic databases dated up through December 2012. Primary outcomes of criterion-related validity were Pearson´s zero-order correlation coefficients (r) between sit-and-reach tests and hamstrings and/or lumbar extensibility criterion measures. Then, from the included studies, the Hunter- Schmidt´s psychometric meta-analysis approach was conducted to estimate population criterion- related validity of sit-and-reach tests. Firstly, the corrected correlation mean (rp), unaffected by statistical artefacts (i.e., sampling error and measurement error), was calculated separately for each sit-and-reach test. Subsequently, the three potential moderator variables (sex of participants, age of participants, and level of hamstring extensibility) were examined by a partially hierarchical analysis. Of the 34 studies included in the present meta-analysis, 99 correlations values across eight sit-and-reach tests and 51 across seven sit-and-reach tests were retrieved for hamstring and lumbar extensibility, respectively. The overall results showed that all sit-and-reach tests had a moderate mean criterion-related validity for estimating hamstring extensibility (rp = 0.46-0.67), but they had a low mean for estimating lumbar extensibility (rp = 0. 16-0.35). Generally, females, adults and participants with high levels of hamstring extensibility tended to have greater mean values of criterion-related validity for estimating hamstring extensibility. When the use of angular tests is limited such as in a school setting or in large scale studies, scientists and practitioners could use the sit-and-reach tests as a useful alternative for hamstring extensibility estimation, but not for estimating lumbar extensibility. Key Points Overall sit-and-reach tests have a moderate mean criterion-related validity for estimating hamstring extensibility, but they have a low mean validity for estimating lumbar extensibility. Among all the sit-and-reach test protocols, the Classic sit-and-reach test seems to be the best option to estimate hamstring extensibility. End scores (e.g., the Classic sit-and-reach test) are a better indicator of hamstring extensibility than the modifications that incorporate fingers-to-box distance (e.g., the Modified sit-and-reach test). When angular tests such as straight leg raise or knee extension tests cannot be used, sit-and-reach tests seem to be a useful field test alternative to estimate hamstring extensibility, but not to estimate lumbar extensibility. PMID:24570599

  4. A novel cell culture model as a tool for forensic biology experiments and validations.

    PubMed

    Feine, Ilan; Shpitzen, Moshe; Roth, Jonathan; Gafny, Ron

    2016-09-01

    To improve and advance DNA forensic casework investigation outcomes, extensive field and laboratory experiments are carried out in a broad range of relevant branches, such as touch and trace DNA, secondary DNA transfer and contamination confinement. Moreover, the development of new forensic tools, for example new sampling appliances, by commercial companies requires ongoing validation and assessment by forensic scientists. A frequent challenge in these kinds of experiments and validations is the lack of a stable, reproducible and flexible biological reference material. As a possible solution, we present here a cell culture model based on skin-derived human dermal fibroblasts. Cultured cells were harvested, quantified and dried on glass slides. These slides were used in adhesive tape-lifting experiments and tests of DNA crossover confinement by UV irradiation. The use of this model enabled a simple and concise comparison between four adhesive tapes, as well as a straightforward demonstration of the effect of UV irradiation intensities on DNA quantity and degradation. In conclusion, we believe this model has great potential to serve as an efficient research tool in forensic biology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Dissociation From a Cross-Cultural Perspective: Implications of Studies in Brazil.

    PubMed

    Maraldi, Everton de Oliveira; Krippner, Stanley; Barros, Maria Cristina Monteiro; Cunha, Alexandre

    2017-07-01

    A major issue in the study of dissociation concerns the cross-cultural validity of definitions and measurements used to identify and classify dissociative disorders. There is also extensive debate on the etiological factors underlying dissociative experiences. Cross-cultural research is essential to elucidate these issues, particularly regarding evidence obtained from countries in which the study of dissociation is still in its infancy. The aim of this article was to discuss Brazilian research on the topic of dissociation, highlighting its contributions for the understanding of dissociative experiences in nonclinical populations and for the validity and relevance of dissociative disorders in the contexts of psychiatry, psychology, and psychotherapy. We also consider the ways in which dissociative experiences are assimilated by Brazilian culture and religious expressions, and the implications of Brazilian studies for the sociocultural investigation of dissociation. We conclude by addressing the limitations of these studies and potential areas for future research.

  6. Zig-zag tape influence in NREL Phase VI wind turbine

    NASA Astrophysics Data System (ADS)

    Gomez-Iradi, Sugoi; Munduate, Xabier

    2014-06-01

    Two bladed 10 metre diameter wind turbine was tested in the 24.4m × 36.6m NASA-Ames wind tunnel (Phase VI). These experiments have been extensively used for validation purposes for CFD and other engineering tools. The free transition case (S), has been, and is, the most employed one for validation purposes, and consist in a 3° pitch case with a rotational speed of 72rpm upwind configuration with and without yaw misalignment. However, there is another less visited case (M) where identical configuration was tested but with the inclusion of a zig-zag tape. This was called transition fixed sequence. This paper shows the differences between the free and the fix transition cases, that should be more appropriate for comparison with fully turbulent simulations. Steady k-ω SST fully turbulent computations performed with WMB CFD method are compared with the experiments showing, better predictions in the attached flow region when it is compared with the transition fixed experiments. This work wants to prove the utility of M case (transition fixed) and show its differences respect the S case (free transition) for validation purposes.

  7. In-Flight Thermal Performance of the Lidar In-Space Technology Experiment

    NASA Technical Reports Server (NTRS)

    Roettker, William

    1995-01-01

    The Lidar In-Space Technology Experiment (LITE) was developed at NASA s Langley Research Center to explore the applications of lidar operated from an orbital platform. As a technology demonstration experiment, LITE was developed to gain experience designing and building future operational orbiting lidar systems. Since LITE was the first lidar system to be flown in space, an important objective was to validate instrument design principles in such areas as thermal control, laser performance, instrument alignment and control, and autonomous operations. Thermal and structural analysis models of the instrument were developed during the design process to predict the behavior of the instrument during its mission. In order to validate those mathematical models, extensive engineering data was recorded during all phases of LITE's mission. This inflight engineering data was compared with preflight predictions and, when required, adjustments to the thermal and structural models were made to more accurately match the instrument s actual behavior. The results of this process for the thermal analysis and design of LITE are presented in this paper.

  8. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    1999-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include ASTER, CERES, MISR, MODIS and MOPITT. In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS), AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2, though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community. Detailed information about the EOS Terra validation Program can be found on the EOS Validation program homepage i/e.: http://ospso.gsfc.nasa.gov/validation/valpage.html).

  9. Cutting the Wires: Modularization of Cellular Networks for Experimental Design

    PubMed Central

    Lang, Moritz; Summers, Sean; Stelling, Jörg

    2014-01-01

    Understanding naturally evolved cellular networks requires the consecutive identification and revision of the interactions between relevant molecular species. In this process, initially often simplified and incomplete networks are extended by integrating new reactions or whole subnetworks to increase consistency between model predictions and new measurement data. However, increased consistency with experimental data alone is not sufficient to show the existence of biomolecular interactions, because the interplay of different potential extensions might lead to overall similar dynamics. Here, we present a graph-based modularization approach to facilitate the design of experiments targeted at independently validating the existence of several potential network extensions. Our method is based on selecting the outputs to measure during an experiment, such that each potential network extension becomes virtually insulated from all others during data analysis. Each output defines a module that only depends on one hypothetical network extension, and all other outputs act as virtual inputs to achieve insulation. Given appropriate experimental time-series measurements of the outputs, our modules can be analyzed, simulated, and compared to the experimental data separately. Our approach exemplifies the close relationship between structural systems identification and modularization, an interplay that promises development of related approaches in the future. PMID:24411264

  10. Suborbital Science Program

    NASA Technical Reports Server (NTRS)

    Vachon, Jacques; Curry, Robert E.

    2010-01-01

    Program Objectives: 1) Satellite Calibration and Validation: Provide methods to perform the cal/val requirements for Earth Observing System satellites. 2) New Sensor Development: Provide methods to reduce risk for new sensor concepts and algorithm development prior to committing sensors to operations. 3) Process Studies: Facilitate the acquisition of high spatial/temporal resolution focused measurements that are required to understand small atmospheric and surface structures which generate powerful Earth system effects. 4) Airborne Networking: Develop disruption-tolerant networking to enable integrated multiple scale measurements of critical environmental features. Dryden Capabilities include: a) Aeronautics history of aircraft developments and milestones. b) Extensive history and experience in instrument integration. c) Extensive history and experience in aircraft modifications. d) Strong background in international deployments. e) Long history of reliable and dependable execution of projects. f) Varied aircraft types providing different capabilities, performance and duration.

  11. TMATS/ IHAL/ DDML Schema Validation

    DTIC Science & Technology

    2017-02-01

    task was to create a method for performing IRIG eXtensible Markup Language (XML) schema validation. As opposed to XML instance document validation...TMATS / IHAL / DDML Schema Validation, RCC 126-17, February 2017 vii Acronyms DDML Data Display Markup Language HUD heads-up display iNET...system XML eXtensible Markup Language TMATS / IHAL / DDML Schema Validation, RCC 126-17, February 2017 viii This page intentionally left blank

  12. Review and assessment of turbulence models for hypersonic flows

    NASA Astrophysics Data System (ADS)

    Roy, Christopher J.; Blottner, Frederick G.

    2006-10-01

    Accurate aerodynamic prediction is critical for the design and optimization of hypersonic vehicles. Turbulence modeling remains a major source of uncertainty in the computational prediction of aerodynamic forces and heating for these systems. The first goal of this article is to update the previous comprehensive review of hypersonic shock/turbulent boundary-layer interaction experiments published in 1991 by Settles and Dodson (Hypersonic shock/boundary-layer interaction database. NASA CR 177577, 1991). In their review, Settles and Dodson developed a methodology for assessing experiments appropriate for turbulence model validation and critically surveyed the existing hypersonic experiments. We limit the scope of our current effort by considering only two-dimensional (2D)/axisymmetric flows in the hypersonic flow regime where calorically perfect gas models are appropriate. We extend the prior database of recommended hypersonic experiments (on four 2D and two 3D shock-interaction geometries) by adding three new geometries. The first two geometries, the flat plate/cylinder and the sharp cone, are canonical, zero-pressure gradient flows which are amenable to theory-based correlations, and these correlations are discussed in detail. The third geometry added is the 2D shock impinging on a turbulent flat plate boundary layer. The current 2D hypersonic database for shock-interaction flows thus consists of nine experiments on five different geometries. The second goal of this study is to review and assess the validation usage of various turbulence models on the existing experimental database. Here we limit the scope to one- and two-equation turbulence models where integration to the wall is used (i.e., we omit studies involving wall functions). A methodology for validating turbulence models is given, followed by an extensive evaluation of the turbulence models on the current hypersonic experimental database. A total of 18 one- and two-equation turbulence models are reviewed, and results of turbulence model assessments for the six models that have been extensively applied to the hypersonic validation database are compiled and presented in graphical form. While some of the turbulence models do provide reasonable predictions for the surface pressure, the predictions for surface heat flux are generally poor, and often in error by a factor of four or more. In the vast majority of the turbulence model validation studies we review, the authors fail to adequately address the numerical accuracy of the simulations (i.e., discretization and iterative error) and the sensitivities of the model predictions to freestream turbulence quantities or near-wall y+ mesh spacing. We recommend new hypersonic experiments be conducted which (1) measure not only surface quantities but also mean and fluctuating quantities in the interaction region and (2) provide careful estimates of both random experimental uncertainties and correlated bias errors for the measured quantities and freestream conditions. For the turbulence models, we recommend that a wide-range of turbulence models (including newer models) be re-examined on the current hypersonic experimental database, including the more recent experiments. Any future turbulence model validation efforts should carefully assess the numerical accuracy and model sensitivities. In addition, model corrections (e.g., compressibility corrections) should be carefully examined for their effects on a standard, low-speed validation database. Finally, as new experiments or direct numerical simulation data become available with information on mean and fluctuating quantities, they should be used to improve the turbulence models and thus increase their predictive capability.

  13. Spring 2013 Graduate Engineering Internship Summary

    NASA Technical Reports Server (NTRS)

    Ehrlich, Joshua

    2013-01-01

    In the spring of 2013, I participated in the National Aeronautics and Space Administration (NASA) Pathways Intern Employment Program at the Kennedy Space Center (KSC) in Florida. This was my final internship opportunity with NASA, a third consecutive extension from a summer 2012 internship. Since the start of my tenure here at KSC, I have gained an invaluable depth of engineering knowledge and extensive hands-on experience. These opportunities have granted me the ability to enhance my systems engineering approach in the field of payload design and testing as well as develop a strong foundation in the area of composite fabrication and testing for repair design on space vehicle structures. As a systems engineer, I supported the systems engineering and integration team with final acceptance testing of the Vegetable Production System, commonly referred to as Veggie. Verification and validation (V and V) of Veggie was carried out prior to qualification testing of the payload, which incorporated the process of confirming the system's design requirements dependent on one or more validation methods: inspection, analysis, demonstration, and testing.

  14. Theory and simulations of covariance mapping in multiple dimensions for data analysis in high-event-rate experiments

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Frasinski, L. J.; Eland, J. H. D.; Feifel, R.

    2014-05-01

    Multidimensional covariance analysis and its validity for correlation of processes leading to multiple products are investigated from a theoretical point of view. The need to correct for false correlations induced by experimental parameters which fluctuate from shot to shot, such as the intensity of self-amplified spontaneous emission x-ray free-electron laser pulses, is emphasized. Threefold covariance analysis based on simple extension of the two-variable formulation is shown to be valid for variables exhibiting Poisson statistics. In this case, false correlations arising from fluctuations in an unstable experimental parameter that scale linearly with signals can be eliminated by threefold partial covariance analysis, as defined here. Fourfold covariance based on the same simple extension is found to be invalid in general. Where fluctuations in an unstable parameter induce nonlinear signal variations, a technique of contingent covariance analysis is proposed here to suppress false correlations. In this paper we also show a method to eliminate false correlations associated with fluctuations of several unstable experimental parameters.

  15. European consensus on a competency-based virtual reality training program for basic endoscopic surgical psychomotor skills.

    PubMed

    van Dongen, Koen W; Ahlberg, Gunnar; Bonavina, Luigi; Carter, Fiona J; Grantcharov, Teodor P; Hyltander, Anders; Schijven, Marlies P; Stefani, Alessandro; van der Zee, David C; Broeders, Ivo A M J

    2011-01-01

    Virtual reality (VR) simulators have been demonstrated to improve basic psychomotor skills in endoscopic surgery. The exercise configuration settings used for validation in studies published so far are default settings or are based on the personal choice of the tutors. The purpose of this study was to establish consensus on exercise configurations and on a validated training program for a virtual reality simulator, based on the experience of international experts to set criterion levels to construct a proficiency-based training program. A consensus meeting was held with eight European teams, all extensively experienced in using the VR simulator. Construct validity of the training program was tested by 20 experts and 60 novices. The data were analyzed by using the t test for equality of means. Consensus was achieved on training designs, exercise configuration, and examination. Almost all exercises (7/8) showed construct validity. In total, 50 of 94 parameters (53%) showed significant difference. A European, multicenter, validated, training program was constructed according to the general consensus of a large international team with extended experience in virtual reality simulation. Therefore, a proficiency-based training program can be offered to training centers that use this simulator for training in basic psychomotor skills in endoscopic surgery.

  16. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    PubMed

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Cutting the wires: modularization of cellular networks for experimental design.

    PubMed

    Lang, Moritz; Summers, Sean; Stelling, Jörg

    2014-01-07

    Understanding naturally evolved cellular networks requires the consecutive identification and revision of the interactions between relevant molecular species. In this process, initially often simplified and incomplete networks are extended by integrating new reactions or whole subnetworks to increase consistency between model predictions and new measurement data. However, increased consistency with experimental data alone is not sufficient to show the existence of biomolecular interactions, because the interplay of different potential extensions might lead to overall similar dynamics. Here, we present a graph-based modularization approach to facilitate the design of experiments targeted at independently validating the existence of several potential network extensions. Our method is based on selecting the outputs to measure during an experiment, such that each potential network extension becomes virtually insulated from all others during data analysis. Each output defines a module that only depends on one hypothetical network extension, and all other outputs act as virtual inputs to achieve insulation. Given appropriate experimental time-series measurements of the outputs, our modules can be analyzed, simulated, and compared to the experimental data separately. Our approach exemplifies the close relationship between structural systems identification and modularization, an interplay that promises development of related approaches in the future. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  18. Transport methods and interactions for space radiations

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.; Schimmerling, Walter S.; Khandelwal, Govind S.; Khan, Ferdous S.; Nealy, John E.; Cucinotta, Francis A.; Simonsen, Lisa C.; Shinn, Judy L.; Norbury, John W.

    1991-01-01

    A review of the program in space radiation protection at the Langley Research Center is given. The relevant Boltzmann equations are given with a discussion of approximation procedures for space applications. The interaction coefficients are related to solution of the many-body Schroedinger equation with nuclear and electromagnetic forces. Various solution techniques are discussed to obtain relevant interaction cross sections with extensive comparison with experiments. Solution techniques for the Boltzmann equations are discussed in detail. Transport computer code validation is discussed through analytical benchmarking, comparison with other codes, comparison with laboratory experiments and measurements in space. Applications to lunar and Mars missions are discussed.

  19. Validating Quantitative Measurement Using Qualitative Data: Combining Rasch Scaling and Latent Semantic Analysis in Psychiatry

    NASA Astrophysics Data System (ADS)

    Lange, Rense

    2015-02-01

    An extension of concurrent validity is proposed that uses qualitative data for the purpose of validating quantitative measures. The approach relies on Latent Semantic Analysis (LSA) which places verbal (written) statements in a high dimensional semantic space. Using data from a medical / psychiatric domain as a case study - Near Death Experiences, or NDE - we established concurrent validity by connecting NDErs qualitative (written) experiential accounts with their locations on a Rasch scalable measure of NDE intensity. Concurrent validity received strong empirical support since the variance in the Rasch measures could be predicted reliably from the coordinates of their accounts in the LSA derived semantic space (R2 = 0.33). These coordinates also predicted NDErs age with considerable precision (R2 = 0.25). Both estimates are probably artificially low due to the small available data samples (n = 588). It appears that Rasch scalability of NDE intensity is a prerequisite for these findings, as each intensity level is associated (at least probabilistically) with a well- defined pattern of item endorsements.

  20. Investigation of the effect of the ejector on the performance of the pulse detonation engine nozzle extension

    NASA Astrophysics Data System (ADS)

    Korobov, A. E.; Golovastov, S. V.

    2015-11-01

    Influence of an ejector nozzle extension on gas flow at a pulse detonation engine was investigated numerically and experimentally. Detonation formation was organized in stoichiometric hydrogen-oxygen mixture in cylindrical detonation tube. Cylindrical ejector was constructed and mounted at the open end of the tube. Thrust, air consumption and parameters of the detonation were measured in single and multiple regimes of operation. Axisymmetric model was used in numerical investigation. Equations of Navies-Stokes were solved using a finite-difference scheme Roe of second order of accuracy. Initial conditions were estimated on a base of experimental data. Numerical results were validated with experiments data.

  1. New mechanistically based model for predicting reduction of biosolids waste by ozonation of return activated sludge.

    PubMed

    Isazadeh, Siavash; Feng, Min; Urbina Rivas, Luis Enrique; Frigon, Dominic

    2014-04-15

    Two pilot-scale activated sludge reactors were operated for 98 days to provide the necessary data to develop and validate a new mathematical model predicting the reduction of biosolids production by ozonation of the return activated sludge (RAS). Three ozone doses were tested during the study. In addition to the pilot-scale study, laboratory-scale experiments were conducted with mixed liquor suspended solids and with pure cultures to parameterize the biomass inactivation process during exposure to ozone. The experiments revealed that biomass inactivation occurred even at the lowest doses, but that it was not associated with extensive COD solubilization. For validation, the model was used to simulate the temporal dynamics of the pilot-scale operational data. Increasing the description accuracy of the inactivation process improved the precision of the model in predicting the operational data. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Four experimental demonstrations of active vibration control for flexible structures

    NASA Technical Reports Server (NTRS)

    Phillips, Doug; Collins, Emmanuel G., Jr.

    1990-01-01

    Laboratory experiments designed to test prototype active-vibration-control systems under development for future flexible space structures are described, summarizing previously reported results. The control-synthesis technique employed for all four experiments was the maximum-entropy optimal-projection (MEOP) method (Bernstein and Hyland, 1988). Consideration is given to: (1) a pendulum experiment on large-amplitude LF dynamics; (2) a plate experiment on broadband vibration suppression in a two-dimensional structure; (3) a multiple-hexagon experiment combining the factors studied in (1) and (2) to simulate the complexity of a large space structure; and (4) the NASA Marshall ACES experiment on a lightweight deployable 45-foot beam. Extensive diagrams, drawings, graphs, and photographs are included. The results are shown to validate the MEOP design approach, demonstrating that good performance is achievable using relatively simple low-order decentralized controllers.

  3. A Comprehensive Validation Approach Using The RAVEN Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J

    2015-06-01

    The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less

  4. Analysis of in situ resources for the Soil Moisture Active Passive Validation Experiments in 2015 and 2016

    NASA Astrophysics Data System (ADS)

    Cosh, M. H.; Jackson, T. J.; Colliander, A.; Bindlish, R.; McKee, L.; Goodrich, D. C.; Prueger, J. H.; Hornbuckle, B. K.; Coopersmith, E. J.; Holifield Collins, C.; Smith, J.

    2016-12-01

    With the launch of the Soil Moisture Active Passive Mission (SMAP) in 2015, a new era of soil moisture monitoring was begun. Soil moisture is available on a near daily basis at a 36 km resolution for the globe. But this dataset is only as valuable if its products are accurate and reliable. Therefore, in order to demonstrate the accuracy of the soil moisture product, NASA enacted an extensive calibration and validation program with many in situ soil moisture networks contributing data across a variety of landscape regimes. However, not all questions can be answered by these networks. As a result, two intensive field experiments were executed to provide more detailed reference points for calibration and validation. Multi-week field campaigns were conducted in Arizona and Iowa at the USDA Agricultural Research Service Walnut Gulch and South Fork Experimental Watersheds, respectively. Aircraft observations were made to provide a high resolution data product. Soil moisture, soil roughness and vegetation data were collected at high resolution to provide a downscaled dataset to compare against aircraft and satellite estimates.

  5. FuGEFlow: data model and markup language for flow cytometry.

    PubMed

    Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R

    2009-06-16

    Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development.

  6. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    2000-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community.

  7. Calibration of the LHAASO-KM2A electromagnetic particle detectors using charged particles within the extensive air showers

    NASA Astrophysics Data System (ADS)

    Lv, Hongkui; He, Huihai; Sheng, Xiangdong; Liu, Jia; Chen, Songzhan; Liu, Ye; Hou, Chao; Zhao, Jing; Zhang, Zhongquan; Wu, Sha; Wang, Yaping; Lhaaso Collaboration

    2018-07-01

    In the Large High Altitude Air Shower Observatory (LHAASO), one square kilometer array (KM2A), with 5242 electromagnetic particle detectors (EDs) and 1171 muon detectors (MDs), is designed to study ultra-high energy gamma-ray astronomy and cosmic ray physics. The remoteness and numerous detectors extremely demand a robust and automatic calibration procedure. In this paper, a self-calibration method which relies on the measurement of charged particles within the extensive air showers is proposed. The method is fully validated by Monte Carlo simulation and successfully applied in a KM2A prototype array experiment. Experimental results show that the self-calibration method can be used to determine the detector time offset constants at the sub-nanosecond level and the number density of particles collected by each ED with an accuracy of a few percents, which are adequate to meet the physical requirements of LHAASO experiment. This software calibration also offers an ideal method to realtime monitor the detector performances for next generation ground-based EAS experiments covering an area above square kilometers scale.

  8. Development of an ethanol model using social insects: IV. Influence of ethanol on the aggression of Africanized honey bees (Apis mellifera L.).

    PubMed

    Abramson, Charles I; Place, Aaron J; Aquino, Italo S; Fernandez, Andrea

    2004-06-01

    Experiments were designed to determine whether ethanol influenced aggression in honey bees. Two experiments are reported. In Exp. 1, harnessed honey bees were fed a 1%, 5%, 10%, or 20% ethanol solution. Two control groups received either a sucrose solution only or no pretreatment, respectively. The dependent variable was the number of sting extensions over 10 min. Analysis showed that aggression in harnessed bees was not influenced by prior ethanol consumption. Because there was some suspicion that the extension of the sting apparatus may be hindered by harnessing, and the authors wanted to use a design that increased ecological validity, Exp. 2 was conducted with free-flying bees. Sucrose or 20% ethanol solutions were placed in front of beehives, and the number of stings on a leather patch dangled in front of the hive served as the dependent variable. The experiment was terminated after 5 hr. because bees exposed to ethanol became dangerously aggressive. A unique aspect of the study was that Africanized honey bees were used.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    English, Shawn Allen; Nelson, Stacy Michelle; Briggs, Timothy

    Presented is a model verification and validation effort using low - velocity impact (LVI) of carbon fiber reinforced polymer laminate experiments. A flat cylindrical indenter impacts the laminate with enough energy to produce delamination, matrix cracks and fiber breaks. Included in the experimental efforts are ultrasonic scans of the damage for qualitative validation of the models. However, the primary quantitative metrics of validation are the force time history measured through the instrumented indenter and initial and final velocities. The simulations, whi ch are run on Sandia's Sierra finite element codes , consist of all physics and material parameters of importancemore » as determined by a sensitivity analysis conducted on the LVI simulation. A novel orthotropic damage and failure constitutive model that is cap able of predicting progressive composite damage and failure is described in detail and material properties are measured, estimated from micromechanics or optimized through calibration. A thorough verification and calibration to the accompanying experiment s are presented. Specia l emphasis is given to the four - point bend experiment. For all simulations of interest, the mesh and material behavior is verified through extensive convergence studies. An ensemble of simulations incorporating model parameter unc ertainties is used to predict a response distribution which is then compared to experimental output. The result is a quantifiable confidence in material characterization and model physics when simulating this phenomenon in structures of interest.« less

  10. Verification and Validation Strategy for LWRS Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carl M. Stoots; Richard R. Schultz; Hans D. Gougar

    2012-09-01

    One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verifiedmore » and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.« less

  11. Vibration control of beams using stand-off layer damping: finite element modeling and experiments

    NASA Astrophysics Data System (ADS)

    Chaudry, A.; Baz, A.

    2006-03-01

    Damping treatments with stand-off layer (SOL) have been widely accepted as an attractive alternative to conventional constrained layer damping (CLD) treatments. Such an acceptance stems from the fact that the SOL, which is simply a slotted spacer layer sandwiched between the viscoelastic layer and the base structure, acts as a strain magnifier that considerably amplifies the shear strain and hence the energy dissipation characteristics of the viscoelastic layer. Accordingly, more effective vibration suppression can be achieved by using SOL as compared to employing CLD. In this paper, a comprehensive finite element model of the stand-off layer constrained damping treatment is developed. The model accounts for the geometrical and physical parameters of the slotted SOL, the viscoelastic, layer the constraining layer, and the base structure. The predictions of the model are validated against the predictions of a distributed transfer function model and a model built using a commercial finite element code (ANSYS). Furthermore, the theoretical predictions are validated experimentally for passive SOL treatments of different configurations. The obtained results indicate a close agreement between theory and experiments. Furthermore, the obtained results demonstrate the effectiveness of the CLD with SOL in enhancing the energy dissipation as compared to the conventional CLD. Extension of the proposed one-dimensional CLD with SOL to more complex structures is a natural extension to the present study.

  12. An extensive analysis of various texture feature extractors to detect Diabetes Mellitus using facial specific regions.

    PubMed

    Shu, Ting; Zhang, Bob; Yan Tang, Yuan

    2017-04-01

    Researchers have recently discovered that Diabetes Mellitus can be detected through non-invasive computerized method. However, the focus has been on facial block color features. In this paper, we extensively study the effects of texture features extracted from facial specific regions at detecting Diabetes Mellitus using eight texture extractors. The eight methods are from four texture feature families: (1) statistical texture feature family: Image Gray-scale Histogram, Gray-level Co-occurance Matrix, and Local Binary Pattern, (2) structural texture feature family: Voronoi Tessellation, (3) signal processing based texture feature family: Gaussian, Steerable, and Gabor filters, and (4) model based texture feature family: Markov Random Field. In order to determine the most appropriate extractor with optimal parameter(s), various parameter(s) of each extractor are experimented. For each extractor, the same dataset (284 Diabetes Mellitus and 231 Healthy samples), classifiers (k-Nearest Neighbors and Support Vector Machines), and validation method (10-fold cross validation) are used. According to the experiments, the first and third families achieved a better outcome at detecting Diabetes Mellitus than the other two. The best texture feature extractor for Diabetes Mellitus detection is the Image Gray-scale Histogram with bin number=256, obtaining an accuracy of 99.02%, a sensitivity of 99.64%, and a specificity of 98.26% by using SVM. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Protocol for the validation of microbiological control of cellular products according to German regulators recommendations--Boon and Bane for the manufacturer.

    PubMed

    Störmer, M; Radojska, S; Hos, N J; Gathof, B S

    2015-04-01

    In order to generate standardized conditions for the microbiological control of HPCs, the PEI recommended defined steps for validation that will lead to extensive validation as shown in this study, where a possible validation principle for the microbiological control of allogeneic SCPs is presented. Although it could be demonstrated that automated culture improves microbial safety of cellular products, the requirement for extensive validation studies needs to be considered. © 2014 International Society of Blood Transfusion.

  14. 8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...

  15. 8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...

  16. 8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...

  17. 8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...

  18. 8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...

  19. Validation of a 30-year-old process for the manufacture of L-asparaginase from Erwinia chrysanthemi.

    PubMed

    Gervais, David; Allison, Nigel; Jennings, Alan; Jones, Shane; Marks, Trevor

    2013-04-01

    A 30-year-old manufacturing process for the biologic product L-asparaginase from the plant pathogen Erwinia chrysanthemi was rigorously qualified and validated, with a high level of agreement between validation data and the 6-year process database. L-Asparaginase exists in its native state as a tetrameric protein and is used as a chemotherapeutic agent in the treatment regimen for Acute Lymphoblastic Leukaemia (ALL). The manufacturing process involves fermentation of the production organism, extraction and purification of the L-asparaginase to make drug substance (DS), and finally formulation and lyophilisation to generate drug product (DP). The extensive manufacturing experience with the product was used to establish ranges for all process parameters and product quality attributes. The product and in-process intermediates were rigorously characterised, and new assays, such as size-exclusion and reversed-phase UPLC, were developed, validated, and used to analyse several pre-validation batches. Finally, three prospective process validation batches were manufactured and product quality data generated using both the existing and the new analytical methods. These data demonstrated the process to be robust, highly reproducible and consistent, and the validation was successful, contributing to the granting of an FDA product license in November, 2011.

  20. Soybean canopy reflectance modeling data sets

    NASA Technical Reports Server (NTRS)

    Ranson, K. J.; Biehl, L. L.; Daughtry, C. S. T.

    1984-01-01

    Numerous mathematical models of the interaction of radiation with vegetation canopies have been developed over the last two decades. However, data with which to exercise and validate these models are scarce. During three days in the summer of 1980, experiments are conducted with the objective of gaining insight about the effects of solar illumination and view angles on soybean canopy reflectance. In concert with these experiment, extensive measurements of the soybean canopies are obtained. This document is a compilation of the bidirectional reflectance factors, agronomic, characteristics, canopy geometry, and leaf, stem, and pod optical properties of the soybean canopies. These data sets should be suitable for use with most vegetation canopy reflectance models.

  1. New Reactor Physics Benchmark Data in the March 2012 Edition of the IRPhEP Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John D. Bess; J. Blair Briggs; Jim Gulliford

    2012-11-01

    The International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications. Numerous experiments that have been performed worldwide, represent a large investment of infrastructure, expertise, and cost, and are valuable resources of data for present and future research. These valuable assets provide the basis for recording, development, and validation of methods. If the experimental data are lost, the high cost to repeat many of these measurements may be prohibitive. The purpose of the IRPhEP is to provide an extensively peer-reviewed set ofmore » reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. Contributors from around the world collaborate in the evaluation and review of selected benchmark experiments for inclusion in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [1]. Several new evaluations have been prepared for inclusion in the March 2012 edition of the IRPhEP Handbook.« less

  2. Polarization of Cosmic Microwave Background

    NASA Astrophysics Data System (ADS)

    Buzzelli, A.; Cabella, P.; de Gasperis, G.; Vittorio, N.

    2016-02-01

    In this work we present an extension of the ROMA map-making code for data analysis of Cosmic Microwave Background polarization, with particular attention given to the inflationary polarization B-modes. The new algorithm takes into account a possible cross- correlated noise component among the different detectors of a CMB experiment. We tested the code on the observational data of the BOOMERanG (2003) experiment and we show that we are provided with a better estimate of the power spectra, in particular the error bars of the BB spectrum are smaller up to 20% for low multipoles. We point out the general validity of the new method. A possible future application is the LSPE balloon experiment, devoted to the observation of polarization at large angular scales.

  3. Results of Microgravity Fluid Dynamics Captured With the Spheres-Slosh Experiment

    NASA Technical Reports Server (NTRS)

    Lapilli, Gabriel; Kirk, Daniel; Gutierrez, Hector; Schallhorn, Paul; Marsell, Brandon; Roth, Jacob; Moder, Jeffrey

    2015-01-01

    This paper provides an overview of the SPHERES-Slosh Experiment (SSE) aboard the International Space Station (ISS) and presents on-orbit results with data analysis. In order to predict the location of the liquid propellant during all times of a spacecraft mission, engineers and mission analysts utilize Computational Fluid Dynamics (CFD). These state-of-the-art computer programs numerically solve the fluid flow equations to predict the location of the fluid at any point in time during different spacecraft maneuvers. The models and equations used by these programs have been extensively validated on the ground, but long duration data has never been acquired in a microgravity environment. The SSE aboard the ISS is designed to acquire this type of data, used by engineers on earth to validate and improve the CFD prediction models, improving the design of the next generation of space vehicles as well as the safety of current missions. The experiment makes use of two Synchronized Position Hold, Engage, Reorient Experimental Satellites (SPHERES) connected by a frame. In the center of the frame there is a plastic, pill shaped tank that is partially filled with green-colored water. A pair of high resolution cameras records the movement of the liquid inside the tank as the experiment maneuvers within the Japanese Experimental Module test volume. Inertial measurement units record the accelerations and rotations of the tank, making the combination of stereo imaging and inertial data the inputs for CFD model validation.

  4. Result of Microgravity Fluid Dynamics Captured with the SPHERES-Slosh Experiment

    NASA Technical Reports Server (NTRS)

    Lapilli, Gabriel; Kirk, Daniel; Gutierrez, Hector; Schallhorn, Paul; Marsell, Brandon; Roth, Jacob; Moder, Jeffrey

    2015-01-01

    This paper provides an overview of the SPHERES-Slosh Experiment (SSE) aboard the International Space Station (ISS) and presents on-orbit results with data analysis. In order to predict the location of the liquid propellant during all times of a spacecraft mission, engineers and mission analysts utilize Computational Fluid Dynamics (CFD). These state-of-the-art computer programs numerically solve the fluid flow equations to predict the location of the fluid at any point in time during different spacecraft maneuvers. The models and equations used by these programs have been extensively validated on the ground, but long duration data has never been acquired in a microgravity environment. The SSE aboard the ISS is designed to acquire this type of data, used by engineers on earth to validate and improve the CFD prediction models, improving the design of the next generation of space vehicles as well as the safety of current missions. The experiment makes use of two Synchronized Position Hold, Engage, Reorient Experimental Satellites (SPHERES) connected by a frame. In the center of the frame there is a plastic, pill shaped tank that is partially filled with green-colored water. A pair of high resolution cameras records the movement of the liquid inside the tank as the experiment maneuvers within the Japanese Experimental Module test volume. Inertial measurement units record the accelerations and rotations of the tank, making the combination of stereo imaging and inertial data the inputs for CFD model validation.

  5. Results of Microgravity Fluid Dynamics Captured with the Spheres-Slosh Experiment

    NASA Technical Reports Server (NTRS)

    Lapilli, Gabriel; Kirk, Daniel Robert; Gutierrez, Hector; Schallhorn, Paul; Marsell, Brandon; Roth, Jacob; Jeffrey Moder

    2015-01-01

    This paper provides an overview of the SPHERES-Slosh Experiment (SSE) aboard the International Space Station (ISS) and presents on-orbit results with data analysis. In order to predict the location of the liquid propellant during all times of a spacecraft mission, engineers and mission analysts utilize Computational Fluid Dynamics (CFD). These state-of-the-art computer programs numerically solve the fluid flow equations to predict the location of the fluid at any point in time during different spacecraft maneuvers. The models and equations used by these programs have been extensively validated on the ground, but long duration data has never been acquired in a microgravity environment. The SSE aboard the ISS is designed to acquire this type of data, used by engineers on earth to validate and improve the CFD prediction models, improving the design of the next generation of space vehicles as well as the safety of current missions. The experiment makes use of two Synchronized Position Hold, Engage, Reorient Experimental Satellites (SPHERES) connected by a frame. In the center of the frame there is a plastic, pill shaped tank that is partially filled with green-colored water. A pair of high resolution cameras records the movement of the liquid inside the tank as the experiment maneuvers within the Japanese Experimental Module test volume. Inertial measurement units record the accelerations and rotations of the tank, making the combination of stereo imaging and inertial data the inputs for CFD model validation.

  6. Calculating the Bending Modulus for Multicomponent Lipid Membranes in Different Thermodynamic Phases

    PubMed Central

    2013-01-01

    We establish a computational approach to extract the bending modulus, KC, for lipid membranes from relatively small-scale molecular simulations. Fluctuations in the splay of individual pairs of lipids faithfully inform on KC in multicomponent membranes over a large range of rigidities in different thermodynamic phases. Predictions are validated by experiments even where the standard spectral analysis-based methods fail. The local nature of this method potentially allows its extension to calculations of KC in protein-laden membranes. PMID:24039553

  7. Methods for Reachability-based Hybrid Controller Design

    DTIC Science & Technology

    2012-05-10

    approaches for airport runways ( Teo and Tomlin, 2003). The results of the reachability calculations were validated in extensive simulations as well as...UAV flight experiments (Jang and Tomlin, 2005; Teo , 2005). While the focus of these previous applications lies largely in safety verification, the work...B([15, 0],a0)× [−π,π])\\ V,∀qi ∈ Q, where a0 = 30m is the protected radius (chosen based upon published data of the wingspan of a Boeing KC -135

  8. Validation of stratospheric aerosol and gas experiments 1 and 2 satellite aerosol optical depth measurements using surface radiometer data

    NASA Technical Reports Server (NTRS)

    Kent, G. S.; Mccormick, M. P.; Wang, P.-H.

    1994-01-01

    The stratospheric aerosol measurement 2, stratospheric aerosol and gas experiment (SAGE) 1, and SAGE 2 series of solar occultation satellite instruments were designed for the study of stratospheric aerosols and gases and have been extensively validated in the stratosphere. They are also capable, under cloud-free conditions, of measuring the extinction due to aerosols in the troposphere. Such tropospheric extinction measurements have yet to be validated by appropriate lidar and in situ techniques. In this paper published atmospheric aerosol optical depth measurements, made from high-altitude observatories during volcanically quiet periods, have been compared with optical depths calculated from local SAGE 1 and SAGE 2 extinction profiles. Surface measurements from three such observatories have been used, one located in Hawaii and two within the continental United States. Data have been intercompared on a seasonal basis at wave-lenths between 0.5 and 1.0 micron and found to agree within the range of measurement errors and expected atmospheric variation. The mean rms difference between the optical depths for corresponding satellite and surface measured data sets is 29%, and the mean ratio of the optical depths is 1.09.

  9. Enabling Self-Monitoring Data Exchange in Participatory Medicine.

    PubMed

    Lopez-Campos, Guillermo; Ofoghi, Bahadorreza; Martin-Sanchez, Fernando

    2015-01-01

    The development of new methods, devices and apps for self-monitoring have enabled the extension of the application of these approaches for consumer health and research purposes. The increase in the number and variety of devices has generated a complex scenario where reporting guidelines and data exchange formats will be needed to ensure the quality of the information and the reproducibility of results of the experiments. Based on the Minimal Information for Self Monitoring Experiments (MISME) reporting guideline we have developed an XML format (MISME-ML) to facilitate data exchange for self monitoring experiments. We have also developed a sample instance to illustrate the concept and a Java MISME-ML validation tool. The implementation and adoption of these tools should contribute to the consolidation of a set of methods that ensure the reproducibility of self monitoring experiments for research purposes.

  10. The List of Threatening Experiences: the reliability and validity of a brief life events questionnaire.

    PubMed

    Brugha, T S; Cragg, D

    1990-07-01

    During the 23 years since the original work of Holmes & Rahe, research into stressful life events on human subjects has tended towards the development of longer and more complex inventories. The List of Threatening Experiences (LTE) of Brugha et al., by virtue of its brevity, overcomes difficulties of clinical application. In a study of 50 psychiatric patients and informants, the questionnaire version of the list (LTE-Q) was shown to have high test-retest reliability, and good agreement with informant information. Concurrent validity, based on the criterion of independently rated adversity derived from a semistructured life events interview, making use of the Life Events and Difficulties Scales (LEDS) method developed by Brown & Harris, showed both high specificity and sensitivity. The LTE-Q is particularly recommended for use in psychiatric, psychological and social studies in which other intervening variables such as social support, coping, and cognitive variables are of interest, and resources do not allow for the use of extensive interview measures of stress.

  11. A critical evaluation of the validity of episodic future thinking: A clinical neuropsychology perspective.

    PubMed

    Ward, Amanda M

    2016-11-01

    Episodic future thinking is defined as the ability to mentally simulate a future event. Although episodic future thinking has been studied extensively in neuroscience, this construct has not been explored in depth from the perspective of clinical neuropsychology. The aim of this critical narrative review is to assess the validity and clinical implications of episodic future thinking. A systematic review of episodic future thinking literature was conducted. PubMed and PsycInfo were searched through July 2015 for review and empirical articles with the following search terms: "episodic future thinking," "future mental simulation," "imagining the future," "imagining new experiences," "future mental time travel," "future autobiographical experience," and "prospection." The review discusses evidence that episodic future thinking is important for adaptive functioning, which has implications for neurological populations. To determine the validity of episodic future thinking, the construct is evaluated with respect to related constructs, such as imagination, episodic memory, autobiographical memory, prospective memory, narrative construction, and working memory. Although it has been minimally investigated, there is evidence of convergent and discriminant validity for episodic future thinking. Research has not addressed the incremental validity of episodic future thinking. Practical considerations of episodic future thinking tasks and related constructs in a clinical neuropsychological setting are considered. The utility of episodic future thinking is currently unknown due to the lack of research investigating the validity of episodic future thinking. Future work is discussed, which could determine whether episodic future thinking is an important missing piece in standard clinical neuropsychological assessment. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. FuGEFlow: data model and markup language for flow cytometry

    PubMed Central

    Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R

    2009-01-01

    Background Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. Methods We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. Results The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. Conclusion We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development. PMID:19531228

  13. Epithelial cancers and photon migration: Monte Carlo simulations and diffuse reflectance measurements

    NASA Astrophysics Data System (ADS)

    Tubiana, Jerome; Kass, Alex J.; Newman, Maya Y.; Levitz, David

    2015-07-01

    Detecting pre-cancer in epithelial tissues such as the cervix is a challenging task in low-resources settings. In an effort to achieve low cost cervical cancer screening and diagnostic method for use in low resource settings, mobile colposcopes that use a smartphone as their engine have been developed. Designing image analysis software suited for this task requires proper modeling of light propagation from the abnormalities inside tissues to the camera of the smartphones. Different simulation methods have been developed in the past, by solving light diffusion equations, or running Monte Carlo simulations. Several algorithms exist for the latter, including MCML and the recently developed MCX. For imaging purpose, the observable parameter of interest is the reflectance profile of a tissue under some specific pattern of illumination and optical setup. Extensions of the MCX algorithm to simulate this observable under these conditions were developed. These extensions were validated against MCML and diffusion theory for the simple case of contact measurements, and reflectance profiles under colposcopy imaging geometry were also simulated. To validate this model, the diffuse reflectance profiles of tissue phantoms were measured with a spectrometer under several illumination and optical settings for various homogeneous tissues phantoms. The measured reflectance profiles showed a non-trivial deviation across the spectrum. Measurements of an added absorber experiment on a series of phantoms showed that absorption of dye scales linearly when fit to both MCX and diffusion models. More work is needed to integrate a pupil into the experiment.

  14. Development and validation of a 10-year-old child ligamentous cervical spine finite element model.

    PubMed

    Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H

    2013-12-01

    Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.

  15. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  16. Measuring the Pros and Cons of What It Means to Be a Black Man: Development and Validation of the Black Men’s Experiences Scale (BMES)

    PubMed Central

    Bowleg, Lisa; English, Devin; del Rio-Gonzalez, Ana Maria; Burkholder, Gary J.; Teti, Michelle; Tschann, Jeanne M.

    2015-01-01

    Although extensive research documents that Black people in the U.S. frequently experience social discrimination, most of this research aggregates these experiences primarily or exclusively by race. Consequently, empirical gaps exist about the psychosocial costs and benefits of Black men’s experiences at the intersection of race and gender. Informed by intersectionality, a theoretical framework that highlights how multiple social identities intersect to reflect interlocking social-structural inequality, this study addresses these gaps with the qualitative development and quantitative test of the Black Men’s Experiences Scale (BMES). The BMES assesses Black men’s negative experiences with overt discrimination and microaggressions, as well their positive evaluations of what it means to be Black men. First, we conducted focus groups and individual interviews with Black men to develop the BMES. Next, we tested the BMES with 578 predominantly low-income urban Black men between the ages of 18 and 44. Exploratory factor analysis suggested a 12-item, 3-factor solution that explained 63.7% of the variance. We labeled the subscales: Overt Discrimination, Microaggressions, and Positives: Black Men. Confirmatory factor analysis supported the three-factor solution. As hypothesized, the BMES’s subscales correlated with measures of racial discrimination, depression, resilience, and social class at the neighborhood-level. Preliminary evidence suggests that the BMES is a reliable and valid measure of Black men’s experiences at the intersection of race and gender. PMID:27087786

  17. Measuring the Pros and Cons of What It Means to Be a Black Man: Development and Validation of the Black Men's Experiences Scale (BMES).

    PubMed

    Bowleg, Lisa; English, Devin; Del Rio-Gonzalez, Ana Maria; Burkholder, Gary J; Teti, Michelle; Tschann, Jeanne M

    2016-04-01

    Although extensive research documents that Black people in the U.S. frequently experience social discrimination, most of this research aggregates these experiences primarily or exclusively by race. Consequently, empirical gaps exist about the psychosocial costs and benefits of Black men's experiences at the intersection of race and gender. Informed by intersectionality, a theoretical framework that highlights how multiple social identities intersect to reflect interlocking social-structural inequality, this study addresses these gaps with the qualitative development and quantitative test of the Black Men's Experiences Scale (BMES). The BMES assesses Black men's negative experiences with overt discrimination and microaggressions, as well their positive evaluations of what it means to be Black men. First, we conducted focus groups and individual interviews with Black men to develop the BMES. Next, we tested the BMES with 578 predominantly low-income urban Black men between the ages of 18 and 44. Exploratory factor analysis suggested a 12-item, 3-factor solution that explained 63.7% of the variance. We labeled the subscales: Overt Discrimination, Microaggressions, and Positives: Black Men . Confirmatory factor analysis supported the three-factor solution. As hypothesized, the BMES's subscales correlated with measures of racial discrimination, depression, resilience, and social class at the neighborhood-level. Preliminary evidence suggests that the BMES is a reliable and valid measure of Black men's experiences at the intersection of race and gender.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Jade; Nobrega, R. Paul; Schwantes, Christian

    The dynamics of globular proteins can be described in terms of transitions between a folded native state and less-populated intermediates, or excited states, which can play critical roles in both protein folding and function. Excited states are by definition transient species, and therefore are difficult to characterize using current experimental techniques. We report an atomistic model of the excited state ensemble of a stabilized mutant of an extensively studied flavodoxin fold protein CheY. We employed a hybrid simulation and experimental approach in which an aggregate 42 milliseconds of all-atom molecular dynamics were used as an informative prior for the structuremore » of the excited state ensemble. The resulting prior was then refined against small-angle X-ray scattering (SAXS) data employing an established method (EROS). The most striking feature of the resulting excited state ensemble was an unstructured N-terminus stabilized by non-native contacts in a conformation that is topologically simpler than the native state. We then predict incisive single molecule FRET experiments, using these results, as a means of model validation. Our study demonstrates the paradigm of uniting simulation and experiment in a statistical model to study the structure of protein excited states and rationally design validating experiments.« less

  19. Detecting cheaters without thinking: testing the automaticity of the cheater detection module.

    PubMed

    Van Lier, Jens; Revlin, Russell; De Neys, Wim

    2013-01-01

    Evolutionary psychologists have suggested that our brain is composed of evolved mechanisms. One extensively studied mechanism is the cheater detection module. This module would make people very good at detecting cheaters in a social exchange. A vast amount of research has illustrated performance facilitation on social contract selection tasks. This facilitation is attributed to the alleged automatic and isolated operation of the module (i.e., independent of general cognitive capacity). This study, using the selection task, tested the critical automaticity assumption in three experiments. Experiments 1 and 2 established that performance on social contract versions did not depend on cognitive capacity or age. Experiment 3 showed that experimentally burdening cognitive resources with a secondary task had no impact on performance on the social contract version. However, in all experiments, performance on a non-social contract version did depend on available cognitive capacity. Overall, findings validate the automatic and effortless nature of social exchange reasoning.

  20. A Testing Platform for Validation of Overhead Conductor Aging Models and Understanding Thermal Limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Irminger, Philip; Starke, Michael R; Dimitrovski, Aleksandar D

    2014-01-01

    Power system equipment manufacturers and researchers continue to experiment with novel overhead electric conductor designs that support better conductor performance and address congestion issues. To address the technology gap in testing these novel designs, Oak Ridge National Laboratory constructed the Powerline Conductor Accelerated Testing (PCAT) facility to evaluate the performance of novel overhead conductors in an accelerated fashion in a field environment. Additionally, PCAT has the capability to test advanced sensors and measurement methods for accessing overhead conductor performance and condition. Equipped with extensive measurement and monitoring devices, PCAT provides a platform to improve/validate conductor computer models and assess themore » performance of novel conductors. The PCAT facility and its testing capabilities are described in this paper.« less

  1. Self-rating inventory for posttraumatic stress disorder: review of the psychometric properties of a new brief Dutch screening instrument.

    PubMed

    Hovens, J E; Bramsen, I; van der Ploeg, H M

    2002-06-01

    The Self-rating Inventory for Posttraumatic Stress Disorder of 22 items was developed for use with populations without identified traumatic experiences. The inventory has been used extensively in survey research in The Netherlands. This paper examines the psychometric properties. In four different groups (trauma and psychiatric patients, elderly Dutch subjects, former peacekeepers, and medical students) internal consistency, test-retest reliability, concurrent and discriminant validity, and sensitivity and specificity are analyzed. The inventory showed good internal consistency, test-retest reliability, concurrent and discriminant validity, and high sensitivity and specificity. It appears to be valuable for survey research on posttraumatic stress in nonselected populations. As a screening device, high sensitivity for PTSD symptoms is evident even when the traumatic event has not been defined.

  2. Availability of Neutronics Benchmarks in the ICSBEP and IRPhEP Handbooks for Computational Tools Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bess, John D.; Briggs, J. Blair; Ivanova, Tatiana

    2017-02-01

    In the past several decades, numerous experiments have been performed worldwide to support reactor operations, measurements, design, and nuclear safety. Those experiments represent an extensive international investment in infrastructure, expertise, and cost, representing significantly valuable resources of data supporting past, current, and future research activities. Those valuable assets represent the basis for recording, development, and validation of our nuclear methods and integral nuclear data [1]. The loss of these experimental data, which has occurred all too much in the recent years, is tragic. The high cost to repeat many of these measurements can be prohibitive, if not impossible, to surmount.more » Two international projects were developed, and are under the direction of the Organisation for Co-operation and Development Nuclear Energy Agency (OECD NEA) to address the challenges of not just data preservation, but evaluation of the data to determine its merit for modern and future use. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was established to identify and verify comprehensive critical benchmark data sets; evaluate the data, including quantification of biases and uncertainties; compile the data and calculations in a standardized format; and formally document the effort into a single source of verified benchmark data [2]. Similarly, the International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications [3]. Annually, contributors from around the world continue to collaborate in the evaluation and review of select benchmark experiments for preservation and dissemination. The extensively peer-reviewed integral benchmark data can then be utilized to support nuclear design and safety analysts to validate the analytical tools, methods, and data needed for next-generation reactor design, safety analysis requirements, and all other front- and back-end activities contributing to the overall nuclear fuel cycle where quality neutronics calculations are paramount.« less

  3. Broad attention to multiple individual objects may facilitate change detection with complex auditory scenes.

    PubMed

    Irsik, Vanessa C; Vanden Bosch der Nederlanden, Christina M; Snyder, Joel S

    2016-11-01

    Attention and other processing constraints limit the perception of objects in complex scenes, which has been studied extensively in the visual sense. We used a change deafness paradigm to examine how attention to particular objects helps and hurts the ability to notice changes within complex auditory scenes. In a counterbalanced design, we examined how cueing attention to particular objects affected performance in an auditory change-detection task through the use of valid or invalid cues and trials without cues (Experiment 1). We further examined how successful encoding predicted change-detection performance using an object-encoding task and we addressed whether performing the object-encoding task along with the change-detection task affected performance overall (Experiment 2). Participants had more error for invalid compared to valid and uncued trials, but this effect was reduced in Experiment 2 compared to Experiment 1. When the object-encoding task was present, listeners who completed the uncued condition first had less overall error than those who completed the cued condition first. All participants showed less change deafness when they successfully encoded change-relevant compared to irrelevant objects during valid and uncued trials. However, only participants who completed the uncued condition first also showed this effect during invalid cue trials, suggesting a broader scope of attention. These findings provide converging evidence that attention to change-relevant objects is crucial for successful detection of acoustic changes and that encouraging broad attention to multiple objects is the best way to reduce change deafness. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. Opportunistic Mobility Support for Resource Constrained Sensor Devices in Smart Cities

    PubMed Central

    Granlund, Daniel; Holmlund, Patrik; Åhlund, Christer

    2015-01-01

    A multitude of wireless sensor devices and technologies are being developed and deployed in cities all over the world. Sensor applications in city environments may include highly mobile installations that span large areas which necessitates sensor mobility support. This paper presents and validates two mechanisms for supporting sensor mobility between different administrative domains. Firstly, EAP-Swift, an Extensible Authentication Protocol (EAP)-based sensor authentication protocol is proposed that enables light-weight sensor authentication and key generation. Secondly, a mechanism for handoffs between wireless sensor gateways is proposed. We validate both mechanisms in a real-life study that was conducted in a smart city environment with several fixed sensors and moving gateways. We conduct similar experiments in an industry-based anechoic Long Term Evolution (LTE) chamber with an ideal radio environment. Further, we validate our results collected from the smart city environment against the results produced under ideal conditions to establish best and real-life case scenarios. Our results clearly validate that our proposed mechanisms can facilitate efficient sensor authentication and handoffs while sensors are roaming in a smart city environment. PMID:25738767

  5. Monte Carol-based validation of neutronic methodology for EBR-II analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liaw, J.R.; Finck, P.J.

    1993-01-01

    The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less

  6. Opportunistic mobility support for resource constrained sensor devices in smart cities.

    PubMed

    Granlund, Daniel; Holmlund, Patrik; Åhlund, Christer

    2015-03-02

    A multitude of wireless sensor devices and technologies are being developed and deployed in cities all over the world. Sensor applications in city environments may include highly mobile installations that span large areas which necessitates sensor mobility support. This paper presents and validates two mechanisms for supporting sensor mobility between different administrative domains. Firstly, EAP-Swift, an Extensible Authentication Protocol (EAP)-based sensor authentication protocol is proposed that enables light-weight sensor authentication and key generation. Secondly, a mechanism for handoffs between wireless sensor gateways is proposed. We validate both mechanisms in a real-life study that was conducted in a smart city environment with several fixed sensors and moving gateways. We conduct similar experiments in an industry-based anechoic Long Term Evolution (LTE) chamber with an ideal radio environment. Further, we validate our results collected from the smart city environment against the results produced under ideal conditions to establish best and real-life case scenarios. Our results clearly validate that our proposed mechanisms can facilitate efficient sensor authentication and handoffs while sensors are roaming in a smart city environment.

  7. Fall 2012 Graduate Engineering Internship Summary

    NASA Technical Reports Server (NTRS)

    Ehrlich, Joshua

    2013-01-01

    In the fall of 2012, I participated in the National Aeronautics and Space Administration (NASA) Pathways Intern Employment Program at the Kennedy Space Center (KSC) in Florida. This was my second internship opportunity with NASA, a consecutive extension from a summer 2012 internship. During my four-month tenure, I gained valuable knowledge and extensive hands-on experience with payload design and testing as well as composite fabrication for repair design on future space vehicle structures. As a systems engineer, I supported the systems engineering and integration team with the testing of scientific payloads such as the Vegetable Production System (Veggie). Verification and validation (V&V) of the Veggie was carried out prior to qualification testing of the payload, which incorporated a lengthy process of confirming design requirements that were integrated through one or more validatjon methods: inspection, analysis, demonstration, and testing. Additionally, I provided assistance in verifying design requirements outlined in the V&V plan with the requirements outlined by the scientists in the Science Requirements Envelope Document (SRED). The purpose of the SRED was to define experiment requirements intended for the payload to meet and carry out.

  8. Scaling Relations for Intercalation Induced Damage in Electrodes

    DOE PAGES

    Chen, Chien-Fan; Barai, Pallab; Smith, Kandler; ...

    2016-04-02

    Mechanical degradation, owing to intercalation induced stress and microcrack formation, is a key contributor to the electrode performance decay in lithium-ion batteries (LIBs). The stress generation and formation of microcracks are caused by the solid state diffusion of lithium in the active particles. Here in this work, scaling relations are constructed for diffusion induced damage in intercalation electrodes based on an extensive set of numerical experiments with a particle-level description of microcrack formation under disparate operating and cycling conditions, such as temperature, particle size, C-rate, and drive cycle. The microcrack formation and evolution in active particles is simulated based onmore » a stochastic methodology. A reduced order scaling law is constructed based on an extensive set of data from the numerical experiments. The scaling relations include combinatorial constructs of concentration gradient, cumulative strain energy, and microcrack formation. Lastly, the reduced order relations are further employed to study the influence of mechanical degradation on cell performance and validated against the high order model for the case of damage evolution during variable current vehicle drive cycle profiles.« less

  9. The kinetic origin of delayed yielding in metallic glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Y. F.; Liu, X. D.; Wang, S.

    2016-06-20

    Recent experiments showed that irreversible structural change or plasticity could occur in metallic glasses (MGs) even within the apparent elastic limit after a sufficiently long waiting time. To explain this phenomenon, a stochastic shear transformation model is developed based on a unified rate theory to predict delayed yielding in MGs, which is validated afterwards through extensive atomistic simulations carried out on different MGs. On a fundamental level, an analytic framework is established in this work that links time, stress, and temperature altogether into a general yielding criterion for MGs.

  10. Toward reliable biomarker signatures in the age of liquid biopsies - how to standardize the small RNA-Seq workflow

    PubMed Central

    Buschmann, Dominik; Haberberger, Anna; Kirchner, Benedikt; Spornraft, Melanie; Riedmaier, Irmgard; Schelling, Gustav; Pfaffl, Michael W.

    2016-01-01

    Small RNA-Seq has emerged as a powerful tool in transcriptomics, gene expression profiling and biomarker discovery. Sequencing cell-free nucleic acids, particularly microRNA (miRNA), from liquid biopsies additionally provides exciting possibilities for molecular diagnostics, and might help establish disease-specific biomarker signatures. The complexity of the small RNA-Seq workflow, however, bears challenges and biases that researchers need to be aware of in order to generate high-quality data. Rigorous standardization and extensive validation are required to guarantee reliability, reproducibility and comparability of research findings. Hypotheses based on flawed experimental conditions can be inconsistent and even misleading. Comparable to the well-established MIQE guidelines for qPCR experiments, this work aims at establishing guidelines for experimental design and pre-analytical sample processing, standardization of library preparation and sequencing reactions, as well as facilitating data analysis. We highlight bottlenecks in small RNA-Seq experiments, point out the importance of stringent quality control and validation, and provide a primer for differential expression analysis and biomarker discovery. Following our recommendations will encourage better sequencing practice, increase experimental transparency and lead to more reproducible small RNA-Seq results. This will ultimately enhance the validity of biomarker signatures, and allow reliable and robust clinical predictions. PMID:27317696

  11. Validity and reliability of a low-cost digital dynamometer for measuring isometric strength of lower limb.

    PubMed

    Romero-Franco, Natalia; Jiménez-Reyes, Pedro; Montaño-Munuera, Juan A

    2017-11-01

    Lower limb isometric strength is a key parameter to monitor the training process or recognise muscle weakness and injury risk. However, valid and reliable methods to evaluate it often require high-cost tools. The aim of this study was to analyse the concurrent validity and reliability of a low-cost digital dynamometer for measuring isometric strength in lower limb. Eleven physically active and healthy participants performed maximal isometric strength for: flexion and extension of ankle, flexion and extension of knee, flexion, extension, adduction, abduction, internal and external rotation of hip. Data obtained by the digital dynamometer were compared with the isokinetic dynamometer to examine its concurrent validity. Data obtained by the digital dynamometer from 2 different evaluators and 2 different sessions were compared to examine its inter-rater and intra-rater reliability. Intra-class correlation (ICC) for validity was excellent in every movement (ICC > 0.9). Intra and inter-tester reliability was excellent for all the movements assessed (ICC > 0.75). The low-cost digital dynamometer demonstrated strong concurrent validity and excellent intra and inter-tester reliability for assessing isometric strength in the main lower limb movements.

  12. Algorithm and code development for unsteady three-dimensional Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Obayashi, Shigeru

    1994-01-01

    Aeroelastic tests require extensive cost and risk. An aeroelastic wind-tunnel experiment is an order of magnitude more expensive than a parallel experiment involving only aerodynamics. By complementing the wind-tunnel experiments with numerical simulations, the overall cost of the development of aircraft can be considerably reduced. In order to accurately compute aeroelastic phenomenon it is necessary to solve the unsteady Euler/Navier-Stokes equations simultaneously with the structural equations of motion. These equations accurately describe the flow phenomena for aeroelastic applications. At ARC a code, ENSAERO, is being developed for computing the unsteady aerodynamics and aeroelasticity of aircraft, and it solves the Euler/Navier-Stokes equations. The purpose of this cooperative agreement was to enhance ENSAERO in both algorithm and geometric capabilities. During the last five years, the algorithms of the code have been enhanced extensively by using high-resolution upwind algorithms and efficient implicit solvers. The zonal capability of the code has been extended from a one-to-one grid interface to a mismatching unsteady zonal interface. The geometric capability of the code has been extended from a single oscillating wing case to a full-span wing-body configuration with oscillating control surfaces. Each time a new capability was added, a proper validation case was simulated, and the capability of the code was demonstrated.

  13. Spectroscopic studies of model photo-receptors: validation of a nanosecond time-resolved micro-spectrophotometer design using photoactive yellow protein and α-phycoerythrocyanin.

    PubMed

    Purwar, Namrta; Tenboer, Jason; Tripathi, Shailesh; Schmidt, Marius

    2013-09-13

    Time-resolved spectroscopic experiments have been performed with protein in solution and in crystalline form using a newly designed microspectrophotometer. The time-resolution of these experiments can be as good as two nanoseconds (ns), which is the minimal response time of the image intensifier used. With the current setup, the effective time-resolution is about seven ns, determined mainly by the pulse duration of the nanosecond laser. The amount of protein required is small, on the order of 100 nanograms. Bleaching, which is an undesirable effect common to photoreceptor proteins, is minimized by using a millisecond shutter to avoid extensive exposure to the probing light. We investigate two model photoreceptors, photoactive yellow protein (PYP), and α-phycoerythrocyanin (α-PEC), on different time scales and at different temperatures. Relaxation times obtained from kinetic time-series of difference absorption spectra collected from PYP are consistent with previous results. The comparison with these results validates the capability of this spectrophotometer to deliver high quality time-resolved absorption spectra.

  14. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Reliability and Construct Validity of the Psychopathic Personality Inventory-Revised in a Swedish Non-Criminal Sample - A Multimethod Approach including Psychophysiological Correlates of Empathy for Pain.

    PubMed

    Sörman, Karolina; Nilsonne, Gustav; Howner, Katarina; Tamm, Sandra; Caman, Shilan; Wang, Hui-Xin; Ingvar, Martin; Edens, John F; Gustavsson, Petter; Lilienfeld, Scott O; Petrovic, Predrag; Fischer, Håkan; Kristiansson, Marianne

    2016-01-01

    Cross-cultural investigation of psychopathy measures is important for clarifying the nomological network surrounding the psychopathy construct. The Psychopathic Personality Inventory-Revised (PPI-R) is one of the most extensively researched self-report measures of psychopathic traits in adults. To date however, it has been examined primarily in North American criminal or student samples. To address this gap in the literature, we examined PPI-R's reliability, construct validity and factor structure in non-criminal individuals (N = 227) in Sweden, using a multimethod approach including psychophysiological correlates of empathy for pain. PPI-R construct validity was investigated in subgroups of participants by exploring its degree of overlap with (i) the Psychopathy Checklist: Screening Version (PCL:SV), (ii) self-rated empathy and behavioral and physiological responses in an experiment on empathy for pain, and (iii) additional self-report measures of alexithymia and trait anxiety. The PPI-R total score was significantly associated with PCL:SV total and factor scores. The PPI-R Coldheartedness scale demonstrated significant negative associations with all empathy subscales and with rated unpleasantness and skin conductance responses in the empathy experiment. The PPI-R higher order Self-Centered Impulsivity and Fearless Dominance dimensions were associated with trait anxiety in opposite directions (positively and negatively, respectively). Overall, the results demonstrated solid reliability (test-retest and internal consistency) and promising but somewhat mixed construct validity for the Swedish translation of the PPI-R.

  16. Reliability and Construct Validity of the Psychopathic Personality Inventory-Revised in a Swedish Non-Criminal Sample – A Multimethod Approach including Psychophysiological Correlates of Empathy for Pain

    PubMed Central

    Sörman, Karolina; Nilsonne, Gustav; Howner, Katarina; Tamm, Sandra; Caman, Shilan; Wang, Hui-Xin; Ingvar, Martin; Edens, John F.; Gustavsson, Petter; Lilienfeld, Scott O; Petrovic, Predrag; Fischer, Håkan; Kristiansson, Marianne

    2016-01-01

    Cross-cultural investigation of psychopathy measures is important for clarifying the nomological network surrounding the psychopathy construct. The Psychopathic Personality Inventory-Revised (PPI-R) is one of the most extensively researched self-report measures of psychopathic traits in adults. To date however, it has been examined primarily in North American criminal or student samples. To address this gap in the literature, we examined PPI-R’s reliability, construct validity and factor structure in non-criminal individuals (N = 227) in Sweden, using a multimethod approach including psychophysiological correlates of empathy for pain. PPI-R construct validity was investigated in subgroups of participants by exploring its degree of overlap with (i) the Psychopathy Checklist: Screening Version (PCL:SV), (ii) self-rated empathy and behavioral and physiological responses in an experiment on empathy for pain, and (iii) additional self-report measures of alexithymia and trait anxiety. The PPI-R total score was significantly associated with PCL:SV total and factor scores. The PPI-R Coldheartedness scale demonstrated significant negative associations with all empathy subscales and with rated unpleasantness and skin conductance responses in the empathy experiment. The PPI-R higher order Self-Centered Impulsivity and Fearless Dominance dimensions were associated with trait anxiety in opposite directions (positively and negatively, respectively). Overall, the results demonstrated solid reliability (test-retest and internal consistency) and promising but somewhat mixed construct validity for the Swedish translation of the PPI-R. PMID:27300292

  17. Adam M. Grant: Award for Distinguished Scientific Early Career Contributions to Psychology.

    PubMed

    2011-11-01

    Presents Adam M. Grant, the 2011 winner of the American Psychological Association Award for Distinguished Scientific Early Career Contributions to Psychology. "For extensive, elegant, and programmatic research on the power of relational job design in enhancing employee motivation, productivity, and satisfaction; for creative and rigorous studies documenting the profound and surprising effects of connecting employees to their impact on others; for highlighting prosocial motivation, not only extrinsic and intrinsic motivations, as a key force behind employee behavior; and for demonstrating by example the feasibility and benefits of conducting field experiments, yielding studies rich in internal validity, external validity, and practical impact. In addition to his accomplishments, Adam M. Grant is known for his generosity as a scholar, teacher, and colleague." (PsycINFO Database Record (c) 2011 APA, all rights reserved). 2011 APA, all rights reserved

  18. Genetic programming-based mathematical modeling of influence of weather parameters in BOD5 removal by Lemna minor.

    PubMed

    Chandrasekaran, Sivapragasam; Sankararajan, Vanitha; Neelakandhan, Nampoothiri; Ram Kumar, Mahalakshmi

    2017-11-04

    This study, through extensive experiments and mathematical modeling, reveals that other than retention time and wastewater temperature (T w ), atmospheric parameters also play important role in the effective functioning of aquatic macrophyte-based treatment system. Duckweed species Lemna minor is considered in this study. It is observed that the combined effect of atmospheric temperature (T atm ), wind speed (U w ), and relative humidity (RH) can be reflected through one parameter, namely the "apparent temperature" (T a ). A total of eight different models are considered based on the combination of input parameters and the best mathematical model is arrived at which is validated through a new experimental set-up outside the modeling period. The validation results are highly encouraging. Genetic programming (GP)-based models are found to reveal deeper understandings of the wetland process.

  19. Multiscale GPS tomography during COPS: validation and applications

    NASA Astrophysics Data System (ADS)

    Champollion, Cédric; Flamant, Cyrille; Masson, Frédéric; Gégout, Pascal; Boniface, Karen; Richard, Evelyne

    2010-05-01

    Accurate 3D description of the water vapour field is of interest for process studies such as convection initiation. None of the current techniques (LIDAR, satellite, radio soundings, GPS) can provide an all weather continuous 3D field of moisture. The combination of GPS tomography with radio-soundings (and/or LIDAR) has been used for such process studies using both advantages of vertically resolved soundings and high temporal density of GPS measurements. GPS tomography has been used at short scale (10 km horizontal resolution but in a 50 km² area) for process studies such as the ESCOMPTE experiment (Bastin et al., 2005) and at larger scale (50 km horizontal resolution) during IHOP_2002. But no extensive statistical validation has been done so far. The overarching goal of the COPS field experiment is to advance the quality of forecasts of orographically induced convective precipitation by four-dimensional observations and modeling of its life cycle for identifying the physical and chemical processes responsible for deficiencies in QPF over low-mountain regions. During the COPS field experiment, a GPS network of about 100 GPS stations has been continuously operating during three months in an area of 500 km² in the East of France (Vosges Mountains) and West of Germany (Black Forest). If the mean spacing between the GPS is about 50 km, an East-West GPS profile with a density of about 10 km is dedicated to high resolution tomography. One major goal of the GPS COPS experiment is to validate the GPS tomography with different spatial resolutions. Validation is based on additional radio-soundings and airborne / ground-based LIDAR measurement. The number and the high quality of vertically resolved water vapor observations give an unique data set for GPS tomography validation. Numerous tests have been done on real data to show the type water vapor structures that can be imaging by GPS tomography depending of the assimilation of additional data (radio soundings), the resolution of the tomography grid and the density of GPS network. Finally some applications to different cases studies will be shortly presented.

  20. An Analysis of Construct Validity of Motivation As It Relates to North Carolina County Agricultural Extension Service Agents.

    ERIC Educational Resources Information Center

    Calloway, Pauline Frances

    This study investigated the construct validity of the Herzberg (1964) theory of motivation as it relates to county Extension agents; and developed an inventory to measure the job satisfaction of county agents in North Carolina. The inventory was administered to 419 agents in 79 counties. Factor analysis was used to determine the number of job…

  1. Description of Mexican Cleft Surgeons' Experience With Foreign Surgical Volunteer Missions in Mexico.

    PubMed

    Schoenbrunner, Anna R; Kelley, Kristen D; Buckstaff, Taylor; McIntyre, Joyce K; Sigler, Alicia; Gosman, Amanda A

    2018-05-01

    Mexican cleft surgeons provide multidisciplinary comprehensive cleft lip and palate care to children in Mexico. Many Mexican cleft surgeons have extensive experience with foreign, visiting surgeons. The purpose of this study was to characterize Mexican cleft surgeons' domestic and volunteer practice and to learn more about Mexican cleft surgeons' experience with visiting surgeons. A cross-sectional validated e-mail survey tool was sent to Mexican cleft surgeons through 2 Mexican plastic surgery societies and the Asociación Mexicana de Labio y Paladar Hendido y Anomalías Craneofaciales, the national cleft palate society that includes plastic and maxillofacial surgeons who specialize in cleft surgery. We utilized validated survey methodology, including neutral fact-based questions and repeated e-mails to survey nonresponders to maximize validity of statistical data; response rate was 30.6% (n = 81). Mexican cleft surgeons performed, on average, 37.7 primary palate repairs per year with an overall complication rate of 2.5%; 34.6% (n = 28) of respondents had direct experience with patients operated on by visiting surgeons; 53.6% of these respondents performed corrective surgery because of complications from visiting surgeons. Respondents rated 48% of the functional outcomes of visiting surgeons as "acceptable," whereas 43% rated aesthetic outcomes of visiting surgeons as "poor"; 73.3% of respondents were never paid for the corrective surgeries they performed. Thirty-three percent of Mexican cleft surgeons believe that there is a role for educational collaboration with visiting surgeons. Mexican cleft surgeons have a high volume of primary cleft palate repairs in their domestic practice with good outcomes. Visiting surgeons may play an important role in Mexican cleft care through educational collaborations that complement the strengths of Mexican cleft surgeons.

  2. Extension of the Hugoniot and analytical release model of α-quartz to 0.2–3 TPa

    DOE PAGES

    Desjarlais, M. P.; Knudson, M. D.; Cochrane, K. R.

    2017-07-21

    In recent years, α-quartz has been used prolifically as an impedance matching standard in shock wave experiments in the multi-Mbar regime (1 Mbar = 100 GPa = 0.1 TPa). This is due to the fact that above ~90–100 GPa along the principal Hugoniot α-quartz becomes reflective, and thus, shock velocities can be measured to high precision using velocity interferometry. The Hugoniot and release of α-quartz have been studied extensively, enabling the development of an analytical release model for use in impedance matching. However, this analytical release model has only been validated over a range of 300–1200 GPa (0.3–1.2 TPa). Furthermore,more » we extend this analytical model to 200–3000 GPa (0.2–3 TPa) through additional α-quartz Hugoniot and release measurements, as well as first-principles molecular dynamics calculations.« less

  3. The Potential of RFID Technology in the Textile and Clothing Industry: Opportunities, Requirements and Challenges

    NASA Astrophysics Data System (ADS)

    Legnani, Elena; Cavalieri, Sergio; Pinto, Roberto; Dotti, Stefano

    In the current competitive environment, companies need to extensively exploit the use of advanced technologies in order to develop a sustainable advantage, enhance their operational efficiency and better serve customers. In this context, RFID technology has emerged as a valid support for the company progress and its value is becoming more and more apparent. In particular, the textile and clothing industry, characterised by short life-cycles , quick response production , fast distribution, erratic customer preferences and impulsive purchasing, is one of the sectors which can extensively benefit from the RFID technology. However, actual applications are still very limited, especially in the upstream side of the supply network. This chapter provides an insight into the main benefits and potentials of this technology and highlights the main issues which are currently inhibiting its large scale development in the textile and clothing industry. The experience of two industry-academia projects and the relative fallouts are reported.

  4. Containerless Processing on ISS: Ground Support Program for EML

    NASA Technical Reports Server (NTRS)

    Diefenbach, Angelika; Schneider, Stephan; Willnecker, Rainer

    2012-01-01

    EML is an electromagnetic levitation facility planned for the ISS aiming at processing and investigating liquid metals or semiconductors by using electromagnetic levitation technique under microgravity with reduced electromagnetic fields and convection conditions. Its diagnostics and processing methods allow to measure thermophysical properties in the liquid state over an extended temperature range and to investigate solidification phenomena in undercooled melts. The EML project is a common effort of The European Space Agency (ESA) and the German Space Agency DLR. The Microgravity User Support Centre MUSC at Cologne, Germany, has been assigned the responsibility for EML operations. For the EML experiment preparation an extensive scientific ground support program is established at MUSC, providing scientific and technical services in the preparation, performance and evaluation of the experiments. Its final output is the transcription of the scientific goals and requirements into validated facility control parameters for the experiment execution onboard the ISS.

  5. Three-dimensional Dendritic Needle Network model with application to Al-Cu directional solidification experiments

    NASA Astrophysics Data System (ADS)

    Tourret, D.; Karma, A.; Clarke, A. J.; Gibbs, P. J.; Imhoff, S. D.

    2015-06-01

    We present a three-dimensional (3D) extension of a previously proposed multi-scale Dendritic Needle Network (DNN) approach for the growth of complex dendritic microstructures. Using a new formulation of the DNN dynamics equations for dendritic paraboloid-branches of a given thickness, one can directly extend the DNN approach to 3D modeling. We validate this new formulation against known scaling laws and analytical solutions that describe the early transient and steady-state growth regimes, respectively. Finally, we compare the predictions of the model to in situ X-ray imaging of Al-Cu alloy solidification experiments. The comparison shows a very good quantitative agreement between 3D simulations and thin sample experiments. It also highlights the importance of full 3D modeling to accurately predict the primary dendrite arm spacing that is significantly over-estimated by 2D simulations.

  6. Individual differences in emotional complexity: their psychological implications.

    PubMed

    Kang, Sun-Mee; Shaver, Phillip R

    2004-08-01

    Two studies explored the nature and psychological implications of individual differences in emotional complexity, defined as having emotional experiences that are broad in range and well differentiated. Emotional complexity was predicted to be associated with private self-consciousness, openness to experience, empathic tendencies, cognitive complexity, ability to differentiate among named emotions, range of emotions experienced daily, and interpersonal adaptability. The Range and Differentiation of Emotional Experience Scale (RDEES) was developed to test these hypotheses. In Study 1 (N=1,129) students completed questionnaire packets containing the RDEES and various outcome measures. Study 2 (N=95) included the RDEES and non-self-report measures such as peer reports, complexity of representations of the emotion domain, and level of ego development measured by a sentence completion test. Results supported all of the hypotheses, providing extensive evidence for the RDEES's construct validity. Findings were discussed in terms of the role of emotional complexity in ego maturity and interpersonal adaptability.

  7. Three-dimensional Dendritic Needle Network model with application to Al-Cu directional solidification experiments

    DOE PAGES

    Tourret, D.; Karma, A.; Clarke, A. J.; ...

    2015-06-11

    We present a three-dimensional (3D) extension of a previously proposed multi-scale Dendritic Needle Network (DNN) approach for the growth of complex dendritic microstructures. Using a new formulation of the DNN dynamics equations for dendritic paraboloid-branches of a given thickness, one can directly extend the DNN approach to 3D modeling. We validate this new formulation against known scaling laws and analytical solutions that describe the early transient and steady-state growth regimes, respectively. Finally, we compare the predictions of the model to in situ X-ray imaging of Al-Cu alloy solidification experiments. The comparison shows a very good quantitative agreement between 3D simulationsmore » and thin sample experiments. It also highlights the importance of full 3D modeling to accurately predict the primary dendrite arm spacing that is significantly over-estimated by 2D simulations.« less

  8. Validity and intra-rater reliability of an android phone application to measure cervical range-of-motion.

    PubMed

    Quek, June; Brauer, Sandra G; Treleaven, Julia; Pua, Yong-Hao; Mentiplay, Benjamin; Clark, Ross Allan

    2014-04-17

    Concurrent validity and intra-rater reliability using a customized Android phone application to measure cervical-spine range-of-motion (ROM) has not been previously validated against a gold-standard three-dimensional motion analysis (3DMA) system. Twenty-one healthy individuals (age:31 ± 9.1 years, male:11) participated, with 16 re-examined for intra-rater reliability 1-7 days later. An Android phone was fixed on a helmet, which was then securely fastened on the participant's head. Cervical-spine ROM in flexion, extension, lateral flexion and rotation were performed in sitting with concurrent measurements obtained from both a 3DMA system and the phone.The phone demonstrated moderate to excellent (ICC = 0.53-0.98, Spearman ρ = 0.52-0.98) concurrent validity for ROM measurements in cervical flexion, extension, lateral-flexion and rotation. However, cervical rotation demonstrated both proportional and fixed bias. Excellent intra-rater reliability was demonstrated for cervical flexion, extension and lateral flexion (ICC = 0.82-0.90), but poor for right- and left-rotation (ICC = 0.05-0.33) using the phone. Possible reasons for the outcome are that flexion, extension and lateral-flexion measurements are detected by gravity-dependent accelerometers while rotation measurements are detected by the magnetometer which can be adversely affected by surrounding magnetic fields. The results of this study demonstrate that the tested Android phone application is valid and reliable to measure ROM of the cervical-spine in flexion, extension and lateral-flexion but not in rotation likely due to magnetic interference. The clinical implication of this study is that therapists should be mindful of the plane of measurement when using the Android phone to measure ROM of the cervical-spine.

  9. Validity and intra-rater reliability of an Android phone application to measure cervical range-of-motion

    PubMed Central

    2014-01-01

    Background Concurrent validity and intra-rater reliability using a customized Android phone application to measure cervical-spine range-of-motion (ROM) has not been previously validated against a gold-standard three-dimensional motion analysis (3DMA) system. Findings Twenty-one healthy individuals (age:31 ± 9.1 years, male:11) participated, with 16 re-examined for intra-rater reliability 1–7 days later. An Android phone was fixed on a helmet, which was then securely fastened on the participant’s head. Cervical-spine ROM in flexion, extension, lateral flexion and rotation were performed in sitting with concurrent measurements obtained from both a 3DMA system and the phone. The phone demonstrated moderate to excellent (ICC = 0.53-0.98, Spearman ρ = 0.52-0.98) concurrent validity for ROM measurements in cervical flexion, extension, lateral-flexion and rotation. However, cervical rotation demonstrated both proportional and fixed bias. Excellent intra-rater reliability was demonstrated for cervical flexion, extension and lateral flexion (ICC = 0.82-0.90), but poor for right- and left-rotation (ICC = 0.05-0.33) using the phone. Possible reasons for the outcome are that flexion, extension and lateral-flexion measurements are detected by gravity-dependent accelerometers while rotation measurements are detected by the magnetometer which can be adversely affected by surrounding magnetic fields. Conclusion The results of this study demonstrate that the tested Android phone application is valid and reliable to measure ROM of the cervical-spine in flexion, extension and lateral-flexion but not in rotation likely due to magnetic interference. The clinical implication of this study is that therapists should be mindful of the plane of measurement when using the Android phone to measure ROM of the cervical-spine. PMID:24742001

  10. Atomistic structural ensemble refinement reveals non-native structure stabilizes a sub-millisecond folding intermediate of CheY

    DOE PAGES

    Shi, Jade; Nobrega, R. Paul; Schwantes, Christian; ...

    2017-03-08

    The dynamics of globular proteins can be described in terms of transitions between a folded native state and less-populated intermediates, or excited states, which can play critical roles in both protein folding and function. Excited states are by definition transient species, and therefore are difficult to characterize using current experimental techniques. We report an atomistic model of the excited state ensemble of a stabilized mutant of an extensively studied flavodoxin fold protein CheY. We employed a hybrid simulation and experimental approach in which an aggregate 42 milliseconds of all-atom molecular dynamics were used as an informative prior for the structuremore » of the excited state ensemble. The resulting prior was then refined against small-angle X-ray scattering (SAXS) data employing an established method (EROS). The most striking feature of the resulting excited state ensemble was an unstructured N-terminus stabilized by non-native contacts in a conformation that is topologically simpler than the native state. We then predict incisive single molecule FRET experiments, using these results, as a means of model validation. Our study demonstrates the paradigm of uniting simulation and experiment in a statistical model to study the structure of protein excited states and rationally design validating experiments.« less

  11. Atomistic structural ensemble refinement reveals non-native structure stabilizes a sub-millisecond folding intermediate of CheY

    NASA Astrophysics Data System (ADS)

    Shi, Jade; Nobrega, R. Paul; Schwantes, Christian; Kathuria, Sagar V.; Bilsel, Osman; Matthews, C. Robert; Lane, T. J.; Pande, Vijay S.

    2017-03-01

    The dynamics of globular proteins can be described in terms of transitions between a folded native state and less-populated intermediates, or excited states, which can play critical roles in both protein folding and function. Excited states are by definition transient species, and therefore are difficult to characterize using current experimental techniques. Here, we report an atomistic model of the excited state ensemble of a stabilized mutant of an extensively studied flavodoxin fold protein CheY. We employed a hybrid simulation and experimental approach in which an aggregate 42 milliseconds of all-atom molecular dynamics were used as an informative prior for the structure of the excited state ensemble. This prior was then refined against small-angle X-ray scattering (SAXS) data employing an established method (EROS). The most striking feature of the resulting excited state ensemble was an unstructured N-terminus stabilized by non-native contacts in a conformation that is topologically simpler than the native state. Using these results, we then predict incisive single molecule FRET experiments as a means of model validation. This study demonstrates the paradigm of uniting simulation and experiment in a statistical model to study the structure of protein excited states and rationally design validating experiments.

  12. Evaluation of complementary-alternative medicine (CAM) questionnaire development for Indonesian clinical psychologists: A pilot study.

    PubMed

    Liem, Andrian; Newcombe, Peter A; Pohlman, Annie

    2017-08-01

    This study aimed to evaluate questionnaire development to measure the knowledge of Complementary-Alternative Medicine (CAM), attitudes towards CAM, CAM experiences, and CAM educational needs of clinical psychologists in Indonesia. A 26-item questionnaire was developed through an extensive literature search. Data was obtained from provisional psychologists from the Master of Professional Clinical Psychology programs at two established public universities in urban areas of Indonesia. To validate the questionnaire, panel reviews by executive members of the Indonesian Clinical Psychology Association (ICPA), experts in health psychology, and experts in public health and CAM provided their professional judgements. The self-reporting questionnaire consisted of four scales including: knowledge of CAM (6 items), attitudes towards CAM (10 items), CAM experiences (4 items), and CAM educational needs (6 items). All scales, except CAM Experiences, were assessed on a 7-point Likert scale. Sixty provisional psychologists were eligible to complete the questionnaire with a response rate of 73% (N=44). The results showed that the CAM questionnaire was reliable (Cronbach's coefficient alpha range=0.62-0.96; item-total correlation range=0.14-0.92) and demonstrated content validity. Following further psychometric evaluation, the CAM questionnaire may provide the evidence-based information to inform the education and practice of Indonesian clinical psychologists. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Airborne Observations and Satellite Validation: INTEX-A Experience and INTEX-B Plans

    NASA Technical Reports Server (NTRS)

    Crawford, James H.; Singh, Hanwant B.; Brune, William H.; Jacob, Daniel J.

    2005-01-01

    Intercontinental Chemical Transport Experiment (INTEX; http://cloudl.arc.nasa.gov) is an ongoing two-phase integrated atmospheric field experiment being performed over North America (NA). Its first phase (INTEX-A) was performed in the summer of 2004 and the second phase (INTEX-B) is planned for the early spring of 2006. The main goal of INTEX-NA is to understand the transport and transformation of gases and aerosols on transcontinental/intercontinental scales and to assess their impact on air quality and climate. Central to achieving this goal is the need to relate space-based observations with those from airborne and surface platforms. During INTEX-A, NASA s DC-8 was joined by some dozen other aircraft from a large number of European and North American partners to focus on the outflow of pollution from NA to the Atlantic. Several instances of Asian pollution over NA were also encountered. INTEX-A flight planning extensively relied on satellite observations and in turn Satellite validation (Terra, Aqua, and Envisat) was given high priority. Over 20 validation profiles were successfully carried out. DC-8 sampling of smoke from Alaskan fires and formaldehyde over forested regions, and simultaneous satellite observations of these provided excellent opportunities for the interplay of these platforms. The planning for INTEX-5 is currently underway, and a vast majority of "standard" and "research" products to be retrieved from Aura instruments will be measured during INTEX-B throughout the troposphere. INTEX-B will focus on the inflow of pollution from Asia to North America and validation of satellite observations with emphasis on Aura. Several national and international partners are expected to coordinate activities with INTEX-B, and we expect its scope to expand in the coming months. An important new development involves partnership with an NSF-sponsored campaign called MIRAGE (Megacity Impacts on Regional and Global Environments- Mexico City Pollution Outflow Field Campaign; http://mirage- mex.acd.ucar.edu/index.html). This partnership will utilize both the NASA-DC-8 mad NCAR C-130 aircrafts and greatly expand the temporal and spatial coverage of these experiments. We will briefly describe our INTEX-A experience and discuss plans for INTEX-B activities especially as they relate to validation and interpretation of Aura observations.

  14. Analysis of cocaine and metabolites in hair: validation and application of measurement of hydroxycocaine metabolites as evidence of cocaine ingestion.

    PubMed

    Schaffer, Michael; Cheng, Chen-Chih; Chao, Oscar; Hill, Virginia; Matsui, Paul

    2016-03-01

    An LC/MS/MS method to identify and quantitate in hair the minor metabolites of cocaine-meta-, para-, and ortho-hydroxy cocaine-was developed and validated. Analysis was performed on a triple quadrupole ABSciex API 3000 MS equipped with an atmospheric pressure ionization source via an IonSpray (ESI). For LC, a series 200 micro binary pump with a Perkin Elmer Model 200 autosampler was used. The limit of detection (LOD) and limit of quantification (LOQ) were 0.02 ng/10 mg hair, with linearity from 0.02 to 10 ng/10 mg hair. Concentrations of the para isomer in extensively washed hair samples were in the range of 1-2 % of the cocaine in the sample, while the concentrations of the ortho form were considerably less. The method was used to analyze large numbers of samples from two populations: workplace and criminal justice. In vitro experiments to determine if deodorants or peroxide-containing cosmetic treatments could result in the presence of these metabolites in hair showed that this does not occur with extensively washed hair. Presence of hydroxycocaines, when detected after aggressive washing of the hair samples, provides a valuable additional indicator of ingestion of cocaine rather than mere environmental exposure.

  15. A non-orthogonal material model of woven composites in the preforming process

    DOE PAGES

    Zhang, Weizhao; Ren, Huaqing; Liang, Biao; ...

    2017-05-04

    Woven composites are considered as a promising material choice for lightweight applications. An improved non-orthogonal material model that can decouple the strong tension and weak shear behaviour of the woven composite under large shear deformation is proposed for simulating the preforming of woven composites. The tension, shear and compression moduli in the model are calibrated using the tension, bias-extension and bending experiments, respectively. The interaction between the composite layers is characterized by a sliding test. The newly developed material model is implemented in the commercial finite element software LS-DYNA® and validated by a double dome study.

  16. Need for artificial gravity on a manned Mars mission?

    NASA Technical Reports Server (NTRS)

    Sharp, Joseph C.

    1986-01-01

    Drawing upon the extensive Soviet and Skylab medical observations, the need for artificial gravity (g) on a manned Mars mission is discussed. Little hard data derived from well done experiments exist. This dearth of information is primarily due to two factors. Inability to collect tissues from astronauts for ethical or operational reasons. Second, there was not opportunities to fly animals in space to systematically evaluate the extent of the problem, and to develop and then to prove the effectiveness of countermeasures. The Skylab and space station will provide the opportunity to study these questions and validate suggested solutions.

  17. Parent Reports of Young Spanish-English Bilingual Children's Productive Vocabulary: A Development and Validation Study.

    PubMed

    Mancilla-Martinez, Jeannette; Gámez, Perla B; Vagh, Shaher Banu; Lesaux, Nonie K

    2016-01-01

    This 2-phase study aims to extend research on parent report measures of children's productive vocabulary by investigating the development (n = 38) of the Spanish Vocabulary Extension and validity (n = 194) of the 100-item Spanish and English MacArthur-Bates Communicative Development Inventories Toddler Short Forms and Upward Extension (Fenson et al., 2000, 2007; Jackson-Maldonado, Marchman, & Fernald, 2013) and the Spanish Vocabulary Extension for use with parents from low-income homes and their 24- to 48-month-old Spanish-English bilingual children. Study participants were drawn from Early Head Start and Head Start collaborative programs in the Northeastern United States in which English was the primary language used in the classroom. All families reported Spanish or Spanish-English as their home language(s). The MacArthur Communicative Development Inventories as well as the researcher-designed Spanish Vocabulary Extension were used as measures of children's English and Spanish productive vocabularies. Findings revealed the forms' concurrent and discriminant validity, on the basis of standardized measures of vocabulary, as measures of productive vocabulary for this growing bilingual population. These findings suggest that parent reports, including our researcher-designed form, represent a valid, cost-effective mechanism for vocabulary monitoring purposes in early childhood education settings.

  18. Test of hadronic interaction models with the KASCADE-Grande muon data

    NASA Astrophysics Data System (ADS)

    Arteaga-Velázquez, J. C.; Apel, W. D.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Finger, M.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Ludwig, M.; Mathes, H. J.; Mayer, H. J.; Melissas, M.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Wommer, M.; Zabierowski, J.

    2013-06-01

    KASCADE-Grande is an air-shower observatory devoted for the detection of cosmic rays with energies in the interval of 1014 - 1018 eV, where the Grande array is responsible for the higher energy range. The experiment comprises different detection systems which allow precise measurements of the charged, electron and muon numbers of extensive air-showers (EAS). These data is employed not only to reconstruct the properties of the primary cosmic-ray particle but also to test hadronic interaction models at high energies. In this contribution, predictions of the muon content of EAS from QGSJET II-2, SIBYLL 2.1 and EPOS 1.99 are confronted with the experimental measurements performed with the KASCADE-Grande experiment in order to test the validity of these hadronic models commonly used in EAS simulations.

  19. Detecting Cheaters without Thinking: Testing the Automaticity of the Cheater Detection Module

    PubMed Central

    Van Lier, Jens; Revlin, Russell; De Neys, Wim

    2013-01-01

    Evolutionary psychologists have suggested that our brain is composed of evolved mechanisms. One extensively studied mechanism is the cheater detection module. This module would make people very good at detecting cheaters in a social exchange. A vast amount of research has illustrated performance facilitation on social contract selection tasks. This facilitation is attributed to the alleged automatic and isolated operation of the module (i.e., independent of general cognitive capacity). This study, using the selection task, tested the critical automaticity assumption in three experiments. Experiments 1 and 2 established that performance on social contract versions did not depend on cognitive capacity or age. Experiment 3 showed that experimentally burdening cognitive resources with a secondary task had no impact on performance on the social contract version. However, in all experiments, performance on a non-social contract version did depend on available cognitive capacity. Overall, findings validate the automatic and effortless nature of social exchange reasoning. PMID:23342012

  20. Kinematics of the thoracic T10-T11 motion segment: locus of instantaneous axes of rotation in flexion and extension.

    PubMed

    Qiu, Tian-Xia; Teo, Ee-Chon; Lee, Kim-Kheng; Ng, Hong-Wan; Yang, Kai

    2004-04-01

    The purpose of this study was to determine the locations and loci of instantaneous axes of rotation (IARs) of the T10-T11 motion segment in flexion and extension. An anatomically accurate three-dimensional model of thoracic T10-T11 functional spinal unit (FSU) was developed and validated against published experimental data under flexion, extension, lateral bending, and axial rotation loading configurations. The validated model was exercised under six load configurations that produced motions only in the sagittal plane to characterize the loci of IARs for flexion and extension. The IARs for both flexion and extension under these six load types were directly below the geometric center of the moving vertebra, and all the loci of IARs were tracked superoanteriorly for flexion and inferoposteriorly for extension with rotation. These findings may offer an insight to better understanding of the kinematics of the human thoracic spine and provide clinically relevant information for the evaluation of spinal stability and implant device functionality.

  1. Convergent validity between a discrete choice experiment and a direct, open-ended method: comparison of preferred attribute levels and willingness to pay estimates.

    PubMed

    Marjon van der Pol; Shiell, Alan; Au, Flora; Johnston, David; Tough, Suzanne

    2008-12-01

    The Discrete Choice Experiment (DCE) has become increasingly popular as a method for eliciting patient or population preferences. If DCE estimates are to inform health policy, it is crucial that the answers they provide are valid. Convergent validity is tested in this paper by comparing the results of a DCE exercise with the answers obtained from direct, open-ended questions. The two methods are compared in terms of preferred attribute levels and willingness to pay (WTP) values. Face-to-face interviews were held with 292 women in Calgary, Canada. Similar values were found between the two methods with respect to preferred levels for two out of three of the attributes examined. The DCE predicted less well for levels outside the range than for levels inside the range reaffirming the importance of extensive piloting to ensure appropriate level range in DCEs. The mean WTP derived from the open-ended question was substantially lower than the mean derived from the DCE. However, the two sets of willingness to pay estimates were consistent with each other in that individuals who were willing to pay more in the open-ended question were also willing to pay more in the DCE. The difference in mean WTP values between the two approaches (direct versus DCE) demonstrates the importance of continuing research into the different biases present across elicitation methods.

  2. Performance prediction of a ducted rocket combustor

    NASA Astrophysics Data System (ADS)

    Stowe, Robert

    2001-07-01

    The ducted rocket is a supersonic flight propulsion system that takes the exhaust from a solid fuel gas generator, mixes it with air, and burns it to produce thrust. To develop such systems, the use of numerical models based on Computational Fluid Dynamics (CFD) is increasingly popular, but their application to reacting flow requires specific attention and validation. Through a careful examination of the governing equations and experimental measurements, a CFD-based method was developed to predict the performance of a ducted rocket combustor. It uses an equilibrium-chemistry Probability Density Function (PDF) combustion model, with a gaseous and a separate stream of 75 nm diameter carbon spheres to represent the fuel. After extensive validation with water tunnel and direct-connect combustion experiments over a wide range of geometries and test conditions, this CFD-based method was able to predict, within a good degree of accuracy, the combustion efficiency of a ducted rocket combustor.

  3. Multiband product rule and consonant identification.

    PubMed

    Li, Feipeng; Allen, Jont B

    2009-07-01

    The multiband product rule, also known as band-independence, is a basic assumption of articulation index and its extension, the speech intelligibility index. Previously Fletcher showed its validity for a balanced mix of 20% consonant-vowel (CV), 20% vowel-consonant (VC), and 60% consonant-vowel-consonant (CVC) sounds. This study repeats Miller and Nicely's version of the hi-/lo-pass experiment with minor changes to study band-independence for the 16 Miller-Nicely consonants. The cut-off frequencies are chosen such that the basilar membrane is evenly divided into 12 segments from 250 to 8000 Hz with the high-pass and low-pass filters sharing the same six cut-off frequencies in the middle. Results show that the multiband product rule is statistically valid for consonants on average. It also applies to subgroups of consonants, such as stops and fricatives, which are characterized by a flat distribution of speech cues along the frequency. It fails for individual consonants.

  4. Experimental, Numerical, and Analytical Slosh Dynamics of Water and Liquid Nitrogen in a Spherical Tank

    NASA Technical Reports Server (NTRS)

    Storey, Jedediah Morse

    2016-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecraft's mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many experimental and numerical studies of water slosh have been conducted. However, slosh data for cryogenic liquids is lacking. Water and cryogenic liquid nitrogen are used in various ground-based tests with a spherical tank to characterize damping, slosh mode frequencies, and slosh forces. A single ring baffle is installed in the tank for some of the tests. Analytical models for slosh modes, slosh forces, and baffle damping are constructed based on prior work. Select experiments are simulated using a commercial CFD software, and the numerical results are compared to the analytical and experimental results for the purposes of validation and methodology-improvement.

  5. A Phenomenological Model and Validation of Shortening Induced Force Depression during Muscle Contractions

    PubMed Central

    McGowan, C.P.; Neptune, R.R.; Herzog, W.

    2009-01-01

    History dependent effects on muscle force development following active changes in length have been measured in a number of experimental studies. However, few muscle models have included these properties or examined their impact on force and power output in dynamic cyclic movements. The goal of this study was to develop and validate a modified Hill-type muscle model that includes shortening induced force depression and assess its influence on locomotor performance. The magnitude of force depression was defined by empirical relationships based on muscle mechanical work. To validate the model, simulations incorporating force depression were developed to emulate single muscle in situ and whole muscle group leg extension experiments. There was excellent agreement between simulation and experimental values, with in situ force patterns closely matching the experimental data (average RMS error < 1.5 N) and force depression in the simulated leg extension exercise being similar in magnitude to experimental values (6.0% vs 6.5%, respectively). To examine the influence of force depression on locomotor performance, simulations of maximum power pedaling with and without force depression were generated. Force depression decreased maximum crank power by 20% – 40%, depending on the relationship between force depression and muscle work used. These results indicate that force depression has the potential to substantially influence muscle power output in dynamic cyclic movements. However, to fully understand the impact of this phenomenon on human movement, more research is needed to characterize the relationship between force depression and mechanical work in large muscles with different morphologies. PMID:19879585

  6. 41 CFR 60-3.14 - Technical standards for validity studies.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...

  7. 41 CFR 60-3.14 - Technical standards for validity studies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...

  8. 41 CFR 60-3.14 - Technical standards for validity studies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...

  9. 41 CFR 60-3.14 - Technical standards for validity studies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...

  10. 41 CFR 60-3.14 - Technical standards for validity studies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...

  11. Narrative Constructions for the Organization of Self Experience: Proof of Concept via Embodied Robotics.

    PubMed

    Mealier, Anne-Laure; Pointeau, Gregoire; Mirliaz, Solène; Ogawa, Kenji; Finlayson, Mark; Dominey, Peter F

    2017-01-01

    It has been proposed that starting from meaning that the child derives directly from shared experience with others, adult narrative enriches this meaning and its structure, providing causal links between unseen intentional states and actions. This would require a means for representing meaning from experience-a situation model-and a mechanism that allows information to be extracted from sentences and mapped onto the situation model that has been derived from experience, thus enriching that representation. We present a hypothesis and theory concerning how the language processing infrastructure for grammatical constructions can naturally be extended to narrative constructions to provide a mechanism for using language to enrich meaning derived from physical experience. Toward this aim, the grammatical construction models are augmented with additional structures for representing relations between events across sentences. Simulation results demonstrate proof of concept for how the narrative construction model supports multiple successive levels of meaning creation which allows the system to learn about the intentionality of mental states, and argument substitution which allows extensions to metaphorical language and analogical problem solving. Cross-linguistic validity of the system is demonstrated in Japanese. The narrative construction model is then integrated into the cognitive system of a humanoid robot that provides the memory systems and world-interaction required for representing meaning in a situation model. In this context proof of concept is demonstrated for how the system enriches meaning in the situation model that has been directly derived from experience. In terms of links to empirical data, the model predicts strong usage based effects: that is, that the narrative constructions used by children will be highly correlated with those that they experience. It also relies on the notion of narrative or discourse function words. Both of these are validated in the experimental literature.

  12. Estimation of Unsteady Aerodynamic Models from Dynamic Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick; Klein, Vladislav

    2011-01-01

    Demanding aerodynamic modelling requirements for military and civilian aircraft have motivated researchers to improve computational and experimental techniques and to pursue closer collaboration in these areas. Model identification and validation techniques are key components for this research. This paper presents mathematical model structures and identification techniques that have been used successfully to model more general aerodynamic behaviours in single-degree-of-freedom dynamic testing. Model parameters, characterizing aerodynamic properties, are estimated using linear and nonlinear regression methods in both time and frequency domains. Steps in identification including model structure determination, parameter estimation, and model validation, are addressed in this paper with examples using data from one-degree-of-freedom dynamic wind tunnel and water tunnel experiments. These techniques offer a methodology for expanding the utility of computational methods in application to flight dynamics, stability, and control problems. Since flight test is not always an option for early model validation, time history comparisons are commonly made between computational and experimental results and model adequacy is inferred by corroborating results. An extension is offered to this conventional approach where more general model parameter estimates and their standard errors are compared.

  13. The Development of a Pediatric Inpatient Experience of Care Measure: Child HCAHPS®

    PubMed Central

    Toomey, Sara L.; Zaslavsky, Alan M.; Elliott, Marc N.; Gallagher, Patricia M.; Fowler, Floyd J.; Klein, David J.; Shulman, Shanna; Ratner, Jessica; McGovern, Caitriona; LeBlanc, Jessica L.; Schuster, Mark A.

    2016-01-01

    CMS uses Adult HCAHPS® scores for public reporting and pay-for-performance for most U.S. hospitals, but no publicly available standardized survey of inpatient experience of care exists for pediatrics. To fill the gap, CMS/AHRQ commissioned the development of the Consumer Assessment of Healthcare Providers and Systems Hospital Survey – Child Version (Child HCAHPS), a survey of parents/guardians of pediatric patients (<18 years old) who were recently hospitalized. This Special Article describes the development of Child HCAHPS, which included an extensive review of the literature and quality measures, expert interviews, focus groups, cognitive testing, pilot testing of the draft survey, a national field test with 69 hospitals in 34 states, psychometric analysis, and end-user testing of the final survey. We conducted extensive validity and reliability testing to determine which items would be included in the final survey instrument and to develop composite measures. We analyzed national field test data from 17,727 surveys collected from 11/12-1/14 from parents of recently hospitalized children. The final Child HCAHPS instrument has 62 items, including 39 patient experience items, 10 screeners, 12 demographic/descriptive items, and 1 open-ended item. The 39 experience items are categorized based on testing into 18 composite and single-item measures. Our composite and single-item measures demonstrated good to excellent hospital-level reliability at 300 responses per hospital. Child HCAHPS was developed to be a publicly available standardized survey of pediatric inpatient experience of care. It can be used to benchmark pediatric inpatient experience across hospitals and assist in efforts to improve the quality of inpatient care. PMID:26195542

  14. Self-evaluation of assessment programs: a cross-case analysis.

    PubMed

    Baartman, Liesbeth K J; Prins, Frans J; Kirschner, Paul A; van der Vleuten, Cees P M

    2011-08-01

    The goal of this article is to contribute to the validation of a self-evaluation method, which can be used by schools to evaluate the quality of their Competence Assessment Program (CAP). The outcomes of the self-evaluations of two schools are systematically compared: a novice school with little experience in competence-based education and assessment, and an innovative school with extensive experience. The self-evaluation was based on 12 quality criteria for CAPs, including both validity and reliability, and criteria stressing the importance of the formative function of assessment, such as meaningfulness and educational consequences. In each school, teachers, management and examination board participated. Results show that the two schools use different approaches to assure assessment quality. The innovative school seems to be more aware of its own strengths and weaknesses, to have a more positive attitude towards teachers, students, and educational innovations, and to explicitly involve stakeholders (i.e., teachers, students, and the work field) in their assessments. This school also had a more explicit vision of the goal of competence-based education and could design its assessments in accordance with these goals. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Spatial-temporal discriminant analysis for ERP-based brain-computer interface.

    PubMed

    Zhang, Yu; Zhou, Guoxu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2013-03-01

    Linear discriminant analysis (LDA) has been widely adopted to classify event-related potential (ERP) in brain-computer interface (BCI). Good classification performance of the ERP-based BCI usually requires sufficient data recordings for effective training of the LDA classifier, and hence a long system calibration time which however may depress the system practicability and cause the users resistance to the BCI system. In this study, we introduce a spatial-temporal discriminant analysis (STDA) to ERP classification. As a multiway extension of the LDA, the STDA method tries to maximize the discriminant information between target and nontarget classes through finding two projection matrices from spatial and temporal dimensions collaboratively, which reduces effectively the feature dimensionality in the discriminant analysis, and hence decreases significantly the number of required training samples. The proposed STDA method was validated with dataset II of the BCI Competition III and dataset recorded from our own experiments, and compared to the state-of-the-art algorithms for ERP classification. Online experiments were additionally implemented for the validation. The superior classification performance in using few training samples shows that the STDA is effective to reduce the system calibration time and improve the classification accuracy, thereby enhancing the practicability of ERP-based BCI.

  16. Validation and augmentation of Inrix arterial travel time data using independent sources : [research summary].

    DOT National Transportation Integrated Search

    2015-02-01

    Although the freeway travel time data has been validated extensively in recent : years, the quality of arterial travel time data is not well known. This project : presents a comprehensive validation scheme for arterial travel time data based : on GPS...

  17. Validity and Reliability of a New Device (WIMU®) for Measuring Hamstring Muscle Extensibility.

    PubMed

    Muyor, José M

    2017-09-01

    The aims of the current study were 1) to evaluate the validity of the WIMU ® system for measuring hamstring muscle extensibility in the passive straight leg raise (PSLR) test using an inclinometer for the criterion and 2) to determine the test-retest reliability of the WIMU ® system to measure hamstring muscle extensibility during the PSLR test. 55 subjects were evaluated on 2 separate occasions. Data from a Unilever inclinometer and WIMU ® system were collected simultaneously. Intraclass correlation coefficients (ICCs) for the validity were very high (0.983-1); a very low systematic bias (-0.21°--0.42°), random error (0.05°-0.04°) and standard error of the estimate (0.43°-0.34°) were observed (left-right leg, respectively) between the 2 devices (inclinometer and the WIMU ® system). The R 2 between the devices was 0.999 (p<0.001) in both the left and right legs. The test-retest reliability of the WIMU ® system was excellent, with ICCs ranging from 0.972-0.995, low coefficients of variation (0.01%), and a low standard error of the estimate (0.19-0.31°). The WIMU ® system showed strong concurrent validity and excellent test-retest reliability for the evaluation of hamstring muscle extensibility in the PSLR test. © Georg Thieme Verlag KG Stuttgart · New York.

  18. Perception of Sexual Orientation from Facial Structure: A Study with Artificial Face Models.

    PubMed

    González-Álvarez, Julio

    2017-07-01

    Research has shown that lay people can perceive sexual orientation better than chance from face stimuli. However, the relation between facial structure and sexual orientation has been scarcely examined. Recently, an extensive morphometric study on a large sample of Canadian people (Skorska, Geniole, Vrysen, McCormick, & Bogaert, 2015) identified three (in men) and four (in women) facial features as unique multivariate predictors of sexual orientation in each sex group. The present study tested the perceptual validity of these facial traits with two experiments based on realistic artificial 3D face models created by manipulating the key parameters and presented to Spanish participants. Experiment 1 included 200 White and Black face models of both sexes. The results showed an overall accuracy (0.74) clearly above chance in a binary hetero/homosexual judgment task and significant differences depending on the race and sex of the face models. Experiment 2 produced five versions of 24 artificial faces of both sexes varying the key parameters in equal steps, and participants had to rate on a 1-7 scale how likely they thought that the depicted person had a homosexual sexual orientation. Rating scores displayed an almost perfect linear regression as a function of the parameter steps. In summary, both experiments demonstrated the perceptual validity of the seven multivariate predictors identified by Skorska et al. and open up new avenues for further research on this issue with artificial face models.

  19. Parametric study of different contributors to tumor thermal profile

    NASA Astrophysics Data System (ADS)

    Tepper, Michal; Gannot, Israel

    2014-03-01

    Treating cancer is one of the major challenges of modern medicine. There is great interest in assessing tumor development in in vivo animal and human models, as well as in in vitro experiments. Existing methods are either limited by cost and availability or by their low accuracy and reproducibility. Thermography holds the potential of being a noninvasive, low-cost, irradiative and easy-to-use method for tumor monitoring. Tumors can be detected in thermal images due to their relatively higher or lower temperature compared to the temperature of the healthy skin surrounding them. Extensive research is performed to show the validity of thermography as an efficient method for tumor detection and the possibility of extracting tumor properties from thermal images, showing promising results. However, deducing from one type of experiment to others is difficult due to the differences in tumor properties, especially between different types of tumors or different species. There is a need in a research linking different types of tumor experiments. In this research, parametric analysis of possible contributors to tumor thermal profiles was performed. The effect of tumor geometric, physical and thermal properties was studied, both independently and together, in phantom model experiments and computer simulations. Theoretical and experimental results were cross-correlated to validate the models used and increase the accuracy of simulated complex tumor models. The contribution of different parameters in various tumor scenarios was estimated and the implication of these differences on the observed thermal profiles was studied. The correlation between animal and human models is discussed.

  20. Numerical Investigation of Plasma Detachment in Magnetic Nozzle Experiments

    NASA Technical Reports Server (NTRS)

    Sankaran, Kamesh; Polzin, Kurt A.

    2008-01-01

    At present there exists no generally accepted theoretical model that provides a consistent physical explanation of plasma detachment from an externally-imposed magnetic nozzle. To make progress towards that end, simulation of plasma flow in the magnetic nozzle of an arcjet experiment is performed using a multidimensional numerical simulation tool that includes theoretical models of the various dispersive and dissipative processes present in the plasma. This is an extension of the simulation tool employed in previous work by Sankaran et al. The aim is to compare the computational results with various proposed magnetic nozzle detachment theories to develop an understanding of the physical mechanisms that cause detachment. An applied magnetic field topology is obtained using a magnetostatic field solver (see Fig. I), and this field is superimposed on the time-dependent magnetic field induced in the plasma to provide a self-consistent field description. The applied magnetic field and model geometry match those found in experiments by Kuriki and Okada. This geometry is modeled because there is a substantial amount of experimental data that can be compared to the computational results, allowing for validation of the model. In addition, comparison of the simulation results with the experimentally obtained plasma parameters will provide insight into the mechanisms that lead to plasma detachment, revealing how they scale with different input parameters. Further studies will focus on modeling literature experiments both for the purpose of additional code validation and to extract physical insight regarding the mechanisms driving detachment.

  1. In Defense of an Instrument-Based Approach to Validity

    ERIC Educational Resources Information Center

    Hood, S. Brian

    2012-01-01

    Paul E. Newton argues in favor of a conception of validity, viz, "the consensus definition of validity," according to which the extension of the predicate "is valid" is a subset of "assessment-based decision-making procedure[s], which [are] underwritten by an argument that the assessment procedure can be used to measure the attribute entailed by…

  2. Contrast-enhanced spectral mammography in recalls from the Dutch breast cancer screening program: validation of results in a large multireader, multicase study.

    PubMed

    Lalji, U C; Houben, I P L; Prevos, R; Gommers, S; van Goethem, M; Vanwetswinkel, S; Pijnappel, R; Steeman, R; Frotscher, C; Mok, W; Nelemans, P; Smidt, M L; Beets-Tan, R G; Wildberger, J E; Lobbes, M B I

    2016-12-01

    Contrast-enhanced spectral mammography (CESM) is a promising problem-solving tool in women referred from a breast cancer screening program. We aimed to study the validity of preliminary results of CESM using a larger panel of radiologists with different levels of CESM experience. All women referred from the Dutch breast cancer screening program were eligible for CESM. 199 consecutive cases were viewed by ten radiologists. Four had extensive CESM experience, three had no CESM experience but were experienced breast radiologists, and three were residents. All readers provided a BI-RADS score for the low-energy CESM images first, after which the score could be adjusted when viewing the entire CESM exam. BI-RADS 1-3 were considered benign and BI-RADS 4-5 malignant. With this cutoff, we calculated sensitivity, specificity and area under the ROC curve. CESM increased diagnostic accuracy in all readers. The performance for all readers using CESM was: sensitivity 96.9 % (+3.9 %), specificity 69.7 % (+33.8 %) and area under the ROC curve 0.833 (+0.188). CESM is superior to conventional mammography, with excellent problem-solving capabilities in women referred from the breast cancer screening program. Previous results were confirmed even in a larger panel of readers with varying CESM experience. • CESM is consistently superior to conventional mammography • CESM increases diagnostic accuracy regardless of a reader's experience • CESM is an excellent problem-solving tool in recalls from screening programs.

  3. Louisiana Extension Educators' Perceptions of the Benefit and Relevance of Participating in an International Extension Experience toward Their Career

    ERIC Educational Resources Information Center

    McClure, Carli; Danjean, Shelli; Bunch, J. C.; Machtmes, Krisanna; Kotrlik, Joe W.

    2014-01-01

    The purpose of this study was to assess Extension educators' perceptions of the benefit and relevance of an international Extension experience (IEE) toward their career. It was concluded that almost two-thirds of Extension educators perceive that participation in an IEE is beneficial and relevant to their careers. Further, Extension educators…

  4. A Snapshot of Organizational Climate: Perceptions of Extension Faculty

    ERIC Educational Resources Information Center

    Tower, Leslie E.; Bowen, Elaine; Alkadry, Mohamad G.

    2011-01-01

    This article provides a snapshot of the perceptions of workplace climate of Extension faculty at a land-grant, research-high activity university, compared with the perceptions of non-Extension faculty at the same university. An online survey was conducted with a validated instrument. The response rate for university faculty was 44% (968); the…

  5. Flight test experience with high-alpha control system techniques on the F-14 airplane

    NASA Technical Reports Server (NTRS)

    Gera, J.; Wilson, R. J.; Enevoldson, E. K.; Nguyen, L. T.

    1981-01-01

    Improved handling qualities of fighter aircraft at high angles of attack can be provided by various stability and control augmentation techniques. NASA and the U.S. Navy are conducting a joint flight demonstration of these techniques on an F-14 airplane. This paper reports on the flight test experience with a newly designed lateral-directional control system which suppresses such high angle of attack handling qualities problems as roll reversal, wing rock, and directional divergence while simultaneously improving departure/spin resistance. The technique of integrating a piloted simulation into the flight program was used extensively in this program. This technique had not been applied previously to high angle of attack testing and required the development of a valid model to simulate the test airplane at extremely high angles of attack.

  6. Joint multifractal analysis based on wavelet leaders

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Yang, Yan-Hong; Wang, Gang-Jin; Zhou, Wei-Xing

    2017-12-01

    Mutually interacting components form complex systems and these components usually have long-range cross-correlated outputs. Using wavelet leaders, we propose a method for characterizing the joint multifractal nature of these long-range cross correlations; we call this method joint multifractal analysis based on wavelet leaders (MF-X-WL). We test the validity of the MF-X-WL method by performing extensive numerical experiments on dual binomial measures with multifractal cross correlations and bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. Both experiments indicate that MF-X-WL is capable of detecting cross correlations in synthetic data with acceptable estimating errors. We also apply the MF-X-WL method to pairs of series from financial markets (returns and volatilities) and online worlds (online numbers of different genders and different societies) and determine intriguing joint multifractal behavior.

  7. Architecture Design and Experimental Platform Demonstration of Optical Network based on OpenFlow Protocol

    NASA Astrophysics Data System (ADS)

    Xing, Fangyuan; Wang, Honghuan; Yin, Hongxi; Li, Ming; Luo, Shenzi; Wu, Chenguang

    2016-02-01

    With the extensive application of cloud computing and data centres, as well as the constantly emerging services, the big data with the burst characteristic has brought huge challenges to optical networks. Consequently, the software defined optical network (SDON) that combines optical networks with software defined network (SDN), has attracted much attention. In this paper, an OpenFlow-enabled optical node employed in optical cross-connect (OXC) and reconfigurable optical add/drop multiplexer (ROADM), is proposed. An open source OpenFlow controller is extended on routing strategies. In addition, the experiment platform based on OpenFlow protocol for software defined optical network, is designed. The feasibility and availability of the OpenFlow-enabled optical nodes and the extended OpenFlow controller are validated by the connectivity test, protection switching and load balancing experiments in this test platform.

  8. Crack propagation analysis using acoustic emission sensors for structural health monitoring systems.

    PubMed

    Kral, Zachary; Horn, Walter; Steck, James

    2013-01-01

    Aerospace systems are expected to remain in service well beyond their designed life. Consequently, maintenance is an important issue. A novel method of implementing artificial neural networks and acoustic emission sensors to form a structural health monitoring (SHM) system for aerospace inspection routines was the focus of this research. Simple structural elements, consisting of flat aluminum plates of AL 2024-T3, were subjected to increasing static tensile loading. As the loading increased, designed cracks extended in length, releasing strain waves in the process. Strain wave signals, measured by acoustic emission sensors, were further analyzed in post-processing by artificial neural networks (ANN). Several experiments were performed to determine the severity and location of the crack extensions in the structure. ANNs were trained on a portion of the data acquired by the sensors and the ANNs were then validated with the remaining data. The combination of a system of acoustic emission sensors, and an ANN could determine crack extension accurately. The difference between predicted and actual crack extensions was determined to be between 0.004 in. and 0.015 in. with 95% confidence. These ANNs, coupled with acoustic emission sensors, showed promise for the creation of an SHM system for aerospace systems.

  9. Structural aspects of Lorentz-violating quantum field theory

    NASA Astrophysics Data System (ADS)

    Cambiaso, M.; Lehnert, R.; Potting, R.

    2018-01-01

    In the last couple of decades the Standard Model Extension has emerged as a fruitful framework to analyze the empirical and theoretical extent of the validity of cornerstones of modern particle physics, namely, of Special Relativity and of the discrete symmetries C, P and T (or some combinations of these). The Standard Model Extension allows to contrast high-precision experimental tests with posited alterations representing minute Lorentz and/or CPT violations. To date no violation of these symmetry principles has been observed in experiments, mostly prompted by the Standard-Model Extension. From the latter, bounds on the extent of departures from Lorentz and CPT symmetries can be obtained with ever increasing accuracy. These analyses have been mostly focused on tree-level processes. In this presentation I would like to comment on structural aspects of perturbative Lorentz violating quantum field theory. I will show that some insight coming from radiative corrections demands a careful reassessment of perturbation theory. Specifically I will argue that both the standard renormalization procedure as well as the Lehmann-Symanzik-Zimmermann reduction formalism need to be adapted given that the asymptotic single-particle states can receive quantum corrections from Lorentz-violating operators that are not present in the original Lagrangian.

  10. The Brazilian Experience with Agroecological Extension: A Critical Analysis of Reform in a Pluralistic Extension System

    ERIC Educational Resources Information Center

    Diesel, Vivien; Miná Dias, Marcelo

    2016-01-01

    Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…

  11. Light-quark, heavy-quark systems: An update

    NASA Astrophysics Data System (ADS)

    Grinstein, B.

    1993-06-01

    We review many of the recently developed applications of Heavy Quark Effective Theory techniques. After a brief update on Luke's theorem, we describe striking relations between heavy baryon form factors, and how to use them to estimate the accuracy of the extraction of (vert bar)V(sub cb)(vert bar). We discuss factorization and compare with experiment. An elementary presentation, with sample applications, of reparametrization invariance comes next. The final and most extensive chapter in this review deals with phenomenological lagrangians that incorporate heavy-quark spin-flavor as well as light quark chiral symmetries. We compile many interesting results and discuss the validity of the calculations.

  12. Parallelization of Unsteady Adaptive Mesh Refinement for Unstructured Navier-Stokes Solvers

    NASA Technical Reports Server (NTRS)

    Schwing, Alan M.; Nompelis, Ioannis; Candler, Graham V.

    2014-01-01

    This paper explores the implementation of the MPI parallelization in a Navier-Stokes solver using adaptive mesh re nement. Viscous and inviscid test problems are considered for the purpose of benchmarking, as are implicit and explicit time advancement methods. The main test problem for comparison includes e ects from boundary layers and other viscous features and requires a large number of grid points for accurate computation. Ex- perimental validation against double cone experiments in hypersonic ow are shown. The adaptive mesh re nement shows promise for a staple test problem in the hypersonic com- munity. Extension to more advanced techniques for more complicated ows is described.

  13. Report of the panel on international programs

    NASA Technical Reports Server (NTRS)

    Anderson, Allen Joel; Fuchs, Karl W.; Ganeka, Yasuhiro; Gaur, Vinod; Green, Andrew A.; Siegfried, W.; Lambert, Anthony; Rais, Jacub; Reighber, Christopher; Seeger, Herman

    1991-01-01

    The panel recommends that NASA participate and take an active role in the continuous monitoring of existing regional networks, the realization of high resolution geopotential and topographic missions, the establishment of interconnection of the reference frames as defined by different space techniques, the development and implementation of automation for all ground-to-space observing systems, calibration and validation experiments for measuring techniques and data, the establishment of international space-based networks for real-time transmission of high density space data in standardized formats, tracking and support for non-NASA missions, and the extension of state-of-the art observing and analysis techniques to developing nations.

  14. Dynamic gene expression response to altered gravity in human T cells.

    PubMed

    Thiel, Cora S; Hauschild, Swantje; Huge, Andreas; Tauber, Svantje; Lauber, Beatrice A; Polzer, Jennifer; Paulsen, Katrin; Lier, Hartwin; Engelmann, Frank; Schmitz, Burkhard; Schütte, Andreas; Layer, Liliana E; Ullrich, Oliver

    2017-07-12

    We investigated the dynamics of immediate and initial gene expression response to different gravitational environments in human Jurkat T lymphocytic cells and compared expression profiles to identify potential gravity-regulated genes and adaptation processes. We used the Affymetrix GeneChip® Human Transcriptome Array 2.0 containing 44,699 protein coding genes and 22,829 non-protein coding genes and performed the experiments during a parabolic flight and a suborbital ballistic rocket mission to cross-validate gravity-regulated gene expression through independent research platforms and different sets of control experiments to exclude other factors than alteration of gravity. We found that gene expression in human T cells rapidly responded to altered gravity in the time frame of 20 s and 5 min. The initial response to microgravity involved mostly regulatory RNAs. We identified three gravity-regulated genes which could be cross-validated in both completely independent experiment missions: ATP6V1A/D, a vacuolar H + -ATPase (V-ATPase) responsible for acidification during bone resorption, IGHD3-3/IGHD3-10, diversity genes of the immunoglobulin heavy-chain locus participating in V(D)J recombination, and LINC00837, a long intergenic non-protein coding RNA. Due to the extensive and rapid alteration of gene expression associated with regulatory RNAs, we conclude that human cells are equipped with a robust and efficient adaptation potential when challenged with altered gravitational environments.

  15. Experimental studies of Micro- and Nano-grained UO 2: Grain Growth Behavior, Sufrace Morphology, and Fracture Toughness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miao, Yinbin; Mo, Kun; Jamison, Laura M.

    This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructure-basedmore » materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize the experimental efforts in FY16 including the following important experiments: (1) in-situ grain growth measurement of nano-grained UO 2; (2) investigation of surface morphology in micrograined UO 2; (3) Nano-indentation experiments on nano- and micro-grained UO 2. The highlight of this year is: we have successfully demonstrated our capability to in-situ measure grain size development while maintaining the stoichiometry of nano-grained UO 2 materials; the experiment is, for the first time, using synchrotron X-ray diffraction to in-situ measure grain growth behavior of UO 2.« less

  16. Narrative Constructions for the Organization of Self Experience: Proof of Concept via Embodied Robotics

    PubMed Central

    Mealier, Anne-Laure; Pointeau, Gregoire; Mirliaz, Solène; Ogawa, Kenji; Finlayson, Mark; Dominey, Peter F.

    2017-01-01

    It has been proposed that starting from meaning that the child derives directly from shared experience with others, adult narrative enriches this meaning and its structure, providing causal links between unseen intentional states and actions. This would require a means for representing meaning from experience—a situation model—and a mechanism that allows information to be extracted from sentences and mapped onto the situation model that has been derived from experience, thus enriching that representation. We present a hypothesis and theory concerning how the language processing infrastructure for grammatical constructions can naturally be extended to narrative constructions to provide a mechanism for using language to enrich meaning derived from physical experience. Toward this aim, the grammatical construction models are augmented with additional structures for representing relations between events across sentences. Simulation results demonstrate proof of concept for how the narrative construction model supports multiple successive levels of meaning creation which allows the system to learn about the intentionality of mental states, and argument substitution which allows extensions to metaphorical language and analogical problem solving. Cross-linguistic validity of the system is demonstrated in Japanese. The narrative construction model is then integrated into the cognitive system of a humanoid robot that provides the memory systems and world-interaction required for representing meaning in a situation model. In this context proof of concept is demonstrated for how the system enriches meaning in the situation model that has been directly derived from experience. In terms of links to empirical data, the model predicts strong usage based effects: that is, that the narrative constructions used by children will be highly correlated with those that they experience. It also relies on the notion of narrative or discourse function words. Both of these are validated in the experimental literature. PMID:28861011

  17. Valid Knowledge: The Economy and the Academy

    ERIC Educational Resources Information Center

    Williams, Peter John

    2007-01-01

    The future of Western universities as public institutions is the subject of extensive continuing debate, underpinned by the issue of what constitutes "valid knowledge". Where in the past only propositional knowledge codified by academics was considered valid, in the new economy enabled by information and communications technology, the procedural…

  18. Initial Development and Validation of the Global Citizenship Scale

    ERIC Educational Resources Information Center

    Morais, Duarte B.; Ogden, Anthony C.

    2011-01-01

    The purpose of this article is to report on the initial development of a theoretically grounded and empirically validated scale to measure global citizenship. The methodology employed is multi-faceted, including two expert face validity trials, extensive exploratory and confirmatory factor analyses with multiple datasets, and a series of three…

  19. Initial Teacher Licensure Testing in Tennessee: Test Validation.

    ERIC Educational Resources Information Center

    Bowman, Harry L.; Petry, John R.

    In 1988 a study was conducted to determine the validity of candidate teacher licensure examinations for use in Tennessee under the 1984 Comprehensive Education Reform Act. The Department of Education conducted a study to determine the validity of 11 previously unvalidated or extensively revised tests for certification and to make recommendations…

  20. Analysis and Validation of a Predictive Model for Growth and Death of Aeromonas hydrophila under Modified Atmospheres at Refrigeration Temperatures

    PubMed Central

    Pin, Carmen; Velasco de Diego, Raquel; George, Susan; García de Fernando, Gonzalo D.; Baranyi, József

    2004-01-01

    Specific growth and death rates of Aeromonas hydrophila were measured in laboratory media under various combinations of temperature, pH, and percent CO2 and O2 in the atmosphere. Predictive models were developed from the data and validated by means of observations obtained from (i) seafood experiments set up for this purpose and (ii) the ComBase database (http://www.combase.cc; http://wyndmoor.arserrc.gov/combase/).Two main reasons were identified for the differences between the predicted and observed growth in food: they were the variability of the growth rates in food and the bias of the model predictions when applied to food environments. A statistical method is presented to quantitatively analyze these differences. The method was also used to extend the interpolation region of the model. In this extension, the concept of generalized Z values (C. Pin, G. García de Fernando, J. A. Ordóñez, and J. Baranyi, Food Microbiol. 18:539-545, 2001) played an important role. The extension depended partly on the density of the model-generating observations and partly on the accuracy of extrapolated predictions close to the boundary of the interpolation region. The boundary of the growth region of the organism was also estimated by means of experimental results for growth and death rates. PMID:15240265

  1. Criterion validity study of the cervical range of motion (CROM) device for rotational range of motion on healthy adults.

    PubMed

    Tousignant, Michel; Smeesters, Cécil; Breton, Anne-Marie; Breton, Emilie; Corriveau, Hélène

    2006-04-01

    This study compared range of motion (ROM) measurements using a cervical range of motion device (CROM) and an optoelectronic system (OPTOTRAK). To examine the criterion validity of the CROM for the measurement of cervical ROM on healthy adults. Whereas measurements of cervical ROM are recognized as part of the assessment of patients with neck pain, few devices are available in clinical settings. Two papers published previously showed excellent criterion validity for measurements of cervical flexion/extension and lateral flexion using the CROM. Subjects performed neck rotation, flexion/extension, and lateral flexion while sitting on a wooden chair. The ROM values were measured by the CROM as well as the OPTOTRAK. The cervical rotational ROM values using the CROM demonstrated a good to excellent linear relationship with those using the OPTOTRAK: right rotation, r = 0.89 (95% confidence interval, 0.81-0.94), and left rotation, r = 0.94 (95% confidence interval, 0.90-0.97). Similar results were also obtained for flexion/extension and lateral flexion ROM values. The CROM showed excellent criterion validity for measurements of cervical rotation. We propose using ROM values measured by the CROM as outcome measures for patients with neck pain.

  2. The effect of geriatric rehabilitation on physical performance and pain in men and women.

    PubMed

    Niemelä, Kristiina; Leinonen, Raija; Laukkanen, Pia

    2011-01-01

    In the developed countries, people are living longer and the number of aged persons is growing. Knowledge on the effectiveness of rehabilitative procedures is needed and information in physical performance between men and women is scarce. An intervention study was carried out in two war veterans' rehabilitation centers in Finland to examine the effects of geriatric inpatient rehabilitation on physical performance and pain in elderly men and women. The study included 441 community-dwelling persons with a mean age of 83 years. A clinical assessment and a structured interview were carried out. Cognitive capacity was evaluated with the mini-mental state examination (MMSE). Physical performance was measured through several validated tests. Pain was measured with the visual analogy scale (VAS). The rehabilitation was carried out with the standard rehabilitation protocol. Both men and women showed a statistically significant improvement in physical performance tests. The experience of pain and disease symptoms diminished significantly in both sexes (p<0.001). The intervention showed that women improved more than men. It showed significant interactions of group by time in knee extension strength (p=0.033), the experience of pain reduction (p=0.002) and disease symptoms (p=0.040). Inpatient geriatric rehabilitation appeared to have a positive effect on physical performance and the experience of pain in elderly people. The differences between the sexes in the experience of pain, disease symptoms and in the knee extension strength could provide a new perspective in the planning of more individual rehabilitation interventions. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  3. Comparison of simulator fidelity model predictions with in-simulator evaluation data

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Mckissick, B. T.; Ashworth, B. R.

    1983-01-01

    A full factorial in simulator experiment of a single axis, multiloop, compensatory pitch tracking task is described. The experiment was conducted to provide data to validate extensions to an analytic, closed loop model of a real time digital simulation facility. The results of the experiment encompassing various simulation fidelity factors, such as visual delay, digital integration algorithms, computer iteration rates, control loading bandwidths and proprioceptive cues, and g-seat kinesthetic cues, are compared with predictions obtained from the analytic model incorporating an optimal control model of the human pilot. The in-simulator results demonstrate more sensitivity to the g-seat and to the control loader conditions than were predicted by the model. However, the model predictions are generally upheld, although the predicted magnitudes of the states and of the error terms are sometimes off considerably. Of particular concern is the large sensitivity difference for one control loader condition, as well as the model/in-simulator mismatch in the magnitude of the plant states when the other states match.

  4. A new, low-cost sun photometer for student use

    NASA Astrophysics Data System (ADS)

    Espinoza, A.; Pérez-Álvarez, H.; Parra-Vilchis, J. I.; Fauchey-López, E.; Fernando-González, L.; Faus-Landeros, G. E.; Celarier, E. A.; Robinson, D. Q.; Zepeda-Galbez, R.

    2011-12-01

    We have designed a sun photometer for the measurement of aerosol optical thickness (AOT) at 505 nm and 620 nm, using custom-made glass filters (9.5 nm bandpass, FWHM) and photodiodes. The recommended price-point (US150 - US200) allowed us to incorporate technologies such as microcontrollers, a sun target, a USB port for data uploading, nonvolatile memory to contain tables of up to 127 geolocation profiles, extensive calibration data, and a log of up to 2,000 measurements. The instrument is designed to be easy to use, and to provide instant display of AOT estimates. A diffuser in the fore-optics limits the sensitivity to pointing error. We have developed postprocessing software to refine the AOT estimates, format a spreadsheet file, and upload the data to the GLOBE website. We are currently finalizing hardware and firmware, and conducting extensive calibration/validation experiments. These instruments will soon be in production and available to the K-12 education community, including and especially the GLOBE program.

  5. Modified linear predictive coding approach for moving target tracking by Doppler radar

    NASA Astrophysics Data System (ADS)

    Ding, Yipeng; Lin, Xiaoyi; Sun, Ke-Hui; Xu, Xue-Mei; Liu, Xi-Yao

    2016-07-01

    Doppler radar is a cost-effective tool for moving target tracking, which can support a large range of civilian and military applications. A modified linear predictive coding (LPC) approach is proposed to increase the target localization accuracy of the Doppler radar. Based on the time-frequency analysis of the received echo, the proposed approach first real-time estimates the noise statistical parameters and constructs an adaptive filter to intelligently suppress the noise interference. Then, a linear predictive model is applied to extend the available data, which can help improve the resolution of the target localization result. Compared with the traditional LPC method, which empirically decides the extension data length, the proposed approach develops an error array to evaluate the prediction accuracy and thus, adjust the optimum extension data length intelligently. Finally, the prediction error array is superimposed with the predictor output to correct the prediction error. A series of experiments are conducted to illustrate the validity and performance of the proposed techniques.

  6. Flight test experience and controlled impact of a large, four-engine, remotely piloted airplane

    NASA Technical Reports Server (NTRS)

    Kempel, R. W.; Horton, T. W.

    1985-01-01

    A controlled impact demonstration (CID) program using a large, four engine, remotely piloted transport airplane was conducted. Closed loop primary flight control was performed from a ground based cockpit and digital computer in conjunction with an up/down telemetry link. Uplink commands were received aboard the airplane and transferred through uplink interface systems to a highly modified Bendix PB-20D autopilot. Both proportional and discrete commands were generated by the ground pilot. Prior to flight tests, extensive simulation was conducted during the development of ground based digital control laws. The control laws included primary control, secondary control, and racetrack and final approach guidance. Extensive ground checks were performed on all remotely piloted systems. However, manned flight tests were the primary method of verification and validation of control law concepts developed from simulation. The design, development, and flight testing of control laws and the systems required to accomplish the remotely piloted mission are discussed.

  7. Performance analysis and experimental study on rainfall water purification with an extensive green roof matrix layer in Shanghai, China.

    PubMed

    Guo, Jiankang; Zhang, Yanting; Che, Shengquan

    2018-02-01

    Current research has validated the purification of rainwater by a substrate layer of green roofs to some extent, though the effects of the substrate layer on rainwater purification have not been adequately quantified. The present study set up nine extensive green roof experiment combinations based on the current conditions of precipitation characteristics observed in Shanghai, China. Different rain with pollutants were simulated, and the orthogonal design L9 (33) test was conducted to measure purification performance. The purification influences of the extensive green roof substrate layer were quantitatively analyzed in Shanghai to optimize the thickness, proportion of substrate, and sodium polyacrylate content. The experimental outcomes resulted in ammonium nitrogen (NH 4 + -N), lead (Pb), and zinc (Zn) removal of up to 93.87%, 98.81%, and 94.55% in the artificial rainfall, respectively, and NH 4 + -N, Pb, and Zn event mean concentration (EMC) was depressed to 0.263 mg/L, 0.002 mg/L and 0.018 mg/L, respectively, which were all well below the pollutant concentrations of artificial rainfall. With reference to the rainfall chemical characteristics of Shanghai, a combination of a 200 mm thickness, proportions of 1:1:2 of Loam: Perlite: Cocopeat and 2 g/L sodium polyacrylate content was suggested for the design of an extensive green roof substrate to purify NH 4 + -N, Pb and Zn.

  8. Age-related reduction of trunk muscle torque and prevalence of trunk sarcopenia in community-dwelling elderly: Validity of a portable trunk muscle torque measurement instrument and its application to a large sample cohort study

    PubMed Central

    Sasaki, Shizuka; Chiba, Daisuke; Yamamoto, Yuji; Nawata, Atsushi; Tsuda, Eiichi; Nakaji, Shigeyuki; Ishibashi, Yasuyuki

    2018-01-01

    Trunk muscle weakness and imbalance are risk factors for postural instability, low back pain, and poor postoperative outcomes. The association between trunk muscle strength and aging is poorly understood, and establishing normal reference values is difficult. We aimed to establish the validity of a novel portable trunk muscle torque measurement instrument (PTMI). We then estimated reference data for healthy young adults and elucidated age-related weakness in trunk muscle strength. Twenty-four university students were enrolled to validate values for PTMI, and 816 volunteers from the general population who were recruited to the Iwaki Health Promotion Project were included to estimate reference data for trunk muscle strength. Trunk flexion and extension torque were measured with PTMI and KinCom, and interclass correlation coefficients (ICC) were estimated to evaluate the reliability of PTMI values. Furthermore, from the young adult reference, the age-related reduction in trunk muscle torque and the prevalence of sarcopenia among age-sex groups were estimated. The ICC in flexion and extension torque were 0.807 (p<0.001) and 0.789 (p<0.001), respectively. The prevalence of sarcopenia increased with age, and the prevalence due to flexion torque was double that of extension torque. Flexion torque decreased significantly after 60 years of age, and extension torque decreased after 70 years of age. In males over age 80, trunk muscle torque decreased to 49.1% in flexion and 63.5% in extension. In females over age 80, trunk muscle torque decreased to 60.7% in flexion and 68.4% in extension. The validity of PTMI was confirmed by correlation with KinCom. PTMI produced reference data for healthy young adults, and demonstrated age-related reduction in trunk muscle torque. Trunk sarcopenia progressed with aging, and the loss of flexion torque began earlier than extension torque. At age 80, trunk muscle torque had decreased 60% compared with healthy young adults. PMID:29471310

  9. Age-related reduction of trunk muscle torque and prevalence of trunk sarcopenia in community-dwelling elderly: Validity of a portable trunk muscle torque measurement instrument and its application to a large sample cohort study.

    PubMed

    Sasaki, Eiji; Sasaki, Shizuka; Chiba, Daisuke; Yamamoto, Yuji; Nawata, Atsushi; Tsuda, Eiichi; Nakaji, Shigeyuki; Ishibashi, Yasuyuki

    2018-01-01

    Trunk muscle weakness and imbalance are risk factors for postural instability, low back pain, and poor postoperative outcomes. The association between trunk muscle strength and aging is poorly understood, and establishing normal reference values is difficult. We aimed to establish the validity of a novel portable trunk muscle torque measurement instrument (PTMI). We then estimated reference data for healthy young adults and elucidated age-related weakness in trunk muscle strength. Twenty-four university students were enrolled to validate values for PTMI, and 816 volunteers from the general population who were recruited to the Iwaki Health Promotion Project were included to estimate reference data for trunk muscle strength. Trunk flexion and extension torque were measured with PTMI and KinCom, and interclass correlation coefficients (ICC) were estimated to evaluate the reliability of PTMI values. Furthermore, from the young adult reference, the age-related reduction in trunk muscle torque and the prevalence of sarcopenia among age-sex groups were estimated. The ICC in flexion and extension torque were 0.807 (p<0.001) and 0.789 (p<0.001), respectively. The prevalence of sarcopenia increased with age, and the prevalence due to flexion torque was double that of extension torque. Flexion torque decreased significantly after 60 years of age, and extension torque decreased after 70 years of age. In males over age 80, trunk muscle torque decreased to 49.1% in flexion and 63.5% in extension. In females over age 80, trunk muscle torque decreased to 60.7% in flexion and 68.4% in extension. The validity of PTMI was confirmed by correlation with KinCom. PTMI produced reference data for healthy young adults, and demonstrated age-related reduction in trunk muscle torque. Trunk sarcopenia progressed with aging, and the loss of flexion torque began earlier than extension torque. At age 80, trunk muscle torque had decreased 60% compared with healthy young adults.

  10. Fitmunk: improving protein structures by accurate, automatic modeling of side-chain conformations.

    PubMed

    Porebski, Przemyslaw Jerzy; Cymborowski, Marcin; Pasenkiewicz-Gierula, Marta; Minor, Wladek

    2016-02-01

    Improvements in crystallographic hardware and software have allowed automated structure-solution pipelines to approach a near-`one-click' experience for the initial determination of macromolecular structures. However, in many cases the resulting initial model requires a laborious, iterative process of refinement and validation. A new method has been developed for the automatic modeling of side-chain conformations that takes advantage of rotamer-prediction methods in a crystallographic context. The algorithm, which is based on deterministic dead-end elimination (DEE) theory, uses new dense conformer libraries and a hybrid energy function derived from experimental data and prior information about rotamer frequencies to find the optimal conformation of each side chain. In contrast to existing methods, which incorporate the electron-density term into protein-modeling frameworks, the proposed algorithm is designed to take advantage of the highly discriminatory nature of electron-density maps. This method has been implemented in the program Fitmunk, which uses extensive conformational sampling. This improves the accuracy of the modeling and makes it a versatile tool for crystallographic model building, refinement and validation. Fitmunk was extensively tested on over 115 new structures, as well as a subset of 1100 structures from the PDB. It is demonstrated that the ability of Fitmunk to model more than 95% of side chains accurately is beneficial for improving the quality of crystallographic protein models, especially at medium and low resolutions. Fitmunk can be used for model validation of existing structures and as a tool to assess whether side chains are modeled optimally or could be better fitted into electron density. Fitmunk is available as a web service at http://kniahini.med.virginia.edu/fitmunk/server/ or at http://fitmunk.bitbucket.org/.

  11. Vibration control of beams using constrained layer damping with functionally graded viscoelastic cores: theory and experiments

    NASA Astrophysics Data System (ADS)

    El-Sabbagh, A.; Baz, A.

    2006-03-01

    Conventionally, the viscoelastic cores of Constrained Layer Damping (CLD) treatments are made of materials that have uniform shear modulus. Under such conditions, it is well-recognized that these treatments are only effective near their edges where the shear strains attain their highest values. In order to enhance the damping characteristics of the CLD treatments, we propose to manufacture the cores from Functionally Graded ViscoElastic Materials (FGVEM) that have optimally selected gradient of the shear modulus over the length of the treatments. With such optimized distribution of the shear modulus, the shear strain can be enhanced, and the energy dissipation can be maximized. The theory governing the vibration of beams treated with CLD, that has functionally graded viscoelastic cores, is presented using the finite element method (FEM). The predictions of the FEM are validated experimentally for plain beams, beams treated conventional CLD, and beams with CLD/FGVEM of different configurations. The obtained results indicate a close agreement between theory and experiments. Furthermore, the obtained results demonstrate the effectiveness of the new class of CLD with functionally graded cores in enhancing the energy dissipation over the conventional CLD over a broad frequency band. Extension of the proposed one-dimensional beam/CLD/FGVEM system to more complex structures is a natural extension to the present study.

  12. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  13. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    DTIC Science & Technology

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  14. 77 FR 46750 - Agency Information Collection Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-06

    ... Questionnaire Testing, Evaluation, and Research.'' The proposed collection will utilize qualitative and quantitative methodologies to pretest questionnaires and validate EIA survey forms data quality, including..., Evaluation, and Research; (3) Type of Request: Extension, Without Change, of a Previously Approved Collection...

  15. Development of the Packed Bed Reactor ISS Flight Experiment

    NASA Technical Reports Server (NTRS)

    Patton, Martin O.; Bruzas, Anthony E.; Rame, Enrique; Motil, Brian J.

    2012-01-01

    Packed bed reactors are compact, require minimum power and maintenance to operate, and are highly reliable. These features make this technology a leading candidate as a potential unit operation in support of long duration human space exploration. On earth, this type of reactor accounts for approximately 80% of all the reactors used in the chemical process industry today. Development of this technology for space exploration is truly crosscutting with many other potential applications (e.g., in-situ chemical processing of planetary materials and transport of nutrients through soil). NASA is developing an ISS experiment to address this technology with particular focus on water reclamation and air revitalization. Earlier research and development efforts funded by NASA have resulted in two hydrodynamic models which require validation with appropriate instrumentation in an extended microgravity environment. The first model developed by Motil et al., (2003) is based on a modified Ergun equation. This model was demonstrated at moderate gas and liquid flow rates, but extension to the lower flow rates expected in many advanced life support systems must be validated. The other model, developed by Guo et al., (2004) is based on Darcy s (1856) law for two-phase flow. This model has been validated for a narrow range of flow parameters indirectly (without full instrumentation) and included test points where the flow was not fully developed. The flight experiment presented will be designed with removable test sections to test the hydrodynamic models. The experiment will provide flexibility to test additional beds with different types of packing in the future. One initial test bed is based on the VRA (Volatile Removal Assembly), a packed bed reactor currently on ISS whose behavior in micro-gravity is not fully understood. Improving the performance of this system through an accurate model will increase our ability to purify water in the space environment.

  16. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    ERIC Educational Resources Information Center

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  17. Developing and Validating a Survey of Korean Early Childhood English Teachers' Knowledge

    ERIC Educational Resources Information Center

    Kim, Jung In

    2015-01-01

    The main purpose of this study is to develop and validate a valid measure of the early childhood (EC) English teacher knowledge. Through extensive literature review on second/foreign language (L2/FL) teacher knowledge, early childhood teacher knowledge and early childhood language teacher knowledge, and semi-structured interviews from current…

  18. Intrasubject multimodal groupwise registration with the conditional template entropy.

    PubMed

    Polfliet, Mathias; Klein, Stefan; Huizinga, Wyke; Paulides, Margarethus M; Niessen, Wiro J; Vandemeulebroucke, Jef

    2018-05-01

    Image registration is an important task in medical image analysis. Whereas most methods are designed for the registration of two images (pairwise registration), there is an increasing interest in simultaneously aligning more than two images using groupwise registration. Multimodal registration in a groupwise setting remains difficult, due to the lack of generally applicable similarity metrics. In this work, a novel similarity metric for such groupwise registration problems is proposed. The metric calculates the sum of the conditional entropy between each image in the group and a representative template image constructed iteratively using principal component analysis. The proposed metric is validated in extensive experiments on synthetic and intrasubject clinical image data. These experiments showed equivalent or improved registration accuracy compared to other state-of-the-art (dis)similarity metrics and improved transformation consistency compared to pairwise mutual information. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Bayesian models based on test statistics for multiple hypothesis testing problems.

    PubMed

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  20. Experimental statistical signature of many-body quantum interference

    NASA Astrophysics Data System (ADS)

    Giordani, Taira; Flamini, Fulvio; Pompili, Matteo; Viggianiello, Niko; Spagnolo, Nicolò; Crespi, Andrea; Osellame, Roberto; Wiebe, Nathan; Walschaers, Mattia; Buchleitner, Andreas; Sciarrino, Fabio

    2018-03-01

    Multi-particle interference is an essential ingredient for fundamental quantum mechanics phenomena and for quantum information processing to provide a computational advantage, as recently emphasized by boson sampling experiments. Hence, developing a reliable and efficient technique to witness its presence is pivotal in achieving the practical implementation of quantum technologies. Here, we experimentally identify genuine many-body quantum interference via a recent efficient protocol, which exploits statistical signatures at the output of a multimode quantum device. We successfully apply the test to validate three-photon experiments in an integrated photonic circuit, providing an extensive analysis on the resources required to perform it. Moreover, drawing upon established techniques of machine learning, we show how such tools help to identify the—a priori unknown—optimal features to witness these signatures. Our results provide evidence on the efficacy and feasibility of the method, paving the way for its adoption in large-scale implementations.

  1. Design and experiment of data-driven modeling and flutter control of a prototype wing

    NASA Astrophysics Data System (ADS)

    Lum, Kai-Yew; Xu, Cai-Lin; Lu, Zhenbo; Lai, Kwok-Leung; Cui, Yongdong

    2017-06-01

    This paper presents an approach for data-driven modeling of aeroelasticity and its application to flutter control design of a wind-tunnel wing model. Modeling is centered on system identification of unsteady aerodynamic loads using computational fluid dynamics data, and adopts a nonlinear multivariable extension of the Hammerstein-Wiener system. The formulation is in modal coordinates of the elastic structure, and yields a reduced-order model of the aeroelastic feedback loop that is parametrized by airspeed. Flutter suppression is thus cast as a robust stabilization problem over uncertain airspeed, for which a low-order H∞ controller is computed. The paper discusses in detail parameter sensitivity and observability of the model, the former to justify the chosen model structure, and the latter to provide a criterion for physical sensor placement. Wind tunnel experiments confirm the validity of the modeling approach and the effectiveness of the control design.

  2. Stellar Interferometer Technology Experiment (SITE)

    NASA Technical Reports Server (NTRS)

    Crawley, Edward F.; Miller, David; Laskin, Robert; Shao, Michael

    1995-01-01

    The MIT Space Engineering Research Center and the Jet Propulsion Laboratory stand ready to advance science sensor technology for discrete-aperture astronomical instruments such as space-based optical interferometers. The objective of the Stellar Interferometer Technology Experiment (SITE) is to demonstrate system-level functionality of a space-based stellar interferometer through the use of enabling and enhancing Controlled-Structures Technologies (CST). SITE mounts to the Mission Peculiar Experiment Support System inside the Shuttle payload bay. Starlight, entering through two apertures, is steered to a combining plate where it is interferred. Interference requires 27 nanometer pathlength (phasing) and 0.29 archsecond wavefront-tilt (pointing) control. The resulting 15 milli-archsecond angular resolution exceeds that of current earth-orbiting telescopes while maintaining low cost by exploiting active optics and structural control technologies. With these technologies, unforeseen and time-varying disturbances can be rejected while relaxing reliance on ground alignment and calibration. SITE will reduce the risk and cost of advanced optical space systems by validating critical technologies in their operational environment. Moreover, these technologies are directly applicable to commercially driven applications such as precision matching, optical scanning, and vibration and noise control systems for the aerospace, medical, and automotive sectors. The SITE team consists of experienced university, government, and industry researchers, scientists, and engineers with extensive expertise in optical interferometry, nano-precision opto-mechanical control and spaceflight experimentation. The experience exists and the technology is mature. SITE will validate these technologies on a functioning interferometer science sensor in order to confirm definitely their readiness to be baselined for future science missions.

  3. Image analysis method for the measurement of water saturation in a two-dimensional experimental flow tank

    NASA Astrophysics Data System (ADS)

    Belfort, Benjamin; Weill, Sylvain; Lehmann, François

    2017-07-01

    A novel, non-invasive imaging technique is proposed that determines 2D maps of water content in unsaturated porous media. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed, because calibration curve relating water content and reflected light intensities is established during the main monitoring phase of each experiment and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage/imbibition experiment in a 2D flow tank with inner dimensions of 40 cm × 14 cm × 6 cm (L × W × D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using a statistical framework to perform an error analysis and numerical simulations with a state-of-the-art computational code that solves the Richards' equation. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Finally, the photometric procedure has been developed expressly for its extension to heterogeneous media. Other processes may be investigated through different laboratory experiments which will serve as benchmark for numerical codes validation.

  4. Anonymization of electronic medical records for validating genome-wide association studies

    PubMed Central

    Loukides, Grigorios; Gkoulalas-Divanis, Aris; Malin, Bradley

    2010-01-01

    Genome-wide association studies (GWAS) facilitate the discovery of genotype–phenotype relations from population-based sequence databases, which is an integral facet of personalized medicine. The increasing adoption of electronic medical records allows large amounts of patients’ standardized clinical features to be combined with the genomic sequences of these patients and shared to support validation of GWAS findings and to enable novel discoveries. However, disseminating these data “as is” may lead to patient reidentification when genomic sequences are linked to resources that contain the corresponding patients’ identity information based on standardized clinical features. This work proposes an approach that provably prevents this type of data linkage and furnishes a result that helps support GWAS. Our approach automatically extracts potentially linkable clinical features and modifies them in a way that they can no longer be used to link a genomic sequence to a small number of patients, while preserving the associations between genomic sequences and specific sets of clinical features corresponding to GWAS-related diseases. Extensive experiments with real patient data derived from the Vanderbilt's University Medical Center verify that our approach generates data that eliminate the threat of individual reidentification, while supporting GWAS validation and clinical case analysis tasks. PMID:20385806

  5. Camera-tracking gaming control device for evaluation of active wrist flexion and extension.

    PubMed

    Shefer Eini, Dalit; Ratzon, Navah Z; Rizzo, Albert A; Yeh, Shih-Ching; Lange, Belinda; Yaffe, Batia; Daich, Alexander; Weiss, Patrice L; Kizony, Rachel

    Cross sectional. Measuring wrist range of motion (ROM) is an essential procedure in hand therapy clinics. To test the reliability and validity of a dynamic ROM assessment, the Camera Wrist Tracker (CWT). Wrist flexion and extension ROM of 15 patients with distal radius fractures and 15 matched controls were assessed with the CWT and with a universal goniometer. One-way model intraclass correlation coefficient analysis indicated high test-retest reliability for extension (ICC = 0.92) and moderate reliability for flexion (ICC = 0.49). Standard error for extension was 2.45° and for flexion was 4.07°. Repeated-measures analysis revealed a significant main effect for group; ROM was greater in the control group (F[1, 28] = 47.35; P < .001). The concurrent validity of the CWT was partially supported. The results indicate that the CWT may provide highly reliable scores for dynamic wrist extension ROM, and moderately reliable scores for flexion, in people recovering from a distal radius fracture. N/A. Copyright © 2016 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.

  6. Experimental validation of an analytical kinetic model for edge-localized modes in JET-ITER-like wall

    NASA Astrophysics Data System (ADS)

    Guillemaut, C.; Metzger, C.; Moulton, D.; Heinola, K.; O’Mullane, M.; Balboa, I.; Boom, J.; Matthews, G. F.; Silburn, S.; Solano, E. R.; contributors, JET

    2018-06-01

    The design and operation of future fusion devices relying on H-mode plasmas requires reliable modelling of edge-localized modes (ELMs) for precise prediction of divertor target conditions. An extensive experimental validation of simple analytical predictions of the time evolution of target plasma loads during ELMs has been carried out here in more than 70 JET-ITER-like wall H-mode experiments with a wide range of conditions. Comparisons of these analytical predictions with diagnostic measurements of target ion flux density, power density, impact energy and electron temperature during ELMs are presented in this paper and show excellent agreement. The analytical predictions tested here are made with the ‘free-streaming’ kinetic model (FSM) which describes ELMs as a quasi-neutral plasma bunch expanding along the magnetic field lines into the Scrape-Off Layer without collisions. Consequences of the FSM on energy reflection and deposition on divertor targets during ELMs are also discussed.

  7. The Use of the State-Trait Anger Expression Inventory-II With Forensic Populations: A Psychometric Critique.

    PubMed

    Schamborg, Sara; Tully, Ruth J; Browne, Kevin D

    2016-08-01

    The State-Trait Anger Expression Inventory-II (STAXI-II) is a psychometric assessment that measures the experience, expression, and control of anger in research and clinical settings. Although the STAXI-II is extensively used and its psychometric properties supported, no psychometric critique has yet specifically assessed its utility with forensic populations. The aim of this critique was to explore the validity and reliability of the STAXI-II when used with forensic samples. It was found that the psychometric properties of the STAXI-II, when used with forensic populations, are satisfactory. However, gaps in research and issues that need to be addressed in practice have been highlighted. Although STAXI-II provides a comprehensive measure of anger, it does not capture all aspects of the construct. In addition, the tool does not contain an inherent validity scale, indicating the need to control for social desirability responding when administering the STAXI-II. Practical implications, limitations, and future research will be discussed. © The Author(s) 2015.

  8. Experimental validation of phase-only pre-compensation over 494  m free-space propagation.

    PubMed

    Brady, Aoife; Berlich, René; Leonhard, Nina; Kopf, Teresa; Böttner, Paul; Eberhardt, Ramona; Reinlein, Claudia

    2017-07-15

    It is anticipated that ground-to-geostationary orbit (GEO) laser communication will benefit from pre-compensation of atmospheric turbulence for laser beam propagation through the atmosphere. Theoretical simulations and laboratory experiments have determined its feasibility; extensive free-space experimental validation has, however, yet to be fulfilled. Therefore, we designed and implemented an adaptive optical (AO)-box which pre-compensates an outgoing laser beam (uplink) using the measurements of an incoming beam (downlink). The setup was designed to approximate the baseline scenario over a horizontal test range of 0.5 km and consisted of a ground terminal with the AO-box and a simplified approximation of a satellite terminal. Our results confirmed that we could focus the uplink beam on the satellite terminal using AO under a point-ahead angle of 28 μrad. Furthermore, we demonstrated a considerable increase in the intensity received at the satellite. These results are further testimony to AO pre-compensation being a viable technique to enhance Earth-to-GEO optical communication.

  9. The excess choice effect: The role of outcome valence and counterfactual thinking.

    PubMed

    Hafner, Rebecca J; White, Mathew P; Handley, Simon J

    2016-02-01

    Contrary to economic theory, psychological research has demonstrated increased choice can undermine satisfaction. When and why this 'excess choice effect' (ECE) occurs remains unclear. Building on theories of counterfactual thinking we argue the ECE is more likely to occur when people experience counterfactual thought or emotion and that a key trigger is a negative versus positive task outcome. Participants either selected a drink (Experiment 1) or chocolate (Experiment 2) from a limited (6) versus extensive (24) selection (Experiment 1) or were given no choice versus extensive (24) choice (Experiment 2). In both experiments, however, the choice was illusory: Half the participants tasted a 'good' flavour, half a 'bad' flavour. As predicted, extensive choice was only detrimental to satisfaction when participants tasted the 'bad' drink or chocolate, and this was mediated by the experience of counterfactual thought (Experiment 1) or emotion (Experiment 2). When outcomes were positive, participants were similarly satisfied with limited versus extensive and no choice versus extensive choice. Implications for our theoretical understanding of the ECE and for the construction of choice architectures aimed at promoting individual satisfaction and well-being are discussed. © 2015 The British Psychological Society.

  10. Early parental care is important for hippocampal maturation: evidence from brain morphology in humans.

    PubMed

    Rao, Hengyi; Betancourt, Laura; Giannetta, Joan M; Brodsky, Nancy L; Korczykowski, Marc; Avants, Brian B; Gee, James C; Wang, Jiongjiong; Hurt, Hallam; Detre, John A; Farah, Martha J

    2010-01-01

    The effects of early life experience on later brain structure and function have been studied extensively in animals, yet the relationship between childhood experience and normal brain development in humans remains largely unknown. Using a unique longitudinal data set including ecologically valid in-home measures of early experience during childhood (at age 4 and 8 years) and high-resolution structural brain imaging during adolescence (mean age 14 years), we examined the effects on later brain morphology of two dimensions of early experience: parental nurturance and environmental stimulation. Parental nurturance at age 4 predicts the volume of the left hippocampus in adolescence, with better nurturance associated with smaller hippocampal volume. In contrast, environmental stimulation did not correlate with hippocampal volume. Moreover, the association between hippocampal volume and parental nurturance disappears at age 8, supporting the existence of a sensitive developmental period for brain maturation. These findings indicate that variation in normal childhood experience is associated with differences in brain morphology, and hippocampal volume is specifically associated with early parental nurturance. Our results provide neuroimaging evidence supporting the important role of warm parental care during early childhood for brain maturation.

  11. On the Validity of Student Evaluation of Teaching: The State of the Art

    ERIC Educational Resources Information Center

    Spooren, Pieter; Brockx, Bert; Mortelmans, Dimitri

    2013-01-01

    This article provides an extensive overview of the recent literature on student evaluation of teaching (SET) in higher education. The review is based on the SET meta-validation model, drawing upon research reports published in peer-reviewed journals since 2000. Through the lens of validity, we consider both the more traditional research themes in…

  12. Driving-forces model on individual behavior in scenarios considering moving threat agents

    NASA Astrophysics Data System (ADS)

    Li, Shuying; Zhuang, Jun; Shen, Shifei; Wang, Jia

    2017-09-01

    The individual behavior model is a contributory factor to improve the accuracy of agent-based simulation in different scenarios. However, few studies have considered moving threat agents, which often occur in terrorist attacks caused by attackers with close-range weapons (e.g., sword, stick). At the same time, many existing behavior models lack validation from cases or experiments. This paper builds a new individual behavior model based on seven behavioral hypotheses. The driving-forces model is an extension of the classical social force model considering scenarios including moving threat agents. An experiment was conducted to validate the key components of the model. Then the model is compared with an advanced Elliptical Specification II social force model, by calculating the fitting errors between the simulated and experimental trajectories, and being applied to simulate a specific circumstance. Our results show that the driving-forces model reduced the fitting error by an average of 33.9% and the standard deviation by an average of 44.5%, which indicates the accuracy and stability of the model in the studied situation. The new driving-forces model could be used to simulate individual behavior when analyzing the risk of specific scenarios using agent-based simulation methods, such as risk analysis of close-range terrorist attacks in public places.

  13. Synchrotron characterization of nanograined UO 2 grain growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Kun; Miao, Yinbin; Yun, Di

    2015-09-30

    This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructuremore » based materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize our preliminary synchrotron radiation experiments at APS to determine the grain size of nanograin UO 2. The methodology and experimental setup developed in this experiment can directly apply to the proposed in-situ grain growth measurements. The investigation of the grain growth kinetics was conducted based on isothermal annealing and grain growth characterization as functions of duration and temperature. The kinetic parameters such as activation energy for grain growth for UO 2 with different stoichiometry are obtained and compared with molecular dynamics (MD) simulations.« less

  14. Supplying materials needed for grain growth characterizations of nano-grained UO 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Kun; Miao, Yinbin; Yun, Di

    2015-09-30

    This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructuremore » based materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize our preliminary synchrotron radiation experiments at APS to determine the grain size of nanograin UO 2. The methodology and experimental setup developed in this experiment can directly apply to the proposed in-situ grain growth measurements. The investigation of the grain growth kinetics was conducted based on isothermal annealing and grain growth characterization as functions of duration and temperature. The kinetic parameters such as activation energy for grain growth for UO 2 with different stoichiometry are obtained and compared with molecular dynamics (MD) simulations.« less

  15. Extension and Validation of Research on Acquisition and Retention of Cognitive versus Perceptually Oriented Training Materials

    DTIC Science & Technology

    1980-06-01

    Conservative/Experimenting V2d Q2 Group-dependent/Sel f-sufficient V29 Q3 Undisciplined Self-conflict/ Controlled V30 Q4 Relaxed/Tense V4C GZPS Guilford...d7 O 6.5738 2. 1327 61 V2h 02 5.0984 2.2709 61 V29 03 7.3164 1.9958 61 V00 04 4 .98 3o 2.0453 61 V40 GZpS 33.1475 10.0?27 61 Isr 1 95574 5.3899 61 114...sufficient V29 Q3 Undisciplined Self-conflict/ Controlled V30 Q4 Relaxed/Tense V40 GZPS Guilford Zimmerman Perceptual Speed ZR6 ZOVERALL Standardized

  16. Convolutional Deep Belief Networks for Single-Cell/Object Tracking in Computational Biology and Computer Vision.

    PubMed

    Zhong, Bineng; Pan, Shengnan; Zhang, Hongbo; Wang, Tian; Du, Jixiang; Chen, Duansheng; Cao, Liujuan

    2016-01-01

    In this paper, we propose deep architecture to dynamically learn the most discriminative features from data for both single-cell and object tracking in computational biology and computer vision. Firstly, the discriminative features are automatically learned via a convolutional deep belief network (CDBN). Secondly, we design a simple yet effective method to transfer features learned from CDBNs on the source tasks for generic purpose to the object tracking tasks using only limited amount of training data. Finally, to alleviate the tracker drifting problem caused by model updating, we jointly consider three different types of positive samples. Extensive experiments validate the robustness and effectiveness of the proposed method.

  17. Convolutional Deep Belief Networks for Single-Cell/Object Tracking in Computational Biology and Computer Vision

    PubMed Central

    Pan, Shengnan; Zhang, Hongbo; Wang, Tian; Du, Jixiang; Chen, Duansheng; Cao, Liujuan

    2016-01-01

    In this paper, we propose deep architecture to dynamically learn the most discriminative features from data for both single-cell and object tracking in computational biology and computer vision. Firstly, the discriminative features are automatically learned via a convolutional deep belief network (CDBN). Secondly, we design a simple yet effective method to transfer features learned from CDBNs on the source tasks for generic purpose to the object tracking tasks using only limited amount of training data. Finally, to alleviate the tracker drifting problem caused by model updating, we jointly consider three different types of positive samples. Extensive experiments validate the robustness and effectiveness of the proposed method. PMID:27847827

  18. A framework to enhance security of physically unclonable functions using chaotic circuits

    NASA Astrophysics Data System (ADS)

    Chen, Lanxiang

    2018-05-01

    As a new technique for authentication and key generation, physically unclonable function (PUF) has attracted considerable attentions, with extensive research results achieved already. To resist the popular machine learning modeling attacks, a framework to enhance the security of PUFs is proposed. The basic idea is to combine PUFs with a chaotic system of which the response is highly sensitive to initial conditions. For this framework, a specific construction which combines the common arbiter PUF circuit, a converter, and the Chua's circuit is given to implement a more secure PUF. Simulation experiments are presented to further validate the framework. Finally, some practical suggestions for the framework and specific construction are also discussed.

  19. Human-robot interaction tests on a novel robot for gait assistance.

    PubMed

    Tagliamonte, Nevio Luigi; Sergi, Fabrizio; Carpino, Giorgio; Accoto, Dino; Guglielmelli, Eugenio

    2013-06-01

    This paper presents tests on a treadmill-based non-anthropomorphic wearable robot assisting hip and knee flexion/extension movements using compliant actuation. Validation experiments were performed on the actuators and on the robot, with specific focus on the evaluation of intrinsic backdrivability and of assistance capability. Tests on a young healthy subject were conducted. In the case of robot completely unpowered, maximum backdriving torques were found to be in the order of 10 Nm due to the robot design features (reduced swinging masses; low intrinsic mechanical impedance and high-efficiency reduction gears for the actuators). Assistance tests demonstrated that the robot can deliver torques attracting the subject towards a predicted kinematic status.

  20. Time-of-Flight Measurements as a Possible Method to Observe Anyonic Statistics

    NASA Astrophysics Data System (ADS)

    Umucalılar, R. O.; Macaluso, E.; Comparin, T.; Carusotto, I.

    2018-06-01

    We propose a standard time-of-flight experiment as a method for observing the anyonic statistics of quasiholes in a fractional quantum Hall state of ultracold atoms. The quasihole states can be stably prepared by pinning the quasiholes with localized potentials and a measurement of the mean square radius of the freely expanding cloud, which is related to the average total angular momentum of the initial state, offers direct signatures of the statistical phase. Our proposed method is validated by Monte Carlo calculations for ν =1 /2 and 1 /3 fractional quantum Hall liquids containing a realistic number of particles. Extensions to quantum Hall liquids of light and to non-Abelian anyons are briefly discussed.

  1. [CRITERION-RELATED VALIDITY OF SIT-AND-REACH TEST AS A MEASURE OF HAMSTRING EXTENSIBILITY IN OLDER WOMEN].

    PubMed

    López-Miñarro, Pedro Ángel; Vaquero-Cristóbal, Raquel; Muyor, José María; Espejo-Antúnez, Luis

    2015-07-01

    lumbo-sacral posture and the sit-andreach score have been proposed as measures of hamstring extensibility. However, the validity is influenced by sample characteristics. to determine the validity of lumbo-horizontal angle and score in the sit-and-reach test as measures of hamstring extensibility in older women. a hundred and twenty older women performed the straight leg raise test with both leg, and the sit-and-reach test (SR) in a random order. For the sitand- reach test, the score and the lumbo-sacral posture in bending (lumbo-horizontal angle, L-Hfx) were measured. the mean values of straight leg raise in left and right leg were 81.70 ± 13.83º and 82.10 ± 14.36º, respectively. The mean value of EPR of both legs was 81.90 ± 12.70º. The mean values of SR score and L-Hfx were -1.54 ± 8.09 cm and 91.08º ± 9.32º, respectively. The correlation values between the mean straight leg raise test with respect to lumbo-sacral posture and SR score were moderate (L-Hfx: r = -0.72, p < 0.01; SR: r = 0.70, p < 0.01). Both variables independently explained about 50% of the variance (L-Hfx: R2 = 0.52, p < 0,001; SR: R2 = 0.49, p < 0,001). the validity of lumbo-sacral posture in bending as measure of hamstring muscle extensibility on older women is moderate, with similar values than SR score. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  2. Bioanalytical method development and validation for the determination of glycine in human cerebrospinal fluid by ion-pair reversed-phase liquid chromatography-tandem mass spectrometry.

    PubMed

    Jiang, Jian; James, Christopher A; Wong, Philip

    2016-09-05

    A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. The development and validation of a questionnaire to measure the clinical learning environment for undergraduate dental students (DECLEI).

    PubMed

    Kossioni, A E; Lyrakos, G; Ntinalexi, I; Varela, R; Economu, I

    2014-05-01

    The aim of this study was to develop and validate according to psychometric standards a self-administered instrument to measure the students' self-perceptions of the undergraduate clinical dental environment (DECLEI). The initial questionnaire was developed using feedback from dental students, experts' opinion and an extensive literature review. Critical incident technique (CIT) analysis was used to generate items and identify domains. Thirty clinical dental students participated in a pilot validation that generated a 67-item questionnaire. To develop a shorter and more practical version of the instrument, DECLEI-67 was distributed to 153 clinical students at the University of Athens and its English version to 51 students from various dental schools, attending the 2012 European Dental Students Association meeting. This final procedure aimed to select items, identify subscales and measure internal consistency and discriminant validity. A total of 202 students returned the questionnaires (response rate 99%). The final instrument included 24 items divided into three subscales: (i) organisation and learning opportunities, (ii) professionalism and communication and (iii) satisfaction and commitment to the dental studies. Cronbach's α for the total questionnaire was 0.89. The interscale correlations ranged from 0.39 to 0.48. The instrument identified differences related to school of origin, age and duration of clinical experience. An interpretation of the scores (range 0–100) has been proposed. The 24-item DECLEI seemed to be a practical and valid instrument to measure a dental school's undergraduate clinical learning environment.

  4. Unsteady boundary layer development on a wind turbine blade: an experimental study of a surrogate problem

    NASA Astrophysics Data System (ADS)

    Cadel, Daniel R.; Zhang, Di; Lowe, K. Todd; Paterson, Eric G.

    2018-04-01

    Wind turbines with thick blade profiles experience turbulent, periodic approach flow, leading to unsteady blade loading and large torque fluctuations on the turbine drive shaft. Presented here is an experimental study of a surrogate problem representing some key aspects of the wind turbine unsteady fluid mechanics. This experiment has been designed through joint consideration by experiment and computation, with the ultimate goal of numerical model development for aerodynamics in unsteady and turbulent flows. A cylinder at diameter Reynolds number of 65,000 and Strouhal number of 0.184 is placed 10.67 diameters upstream of a NACA 63215b airfoil with chord Reynolds number of 170,000 and chord-reduced frequency of k=2π fc/2/V=1.5. Extensive flow field measurements using particle image velocimetry provide a number of insights about this flow, as well as data for model validation and development. Velocity contours on the airfoil suction side in the presence of the upstream cylinder indicate a redistribution of turbulent normal stresses from transverse to streamwise, consistent with rapid distortion theory predictions. A study of the boundary layer over the suction side of the airfoil reveals very low Reynolds number turbulent mean streamwise velocity profiles. The dominance of the high amplitude large eddy passages results in a phase lag in streamwise velocity as a function of distance from the wall. The results and accompanying description provide a new test case incorporating moderate-reduced frequency inflow for computational model validation and development.

  5. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  6. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  7. Crack Propagation Analysis Using Acoustic Emission Sensors for Structural Health Monitoring Systems

    DOE PAGES

    Kral, Zachary; Horn, Walter; Steck, James

    2013-01-01

    Aerospace systems are expected to remain in service well beyond their designed life. Consequently, maintenance is an important issue. A novel method of implementing artificial neural networks and acoustic emission sensors to form a structural health monitoring (SHM) system for aerospace inspection routines was the focus of this research. Simple structural elements, consisting of flat aluminum plates of AL 2024-T3, were subjected to increasing static tensile loading. As the loading increased, designed cracks extended in length, releasing strain waves in the process. Strain wave signals, measured by acoustic emission sensors, were further analyzed in post-processing by artificial neural networks (ANN).more » Several experiments were performed to determine the severity and location of the crack extensions in the structure. ANNs were trained on a portion of the data acquired by the sensors and the ANNs were then validated with the remaining data. The combination of a system of acoustic emission sensors, and an ANN could determine crack extension accurately. The difference between predicted and actual crack extensions was determined to be between 0.004 in. and 0.015 in. with 95% confidence. These ANNs, coupled with acoustic emission sensors, showed promise for the creation of an SHM system for aerospace systems.« less

  8. Crack Propagation Analysis Using Acoustic Emission Sensors for Structural Health Monitoring Systems

    PubMed Central

    Horn, Walter; Steck, James

    2013-01-01

    Aerospace systems are expected to remain in service well beyond their designed life. Consequently, maintenance is an important issue. A novel method of implementing artificial neural networks and acoustic emission sensors to form a structural health monitoring (SHM) system for aerospace inspection routines was the focus of this research. Simple structural elements, consisting of flat aluminum plates of AL 2024-T3, were subjected to increasing static tensile loading. As the loading increased, designed cracks extended in length, releasing strain waves in the process. Strain wave signals, measured by acoustic emission sensors, were further analyzed in post-processing by artificial neural networks (ANN). Several experiments were performed to determine the severity and location of the crack extensions in the structure. ANNs were trained on a portion of the data acquired by the sensors and the ANNs were then validated with the remaining data. The combination of a system of acoustic emission sensors, and an ANN could determine crack extension accurately. The difference between predicted and actual crack extensions was determined to be between 0.004 in. and 0.015 in. with 95% confidence. These ANNs, coupled with acoustic emission sensors, showed promise for the creation of an SHM system for aerospace systems. PMID:24023536

  9. Genome editing of bread wheat using biolistic delivery of CRISPR/Cas9 in vitro transcripts or ribonucleoproteins.

    PubMed

    Liang, Zhen; Chen, Kunling; Zhang, Yi; Liu, Jinxing; Yin, Kangquan; Qiu, Jin-Long; Gao, Caixia

    2018-03-01

    This protocol is an extension to: Nat. Protoc. 9, 2395-2410 (2014); doi:10.1038/nprot.2014.157; published online 18 September 2014In recent years, CRISPR/Cas9 has emerged as a powerful tool for improving crop traits. Conventional plant genome editing mainly relies on plasmid-carrying cassettes delivered by Agrobacterium or particle bombardment. Here, we describe DNA-free editing of bread wheat by delivering in vitro transcripts (IVTs) or ribonucleoprotein complexes (RNPs) of CRISPR/Cas9 by particle bombardment. This protocol serves as an extension of our previously published protocol on genome editing in bread wheat using CRISPR/Cas9 plasmids delivered by particle bombardment. The methods we describe not only eliminate random integration of CRISPR/Cas9 into genomic DNA, but also reduce off-target effects. In this protocol extension article, we present detailed protocols for preparation of IVTs and RNPs; validation by PCR/restriction enzyme (RE) and next-generation sequencing; delivery by biolistics; and recovery of mutants and identification of mutants by pooling methods and Sanger sequencing. To use these protocols, researchers should have basic skills and experience in molecular biology and biolistic transformation. By using these protocols, plants edited without the use of any foreign DNA can be generated and identified within 9-11 weeks.

  10. DDML Schema Validation

    DTIC Science & Technology

    2016-02-08

    Data Display Markup Language HUD heads-up display IRIG Inter-Range Instrumentation Group RCC Range Commanders Council SVG Scalable Vector Graphics...T&E test and evaluation TMATS Telemetry Attributes Transfer Standard XML eXtensible Markup Language DDML Schema Validation, RCC 126-16, February...2016 viii This page intentionally left blank. DDML Schema Validation, RCC 126-16, February 2016 1 1. Introduction This Data Display Markup

  11. The Resilience Scale for Adults: Construct Validity and Measurement in a Belgian Sample

    ERIC Educational Resources Information Center

    Hjemdal, Odin; Friborg, Oddgeir; Braun, Stephanie; Kempenaers, Chantal; Linkowski, Paul; Fossion, Pierre

    2011-01-01

    The Resilience Scale for Adults (RSA) was developed and has been extensively validated in Norwegian samples. The purpose of this study was to explore the construct validity of the Resilience Scale for Adults in a French-speaking Belgian sample and test measurement invariance between the Belgian and a Norwegian sample. A Belgian student sample (N =…

  12. Transconjuctival Incision with Lateral Paracanthal Extension for Corrective Osteotomy of Malunioned Zygoma

    PubMed Central

    Chung, Jae-Ho; You, Hi-Jin; Hwang, Na-Hyun; Yoon, Eul-Sik

    2016-01-01

    Background Conventional correction of malunioned zygoma requires complete regional exposure through a bicoronal flap combined with a lower eyelid incision and an upper buccal sulcus incision. However, there are many potential complications following bicoronal incisions, such as infection, hematoma, alopecia, scarring and nerve injury. We have adopted a zygomaticofrontal suture osteotomy technique using transconjunctival incision with lateral paracanthal extension. We performed a retrospective review of clinical cases underwent correction of malunioned zygoma with the approach to evaluate outcomes following this method. Methods Between June 2009 and September 2015, corrective osteotomies were performed in 14 patients with malunioned zygoma by a single surgeon. All 14 patients received both upper gingivobuccal and transconjunctival incisions with lateral paracanthal extension. The mean interval from injury to operation was 16 months (range, 12 months to 4 years), and the mean follow-up was 1 year (range, 4 months to 3 years). Results Our surgical approach technique allowed excellent access to the infraorbital rim, orbital floor, zygomaticofrontal suture and anterior surface of the maxilla. Of the 14 patients, only 1 patient suffered a complication—oral wound dehiscence. Among the 6 patients who received infraorbital nerve decompression, numbness was gradually relieved in 4 patients. Two patients continued to experience persistent numbness. Conclusion Transconjunctival incision with lateral paracanthal extension combined with upper gingivobuccal sulcus incision offers excellent exposure of the zygoma-orbit complex, and could be a valid alternative to the bicoronal approach for osteotomy of malunioned zygoma. PMID:28913268

  13. Extension, validation and application of the NASCAP code

    NASA Technical Reports Server (NTRS)

    Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.

    1979-01-01

    Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.

  14. Mathematical Model Formulation And Validation Of Water And Solute Transport In Whole Hamster Pancreatic Islets

    PubMed Central

    Benson, Charles T.; Critser, John K.

    2014-01-01

    Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3 × 3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87 ± 0.06 (mean ± S.D.). Only the treatment variable of perfusing solution was found to be significant (p < 0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. PMID:24950195

  15. Validating Farmers' Indigenous Social Networks for Local Seed Supply in Central Rift Valley of Ethiopia.

    ERIC Educational Resources Information Center

    Seboka, B.; Deressa, A.

    2000-01-01

    Indigenous social networks of Ethiopian farmers participate in seed exchange based on mutual interdependence and trust. A government-imposed extension program must validate the role of local seed systems in developing a national seed industry. (SK)

  16. Farmer Experience of Pluralistic Agricultural Extension, Malawi

    ERIC Educational Resources Information Center

    Chowa, Clodina; Garforth, Chris; Cardey, Sarah

    2013-01-01

    Purpose: Malawi's current extension policy supports pluralism and advocates responsiveness to farmer demand. We investigate whether smallholder farmers' experience supports the assumption that access to multiple service providers leads to extension and advisory services that respond to the needs of farmers. Design/methodology/approach: Within a…

  17. Validation of High-Fidelity Reactor Physics Models for Support of the KJRR Experimental Campaign in the Advanced Test Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigg, David W.; Nielsen, Joseph W.; Norman, Daren R.

    The Korea Atomic Energy Research Institute is currently in the process of qualifying a Low-Enriched Uranium fuel element design for the new Ki-Jang Research Reactor (KJRR). As part of this effort, a prototype KJRR fuel element was irradiated for several operating cycles in the Northeast Flux Trap of the Advanced Test Reactor (ATR) at the Idaho National Laboratory. The KJRR fuel element contained a very large quantity of fissile material (618g 235U) in comparison with historical ATR experiment standards (<1g 235U), and its presence in the ATR flux trap was expected to create a neutronic configuration that would be wellmore » outside of the approved validation envelope for the reactor physics analysis methods used to support ATR operations. Accordingly it was necessary, prior to high-power irradiation of the KJRR fuel element in the ATR, to conduct an extensive set of new low-power physics measurements with the KJRR fuel element installed in the ATR Critical Facility (ATRC), a companion facility to the ATR that is located in an immediately adjacent building, sharing the same fuel handling and storage canal. The new measurements had the objective of expanding the validation envelope for the computational reactor physics tools used to support ATR operations and safety analysis to include the planned KJRR irradiation in the ATR and similar experiments that are anticipated in the future. The computational and experimental results demonstrated that the neutronic behavior of the KJRR fuel element in the ATRC is well-understood, both in terms of its general effects on core excess reactivity and fission power distributions, its effects on the calibration of the core lobe power measurement system, as well as in terms of its own internal fission rate distribution and total fission power per unit ATRC core power. Taken as a whole, these results have significantly extended the ATR physics validation envelope, thereby enabling an entire new class of irradiation experiments.« less

  18. The Structured Assessment of Violence Risk in Adults with Intellectual Disability: A Systematic Review.

    PubMed

    Hounsome, J; Whittington, R; Brown, A; Greenhill, B; McGuire, J

    2018-01-01

    While structured professional judgement approaches to assessing and managing the risk of violence have been extensively examined in mental health/forensic settings, the application of the findings to people with an intellectual disability is less extensively researched and reviewed. This review aimed to assess whether risk assessment tools have adequate predictive validity for violence in adults with an intellectual disability. Standard systematic review methodology was used to identify and synthesize appropriate studies. A total of 14 studies were identified as meeting the inclusion criteria. These studies assessed the predictive validity of 18 different risk assessment tools, mainly in forensic settings. All studies concluded that the tools assessed were successful in predicting violence. Studies were generally of a high quality. There is good quality evidence that risk assessment tools are valid for people with intellectual disability who offend but further research is required to validate tools for use with people with intellectual disability who offend. © 2016 John Wiley & Sons Ltd.

  19. Expanding the Nomological Net of the Pathological Narcissism Inventory: German Validation and Extension in a Clinical Inpatient Sample.

    PubMed

    Morf, Carolyn C; Schürch, Eva; Küfner, Albrecht; Siegrist, Philip; Vater, Aline; Back, Mitja; Mestel, Robert; Schröder-Abé, Michela

    2017-06-01

    The Pathological Narcissism Inventory (PNI) is a multidimensional measure for assessing grandiose and vulnerable features in narcissistic pathology. The aim of the present research was to construct and validate a German translation of the PNI and to provide further information on the PNI's nomological net. Findings from a first study confirm the psychometric soundness of the PNI and replicate its seven-factor first-order structure. A second-order structure was also supported but with several equivalent models. A second study investigating associations with a broad range of measures ( DSM Axis I and II constructs, emotions, personality traits, interpersonal and dysfunctional behaviors, and well-being) supported the concurrent validity of the PNI. Discriminant validity with the Narcissistic Personality Inventory was also shown. Finally, in a third study an extension in a clinical inpatient sample provided further evidence that the PNI is a useful tool to assess the more pathological end of narcissism.

  20. A Decision Tree for Nonmetric Sex Assessment from the Skull.

    PubMed

    Langley, Natalie R; Dudzik, Beatrix; Cloutier, Alesia

    2018-01-01

    This study uses five well-documented cranial nonmetric traits (glabella, mastoid process, mental eminence, supraorbital margin, and nuchal crest) and one additional trait (zygomatic extension) to develop a validated decision tree for sex assessment. The decision tree was built and cross-validated on a sample of 293 U.S. White individuals from the William M. Bass Donated Skeletal Collection. Ordinal scores from the six traits were analyzed using the partition modeling option in JMP Pro 12. A holdout sample of 50 skulls was used to test the model. The most accurate decision tree includes three variables: glabella, zygomatic extension, and mastoid process. This decision tree yielded 93.5% accuracy on the training sample, 94% on the cross-validated sample, and 96% on a holdout validation sample. Linear weighted kappa statistics indicate acceptable agreement among observers for these variables. Mental eminence should be avoided, and definitions and figures should be referenced carefully to score nonmetric traits. © 2017 American Academy of Forensic Sciences.

  1. Ensuring Data Quality in Extension Research and Evaluation Studies

    ERIC Educational Resources Information Center

    Radhakrishna, Rama; Tobin, Daniel; Brennan, Mark; Thomson, Joan

    2012-01-01

    This article presents a checklist as a guide for Extension professionals to use in research and evaluation studies they carry out. A total of 40 statements grouped under eight data quality components--relevance, objectivity, validity, reliability, integrity, generalizability, completeness, and utility--are identified to ensure that research…

  2. Leadership Development Seminar: Developing Human Capital through Extension Leadership Programs. Proceedings (Manhattan, Kansas, August 6, 1989).

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; White, Lynn

    Nineteen papers are included in this document: "Potential and Impact: Assessment and Validation in Leadership Development" (Boatman); "Using an Organizational Diagnostic Instrument to Analyze Perceptions of the Virginia Extension Homemakers Council" (Newhouse, Chandler, Tuckwiller); "Image: Who Needs It?" (Hendricks,…

  3. If Anything Can Go Wrong, Maybe It Will.

    ERIC Educational Resources Information Center

    Wager, Jane C.; Rayner, Gail T.

    Thirty personnel involved in various stages of the Training Extension Course (TEC) design, development, and distribution process were interviewed by telephone to determine the major problems perceived within each stage of the program, which provides validated extension training wherever U.S. soldiers are stationed. Those interviewed included TEC…

  4. An improved method to determine neuromuscular properties using force laws - From single muscle to applications in human movements.

    PubMed

    Siebert, T; Sust, M; Thaller, S; Tilp, M; Wagner, H

    2007-04-01

    We evaluate an improved method for individually determining neuromuscular properties in vivo. The method is based on Hill's equation used as a force law combined with Newton's equation of motion. To ensure the range of validity of Hill's equation, we first perform detailed investigations on in vitro single muscles. The force-velocity relation determined with the model coincides well with results obtained by standard methods (r=.99) above 20% of the isometric force. In addition, the model-predicted force curves during work loop contractions very well agree with measurements (mean difference: 2-3%). Subsequently, we deduce theoretically under which conditions it is possible to combine several muscles of the human body to model muscles. This leads to a model equation for human leg extension movements containing parameters for the muscle properties and for the activation. To numerically determine these invariant neuromuscular properties we devise an experimental method based on concentric and isometric leg extensions. With this method we determine individual muscle parameters from experiments such that the simulated curves agree well with experiments (r=.99). A reliability test with 12 participants revealed correlations r=.72-.91 for the neuromuscular parameters (p<.01). Predictions of similar movements under different conditions show mean errors of about 5%. In addition, we present applications in sports practise and theory.

  5. Additional Evidence for the Reliability and Validity of the Student Risk Screening Scale at the High School Level: A Replication and Extension

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy P.; Ennis, Robin Parks; Cox, Meredith Lucille; Schatschneider, Christopher; Lambert, Warren

    2013-01-01

    This study reports findings from a validation study of the Student Risk Screening Scale for use with 9th- through 12th-grade students (N = 1854) attending a rural fringe school. Results indicated high internal consistency, test-retest stability, and inter-rater reliability. Predictive validity was established across two academic years, with Spring…

  6. Validity of Factors of the Psychopathy Checklist-Revised in Female Prisoners: Discriminant Relations with Antisocial Behavior, Substance Abuse, and Personality

    ERIC Educational Resources Information Center

    Kennealy, Patrick J.; Hicks, Brian M.; Patrick, Christopher J.

    2007-01-01

    The validity of the Psychopathy Checklist-Revised (PCL-R) has been examined extensively in men, but its validity for women remains understudied. Specifically, the correlates of the general construct of psychopathy and its components as assessed by PCL-R total, factor, and facet scores have yet to be examined in depth. Based on previous research…

  7. An Extension Convergent Validity Study of the "Systematic Screening for Behavior Disorders" and the Achenbach "Teacher's Report Form" with Middle and High School Students with Emotional Disturbances

    ERIC Educational Resources Information Center

    Benner, Gregory J.; Uhing, Brad M.; Pierce, Corey D.; Beaudoin, Kathleen M.; Ralston, Nicole C.; Mooney, Paul

    2009-01-01

    We sought to extend instrument validation research for the Systematic Screening for Behavior Disorders (SSBD) (Walker & Severson, 1990) using convergent validation techniques. Associations between Critical Events, Adaptive Behavior, and Maladaptive Behavior indices of the SSBD were examined in relation to syndrome, broadband, and total scores…

  8. Validation of the Leap Motion Controller using markered motion capture technology.

    PubMed

    Smeragliuolo, Anna H; Hill, N Jeremy; Disla, Luis; Putrino, David

    2016-06-14

    The Leap Motion Controller (LMC) is a low-cost, markerless motion capture device that tracks hand, wrist and forearm position. Integration of this technology into healthcare applications has begun to occur rapidly, making validation of the LMC׳s data output an important research goal. Here, we perform a detailed evaluation of the kinematic data output from the LMC, and validate this output against gold-standard, markered motion capture technology. We instructed subjects to perform three clinically-relevant wrist (flexion/extension, radial/ulnar deviation) and forearm (pronation/supination) movements. The movements were simultaneously tracked using both the LMC and a marker-based motion capture system from Motion Analysis Corporation (MAC). Adjusting for known inconsistencies in the LMC sampling frequency, we compared simultaneously acquired LMC and MAC data by performing Pearson׳s correlation (r) and root mean square error (RMSE). Wrist flexion/extension and radial/ulnar deviation showed good overall agreement (r=0.95; RMSE=11.6°, and r=0.92; RMSE=12.4°, respectively) with the MAC system. However, when tracking forearm pronation/supination, there were serious inconsistencies in reported joint angles (r=0.79; RMSE=38.4°). Hand posture significantly influenced the quality of wrist deviation (P<0.005) and forearm supination/pronation (P<0.001), but not wrist flexion/extension (P=0.29). We conclude that the LMC is capable of providing data that are clinically meaningful for wrist flexion/extension, and perhaps wrist deviation. It cannot yet return clinically meaningful data for measuring forearm pronation/supination. Future studies should continue to validate the LMC as updated versions of their software are developed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Soil Moisture Retrieval Through Changing Corn Using Active/Passive Microwave Remote Sensing

    NASA Technical Reports Server (NTRS)

    ONeill, P. E.; Joseph, A.; DeLannoy, G.; Lang, R.; Utku, C.; Kim, E.; Houser, P.; Gish, T.

    2003-01-01

    An extensive field experiment was conducted from May-early October, 2002 at the heavily instrumented USDA-ARS (U.S. Dept. of Agriculture-Agricultural Research Service) OPE3 (Optimizing Production Inputs for Economic and Environmental Enhancement) test site in Beltsville, MD to acquire data needed to address active/passive microwave algorithm, modeling, and ground validation issues for accurate soil moisture retrieval. During the experiment, a tower-mounted 1.4 GHz radiometer (Lrad) and a truck-mounted dual-frequency (1.6 and 4.75 GHz) radar system were deployed on the northern edge of the site. The soil in this portion of the field is a sandy loam (silt 23.5%, sand 60.3%, clay 16.1%) with a measured bulk density of 1.253 g/cu cm. Vegetation cover in the experiment consisted of a corn crop which was measured from just after planting on April 17, 2002 through senescence and harvesting on October 2. Although drought conditions prevailed during the summer, the corn yield was near average, with peak biomass reached in late July.

  10. Mousetrap: An integrated, open-source mouse-tracking package.

    PubMed

    Kieslich, Pascal J; Henninger, Felix

    2017-10-01

    Mouse-tracking - the analysis of mouse movements in computerized experiments - is becoming increasingly popular in the cognitive sciences. Mouse movements are taken as an indicator of commitment to or conflict between choice options during the decision process. Using mouse-tracking, researchers have gained insight into the temporal development of cognitive processes across a growing number of psychological domains. In the current article, we present software that offers easy and convenient means of recording and analyzing mouse movements in computerized laboratory experiments. In particular, we introduce and demonstrate the mousetrap plugin that adds mouse-tracking to OpenSesame, a popular general-purpose graphical experiment builder. By integrating with this existing experimental software, mousetrap allows for the creation of mouse-tracking studies through a graphical interface, without requiring programming skills. Thus, researchers can benefit from the core features of a validated software package and the many extensions available for it (e.g., the integration with auxiliary hardware such as eye-tracking, or the support of interactive experiments). In addition, the recorded data can be imported directly into the statistical programming language R using the mousetrap package, which greatly facilitates analysis. Mousetrap is cross-platform, open-source and available free of charge from https://github.com/pascalkieslich/mousetrap-os .

  11. The Paucity Problem: Where Have All the Space Reactor Experiments Gone?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bess, John D.; Marshall, Margaret A.

    2016-10-01

    The Handbooks of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) together contain a plethora of documented and evaluated experiments essential in the validation of nuclear data, neutronics codes, and modeling of various nuclear systems. Unfortunately, only a minute selection of handbook data (twelve evaluations) are of actual experimental facilities and mockups designed specifically for space nuclear research. There is a paucity problem, such that the multitude of space nuclear experimental activities performed in the past several decades have yet to be recovered and made available in such detail that themore » international community could benefit from these valuable historical research efforts. Those experiments represent extensive investments in infrastructure, expertise, and cost, as well as constitute significantly valuable resources of data supporting past, present, and future research activities. The ICSBEP and IRPhEP were established to identify and verify comprehensive sets of benchmark data; evaluate the data, including quantification of biases and uncertainties; compile the data and calculations in a standardized format; and formally document the effort into a single source of verified benchmark data. See full abstract in attached document.« less

  12. Off-design Performance Analysis of Multi-Stage Transonic Axial Compressors

    NASA Astrophysics Data System (ADS)

    Du, W. H.; Wu, H.; Zhang, L.

    Because of the complex flow fields and component interaction in modern gas turbine engines, they require extensive experiment to validate performance and stability. The experiment process can become expensive and complex. Modeling and simulation of gas turbine engines are way to reduce experiment costs, provide fidelity and enhance the quality of essential experiment. The flow field of a transonic compressor contains all the flow aspects, which are difficult to present-boundary layer transition and separation, shock-boundary layer interactions, and large flow unsteadiness. Accurate transonic axial compressor off-design performance prediction is especially difficult, due in large part to three-dimensional blade design and the resulting flow field. Although recent advancements in computer capacity have brought computational fluid dynamics to forefront of turbomachinery design and analysis, the grid and turbulence model still limit Reynolds-average Navier-Stokes (RANS) approximations in the multi-stage transonic axial compressor flow field. Streamline curvature methods are still the dominant numerical approach as an important tool for turbomachinery to analyze and design, and it is generally accepted that streamline curvature solution techniques will provide satisfactory flow prediction as long as the losses, deviation and blockage are accurately predicted.

  13. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  14. New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey

    NASA Astrophysics Data System (ADS)

    Adams, W. K.; Perkins, K. K.; Podolefsky, N. S.; Dubson, M.; Finkelstein, N. D.; Wieman, C. E.

    2006-06-01

    The Colorado Learning Attitudes about Science Survey (CLASS) is a new instrument designed to measure student beliefs about physics and about learning physics. This instrument extends previous work by probing additional aspects of student beliefs and by using wording suitable for students in a wide variety of physics courses. The CLASS has been validated using interviews, reliability studies, and extensive statistical analyses of responses from over 5000 students. In addition, a new methodology for determining useful and statistically robust categories of student beliefs has been developed. This paper serves as the foundation for an extensive study of how student beliefs impact and are impacted by their educational experiences. For example, this survey measures the following: that most teaching practices cause substantial drops in student scores; that a student’s likelihood of becoming a physics major correlates with their “Personal Interest” score; and that, for a majority of student populations, women’s scores in some categories, including “Personal Interest” and “Real World Connections,” are significantly different from men’s scores.

  15. Learning optimal embedded cascades.

    PubMed

    Saberian, Mohammad Javad; Vasconcelos, Nuno

    2012-10-01

    The problem of automatic and optimal design of embedded object detector cascades is considered. Two main challenges are identified: optimization of the cascade configuration and optimization of individual cascade stages, so as to achieve the best tradeoff between classification accuracy and speed, under a detection rate constraint. Two novel boosting algorithms are proposed to address these problems. The first, RCBoost, formulates boosting as a constrained optimization problem which is solved with a barrier penalty method. The constraint is the target detection rate, which is met at all iterations of the boosting process. This enables the design of embedded cascades of known configuration without extensive cross validation or heuristics. The second, ECBoost, searches over cascade configurations to achieve the optimal tradeoff between classification risk and speed. The two algorithms are combined into an overall boosting procedure, RCECBoost, which optimizes both the cascade configuration and its stages under a detection rate constraint, in a fully automated manner. Extensive experiments in face, car, pedestrian, and panda detection show that the resulting detectors achieve an accuracy versus speed tradeoff superior to those of previous methods.

  16. Adaptive Multi-Agent Systems for Constrained Optimization

    NASA Technical Reports Server (NTRS)

    Macready, William; Bieniawski, Stefan; Wolpert, David H.

    2004-01-01

    Product Distribution (PD) theory is a new framework for analyzing and controlling distributed systems. Here we demonstrate its use for distributed stochastic optimization. First we review one motivation of PD theory, as the information-theoretic extension of conventional full-rationality game theory to the case of bounded rational agents. In this extension the equilibrium of the game is the optimizer of a Lagrangian of the (probability distribution of) the joint state of the agents. When the game in question is a team game with constraints, that equilibrium optimizes the expected value of the team game utility, subject to those constraints. The updating of the Lagrange parameters in the Lagrangian can be viewed as a form of automated annealing, that focuses the MAS more and more on the optimal pure strategy. This provides a simple way to map the solution of any constrained optimization problem onto the equilibrium of a Multi-Agent System (MAS). We present computer experiments involving both the Queen s problem and K-SAT validating the predictions of PD theory and its use for off-the-shelf distributed adaptive optimization.

  17. Flight test experience and controlled impact of a remotely piloted jet transport aircraft

    NASA Technical Reports Server (NTRS)

    Horton, Timothy W.; Kempel, Robert W.

    1988-01-01

    The Dryden Flight Research Center Facility of NASA Ames Research Center (Ames-Dryden) and the FAA conducted the controlled impact demonstration (CID) program using a large, four-engine, remotely piloted jet transport airplane. Closed-loop primary flight was controlled through the existing onboard PB-20D autopilot which had been modified for the CID program. Uplink commands were sent from a ground-based cockpit and digital computer in conjunction with an up-down telemetry link. These uplink commands were received aboard the airplane and transferred through uplink interface systems to the modified PB-20D autopilot. Both proportional and discrete commands were produced by the ground system. Prior to flight tests, extensive simulation was conducted during the development of ground-based digital control laws. The control laws included primary control, secondary control, and racetrack and final approach guidance. Extensive ground checks were performed on all remotely piloted systems; however, piloted flight tests were the primary method and validation of control law concepts developed from simulation. The design, development, and flight testing of control laws and systems required to accomplish the remotely piloted mission are discussed.

  18. Impact of model development, calibration and validation decisions on hydrological simulations in West Lake Erie Basin

    USDA-ARS?s Scientific Manuscript database

    Watershed simulation models are used extensively to investigate hydrologic processes, landuse and climate change impacts, pollutant load assessments and best management practices (BMPs). Developing, calibrating and validating these models require a number of critical decisions that will influence t...

  19. 77 FR 25469 - Applications for New Awards; Investing in Innovation Fund, Validation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-30

    ... DEPARTMENT OF EDUCATION Applications for New Awards; Investing in Innovation Fund, Validation... Innovation and Improvement, Department of Education. ACTION: Notice; extension of deadline date and correction. SUMMARY: On March 27, 2012, the Office of Innovation and Improvement in the U.S. Department of...

  20. Powerful model for the point source sky: Far-ultraviolet and enhanced midinfrared performance

    NASA Technical Reports Server (NTRS)

    Cohen, Martin

    1994-01-01

    I report further developments of the Wainscoat et al. (1992) model originally created for the point source infrared sky. The already detailed and realistic representation of the Galaxy (disk, spiral arms and local spur, molecular ring, bulge, spheroid) has been improved, guided by CO surveys of local molecular clouds, and by the inclusion of a component to represent Gould's Belt. The newest version of the model is very well validated by Infrared Astronomy Satellite (IRAS) source counts. A major new aspect is the extension of the same model down to the far ultraviolet. I compare predicted and observed far-utraviolet source counts from the Apollo 16 'S201' experiment (1400 A) and the TD1 satellite (for the 1565 A band).

  1. Recent developments in Geant4

    DOE PAGES

    Allison, J.; Amako, K.; Apostolakis, J.; ...

    2016-07-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions tomore » the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.« less

  2. Why are Formal Methods Not Used More Widely?

    NASA Technical Reports Server (NTRS)

    Knight, John C.; DeJong, Colleen L.; Gibble, Matthew S.; Nakano, Luis G.

    1997-01-01

    Despite extensive development over many years and significant demonstrated benefits, formal methods remain poorly accepted by industrial practitioners. Many reasons have been suggested for this situation such as a claim that they extent the development cycle, that they require difficult mathematics, that inadequate tools exist, and that they are incompatible with other software packages. There is little empirical evidence that any of these reasons is valid. The research presented here addresses the question of why formal methods are not used more widely. The approach used was to develop a formal specification for a safety-critical application using several specification notations and assess the results in a comprehensive evaluation framework. The results of the experiment suggests that there remain many impediments to the routine use of formal methods.

  3. Revision of the criterion to avoid electron heating during laser aided plasma diagnostics (LAPD)

    NASA Astrophysics Data System (ADS)

    Carbone, E. A. D.; Palomares, J. M.; Hübner, S.; Iordanova, E.; van der Mullen, J. J. A. M.

    2012-01-01

    A criterion is given for the laser fluency (in J/m2) such that, when satisfied, disturbance of the plasma by the laser is avoided. This criterion accounts for laser heating of the electron gas intermediated by electron-ion (ei) and electron-atom (ea) interactions. The first heating mechanism is well known and was extensively dealt with in the past. The second is often overlooked but of importance for plasmas of low degree of ionization. It is especially important for cold atmospheric plasmas, plasmas that nowadays stand in the focus of attention. The new criterion, based on the concerted action of both ei and ea interactions is validated by Thomson scattering experiments performed on four different plasmas.

  4. Acceleration feedback of a current-following synchronized control algorithm for telescope elevation axis

    NASA Astrophysics Data System (ADS)

    Tang, Tao; Zhang, Tong; Du, Jun-Feng; Ren, Ge; Tian, Jing

    2016-11-01

    This paper proposes a dual-motor configuration to enhance closed-loop performance of a telescope control system. Two identical motors are mounted on each side of a U-type frame to drive the telescope elevation axis instead of a single motor drive, which is usually used in a classical design. This new configuration and mechanism can reduce the motor to half the size used in the former design, and it also provides some other advantages. A master-slave current control mode is employed to synchronize the two motors. Acceleration feedback control is utilized to further enhance the servo performance. Extensive experiments are used to validate the effectiveness of the proposed control algorithm in synchronization, disturbance attenuation and low-velocity tracking.

  5. A novel approach to describing and detecting performance anti-patterns

    NASA Astrophysics Data System (ADS)

    Sheng, Jinfang; Wang, Yihan; Hu, Peipei; Wang, Bin

    2017-08-01

    Anti-pattern, as an extension to pattern, describes a widely used poor solution which can bring negative influence to application systems. Aiming at the shortcomings of the existing anti-pattern descriptions, an anti-pattern description method based on first order predicate is proposed. This method synthesizes anti-pattern forms and symptoms, which makes the description more accurate and has good scalability and versatility as well. In order to improve the accuracy of anti-pattern detection, a Bayesian classification method is applied in validation for detection results, which can reduce false negatives and false positives of anti-pattern detection. Finally, the proposed approach in this paper is applied to a small e-commerce system, the feasibility and effectiveness of the approach is demonstrated further through experiments.

  6. A robust vision-based sensor fusion approach for real-time pose estimation.

    PubMed

    Assa, Akbar; Janabi-Sharifi, Farrokh

    2014-02-01

    Object pose estimation is of great importance to many applications, such as augmented reality, localization and mapping, motion capture, and visual servoing. Although many approaches based on a monocular camera have been proposed, only a few works have concentrated on applying multicamera sensor fusion techniques to pose estimation. Higher accuracy and enhanced robustness toward sensor defects or failures are some of the advantages of these schemes. This paper presents a new Kalman-based sensor fusion approach for pose estimation that offers higher accuracy and precision, and is robust to camera motion and image occlusion, compared to its predecessors. Extensive experiments are conducted to validate the superiority of this fusion method over currently employed vision-based pose estimation algorithms.

  7. Validation Metrics for Improving Our Understanding of Turbulent Transport - Moving Beyond Proof by Pretty Picture and Loud Assertion

    NASA Astrophysics Data System (ADS)

    Holland, C.

    2013-10-01

    Developing validated models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. This tutorial will present an overview of the key guiding principles and practices for state-of-the-art validation studies, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. The primary focus of the talk will be the development of quantiatve validation metrics, which are essential for moving beyond qualitative and subjective assessments of model performance and fidelity. Particular emphasis and discussion is given to (i) the need for utilizing synthetic diagnostics to enable quantitatively meaningful comparisons between simulation and experiment, and (ii) the importance of robust uncertainty quantification and its inclusion within the metrics. To illustrate these concepts, we first review the structure and key insights gained from commonly used ``global'' transport model metrics (e.g. predictions of incremental stored energy or radially-averaged temperature), as well as their limitations. Building upon these results, a new form of turbulent transport metrics is then proposed, which focuses upon comparisons of predicted local gradients and fluctuation characteristics against observation. We demonstrate the utility of these metrics by applying them to simulations and modeling of a newly developed ``validation database'' derived from the results of a systematic, multi-year turbulent transport validation campaign on the DIII-D tokamak, in which comprehensive profile and fluctuation measurements have been obtained from a wide variety of heating and confinement scenarios. Finally, we discuss extensions of these metrics and their underlying design concepts to other areas of plasma confinement research, including both magnetohydrodynamic stability and integrated scenario modeling. Supported by the US DOE under DE-FG02-07ER54917 and DE-FC02-08ER54977.

  8. Non-destructive inspection in industrial equipment using robotic mobile manipulation

    NASA Astrophysics Data System (ADS)

    Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah

    2016-05-01

    MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.

  9. Small Launch Vehicle Trade Space Definition: Development of a Zero Level Mass Estimation Tool with Trajectory Validation

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.

    2013-01-01

    Recent high level interest in the capability of small launch vehicles has placed significant demand on determining the trade space these vehicles occupy. This has led to the development of a zero level analysis tool that can quickly determine the minimum expected vehicle gross liftoff weight (GLOW) in terms of vehicle stage specific impulse (Isp) and propellant mass fraction (pmf) for any given payload value. Utilizing an extensive background in Earth to orbit trajectory experience a total necessary delta v the vehicle must achieve can be estimated including relevant loss terms. This foresight into expected losses allows for more specific assumptions relating to the initial estimates of thrust to weight values for each stage. This tool was further validated against a trajectory model, in this case the Program to Optimize Simulated Trajectories (POST), to determine if the initial sizing delta v was adequate to meet payload expectations. Presented here is a description of how the tool is setup and the approach the analyst must take when using the tool. Also, expected outputs which are dependent on the type of small launch vehicle being sized will be displayed. The method of validation will be discussed as well as where the sizing tool fits into the vehicle design process.

  10. Validation and scaling of soil moisture in a semi-arid environment: SMAP Validation Experiment 2015 (SMAPVEX15)

    USDA-ARS?s Scientific Manuscript database

    The NASA SMAP (Soil Moisture Active Passive) mission conducted the SMAP Validation Experiment 2015 (SMAPVEX15) in order to support the calibration and validation activities of SMAP soil moisture data product.The main goals of the experiment were to address issues regarding the spatial disaggregation...

  11. 42 CFR 137.426 - May an Indian Tribe get an extension of time to file a notice of appeal?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-GOVERNANCE Appeals Pre-Award Disputes § 137.426 May an Indian Tribe get an extension of time to file a notice... time period. If the Indian Tribe has a valid reason for not filing its notice of appeal on time, it may...

  12. Development and Validation of the Career Competencies Indicator (CCI)

    ERIC Educational Resources Information Center

    Francis-Smythe, Jan; Haase, Sandra; Thomas, Erica; Steele, Catherine

    2013-01-01

    This article describes the development and validation of the Career Competencies Indicator (CCI); a 43-item measure to assess career competencies (CCs). Following an extensive literature review, a comprehensive item generation process involving consultation with subject matter experts, a pilot study and a factor analytic study on a large sample…

  13. Using Evaluation to Guide and Validate Improvements to the Utah Master Naturalist Program

    ERIC Educational Resources Information Center

    Larese-Casanova, Mark

    2015-01-01

    Integrating evaluation into an Extension program offers multiple opportunities to understand program success through achieving program goals and objectives, delivering programming using the most effective techniques, and refining program audiences. It is less common that evaluation is used to guide and validate the effectiveness of program…

  14. The Open Curriculum and Selection of Qualified Staff: Instrument Validation.

    ERIC Educational Resources Information Center

    Greene, John F.; And Others

    The impact of open education on today's curriculum has been extensive. Of the many requests for research in this area, none is more important than instrument validation. This study examines the internal structure of Barth's Assumptions about Learning and Knowledge scale and explores its relationship to established "progressivism" and…

  15. Confined cattle feeding trail to validate fecal DNA metabarcoding to inform rangeland free-roaming diet applications

    USDA-ARS?s Scientific Manuscript database

    Diet composition of free roaming livestock and wildlife in extensive rangelands are difficult to quantify. Recent technological advances now allow us to reconstruct plant species-specific dietary protein composition using fecal samples. However, it has been suggested that validation of the method i...

  16. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    PubMed

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  17. Development, validity and reliability of a new pressure air biofeedback device (PAB) for measuring isometric extension strength of the lumbar spine.

    PubMed

    Pienaar, Andries W; Barnard, Justhinus G

    2017-04-01

    This study describes the development of a new portable muscle testing device, using air pressure as a biofeedback and strength testing tool. For this purpose, a pressure air biofeedback device (PAB ® ) was developed to measure and record the isometric extension strength of the lumbar multifidus muscle in asymptomatic and low back pain (LBP) persons. A total of 42 subjects (age 47.58 years, ±18.58) participated in this study. The validity of PAB ® was assessed by comparing a selected measure, air pressure force in millibar (mb), to a standard criterion; calibrated weights in kilograms (kg) during day-to-day tests. Furthermore, clinical trial-to-trial and day-to-day tests of maximum voluntary isometric contraction (MVIC) of L5 lumbar multifidus were done to compare air pressure force (mb) to electromyography (EMG) in microvolt (μV) and to measure the reliability of PAB ® . A highly significant relationship were found between air pressure output (mb) and calibrated weights (kg). In addition, Pearson correlation calculations showed a significant relationship between PAB ® force (mb) and EMG activity (μV) for all subjects (n = 42) examined, as well as for the asymptomatic group (n = 24). No relationship was detected for the LBP group (n = 18). In terms of lumbar extension strength, we found that asymptomatic subjects were significantly stronger than LBP subjects. The results of the PAB ® test differentiated between LBP and asymptomatic subject's lumbar isometric extension strength without any risk to the subjects and also indicate that the lumbar isometric extension test with the new PAB ® device is reliable and valid.

  18. Electrical Capacitance Volume Tomography for the Packed Bed Reactor ISS Flight Experiment

    NASA Technical Reports Server (NTRS)

    Marashdeh, Qussai; Motil, Brian; Wang, Aining; Liang-Shih, Fan

    2013-01-01

    Fixed packed bed reactors are compact, require minimum power and maintenance to operate, and are highly reliable. These features make this technology a highly desirable unit operation for long duration life support systems in space. NASA is developing an ISS experiment to address this technology with particular focus on water reclamation and air revitalization. Earlier research and development efforts funded by NASA have resulted in two hydrodynamic models which require validation with appropriate instrumentation in an extended microgravity environment. To validate these models, the instantaneous distribution of the gas and liquid phases must be measured.Electrical Capacitance Volume Tomography (ECVT) is a non-invasive imaging technology recently developed for multi-phase flow applications. It is based on distributing flexible capacitance plates on the peripheral of a flow column and collecting real-time measurements of inter-electrode capacitances. Capacitance measurements here are directly related to dielectric constant distribution, a physical property that is also related to material distribution in the imaging domain. Reconstruction algorithms are employed to map volume images of dielectric distribution in the imaging domain, which is in turn related to phase distribution. ECVT is suitable for imaging interacting materials of different dielectric constants, typical in multi-phase flow systems. ECVT is being used extensively for measuring flow variables in various gas-liquid and gas-solid flow systems. Recent application of ECVT include flows in risers and exit regions of circulating fluidized beds, gas-liquid and gas-solid bubble columns, trickle beds, and slurry bubble columns. ECVT is also used to validate flow models and CFD simulations. The technology is uniquely qualified for imaging phase concentrations in packed bed reactors for the ISS flight experiments as it exhibits favorable features of compact size, low profile sensors, high imaging speed, and flexibility to fit around columns of various shapes and sizes. ECVT is also safer than other commonly used imaging modalities as it operates in the range of low frequencies (1 MHz) and does not radiate radioactive energy. In this effort, ECVT is being used to image flow parameters in a packed bed reactor for an ISS flight experiment.

  19. Dynamic compression of water to 700 GPa: single- and double shock experiments on Sandia's Z machine, first principles simulations, and structure of water planets

    NASA Astrophysics Data System (ADS)

    Mattsson, Thomas R.

    2011-11-01

    Significant progress has over the last few years been made in high energy density physics (HEDP) by executing high-precision multi-Mbar experiments and performing first-principles simulations for elements ranging from carbon [1] to xenon [2]. The properties of water under HEDP conditions are of particular importance in planetary science due to the existence of ice-giants like Neptune and Uranus. Modeling the two planets, as well as water-rich exoplanets, requires knowing the equation of state (EOS), the pressure as a function of density and temperature, of water with high accuracy. Although extensive density functional theory (DFT) simulations have been performed for water under planetary conditions [3] experimental validation has been lacking. Accessing thermodynamic states along planetary isentropes in dynamic compression experiments is challenging because the principal Hugoniot follows a significantly different path in the phase diagram. In this talk, we present experimental data for dynamic compression of water up to 700 GPa, including in a regime of the phase-diagram intersected by the Neptune isentrope and water-rich models for the exoplanet GJ436b. The data was obtained on the Z-accelerator at Sandia National Laboratories by performing magnetically accelerated flyer plate impact experiments measuring both the shock and re-shock in the sample. The high accuracy makes it possible for the data to be used for detailed model validation: the results validate first principles based thermodynamics as a reliable foundation for planetary modeling and confirm the fine effect of including nuclear quantum effects on the shock pressure. Sandia National Laboratories is a multiprogram laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract No. DE-AC04-94AL85000. [4pt] [1] M.D. Knudson, D.H. Dolan, and M.P. Desjarlais, SCIENCE 322, 1822 (2008).[0pt] [2] S. Root, et al., Phys. Rev. Lett. 105, 085501 (2010).[0pt] [3] M. French, et al., Phys. Rev. B 79, 054107 (2009).

  20. A rotation-translation invariant molecular descriptor of partial charges and its use in ligand-based virtual screening

    PubMed Central

    2014-01-01

    Background Measures of similarity for chemical molecules have been developed since the dawn of chemoinformatics. Molecular similarity has been measured by a variety of methods including molecular descriptor based similarity, common molecular fragments, graph matching and 3D methods such as shape matching. Similarity measures are widespread in practice and have proven to be useful in drug discovery. Because of our interest in electrostatics and high throughput ligand-based virtual screening, we sought to exploit the information contained in atomic coordinates and partial charges of a molecule. Results A new molecular descriptor based on partial charges is proposed. It uses the autocorrelation function and linear binning to encode all atoms of a molecule into two rotation-translation invariant vectors. Combined with a scoring function, the descriptor allows to rank-order a database of compounds versus a query molecule. The proposed implementation is called ACPC (AutoCorrelation of Partial Charges) and released in open source. Extensive retrospective ligand-based virtual screening experiments were performed and other methods were compared with in order to validate the method and associated protocol. Conclusions While it is a simple method, it performed remarkably well in experiments. At an average speed of 1649 molecules per second, it reached an average median area under the curve of 0.81 on 40 different targets; hence validating the proposed protocol and implementation. PMID:24887178

  1. Use of a hardware token for Grid authentication by the MICE data distribution framework

    NASA Astrophysics Data System (ADS)

    Nebrensky, JJ; Martyniak, J.

    2017-10-01

    The international Muon Ionization Cooling Experiment (MICE) is designed to demonstrate the principle of muon ionisation cooling for the first time. Data distribution and archiving, batch reprocessing, and simulation are all carried out using the EGI Grid infrastructure, in particular the facilities provided by GridPP in the UK. To prevent interference - especially accidental data deletion - these activities are separated by different VOMS roles. Data acquisition, in particular, can involve 24/7 operation for a number of weeks and so for moving the data out of the MICE Local Control Room at the experiment a valid, VOMS-enabled, Grid proxy must be made available continuously over that time. The MICE "Data Mover" agent is now using a robot certificate stored on a hardware token (Feitian ePass2003) from which a cron job generates a “plain” proxy to which the VOMS authorisation extensions are added in a separate transaction. A valid short-lifetime proxy is thus continuously available to the Data Mover process. The Feitian ePass2003 was chosen because it was both significantly cheaper and easier to actually purchase than the token commonly referred to in the community at that time; however there was no software support for the hardware. This paper describes the software packages, process and commands used to deploy the token into production.

  2. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    PubMed

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.

  3. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161

  4. "Lacking warmth": Alexithymia trait is related to warm-specific thermal somatosensory processing.

    PubMed

    Borhani, Khatereh; Làdavas, Elisabetta; Fotopoulou, Aikaterini; Haggard, Patrick

    2017-09-01

    Alexithymia is a personality trait involving deficits in emotional processing. The personality construct has been extensively validated, but the underlying neural and physiological systems remain controversial. One theory suggests that low-level somatosensory mechanisms act as somatic markers of emotion, underpinning cognitive and affective impairments in alexithymia. In two separate samples (total N=100), we used an established Quantitative Sensory Testing (QST) battery to probe multiple neurophysiological submodalities of somatosensation, and investigated their associations with the widely-used Toronto Alexithymia Scale (TAS-20). Experiment one found reduced sensitivity to warmth in people with higher alexithymia scores, compared to individuals with lower scores, without deficits in other somatosensory submodalities. Experiment two replicated this result in a new group of participants using a full-sample correlation between threshold for warm detection and TAS-20 scores. We discuss the relations between low-level thermoceptive function and cognitive processing of emotion. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  5. One-Shot Learning of Human Activity With an MAP Adapted GMM and Simplex-HMM.

    PubMed

    Rodriguez, Mario; Orrite, Carlos; Medrano, Carlos; Makris, Dimitrios

    2016-05-10

    This paper presents a novel activity class representation using a single sequence for training. The contribution of this representation lays on the ability to train an one-shot learning recognition system, useful in new scenarios where capturing and labeling sequences is expensive or impractical. The method uses a universal background model of local descriptors obtained from source databases available on-line and adapts it to a new sequence in the target scenario through a maximum a posteriori adaptation. Each activity sample is encoded in a sequence of normalized bag of features and modeled by a new hidden Markov model formulation, where the expectation-maximization algorithm for training is modified to deal with observations consisting in vectors in a unit simplex. Extensive experiments in recognition have been performed using one-shot learning over the public datasets Weizmann, KTH, and IXMAS. These experiments demonstrate the discriminative properties of the representation and the validity of application in recognition systems, achieving state-of-the-art results.

  6. Study of Tool Wear Mechanisms and Mathematical Modeling of Flank Wear During Machining of Ti Alloy (Ti6Al4V)

    NASA Astrophysics Data System (ADS)

    Chetan; Narasimhulu, A.; Ghosh, S.; Rao, P. V.

    2015-07-01

    Machinability of titanium is poor due to its low thermal conductivity and high chemical affinity. Lower thermal conductivity of titanium alloy is undesirable on the part of cutting tool causing extensive tool wear. The main task of this work is to predict the various wear mechanisms involved during machining of Ti alloy (Ti6Al4V) and to formulate an analytical mathematical tool wear model for the same. It has been found from various experiments that adhesive and diffusion wear are the dominating wear during machining of Ti alloy with PVD coated tungsten carbide tool. It is also clear from the experiments that the tool wear increases with the increase in cutting parameters like speed, feed and depth of cut. The wear model was validated by carrying out dry machining of Ti alloy at suitable cutting conditions. It has been found that the wear model is able to predict the flank wear suitably under gentle cutting conditions.

  7. Proton affinities of maingroup-element hydrides and noble gases: trends across the periodic table, structural effects, and DFT validation.

    PubMed

    Swart, Marcel; Rösler, Ernst; Bickelhaupt, F Matthias

    2006-10-01

    We have carried out an extensive exploration of the gas-phase basicity of archetypal neutral bases across the periodic system using the generalized gradient approximation (GGA) of the density functional theory (DFT) at BP86/QZ4P//BP86/TZ2P. First, we validate DFT as a reliable tool for computing proton affinities and related thermochemical quantities: BP86/QZ4P//BP86/TZ2P is shown to yield a mean absolute deviation of 2.0 kcal/mol for the proton affinity at 298 K with respect to experiment, and 1.2 kcal/mol with high-level ab initio benchmark data. The main purpose of this work is to provide the proton affinities (and corresponding entropies) at 298 K of the neutral bases constituted by all maingroup-element hydrides of groups 15-17 and the noble gases, that is, group 18, and periods 1-6. We have also studied the effect of step-wise methylation of the protophilic center of the second- and third-period bases. Copyright 2006 Wiley Periodicals, Inc.

  8. Improved patch-based learning for image deblurring

    NASA Astrophysics Data System (ADS)

    Dong, Bo; Jiang, Zhiguo; Zhang, Haopeng

    2015-05-01

    Most recent image deblurring methods only use valid information found in input image as the clue to fill the deblurring region. These methods usually have the defects of insufficient prior information and relatively poor adaptiveness. Patch-based method not only uses the valid information of the input image itself, but also utilizes the prior information of the sample images to improve the adaptiveness. However the cost function of this method is quite time-consuming and the method may also produce ringing artifacts. In this paper, we propose an improved non-blind deblurring algorithm based on learning patch likelihoods. On one hand, we consider the effect of the Gaussian mixture model with different weights and normalize the weight values, which can optimize the cost function and reduce running time. On the other hand, a post processing method is proposed to solve the ringing artifacts produced by traditional patch-based method. Extensive experiments are performed. Experimental results verify that our method can effectively reduce the execution time, suppress the ringing artifacts effectively, and keep the quality of deblurred image.

  9. A design methodology for neutral buoyancy simulation of space operations

    NASA Technical Reports Server (NTRS)

    Akin, David L.

    1988-01-01

    Neutral buoyancy has often been used in the past for EVA development activities, but little has been done to provide an analytical understanding of the environment and its correlation with space. This paper covers a set of related research topics at the MIT Space Systems Laboratory, dealing with the modeling of the space and underwater environments, validation of the models through testing in neutral buoyancy, parabolic flight, and space flight experiments, and applications of the models to gain a better design methodology for creating meaningful neutral buoyancy simulations. Examples covered include simulation validation criteria for human body dynamics, and for applied torques in a beam rotation task, which is the pacing crew operation for EVA structural assembly. Extensions of the dynamics models are presented for powered vehicles in the underwater environment, and examples given from the MIT Space Telerobotics Research Program, including the Beam Assembly Teleoperator and the Multimode Proximity Operations Device. Future expansions of the modeling theory are also presented, leading to remote vehicles which behave in neutral buoyancy exactly as the modeled system would in space.

  10. Experimental validation of the Achromatic Telescopic Squeezing (ATS) scheme at the LHC

    NASA Astrophysics Data System (ADS)

    Fartoukh, S.; Bruce, R.; Carlier, F.; Coello De Portugal, J.; Garcia-Tabares, A.; Maclean, E.; Malina, L.; Mereghetti, A.; Mirarchi, D.; Persson, T.; Pojer, M.; Ponce, L.; Redaelli, S.; Salvachua, B.; Skowronski, P.; Solfaroli, M.; Tomas, R.; Valuch, D.; Wegscheider, A.; Wenninger, J.

    2017-07-01

    The Achromatic Telescopic Squeezing scheme offers new techniques to deliver unprecedentedly small beam spot size at the interaction points of the ATLAS and CMS experiments of the LHC, while perfectly controlling the chromatic properties of the corresponding optics (linear and non-linear chromaticities, off-momentum beta-beating, spurious dispersion induced by the crossing bumps). The first series of beam tests with ATS optics were achieved during the LHC Run I (2011/2012) for a first validation of the basics of the scheme at small intensity. In 2016, a new generation of more performing ATS optics was developed and more extensively tested in the machine, still with probe beams for optics measurement and correction at β* = 10 cm, but also with a few nominal bunches to establish first collisions at nominal β* (40 cm) and beyond (33 cm), and to analysis the robustness of these optics in terms of collimation and machine protection. The paper will highlight the most relevant and conclusive results which were obtained during this second series of ATS tests.

  11. CFD validation experiments for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.

  12. Mathematical model formulation and validation of water and solute transport in whole hamster pancreatic islets.

    PubMed

    Benson, James D; Benson, Charles T; Critser, John K

    2014-08-01

    Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3×3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87±0.06 (mean ± SD). Only the treatment variable of perfusing solution was found to be significant (p<0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Measuring activities and participation in persons with haemophilia: A systematic review of commonly used instruments.

    PubMed

    Timmer, M A; Gouw, S C; Feldman, B M; Zwagemaker, A; de Kleijn, P; Pisters, M F; Schutgens, R E G; Blanchette, V; Srivastava, A; David, J A; Fischer, K; van der Net, J

    2018-03-01

    Monitoring clinical outcome in persons with haemophilia (PWH) is essential in order to provide optimal treatment for individual patients and compare effectiveness of treatment strategies. Experience with measurement of activities and participation in haemophilia is limited and consensus on preferred tools is lacking. The aim of this study was to give a comprehensive overview of the measurement properties of a selection of commonly used tools developed to assess activities and participation in PWH. Electronic databases were searched for articles that reported on reliability, validity or responsiveness of predetermined measurement tools (5 self-reported and 4 performance based measurement tools). Methodological quality of the studies was assessed according to the COSMIN checklist. Best evidence synthesis was used to summarize evidence on the measurement properties. The search resulted in 3453 unique hits. Forty-two articles were included. The self-reported Haemophilia Acitivity List (HAL), Pediatric HAL (PedHAL) and the performance based Functional Independence Score in Haemophilia (FISH) were studied most extensively. Methodological quality of the studies was limited. Measurement error, cross-cultural validity and responsiveness have been insufficiently evaluated. Albeit based on limited evidence, the measurement properties of the PedHAL, HAL and FISH are currently considered most satisfactory. Further research needs to focus on measurement error, responsiveness, interpretability and cross-cultural validity of the self-reported tools and validity of performance based tools which are able to assess limitations in sports and leisure activities. © 2018 The Authors. Haemophilia Published by John Wiley & Sons Ltd.

  14. Model extension, calibration and validation of partial nitritation-anammox process in moving bed biofilm reactor (MBBR) for reject and mainstream wastewater.

    PubMed

    Trojanowicz, K; Plaza, E; Trela, J

    2017-11-09

    In the paper, the extension of mathematical model of partial nitritation-anammox process in a moving bed biofilm reactor (MBBR) is presented. The model was calibrated with a set of kinetic, stoichiometric and biofilm parameters, whose values were taken from the literature and batch tests. The model was validated with data obtained from: laboratory batch experiments, pilot-scale MBBR for a reject water deammonification operated at Himmerfjärden wastewater treatment and pilot-scale MBBR for mainstream wastewater deammonification at Hammarby Sjöstadsverk research facility, Sweden. Simulations were conducted in AQUASIM software. The proposed, extended model proved to be useful for simulating of partial nitritation/anammox process in biofilm reactor both for reject water and mainstream wastewater at variable substrate concentrations (influent total ammonium-nitrogen concentration of 530 ± 68; 45 ± 2.6 and 38 ± 3 gN/m 3 - for reject water - and two cases of mainstream wastewater treatment, respectively), temperature (24 ± 2.8; 15 ± 1.1 and 18 ± 0.5°C), pH (7.8 ± 0.2; 7.3 ± 0.1 and 7.4 ± 0.1) and aeration patterns (continuous aeration and intermittent aeration with variable dissolved oxygen concentrations and length of aerated and anoxic phases). The model can be utilized for optimizing and testing different operational strategies of deammonification process in biofilm systems.

  15. Factors Influencing Career Experiences of Selected Chinese Faculty Employed at an American Research Extensive University

    ERIC Educational Resources Information Center

    Cooksey, Yan Zhang; Cole, Bryan R.

    2012-01-01

    Whereas research related to the experience of faculty of color is increasing, few attentions have been focused on Chinese faculty's career experience in the US. This study examined career experiences of 16 Chinese faculty members across different disciplines, ranks and genders at a studied research extensive university in Texas, US. The study used…

  16. Stratospheric Aerosol and Gas Experiment III on the International Space Station (SAGE III/ISS)

    NASA Technical Reports Server (NTRS)

    Gasbarre, Joseph; Walker, Richard; Cisewski, Michael; Zawodny, Joseph; Cheek, Dianne; Thornton, Brooke

    2015-01-01

    The Stratospheric Aerosol and Gas Experiment III on the International Space Station (SAGE III/ISS) mission will extend the SAGE data record from the ideal vantage point of the International Space Station (ISS). The ISS orbital inclination is ideal for SAGE measurements providing coverage between 70 deg north and 70 deg south latitude. The SAGE data record includes an extensively validated data set including aerosol optical depth data dating to the Stratospheric Aerosol Measurement (SAM) experiments in 1975 and 1978 and stratospheric ozone profile data dating to the Stratospheric Aerosol and Gas Experiment (SAGE) in 1979. These and subsequent data records, notably from the SAGE II experiment launched on the Earth Radiation Budget Satellite in 1984 and the SAGE III experiment launched on the Russian Meteor-3M satellite in 2001, have supported a robust, long-term assessment of key atmospheric constituents. These scientific measurements provide the basis for the analysis of five of the nine critical constituents (aerosols, ozone (O3), nitrogen dioxide (NO2), water vapor (H2O), and air density using O2) identified in the U.S. National Plan for Stratospheric Monitoring. SAGE III on ISS was originally scheduled to fly on the ISS in the same timeframe as the Meteor-3M mission, but was postponed due to delays in ISS construction. The project was re-established in 2009.

  17. COMBINING LIDAR ESTIMATES OF BIOMASS AND LANDSAT ESTIMATES OF STAND AGE FOR SPATIALLY EXTENSIVE VALIDATION OF MODELED FOREST PRODUCTIVITY. (R828309)

    EPA Science Inventory

    Extensive estimates of forest productivity are required to understand the
    relationships between shifting land use, changing climate and carbon storage
    and fluxes. Aboveground net primary production of wood (NPPAw) is a major component
    of total NPP and...

  18. An Exploration of Participative Motivations in a Community-Based Online English Extensive Reading Contest with Respect to Gender Difference

    ERIC Educational Resources Information Center

    Liu, I-Fan; Young, Shelley S. -C.

    2017-01-01

    The purpose of this study is to describe an online community-based English extensive reading contest to investigate whether the participants' intrinsic, extrinsic, and interpersonal motivations and learning results show significant gender differences. A total of 501 valid questionnaires (285 females and 216 males) from Taiwanese high school…

  19. Evaluating the Complementary Roles of an SJT and Academic Assessment for Entry into Clinical Practice

    ERIC Educational Resources Information Center

    Cousans, Fran; Patterson, Fiona; Edwards, Helena; Walker, Kim; McLachlan, John C.; Good, David

    2017-01-01

    Although there is extensive evidence confirming the predictive validity of situational judgement tests (SJTs) in medical education, there remains a shortage of evidence for their predictive validity for performance of postgraduate trainees in their first role in clinical practice. Moreover, to date few researchers have empirically examined the…

  20. Two-Method Planned Missing Designs for Longitudinal Research

    ERIC Educational Resources Information Center

    Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.

    2014-01-01

    We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…

  1. Perception of Competence in Middle School Physical Education: Instrument Development and Validation

    ERIC Educational Resources Information Center

    Scrabis-Fletcher, Kristin; Silverman, Stephen

    2010-01-01

    Perception of Competence (POC) has been studied extensively in physical activity (PA) research with similar instruments adapted for physical education (PE) research. Such instruments do not account for the unique PE learning environment. Therefore, an instrument was developed and the scores validated to measure POC in middle school PE. A…

  2. Mechanical testing of bones: the positive synergy of finite-element models and in vitro experiments.

    PubMed

    Cristofolini, Luca; Schileo, Enrico; Juszczyk, Mateusz; Taddei, Fulvia; Martelli, Saulo; Viceconti, Marco

    2010-06-13

    Bone biomechanics have been extensively investigated in the past both with in vitro experiments and numerical models. In most cases either approach is chosen, without exploiting synergies. Both experiments and numerical models suffer from limitations relative to their accuracy and their respective fields of application. In vitro experiments can improve numerical models by: (i) preliminarily identifying the most relevant failure scenarios; (ii) improving the model identification with experimentally measured material properties; (iii) improving the model identification with accurately measured actual boundary conditions; and (iv) providing quantitative validation based on mechanical properties (strain, displacements) directly measured from physical specimens being tested in parallel with the modelling activity. Likewise, numerical models can improve in vitro experiments by: (i) identifying the most relevant loading configurations among a number of motor tasks that cannot be replicated in vitro; (ii) identifying acceptable simplifications for the in vitro simulation; (iii) optimizing the use of transducers to minimize errors and provide measurements at the most relevant locations; and (iv) exploring a variety of different conditions (material properties, interface, etc.) that would require enormous experimental effort. By reporting an example of successful investigation of the femur, we show how a combination of numerical modelling and controlled experiments within the same research team can be designed to create a virtuous circle where models are used to improve experiments, experiments are used to improve models and their combination synergistically provides more detailed and more reliable results than can be achieved with either approach singularly.

  3. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.

  4. Development of the NIH PROMIS ® Sexual Function and Satisfaction measures in patients with cancer.

    PubMed

    Flynn, Kathryn E; Lin, Li; Cyranowski, Jill M; Reeve, Bryce B; Reese, Jennifer Barsky; Jeffery, Diana D; Smith, Ashley Wilder; Porter, Laura S; Dombeck, Carrie B; Bruner, Deborah Watkins; Keefe, Francis J; Weinfurt, Kevin P

    2013-02-01

    We describe the development and validation of the Patient-Reported Outcomes Measurement Information System(®) Sexual Function and Satisfaction (PROMIS(®) SexFS; National Institutes of Health) measures, version 1.0, for cancer populations. To develop a customizable self-report measure of sexual function and satisfaction as part of the U.S. National Institutes of Health PROMIS Network. Our multidisciplinary working group followed a comprehensive protocol for developing psychometrically robust patient-reported outcome measures including qualitative (scale development) and quantitative (psychometric evaluation) development. We performed an extensive literature review, conducted 16 focus groups with cancer patients and multiple discussions with clinicians, and evaluated candidate items in cognitive testing with patients. We administered items to 819 cancer patients. Items were calibrated using item-response theory and evaluated for reliability and validity. The PROMIS SexFS measures, version 1.0, include 81 items in 11 domains: Interest in Sexual Activity, Lubrication, Vaginal Discomfort, Erectile Function, Global Satisfaction with Sex Life, Orgasm, Anal Discomfort, Therapeutic Aids, Sexual Activities, Interfering Factors, and Screener Questions. In addition to content validity (patients indicate that items cover important aspects of their experiences) and face validity (patients indicate that items measure sexual function and satisfaction), the measure shows evidence for discriminant validity (domains discriminate between groups expected to be different) and convergent validity (strong correlations between scores on PROMIS and scores on conceptually similar older measures of sexual function), as well as favorable test-retest reliability among people not expected to change (interclass correlations from two administrations of the instrument, 1 month apart). The PROMIS SexFS offers researchers a reliable and valid set of tools to measure self-reported sexual function and satisfaction among diverse men and women. The measures are customizable; researchers can select the relevant domains and items comprising those domains for their study. © 2013 International Society for Sexual Medicine.

  5. Development of the NIH PROMIS® Sexual Function and Satisfaction Measures in Patients with Cancer

    PubMed Central

    Flynn, Kathryn E.; Lin, Li; Cyranowski, Jill M.; Reeve, Bryce B.; Reese, Jennifer Barsky; Jeffery, Diana D.; Smith, Ashley Wilder; Porter, Laura S.; Dombeck, Carrie B.; Bruner, Deborah Watkins; Keefe, Francis J.; Weinfurt, Kevin P.

    2013-01-01

    Introduction We describe the development and validation of the PROMIS Sexual Function and Satisfaction (PROMIS SexFS) measures version 1.0 for cancer populations. Aim To develop a customizable self-report measure of sexual function and satisfaction as part of the U.S. National Institutes of Health PROMIS® Network. Methods Our multidisciplinary working group followed a comprehensive protocol for developing psychometrically robust patient reported outcome (PRO) measures including qualitative (scale development) and quantitative (psychometric evaluation) development. We performed an extensive literature review, conducted 16 focus groups with cancer patients and multiple discussions with clinicians, and evaluated candidate items in cognitive testing with patients. We administered items to 819 cancer patients. Items were calibrated using item response theory and evaluated for reliability and validity. Main Outcome Measures The PROMIS Sexual Function and Satisfaction (PROMIS SexFS) measures version 1.0 include 79 items in 11 domains: interest in sexual activity, lubrication, vaginal discomfort, erectile function, global satisfaction with sex life, orgasm, anal discomfort, therapeutic aids, sexual activities, interfering factors, and screener questions. Results In addition to content validity (patients indicate that items cover important aspects of their experiences) and face validity (patients indicate that items measure sexual function and satisfaction), the measure shows evidence for discriminant validity (domains discriminate between groups expected to be different), convergent validity (strong correlations between scores on PROMIS and scores on conceptually-similar older measures of sexual function), as well as favorable test-retest reliability among people not expected to change (inter-class correlations from 2 administrations of the instrument, 1 month apart). Conclusions The PROMIS SexFS offers researchers a reliable and valid set of tools to measure self-reported sexual function and satisfaction among diverse men and women. The measures are customizable; researchers can select the relevant domains and items comprising those domains for their study. PMID:23387911

  6. On the modelling of scalar and mass transport in combustor flows

    NASA Technical Reports Server (NTRS)

    Nikjooy, M.; So, R. M. C.

    1989-01-01

    Results are presented of a numerical study of swirling and nonswirling combustor flows with and without density variations. Constant-density arguments are used to justify closure assumptions invoked for the transport equations for turbulent momentum and scalar fluxes, which are written in terms of density-weighted variables. Comparisons are carried out with measurements obtained from three different axisymmetric model combustor experiments covering recirculating flow, swirling flow, and variable-density swirling flow inside the model combustors. Results show that the Reynolds stress/flux models do a credible job of predicting constant-density swirling and nonswirling combustor flows with passive scalar transport. However, their improvements over algebraic stress/flux models are marginal. The extension of the constant-density models to variable-density flow calculations shows that the models are equally valid for such flows.

  7. R. A. Fisher and his advocacy of randomization.

    PubMed

    Hall, Nancy S

    2007-01-01

    The requirement of randomization in experimental design was first stated by R. A. Fisher, statistician and geneticist, in 1925 in his book Statistical Methods for Research Workers. Earlier designs were systematic and involved the judgment of the experimenter; this led to possible bias and inaccurate interpretation of the data. Fisher's dictum was that randomization eliminates bias and permits a valid test of significance. Randomization in experimenting had been used by Charles Sanders Peirce in 1885 but the practice was not continued. Fisher developed his concepts of randomizing as he considered the mathematics of small samples, in discussions with "Student," William Sealy Gosset. Fisher published extensively. His principles of experimental design were spread worldwide by the many "voluntary workers" who came from other institutions to Rothamsted Agricultural Station in England to learn Fisher's methods.

  8. Mobile robotic sensors for perimeter detection and tracking.

    PubMed

    Clark, Justin; Fierro, Rafael

    2007-02-01

    Mobile robot/sensor networks have emerged as tools for environmental monitoring, search and rescue, exploration and mapping, evaluation of civil infrastructure, and military operations. These networks consist of many sensors each equipped with embedded processors, wireless communication, and motion capabilities. This paper describes a cooperative mobile robot network capable of detecting and tracking a perimeter defined by a certain substance (e.g., a chemical spill) in the environment. Specifically, the contributions of this paper are twofold: (i) a library of simple reactive motion control algorithms and (ii) a coordination mechanism for effectively carrying out perimeter-sensing missions. The decentralized nature of the methodology implemented could potentially allow the network to scale to many sensors and to reconfigure when adding/deleting sensors. Extensive simulation results and experiments verify the validity of the proposed cooperative control scheme.

  9. Model-Based Reinforcement of Kinect Depth Data for Human Motion Capture Applications

    PubMed Central

    Calderita, Luis Vicente; Bandera, Juan Pedro; Bustos, Pablo; Skiadopoulos, Andreas

    2013-01-01

    Motion capture systems have recently experienced a strong evolution. New cheap depth sensors and open source frameworks, such as OpenNI, allow for perceiving human motion on-line without using invasive systems. However, these proposals do not evaluate the validity of the obtained poses. This paper addresses this issue using a model-based pose generator to complement the OpenNI human tracker. The proposed system enforces kinematics constraints, eliminates odd poses and filters sensor noise, while learning the real dimensions of the performer's body. The system is composed by a PrimeSense sensor, an OpenNI tracker and a kinematics-based filter and has been extensively tested. Experiments show that the proposed system improves pure OpenNI results at a very low computational cost. PMID:23845933

  10. A Study of Cloud Radiative Forcing and Feedback

    NASA Technical Reports Server (NTRS)

    Ramanathan, Veerabhadran

    2000-01-01

    The main objective of the grant proposal was to participate in the CERES (Cloud and Earth's Radiant Energy System) Satellite experiment and perform interdisciplinary investigation of NASA's Earth Observing System (EOS). During the grant period, massive amounts of scientific data from diverse platforms have been accessed, processed and archived for continuing use; several software packages have been developed for integration of different data streams for performing scientific evaluation; extensive validation studies planned have been completed culminating in the development of important algorithms that are being used presently in the operational production of data from the CERES. Contributions to the inter-disciplinary science investigations have been significantly more than originally envisioned. The results of these studies have appeared in several refereed journals and conference proceedings. They are listed at the end of this report.

  11. Spatial-time-state fusion algorithm for defect detection through eddy current pulsed thermography

    NASA Astrophysics Data System (ADS)

    Xiao, Xiang; Gao, Bin; Woo, Wai Lok; Tian, Gui Yun; Xiao, Xiao Ting

    2018-05-01

    Eddy Current Pulsed Thermography (ECPT) has received extensive attention due to its high sensitive of detectability on surface and subsurface cracks. However, it remains as a difficult challenge in unsupervised detection as to identify defects without knowing any prior knowledge. This paper presents a spatial-time-state features fusion algorithm to obtain fully profile of the defects by directional scanning. The proposed method is intended to conduct features extraction by using independent component analysis (ICA) and automatic features selection embedding genetic algorithm. Finally, the optimal feature of each step is fused to obtain defects reconstruction by applying common orthogonal basis extraction (COBE) method. Experiments have been conducted to validate the study and verify the efficacy of the proposed method on blind defect detection.

  12. NASA/RAE collaboration on nonlinear control using the F-8C digital fly-by-wire aircraft

    NASA Technical Reports Server (NTRS)

    Butler, G. F.; Corbin, M. J.; Mepham, S.; Stewart, J. F.; Larson, R. R.

    1983-01-01

    Design procedures are reviewed for variable integral control to optimize response (VICTOR) algorithms and results of preliminary flight tests are presented. The F-8C aircraft is operated in the remotely augmented vehicle (RAV) mode, with the control laws implemented as FORTRAN programs on a ground-based computer. Pilot commands and sensor information are telemetered to the ground, where the data are processed to form surface commands which are then telemetered back to the aircraft. The RAV mode represents a singlestring (simplex) system and is therefore vulnerable to a hardover since comparison monitoring is not possible. Hence, extensive error checking is conducted on both the ground and airborne computers to prevent the development of potentially hazardous situations. Experience with the RAV monitoring and validation procedures is described.

  13. Context-dependent logo matching and recognition.

    PubMed

    Sahbi, Hichem; Ballan, Lamberto; Serra, Giuseppe; Del Bimbo, Alberto

    2013-03-01

    We contribute, through this paper, to the design of a novel variational framework able to match and recognize multiple instances of multiple reference logos in image archives. Reference logos and test images are seen as constellations of local features (interest points, regions, etc.) and matched by minimizing an energy function mixing: 1) a fidelity term that measures the quality of feature matching, 2) a neighborhood criterion that captures feature co-occurrence/geometry, and 3) a regularization term that controls the smoothness of the matching solution. We also introduce a detection/recognition procedure and study its theoretical consistency. Finally, we show the validity of our method through extensive experiments on the challenging MICC-Logos dataset. Our method overtakes, by 20%, baseline as well as state-of-the-art matching/recognition procedures.

  14. Combining LIDAR estimates of aboveground biomass and Landsat estimates of stand age for spatially extensive validation of modeled forest productivity.

    Treesearch

    M.A. Lefsky; D.P. Turner; M. Guzy; W.B. Cohen

    2005-01-01

    Extensive estimates of forest productivity are required to understand the relationships between shifting land use, changing climate and carbon storage and fluxes. Aboveground net primary production of wood (NPPAw) is a major component of total NPP and of net ecosystem production (NEP). Remote sensing of NPP and NPPAw is...

  15. Effect of level difference between left and right vocal folds on phonation: Physical experiment and theoretical study.

    PubMed

    Tokuda, Isao T; Shimamura, Ryo

    2017-08-01

    As an alternative factor to produce asymmetry between left and right vocal folds, the present study focuses on level difference, which is defined as the distance between the upper surfaces of the bilateral vocal folds in the inferior-superior direction. Physical models of the vocal folds were utilized to study the effect of the level difference on the phonation threshold pressure. A vocal tract model was also attached to the vocal fold model. For two types of different models, experiments revealed that the phonation threshold pressure tended to increase as the level difference was extended. Based upon a small amplitude approximation of the vocal fold oscillations, a theoretical formula was derived for the phonation threshold pressure. This theory agrees with the experiments, especially when the phase difference between the left and right vocal folds is not extensive. Furthermore, an asymmetric two-mass model was simulated with a level difference to validate the experiments as well as the theory. The primary conclusion is that the level difference has a potential effect on voice production especially for patients with an extended level of vertical difference in the vocal folds, which might be taken into account for the diagnosis of voice disorders.

  16. Ruggedness testing and validation of a practical analytical method for > 100 veterinary drug residues in bovine muscle by ultrahigh performance liquid chromatography – tandem mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    In this study, optimization, extension, and validation of a streamlined, qualitative and quantitative multiclass, multiresidue method was conducted to monitor great than100 veterinary drug residues in meat using ultrahigh-performance liquid chromatography – tandem mass spectrometry (UHPLC-MS/MS). I...

  17. Status and plans for the ANOPP/HSR prediction system

    NASA Technical Reports Server (NTRS)

    Nolan, Sandra K.

    1992-01-01

    ANOPP is a comprehensive prediction system which was developed and validated by NASA. Because ANOPP is a system prediction program, it allows aerospace industry researchers to create trade-off studies with a variety of aircraft noise problems. The extensive validation of ANOPP allows the program results to be used as a benchmark for testing other prediction codes.

  18. Measuring the Effect of Tourism Services on Travelers' Quality of Life: Further Validation.

    ERIC Educational Resources Information Center

    Neal, Janet D.; Sirgy, M. Joseph; Uysal, Muzaffer

    2004-01-01

    lication and extension study provided additional validational support of the original tourism services satisfaction measure in relation to QOL-related measures.Neal, Sirgy and Uysal (1999) developed a model and a measure to capture the effect of tourism services on travelers' quality of life (QOL). They hypothesized that travelers' overall life…

  19. Considering the Needs of English Language Learner Populations: An Examination of the Population Validity of Reading Intervention Research

    ERIC Educational Resources Information Center

    Moore, Brooke A.; Klingner, Janette K.

    2014-01-01

    This article synthesizes reading intervention research studies intended for use with struggling or at-risk students to determine which studies adequately address population validity, particularly in regard to the diverse reading needs of English language learners. An extensive search of the professional literature between 2001 and 2010 yielded a…

  20. Parent Reports of Young Spanish-English Bilingual Children's Productive Vocabulary: A Development and Validation Study

    ERIC Educational Resources Information Center

    Mancilla-Martinez, Jeannette; Gámez, Perla B.; Vagh, Shaher Banu; Lesaux, Nonie K.

    2016-01-01

    Purpose: This 2-phase study aims to extend research on parent report measures of children's productive vocabulary by investigating the development (n = 38) of the Spanish Vocabulary Extension and validity (n = 194) of the 100-item Spanish and English MacArthur-Bates Communicative Development Inventories Toddler Short Forms and Upward Extension…

  1. Preparation of the implementation plan of AASHTO Mechanistic-Empirical Pavement Design Guide (M-EPDG) in Connecticut : Phase II : expanded sensitivity analysis and validation with pavement management data.

    DOT National Transportation Integrated Search

    2017-02-08

    The study re-evaluates distress prediction models using the Mechanistic-Empirical Pavement Design Guide (MEPDG) and expands the sensitivity analysis to a wide range of pavement structures and soils. In addition, an extensive validation analysis of th...

  2. Commissioning and initial experience with the ALICE on-line

    NASA Astrophysics Data System (ADS)

    Altini, V.; Anticic, T.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Dénes, E.; Divià, R.; Fuchs, U.; Kiss, T.; Makhlyueva, I.; Roukoutakis, F.; Schossmaier, K.; Soós, C.; Vande Vyvre, P.; von Haller, B.; ALICE Collaboration

    2010-04-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A large bandwidth and flexible Data Acquisition System (DAQ) has been designed and deployed to collect sufficient statistics in the short running time available per year for heavy ions and to accommodate very different requirements originated from the 18 sub-detectors. This paper will present the large scale tests conducted to assess the standalone DAQ performances, the interfaces with the other online systems and the extensive commissioning performed in order to be fully prepared for physics data taking. It will review the experience accumulated since May 2007 during the standalone commissioning of the main detectors and the global cosmic runs and the lessons learned from this exposure on the "battle field". It will also discuss the test protocol followed to integrate and validate each sub-detector with the online systems and it will conclude with the first results of the LHC injection tests and startup in September 2008. Several papers of the same conference present in more details some elements of the ALICE DAQ system.

  3. Investigation of transient melting of tungsten by ELMs in ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Krieger, K.; Sieglin, B.; Balden, M.; Coenen, J. W.; Göths, B.; Laggner, F.; de Marne, P.; Matthews, G. F.; Nille, D.; Rohde, V.; Dejarnac, R.; Faitsch, M.; Giannone, L.; Herrmann, A.; Horacek, J.; Komm, M.; Pitts, R. A.; Ratynskaia, S.; Thoren, E.; Tolias, P.; ASDEX-Upgrade Team; EUROfusion MST1 Team

    2017-12-01

    Repetitive melting of tungsten by power transients originating from edge localized modes (ELMs) has been studied in the tokamak experiment ASDEX Upgrade. Tungsten samples were exposed to H-mode discharges at the outer divertor target plate using the Divertor Manipulator II system. The exposed sample was designed with an elevated sloped surface inclined against the incident magnetic field to increase the projected parallel power flux to a level were transient melting by ELMs would occur. Sample exposure was controlled by moving the outer strike point to the sample location. As extension to previous melt studies in the new experiment both the current flow from the sample to vessel potential and the local surface temperature were measured with sufficient time resolution to resolve individual ELMs. The experiment provided for the first time a direct link of current flow and surface temperature during transient ELM events. This allows to further constrain the MEMOS melt motion code predictions and to improve the validation of its underlying model assumptions. Post exposure ex situ analysis of the retrieved samples confirms the decreased melt motion observed at shallower magnetic field line to surface angles compared to that at leading edges exposed to the parallel power flux.

  4. 7 CFR 3430.33 - Selection of reviewers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION... and experience in relevant scientific, extension, or education fields taking into account the following factors: (1) Level of relevant formal scientific, technical education, and extension experience of...

  5. Towards shared patient records: an architecture for using routine data for nationwide research.

    PubMed

    Knaup, Petra; Garde, Sebastian; Merzweiler, Angela; Graf, Norbert; Schilling, Freimut; Weber, Ralf; Haux, Reinhold

    2006-01-01

    Ubiquitous information is currently one of the most challenging slogans in medical informatics research. An adequate architecture for shared electronic patient records is needed which can use data for multiple purposes and which is extensible for new research questions. We introduce eardap as architecture for using routine data for nationwide clinical research in a multihospital environment. eardap can be characterized as terminology-based. Main advantage of our approach is the extensibility by new items and new research questions. Once the definition of items for a research question is finished, a consistent, corresponding database can be created without any informatics skills. Our experiences in pediatric oncology in Germany have shown the applicability of eardap. The functions of our core system were in routine clinical use in several hospitals. We validated the terminology management system (TMS) and the module generation tool with the basic data set of pediatric oncology. The multiple usability depends mainly on the quality of item planning in the TMS. High quality harmonization will lead to a higher amount of multiply used data. When using eardap, special emphasis is to be placed on interfaces to local hospital information systems and data security issues.

  6. Decoding Dynamic Brain Patterns from Evoked Responses: A Tutorial on Multivariate Pattern Analysis Applied to Time Series Neuroimaging Data.

    PubMed

    Grootswagers, Tijl; Wardle, Susan G; Carlson, Thomas A

    2017-04-01

    Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analyzing fMRI data. Although decoding methods have been extensively applied in brain-computer interfaces, these methods have only recently been applied to time series neuroimaging data such as MEG and EEG to address experimental questions in cognitive neuroscience. In a tutorial style review, we describe a broad set of options to inform future time series decoding studies from a cognitive neuroscience perspective. Using example MEG data, we illustrate the effects that different options in the decoding analysis pipeline can have on experimental results where the aim is to "decode" different perceptual stimuli or cognitive states over time from dynamic brain activation patterns. We show that decisions made at both preprocessing (e.g., dimensionality reduction, subsampling, trial averaging) and decoding (e.g., classifier selection, cross-validation design) stages of the analysis can significantly affect the results. In addition to standard decoding, we describe extensions to MVPA for time-varying neuroimaging data including representational similarity analysis, temporal generalization, and the interpretation of classifier weight maps. Finally, we outline important caveats in the design and interpretation of time series decoding experiments.

  7. Cell-Surface Bound Nonreceptors and Signaling Morphogen Gradients

    PubMed Central

    Wan, Frederic Y.M.

    2013-01-01

    The patterning of many developing tissues is orchestrated by gradients of signaling morphogens. Included among the molecular events that drive the formation of morphogen gradients are a variety of elaborate regulatory interactions. Such interactions are thought to make gradients robust, i.e. insensitive to change in the face of genetic or environmental perturbations. But just how this is accomplished is a major unanswered question. Recently extensive numerical simulations suggest that robustness of signaling gradients can be achieved through morphogen degradation mediated by cell surface bound non-signaling receptor molecules (or nonreceptors for short) such as heparan sulfate proteoglycans (HSPG). The present paper provides a mathematical validation of the results from the aforementioned numerical experiments. Extension of a basic extracellular model to include reversible binding with nonreceptors synthesized at a prescribed rate and mediated morphogen degradation shows that the signaling gradient diminishes with increasing concentration of cell-surface nonreceptors. Perturbation and asymptotic solutions obtained for i) low (receptor and nonreceptor) occupancy, and ii) high nonreceptor concntration permit more explicit delineation of the effects of nonreceptors on signaling gradients and facilitate the identification of scenarios in which the presence of nonreceptors may or may not be effective in promoting robustness. PMID:25232201

  8. Nonlinear stability of oscillatory core-annular flow: A generalized Kuramoto-Sivashinsky equation with time periodic coefficients

    NASA Technical Reports Server (NTRS)

    Coward, Adrian V.; Papageorgiou, Demetrios T.; Smyrlis, Yiorgos S.

    1994-01-01

    In this paper the nonlinear stability of two-phase core-annular flow in a pipe is examined when the acting pressure gradient is modulated by time harmonic oscillations and viscosity stratification and interfacial tension is present. An exact solution of the Navier-Stokes equations is used as the background state to develop an asymptotic theory valid for thin annular layers, which leads to a novel nonlinear evolution describing the spatio-temporal evolution of the interface. The evolution equation is an extension of the equation found for constant pressure gradients and generalizes the Kuramoto-Sivashinsky equation with dispersive effects found by Papageorgiou, Maldarelli & Rumschitzki, Phys. Fluids A 2(3), 1990, pp. 340-352, to a similar system with time periodic coefficients. The distinct regimes of slow and moderate flow are considered and the corresponding evolution is derived. Certain solutions are described analytically in the neighborhood of the first bifurcation point by use of multiple scales asymptotics. Extensive numerical experiments, using dynamical systems ideas, are carried out in order to evaluate the effect of the oscillatory pressure gradient on the solutions in the presence of a constant pressure gradient.

  9. Ocean acidification affects coral growth by reducing skeletal density.

    PubMed

    Mollica, Nathaniel R; Guo, Weifu; Cohen, Anne L; Huang, Kuo-Fang; Foster, Gavin L; Donald, Hannah K; Solow, Andrew R

    2018-02-20

    Ocean acidification (OA) is considered an important threat to coral reef ecosystems, because it reduces the availability of carbonate ions that reef-building corals need to produce their skeletons. However, while theory predicts that coral calcification rates decline as carbonate ion concentrations decrease, this prediction is not consistently borne out in laboratory manipulation experiments or in studies of corals inhabiting naturally low-pH reefs today. The skeletal growth of corals consists of two distinct processes: extension (upward growth) and densification (lateral thickening). Here, we show that skeletal density is directly sensitive to changes in seawater carbonate ion concentration and thus, to OA, whereas extension is not. We present a numerical model of Porites skeletal growth that links skeletal density with the external seawater environment via its influence on the chemistry of coral calcifying fluid. We validate the model using existing coral skeletal datasets from six Porites species collected across five reef sites and use this framework to project the impact of 21st century OA on Porites skeletal density across the global tropics. Our model predicts that OA alone will drive up to 20.3 ± 5.4% decline in the skeletal density of reef-building Porites corals.

  10. Product Distribution Theory for Control of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Lee, Chia Fan; Wolpert, David H.

    2004-01-01

    Product Distribution (PD) theory is a new framework for controlling Multi-Agent Systems (MAS's). First we review one motivation of PD theory, as the information-theoretic extension of conventional full-rationality game theory to the case of bounded rational agents. In this extension the equilibrium of the game is the optimizer of a Lagrangian of the (probability distribution of) the joint stare of the agents. Accordingly we can consider a team game in which the shared utility is a performance measure of the behavior of the MAS. For such a scenario the game is at equilibrium - the Lagrangian is optimized - when the joint distribution of the agents optimizes the system's expected performance. One common way to find that equilibrium is to have each agent run a reinforcement learning algorithm. Here we investigate the alternative of exploiting PD theory to run gradient descent on the Lagrangian. We present computer experiments validating some of the predictions of PD theory for how best to do that gradient descent. We also demonstrate how PD theory can improve performance even when we are not allowed to rerun the MAS from different initial conditions, a requirement implicit in some previous work.

  11. Two-Stage Categorization in Brand Extension Evaluation: Electrophysiological Time Course Evidence

    PubMed Central

    Wang, Xiaoyi

    2014-01-01

    A brand name can be considered a mental category. Similarity-based categorization theory has been used to explain how consumers judge a new product as a member of a known brand, a process called brand extension evaluation. This study was an event-related potential study conducted in two experiments. The study found a two-stage categorization process reflected by the P2 and N400 components in brand extension evaluation. In experiment 1, a prime–probe paradigm was presented in a pair consisting of a brand name and a product name in three conditions, i.e., in-category extension, similar-category extension, and out-of-category extension. Although the task was unrelated to brand extension evaluation, P2 distinguished out-of-category extensions from similar-category and in-category ones, and N400 distinguished similar-category extensions from in-category ones. In experiment 2, a prime–probe paradigm with a related task was used, in which product names included subcategory and major-category product names. The N400 elicited by subcategory products was more significantly negative than that elicited by major-category products, with no salient difference in P2. We speculated that P2 could reflect the early low-level and similarity-based processing in the first stage, whereas N400 could reflect the late analytic and category-based processing in the second stage. PMID:25438152

  12. Strong monogamy inequalities for four qubits

    NASA Astrophysics Data System (ADS)

    Regula, Bartosz; Osterloh, Andreas; Adesso, Gerardo

    2016-05-01

    We investigate possible generalizations of the Coffman-Kundu-Wootters monogamy inequality to four qubits, accounting for multipartite entanglement in addition to the bipartite terms. We show that the most natural extension of the inequality does not hold in general, and we describe the violations of this inequality in detail. We investigate alternative ways to extend the monogamy inequality to express a constraint on entanglement sharing valid for all four-qubit states, and perform an extensive numerical analysis of randomly generated four-qubit states to explore the properties of such extensions.

  13. Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2000-01-01

    A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.

  14. Ship Air Wake Detection Using a Small Fixed Wing Unmanned Aerial Vehicle

    NASA Astrophysics Data System (ADS)

    Phelps, David M.

    A ship's air wake is dynamically detected using an airborne inertial measurement unit (IMU) and global positioning system (GPS) attached to a fixed wing unmanned aerial system. A fixed wing unmanned aerial system (UAS) was flown through the air wake created by an underway 108 ft (32.9m) long research vessel in pre designated flight paths. The instrumented aircraft was used to validate computational fluid dynamic (CFD) simulations of naval ship air wakes. Computer models of the research ship and the fixed wing UAS were generated and gridded using NASA's TetrUSS software. Simulations were run using Kestrel, a Department of Defense CFD software to validate the physical experimental data collection method. Air wake simulations were run at various relative wind angles and speeds. The fixed wing UAS was subjected to extensive wind tunnel testing to generate a table of aerodynamic coefficients as a function of control surface deflections, angle of attack and sideslip. The wind tunnel experimental data was compared against similarly structured CFD experiments to validate the grid and model of fixed wing UAS. Finally, a CFD simulation of the fixed wing UAV flying through the generated wake was completed. Forces on the instrumented aircraft were calculated from the data collected by the IMU. Comparison of experimental and simulation data showed that the fixed wing UAS could detect interactions with the ship air wake.

  15. Prediction of miRNA targets.

    PubMed

    Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis

    2015-01-01

    Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.

  16. Validation of Cross Sections for Monte Carlo Simulation of the Photoelectric Effect

    NASA Astrophysics Data System (ADS)

    Han, Min Cheol; Kim, Han Sung; Pia, Maria Grazia; Basaglia, Tullio; Batič, Matej; Hoff, Gabriela; Kim, Chan Hyeong; Saracco, Paolo

    2016-04-01

    Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofield's 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library (EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofield's non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebel's parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthill's parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.

  17. Reliability and validity of an iPhone(®) application for the measurement of lumbar spine flexion and extension range of motion.

    PubMed

    Pourahmadi, Mohammad Reza; Taghipour, Morteza; Jannati, Elham; Mohseni-Bandpei, Mohammad Ali; Ebrahimi Takamjani, Ismail; Rajabzadeh, Fatemeh

    2016-01-01

    Measurement of lumbar spine range of motion (ROM) is often considered to be an essential component of lumbar spine physiotherapy and orthopedic assessment. The measurement can be carried out through various instruments such as inclinometers, goniometers, and etc. Recent smartphones have been equipped with accelerometers and magnetometers, which, through specific software applications (apps) can be used for inclinometric functions. The main purpose was to investigate the reliability and validity of an iPhone(®) app (TiltMeter(©) -advanced level and inclinometer) for measuring standing lumbar spine flexion-extension ROM in asymptomatic subjects. A cross-sectional study was carried out. This study was conducted in a physiotherapy clinic located at School of Rehabilitation Sciences, Iran University of Medical Science and Health Services, Tehran, Iran. A convenience sample of 30 asymptomatic adults (15 males; 15 females; age range = 18-55 years) was recruited between August 2015 and December 2015. Following a 2-minute warm-up, the subjects were asked to stand in a relaxed position and their skin was marked at the T12-L1 and S1-S2 spinal levels. From this position, they were asked to perform maximum lumbar flexion followed by maximum lumbar extension with their knees straight. Two blinded raters each used an inclinometer and the iPhone (®) app to measure lumbar spine flexion-extension ROM. A third rater read the measured angles. To calculate total lumbar spine flexion-extension ROM, the measurement from S1-S2 was subtracted from T12-L1. The second (2 hours later) and third (48 hours later) sessions were carried out in the same manner as the first session. All of the measurements were conducted 3 times and the mean value of 3 repetitions for each measurement was used for analysis. Intraclass correlation coefficient (ICC) models (3, k) and (2, k) were used to determine the intra-rater and inter-rater reliability, respectively. The Pearson correlation coefficients were used to establish concurrent validity of the iPhone(®) app. Furthermore, minimum detectable change at the 95% confidence level (MDC95) was computed as 1.96 × standard error of measurement × [Formula: see text]. Good to excellent intra-rater and inter-rater reliability were demonstrated for both the gravity-based inclinometer with ICC values of ≥0.84 and ≥0.77 and the iPhone(®) app with ICC values of ≥0.85 and ≥0.85, respectively. The MDC95 ranged from 5.82°to 8.18°for the intra-rater analysis and from 7.38°to 8.66° for the inter-rater analysis. The concurrent validity for flexion and extension between the 2 instruments was 0.85 and 0.91, respectively. The iPhone(®)app possesses good to excellent intra-rater and inter-rater reliability and concurrent validity. It seems that the iPhone(®) app can be used for the measurement of lumbar spine flexion-extension ROM. IIb.

  18. Turning an Extension Aide into an Extension Agent

    ERIC Educational Resources Information Center

    Seevers, Brenda; Dormody, Thomas J.

    2010-01-01

    For any organization to remain sustainable, a renewable source of faculty and staff needs to be available. The Extension Internship Program for Juniors and Seniors in High School is a new tool for recruiting and developing new Extension agents. Students get "hands on" experience working in an Extension office and earn college credit…

  19. Development and Validation of an Internet Use Attitude Scale

    ERIC Educational Resources Information Center

    Zhang, Yixin

    2007-01-01

    This paper describes the development and validation of a new 40-item Internet Attitude Scale (IAS), a one-dimensional inventory for measuring the Internet attitudes. The first experiment initiated a generic Internet attitude questionnaire, ensured construct validity, and examined factorial validity and reliability. The second experiment further…

  20. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  1. Extension of a Kinetic Approach to Chemical Reactions to Electronic Energy Levels and Reactions Involving Charged Species with Application to DSMC Simulations

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2014-01-01

    The ability to compute rarefied, ionized hypersonic flows is becoming more important as missions such as Earth reentry, landing high mass payloads on Mars, and the exploration of the outer planets and their satellites are being considered. Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties are extended in the current work to include electronic energy level transitions and reactions involving charged particles. These extensions are shown to agree favorably with reported transition and reaction rates from the literature for near-equilibrium conditions. Also, the extensions are applied to the second flight of the Project FIRE flight experiment at 1634 seconds with a Knudsen number of 0.001 at an altitude of 76.4 km. In order to accomplish this, NASA's direct simulation Monte Carlo code DAC was rewritten to include the ability to simulate charge-neutral ionized flows, take advantage of the recently introduced chemistry model, and to include the extensions presented in this work. The 1634 second data point was chosen for comparisons to be made in order to include a CFD solution. The Knudsen number at this point in time is such that the DSMC simulations are still tractable and the CFD computations are at the edge of what is considered valid because, although near-transitional, the flow is still considered to be continuum. It is shown that the inclusion of electronic energy levels in the DSMC simulation is necessary for flows of this nature and is required for comparison to the CFD solution. The flow field solutions are also post-processed by the nonequilibrium radiation code HARA to compute the radiative portion.

  2. NASA F-16XL supersonic laminar flow control program overview

    NASA Technical Reports Server (NTRS)

    Fischer, Michael C.

    1992-01-01

    The viewgraphs and discussion of the NASA supersonic laminar flow control program are provided. Successful application of laminar flow control to a High Speed Civil Transport (HSCT) offers significant benefits in reductions of take-off gross weight, mission fuel burn, cruise drag, structural temperatures, engine size, emissions, and sonic boom. The ultimate economic success of the proposed HSCT may depend on the successful adaption of laminar flow control, which offers the single most significant potential improvements in lift drag ratio (L/D) of all the aerodynamic technologies under consideration. The F-16XL Supersonic Laminar Flow Control (SLFC) Experiment was conceived based on the encouraging results of in-house and NASA supported industry studies to determine if laminar flow control is feasible for the HSCT. The primary objective is to achieve extensive laminar flow (50-60 percent chord) on a highly swept supersonic wing. Data obtained from the flight test will be used to validate existing Euler and Navier Stokes aerodynamic codes and transition prediction boundary layer stability codes. These validated codes and developed design methodology will be delivered to industry for their use in designing supersonic laminar flow control wings. Results from this experiment will establish preliminary suction system design criteria enabling industry to better size the suction system and develop improved estimates of system weight, fuel volume loss due to wing ducting, turbocompressor power requirements, etc. so that benefits and penalties can be more accurately assessed.

  3. Convergent Validity with the BERS-2 Teacher Rating Scale and the Achenbach Teacher's Report Form: A Replication and Extension

    ERIC Educational Resources Information Center

    Benner, Gregory J.; Beaudoin, Kathleen; Mooney, Paul; Uhing, Brad M.; Pierce, Corey D.

    2008-01-01

    In the present study, we sought to extend instrument validation research for a strength-based emotional and behavior rating scale, the "Teacher Rating Scale of the Behavior and Emotional Rating Scale-Second Edition" (BERS-2; Epstein, M. H. (2004). "Behavioral and emotional rating scale" (2nd ed.). Austin, TX: PRO-ED) through…

  4. Psychometric Support of the School Climate Measure in a Large, Diverse Sample of Adolescents: A Replication and Extension

    ERIC Educational Resources Information Center

    Zullig, Keith J.; Collins, Rani; Ghani, Nadia; Patton, Jon M.; Huebner, E. Scott; Ajamie, Jean

    2014-01-01

    Background: The School Climate Measure (SCM) was developed and validated in 2010 in response to a dearth of psychometrically sound school climate instruments. This study sought to further validate the SCM on a large, diverse sample of Arizona public school adolescents (N = 20,953). Methods: Four SCM domains (positive student-teacher relationships,…

  5. Validating the Technology Acceptance Model in the Context of the Laboratory Information System-Electronic Health Record Interface System

    ERIC Educational Resources Information Center

    Aquino, Cesar A.

    2014-01-01

    This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…

  6. Validity and reliability of questionnaires measuring physical activity self-efficacy, enjoyment, social support among Hong Kong Chinese children

    USDA-ARS?s Scientific Manuscript database

    Physical activity (PA) correlates have not been extensively studied in Hong Kong children. The aim of this study is to assess the validity and reliability of translated scales to measure PA related self-efficacy, enjoyment and social support in Hong Kong Chinese children. Sample 1 (n=273, aged 8–12 ...

  7. Validation of the Olympic Games Attitude Scale (OGAS): Evidence from Exploratory and Confirmatory Factor Analyses

    ERIC Educational Resources Information Center

    Mak, Jennifer Y.; Cheung, Siu-Yin; King, Carina C.; Lam, Eddie T. C.

    2016-01-01

    There have been extensive studies of local residents' perception and reaction to the impacts of mega events. However, there is limited empirical research on the social impacts that shape foreign attitudes toward the host country. The purpose of this study was to develop and validate the Olympic Games Attitude Scale (OGAS) to examine viewers'…

  8. Quality of College Life (QCL) of Students: Further Validation of a Measure of Well-Being

    ERIC Educational Resources Information Center

    Sirgy, M. Joseph; Lee, Dong-Jin; Grzeskowiak, Stephan; Yu, Grace B.; Webb, Dave; El-Hasan, Karma; Vega, Jose Jesus Garcia; Ekici, Ahmet; Johar, J. S.; Krishen, Anjala; Kangal, Ayca; Swoboda, Bernhard; Claiborne, C. B.; Maggino, Filomena; Rahtz, Don; Canton, Alicia; Kuruuzum, Ayse

    2010-01-01

    This paper reports a study designed to further validate a measure of quality of college life (QCL) of university students (Sirgy, Grzeskowiak, Rahtz, "Soc Indic Res" 80(2), 343-360, 2007). Two studies were conducted: a replication study and an extension study. The replication study involved surveys of 10 different college campuses in different…

  9. The PKRC's Value as a Professional Development Model Validated

    ERIC Educational Resources Information Center

    Larson, Dale

    2013-01-01

    After a brief review of the 4-H professional development standards, a new model for determining the value of continuing professional development is introduced and applied to the 4-H standards. The validity of the 4-H standards is affirmed. 4-H Extension professionals are encouraged to celebrate the strength of their standards and to engage the…

  10. Strong monogamy conjecture for multiqubit entanglement: the four-qubit case.

    PubMed

    Regula, Bartosz; Di Martino, Sara; Lee, Soojoon; Adesso, Gerardo

    2014-09-12

    We investigate the distribution of bipartite and multipartite entanglement in multiqubit states. In particular, we define a set of monogamy inequalities sharpening the conventional Coffman-Kundu-Wootters constraints, and we provide analytical proofs of their validity for relevant classes of states. We present extensive numerical evidence validating the conjectured strong monogamy inequalities for arbitrary pure states of four qubits.

  11. Search for Spatially Extended Fermi-LAT Sources Using Two Years of Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lande, Joshua; Ackermann, Markus; Allafort, Alice

    2012-07-13

    Spatial extension is an important characteristic for correctly associating {gamma}-ray-emitting sources with their counterparts at other wavelengths and for obtaining an unbiased model of their spectra. We present a new method for quantifying the spatial extension of sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi). We perform a series of Monte Carlo simulations to validate this tool and calculate the LAT threshold for detecting the spatial extension of sources. We then test all sources in the second Fermi -LAT catalog (2FGL) for extension. We report the detection of sevenmore » new spatially extended sources.« less

  12. Oil Slick Characterization Using Synthetic Aperture Radar

    NASA Astrophysics Data System (ADS)

    Jones, C. E.; Breivik, O.; Brekke, C.; Skrunes, S.; Holt, B.

    2015-12-01

    Oil spills are a hazard worldwide with potential of causing high impact disasters, and require an active oil spill response capability to protect personnel, the ecosystem, and the energy supply. As the amount of oil in traditionally accessible reserves decline, there will be increasing oil extraction from the Arctic and deep-water wells, both new sources with high risk and high cost for monitoring and response. Although radar has long been used for mapping the spatial extent of oil slicks, it is only since the Deepwater Horizon spill that synthetic aperture radar (SAR) has been shown capable of characterizing oil properties within a slick, and therefore useful for directing response to the recoverable thicker slicks or emulsions. Here we discuss a 2015 Norwegian oil-on-water spill experiment in which emulsions of known quantity and water-to-oil ratio along with a look-alike slick of plant oil were released in the North Sea and imaged with polarimetric SAR (PolSAR) by NASA's UAVSAR instrument for several hours following release. During the experiment, extensive in situ measurements were made from ship or aircraft with meteorological instruments, released drift buoys, and optical/IR imagers. The experiment was designed to provide validation data for development of a physical model relating polarization-dependent electromagnetic scattering to the dielectric properties of oil mixed with ocean water, which is the basis for oil characterization with SAR. Data were acquired with X-, C-, and L-band satellite-based SARs to enable multi-frequency comparison of characterization capabilities. In addition, the data are used to develop methods to differentiate mineral slicks from biogenic look-alikes, and to better understand slick weathering and dispersion. The results will provide a basis for modeling oil-in-ice spills, currently a high priority for nations involved in Arctic oil exploration. Here we discuss the Norwegian experiment, the validation data, and the results of analysis of the SAR data acquired during the experiment. This work was carried out in part at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with NASA. The Norwegian experiment was partly financed by CIRFA - Centre for integrated remote sensing and forecasting for arctic operations.

  13. Expert system verification and validation study

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.

  14. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    PubMed

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  15. Numerical investigation of deep-crust behavior under lithospheric extension

    NASA Astrophysics Data System (ADS)

    Korchinski, Megan; Rey, Patrice F.; Mondy, Luke; Teyssier, Christian; Whitney, Donna L.

    2018-02-01

    What are the conditions under which lithospheric extension drives exhumation of the deep orogenic crust during the formation of gneiss domes? The mechanical link between extension of shallow crust and flow of deep crust is investigated using two-dimensional numerical experiments of lithospheric extension in which the crust is 60 km thick and the deep-crust viscosity and density parameter space is explored. Results indicate that the style of extension of the shallow crust and the path, magnitude, and rate of flow of deep crust are dynamically linked through the deep-crust viscosity, with density playing an important role in experiments with a high-viscosity deep crust. Three main groups of domes are defined based on their mechanisms of exhumation across the viscosity-density parameter space. In the first group (low-viscosity, low-density deep crust), domes develop by lateral and upward flow of the deep crust at km m.y-1 velocity rates (i.e. rate of experiment boundary extension). In this case, extension in the shallow crust is localized on a single interface, and the deep crust traverses the entire thickness of the crust to the Earth's near-surface in 5 m.y. This high exhuming power relies on the dynamic feedback between the flow of deep crust and the localization of extension in the shallow crust. The second group (intermediate-viscosity, low-density deep crust) has less exhuming power because the stronger deep crust flows less readily and instead accommodates more uniform extension, which imparts distributed extension to the shallow crust. The third group represents the upper limits of viscosity and density for the deep crust; in this case the low buoyancy of the deep crust results in localized thinning of the crust with large upward motion of the Moho and lithosphere-asthenosphere boundary. These numerical experiments test the exhuming power of the deep crust in the formation of extensional gneiss domes.

  16. Impacts of Personal Experience: Informing Water Conservation Extension Education

    ERIC Educational Resources Information Center

    Huang, Pei-wen; Lamm, Alexa J.

    2017-01-01

    Extension educators have diligently educated the general public about water conservation. Incorporating audiences' personal experience into educational programming is recommended as an approach to effectively enhance audiences' adoption of water conservation practices. To ensure the impact on the audiences and environment, understanding the…

  17. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  18. Absolute Reliability and Concurrent Validity of Hand Held Dynamometry and Isokinetic Dynamometry in the Hip, Knee and Ankle Joint: Systematic Review and Meta-analysis

    PubMed Central

    Chamorro, Claudio; Armijo-Olivo, Susan; De la Fuente, Carlos; Fuentes, Javiera; Javier Chirosa, Luis

    2017-01-01

    Abstract The purpose of the study is to establish absolute reliability and concurrent validity between hand-held dynamometers (HHDs) and isokinetic dynamometers (IDs) in lower extremity peak torque assessment. Medline, Embase, CINAHL databases were searched for studies related to psychometric properties in muscle dynamometry. Studies considering standard error of measurement SEM (%) or limit of agreement LOA (%) expressed as percentage of the mean, were considered to establish absolute reliability while studies using intra-class correlation coefficient (ICC) were considered to establish concurrent validity between dynamometers. In total, 17 studies were included in the meta-analysis. The COSMIN checklist classified them between fair and poor. Using HHDs, knee extension LOA (%) was 33.59%, 95% confidence interval (CI) 23.91 to 43.26 and ankle plantar flexion LOA (%) was 48.87%, CI 35.19 to 62.56. Using IDs, hip adduction and extension; knee flexion and extension; and ankle dorsiflexion showed LOA (%) under 15%. Lower hip, knee, and ankle LOA (%) were obtained using an ID compared to HHD. ICC between devices ranged between 0.62, CI (0.37 to 0.87) for ankle dorsiflexion to 0.94, IC (0.91to 0.98) for hip adduction. Very high correlation were found for hip adductors and hip flexors and moderate correlations for knee flexors/extensors and ankle plantar/dorsiflexors. PMID:29071305

  19. Fast and Accurate Simulation Technique for Large Irregular Arrays

    NASA Astrophysics Data System (ADS)

    Bui-Van, Ha; Abraham, Jens; Arts, Michel; Gueuning, Quentin; Raucy, Christopher; Gonzalez-Ovejero, David; de Lera Acedo, Eloy; Craeye, Christophe

    2018-04-01

    A fast full-wave simulation technique is presented for the analysis of large irregular planar arrays of identical 3-D metallic antennas. The solution method relies on the Macro Basis Functions (MBF) approach and an interpolatory technique to compute the interactions between MBFs. The Harmonic-polynomial (HARP) model is established for the near-field interactions in a modified system of coordinates. For extremely large arrays made of complex antennas, two approaches assuming a limited radius of influence for mutual coupling are considered: one is based on a sparse-matrix LU decomposition and the other one on a tessellation of the array in the form of overlapping sub-arrays. The computation of all embedded element patterns is sped up with the help of the non-uniform FFT algorithm. Extensive validations are shown for arrays of log-periodic antennas envisaged for the low-frequency SKA (Square Kilometer Array) radio-telescope. The analysis of SKA stations with such a large number of elements has not been treated yet in the literature. Validations include comparison with results obtained with commercial software and with experiments. The proposed method is particularly well suited to array synthesis, in which several orders of magnitude can be saved in terms of computation time.

  20. A laboratory validation study of the time-lapse oscillatory pumping test for leakage detection in geological repositories

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Lu, Jiemin; Islam, Akand

    2017-05-01

    Geologic repositories are extensively used for disposing byproducts in mineral and energy industries. The safety and reliability of these repositories are a primary concern to environmental regulators and the public. Time-lapse oscillatory pumping test (OPT) has been introduced recently as a pressure-based technique for detecting potential leakage in geologic repositories. By routinely conducting OPT at a number of pulsing frequencies, an operator may identify the potential repository anomalies in the frequency domain, alleviating the ambiguity caused by reservoir noise and improving the signal-to-noise ratio. Building on previous theoretical and field studies, this work performed a series of laboratory experiments to validate the concept of time-lapse OPT using a custom made, stainless steel tank under relatively high pressures. The experimental configuration simulates a miniature geologic storage repository consisting of three layers (i.e., injection zone, caprock, and above-zone aquifer). Results show that leakage in the injection zone led to deviations in the power spectrum of observed pressure data, and the amplitude of which also increases with decreasing pulsing frequencies. The experimental results are further analyzed by developing a 3D flow model, using which the model parameters are estimated through frequency domain inversion.

  1. Simulating direct shear tests with the Bullet physics library: A validation study.

    PubMed

    Izadi, Ehsan; Bezuijen, Adam

    2018-01-01

    This study focuses on the possible uses of physics engines, and more specifically the Bullet physics library, to simulate granular systems. Physics engines are employed extensively in the video gaming, animation and movie industries to create physically plausible scenes. They are designed to deliver a fast, stable, and optimal simulation of certain systems such as rigid bodies, soft bodies and fluids. This study focuses exclusively on simulating granular media in the context of rigid body dynamics with the Bullet physics library. The first step was to validate the results of the simulations of direct shear testing on uniform-sized metal beads on the basis of laboratory experiments. The difference in the average angle of mobilized frictions was found to be only 1.0°. In addition, a very close match was found between dilatancy in the laboratory samples and in the simulations. A comprehensive study was then conducted to determine the failure and post-failure mechanism. We conclude with the presentation of a simulation of a direct shear test on real soil which demonstrated that Bullet has all the capabilities needed to be used as software for simulating granular systems.

  2. 3-D Characterization of Seismic Properties at the Smart Weapons Test Range, YPG

    NASA Astrophysics Data System (ADS)

    Miller, Richard D.; Anderson, Thomas S.; Davis, John C.; Steeples, Don W.; Moran, Mark L.

    2001-10-01

    The Smart Weapons Test Range (SWTR) lies within the Yuma Proving Ground (YPG), Arizona. SWTR is a new facility constructed specifically for the development and testing of futuristic intelligent battlefield sensor networks. In this paper, results are presented for an extensive high-resolution geophysical characterization study at the SWTR site along with validation using 3-D modeling. In this study, several shallow seismic methods and novel processing techniques were used to generate a 3-D grid of earth seismic properties, including compressional (P) and shear (S) body-wave speeds (Vp and Vs), and their associated body-wave attenuation parameters (Qp, and Qs). These experiments covered a volume of earth measuring 1500 m by 300 m by 25 m deep (11 million cubic meters), centered on the vehicle test track at the SWTR site. The study has resulted in detailed characterizations of key geophysical properties. To our knowledge, results of this kind have not been previously achieved, nor have the innovative methods developed for this effort been reported elsewhere. In addition to supporting materiel developers with important geophysical information at this test range, the data from this study will be used to validate sophisticated 3-D seismic signature models for moving vehicles.

  3. Heater Validation for the NEXT-C Hollow Cathodes

    NASA Technical Reports Server (NTRS)

    Verhey, Timothy R.; Soulas, George C.; Mackey, Jonathan Ar.

    2017-01-01

    Swaged cathode heaters whose design was successfully demonstrated under a prior flight project are to be provided by the NASA Glenn Research Center for the NEXT-C ion thruster being fabricated by Aerojet Rocketdyne. Extensive requalification activities were performed to validate process controls that had to be re-established or revised because systemic changes prevented reuse of the past approaches. A development batch of heaters was successfully fabricated based on the new process controls. Acceptance and cyclic life testing of multiple discharge and neutralizer sized heaters extracted from the development batch was initiated in August, 2016, with the last heater completing testing in April, 2017. Cyclic life testing results substantially exceeded the NEXT-C thruster requirement as well as all past experience for GRC fabricated units. The heaters demonstrated ultimate cyclic life capability of 19050 to 33500 cycles. A qualification batch of heaters is now being fabricated using the finalized process controls. A set of six heaters will be acceptance and cyclic tested to verify conformance to the behavior observed with the development heaters. The heaters for flight use will be then be provided to the contractor. This paper summarizes the fabrication process control activities and the acceptance and life testing of the development heater units.

  4. DPOD2014: a new DORIS extension of ITRF2014 for Precise Orbit Determination

    NASA Astrophysics Data System (ADS)

    Moreaux, G.; Willis, P.; Lemoine, F. G.; Zelensky, N. P.

    2016-12-01

    As one of the tracking systems used to determine orbits of the altimeter mission satellites (such as TOPEX/Poseidon, Envisat, Jason-1/2/3 & Cryosat-2), the position of the DORIS tracking stations provides a fundamental reference for the estimation of the precise orbits and so, by extension is fundamental for the quality of the altimeter data and derived products. Therefore, the time evolution of the position of both the existing and the newest DORIS stations must be precisely modeled and regularly updated. To satisfy operational requirements for precise orbit determination and routine delivery of geodetic products, the International DORIS Service maintains the so-called DPOD solutions, which can be seen as extensions of the latest available ITRF solution from the International Earth Rotation and Reference Systems Service (IERS). In mid-2016, the IDS agreed to change the processing strategy of the DPOD solution. The new solution from the IDS Combination Center (CC) consists of a DORIS cumulative position and velocity solution using the latest IDS combined weekly solutions. The first objective of this study is to describe the new DPOD elaboration scheme and to show the IDS CC internal validation steps. The second purpose is to present the external validation process made by an external team before the new DPOD is made available to all the users. The elaboration and validation procedures will be illustrated by the presentation of first version of the DPOD2014 (ITRF2014 DORIS extension) and focus will be given on the update of the position and velocity of two DORIS sites: Everest (after Gorkha earthquake M7.8 in April 2015) and Thule (Greenland).

  5. Ability of preoperative 3.0-Tesla magnetic resonance imaging to predict the absence of side-specific extracapsular extension of prostate cancer.

    PubMed

    Hara, Tomohiko; Nakanishi, Hiroyuki; Nakagawa, Tohru; Komiyama, Motokiyo; Kawahara, Takashi; Manabe, Tomoko; Miyake, Mototaka; Arai, Eri; Kanai, Yae; Fujimoto, Hiroyuki

    2013-10-01

    Recent studies have shown an improvement in prostate cancer diagnosis with the use of 3.0-Tesla magnetic resonance imaging. We retrospectively assessed the ability of this imaging technique to predict side-specific extracapsular extension of prostate cancer. From October 2007 to August 2011, prostatectomy was carried out in 396 patients after preoperative 3.0-Tesla magnetic resonance imaging. Among these, 132 (primary sample) and 134 patients (validation sample) underwent 12-core prostate biopsy at the National Cancer Center Hospital of Tokyo, Japan, and at other institutions, respectively. In the primary dataset, univariate and multivariate analyses were carried out to predict side-specific extracapsular extension using variables determined preoperatively, including 3.0-Tesla magnetic resonance imaging findings (T2-weighted and diffusion-weighted imaging). A prediction model was then constructed and applied to the validation study sample. Multivariate analysis identified four significant independent predictors (P < 0.05), including a biopsy Gleason score of ≥8, positive 3.0-Tesla diffusion-weighted magnetic resonance imaging findings, ≥2 positive biopsy cores on each side and a maximum percentage of positive cores ≥31% on each side. The negative predictive value was 93.9% in the combination model with these four predictors, meanwhile the positive predictive value was 33.8%. Good reproducibility of these four significant predictors and the combination model was observed in the validation study sample. The side-specific extracapsular extension prediction by the biopsy Gleason score and factors associated with tumor location, including a positive 3.0-Tesla diffusion-weighted magnetic resonance imaging finding, have a high negative predictive value, but a low positive predictive value. © 2013 The Japanese Urological Association.

  6. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data (downscaled values) and metadata (characterizing different aspects of the downscaling methods). This constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods. Here, we present an overall validation, analyzing marginal and temporal aspects to assess the intrinsic performance and added value of statistical downscaling methods at both annual and seasonal levels. This validation takes into account the different properties/limitations of different approaches and techniques (as reported in the provided metadata) in order to perform a fair comparison. It is pointed out that this experiment alone is not sufficient to evaluate the limitations of (MOS) bias correction techniques. Moreover, it also does not fully validate PP since we don't learn whether we have the right predictors and whether the PP assumption is valid. These problems will be analyzed in the subsequent community-open VALUE experiments 2) and 3), which will be open for participation along the present year.

  7. Dielectrophoretic immobilisation of nanoparticles as isolated singles in regular arrays

    NASA Astrophysics Data System (ADS)

    Knigge, Xenia; Wenger, Christian; Bier, Frank F.; Hölzel, Ralph

    2018-02-01

    We demonstrate the immobilisation of polystyrene nanoparticles on vertical nano-electrodes by means of dielectrophoresis. The electrodes have diameters of 500 nm or 50 nm, respectively, and are arranged in arrays of several thousand electrodes, allowing many thousands of experiments in parallel. At a frequency of 15 kHz, which is found favourable for polystyrene, several occupation patterns are observed, and both temporary and permanent immobilisation is achieved. In addition, a histogram method is applied, which allows to determine the number of particles occupying the electrodes. These results are validated with scanning electron microscopy images. Immobilising exactly one particle at each electrode tip is achieved for electrode tip diameters with half the particle size. Extension of this system down to the level of single molecules is envisaged, which will avoid ensemble averaging at still statistically large sample sizes.

  8. Turbulence study in the vicinity of piano key weir: relevance, instrumentation, parameters and methods

    NASA Astrophysics Data System (ADS)

    Tiwari, Harinarayan; Sharma, Nayan

    2017-05-01

    This research paper focuses on the need of turbulence, instruments reliable to capture turbulence, different turbulence parameters and some advance methodology which can decompose various turbulence structures at different levels near hydraulic structures. Small-scale turbulence research has valid prospects in open channel flow. The relevance of the study is amplified as we introduce any hydraulic structure in the channel which disturbs the natural flow and creates discontinuity. To recover this discontinuity, the piano key weir (PKW) might be used with sloped keys. Constraints of empirical results in the vicinity of PKW necessitate extensive laboratory experiments with fair and reliable instrumentation techniques. Acoustic Doppler velocimeter was established to be best suited within range of some limitations using principal component analysis. Wavelet analysis is proposed to decompose the underlying turbulence structure in a better way.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nasser, Felipe; Rocha, Rafael Dahmer, E-mail: rafaeldrocha@gmail.com; Falsarella, Priscila Mina

    PurposeTo report a novel modified occlusion balloon technique to treat biliary leaks.MethodsA 22-year-old female patient underwent liver transplantation with biliary-enteric anastomosis. She developed thrombosis of the common hepatic artery and extensive ischemia in the left hepatic lobe. Resection of segments II and III was performed and a biliary-cutaneous leak originating at the resection plane was identified in the early postoperative period. Initial treatment with percutaneous transhepatic drainage was unsuccessful. Therefore, an angioplasty balloon was coaxially inserted within the biliary drain and positioned close to the leak.ResultsThe fistula output abruptly decreased after the procedure and stopped on the 7th day. Atmore » the 3-week follow-up, cholangiography revealed complete resolution of the leakage.ConclusionThis novel modified occlusion balloon technique was effective and safe. However, greater experience and more cases are necessary to validate the technique.« less

  10. Hohlraum-driven mid-Z (SiO2) double-shell implosions on the omega laser facility and their scaling to NIF.

    PubMed

    Robey, H F; Amendt, P A; Milovich, J L; Park, H-S; Hamza, A V; Bono, M J

    2009-10-02

    High-convergence, hohlraum-driven implosions of double-shell capsules using mid-Z (SiO2) inner shells have been performed on the OMEGA laser facility [T. R. Boehly, Opt. Commun. 133, 495 (1997)]. These experiments provide an essential extension of the results of previous low-Z (CH) double-shell implosions [P. A. Amendt, Phys. Rev. Lett. 94, 065004 (2005)] to materials of higher density and atomic number. Analytic modeling, supported by highly resolved 2D numerical simulations, is used to account for the yield degradation due to interfacial atomic mixing. This extended experimental database from OMEGA enables a validation of the mix model, and provides a means for quantitatively assessing the prospects for high-Z double-shell implosions on the National Ignition Facility [Paisner, Laser Focus World 30, 75 (1994)].

  11. ZeroCal: Automatic MAC Protocol Calibration

    NASA Astrophysics Data System (ADS)

    Meier, Andreas; Woehrle, Matthias; Zimmerling, Marco; Thiele, Lothar

    Sensor network MAC protocols are typically configured for an intended deployment scenario once and for all at compile time. This approach, however, leads to suboptimal performance if the network conditions deviate from the expectations. We present ZeroCal, a distributed algorithm that allows nodes to dynamically adapt to variations in traffic volume. Using ZeroCal, each node autonomously configures its MAC protocol at runtime, thereby trying to reduce the maximum energy consumption among all nodes. While the algorithm is readily usable for any asynchronous low-power listening or low-power probing protocol, we validate and demonstrate the effectiveness of ZeroCal on X-MAC. Extensive testbed experiments and simulations indicate that ZeroCal quickly adapts to traffic variations. We further show that ZeroCal extends network lifetime by 50% compared to an optimal configuration with identical and static MAC parameters at all nodes.

  12. DG-IMEX Stochastic Galerkin Schemes for Linear Transport Equation with Random Inputs and Diffusive Scalings

    DOE PAGES

    Chen, Zheng; Liu, Liu; Mu, Lin

    2017-05-03

    In this paper, we consider the linear transport equation under diffusive scaling and with random inputs. The method is based on the generalized polynomial chaos approach in the stochastic Galerkin framework. Several theoretical aspects will be addressed. Additionally, a uniform numerical stability with respect to the Knudsen number ϵ, and a uniform in ϵ error estimate is given. For temporal and spatial discretizations, we apply the implicit–explicit scheme under the micro–macro decomposition framework and the discontinuous Galerkin method, as proposed in Jang et al. (SIAM J Numer Anal 52:2048–2072, 2014) for deterministic problem. Lastly, we provide a rigorous proof ofmore » the stochastic asymptotic-preserving (sAP) property. Extensive numerical experiments that validate the accuracy and sAP of the method are conducted.« less

  13. Simulation of ion-temperature-gradient turbulence in tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, B I; Dimits, A M; Kim, C

    Results are presented from nonlinear gyrokinetic simulations of toroidal ion temperature gradient (ITG) turbulence and transport. The gyrokinetic simulations are found to yield values of the thermal diffusivity significantly lower than gyrofluid or IFS-PPPL-model predictions. A new phenomenon of nonlinear effective critical gradients larger than the linear instability threshold gradients is observed, and is associated with undamped flux-surface-averaged shear flows. The nonlinear gyrokineic codes have passed extensive validity tests which include comparison against independent linear calculations, a series of nonlinear convergence tests, and a comparison between two independent nonlinear gyrokinetic codes. Our most realistic simulations to date have actual reconstructedmore » equilibria from experiments and a model for dilution by impurity and beam ions. These simulations highlight the need for still more physics to be included in the simulations« less

  14. Tumor recognition in wireless capsule endoscopy images using textural features and SVM-based feature selection.

    PubMed

    Li, Baopu; Meng, Max Q-H

    2012-05-01

    Tumor in digestive tract is a common disease and wireless capsule endoscopy (WCE) is a relatively new technology to examine diseases for digestive tract especially for small intestine. This paper addresses the problem of automatic recognition of tumor for WCE images. Candidate color texture feature that integrates uniform local binary pattern and wavelet is proposed to characterize WCE images. The proposed features are invariant to illumination change and describe multiresolution characteristics of WCE images. Two feature selection approaches based on support vector machine, sequential forward floating selection and recursive feature elimination, are further employed to refine the proposed features for improving the detection accuracy. Extensive experiments validate that the proposed computer-aided diagnosis system achieves a promising tumor recognition accuracy of 92.4% in WCE images on our collected data.

  15. Experimental determination of pore shapes using phase retrieval from q -space NMR diffraction

    NASA Astrophysics Data System (ADS)

    Demberg, Kerstin; Laun, Frederik Bernd; Bertleff, Marco; Bachert, Peter; Kuder, Tristan Anselm

    2018-05-01

    This paper presents an approach to solving the phase problem in nuclear magnetic resonance (NMR) diffusion pore imaging, a method that allows imaging the shape of arbitrary closed pores filled with an NMR-detectable medium for investigation of the microstructure of biological tissue and porous materials. Classical q -space imaging composed of two short diffusion-encoding gradient pulses yields, analogously to diffraction experiments, the modulus squared of the Fourier transform of the pore image which entails an inversion problem: An unambiguous reconstruction of the pore image requires both magnitude and phase. Here the phase information is recovered from the Fourier modulus by applying a phase retrieval algorithm. This allows omitting experimentally challenging phase measurements using specialized temporal gradient profiles. A combination of the hybrid input-output algorithm and the error reduction algorithm was used with dynamically adapting support (shrinkwrap extension). No a priori knowledge on the pore shape was fed to the algorithm except for a finite pore extent. The phase retrieval approach proved successful for simulated data with and without noise and was validated in phantom experiments with well-defined pores using hyperpolarized xenon gas.

  16. Modelling of Divertor Detachment in MAST Upgrade

    NASA Astrophysics Data System (ADS)

    Moulton, David; Carr, Matthew; Harrison, James; Meakins, Alex

    2017-10-01

    MAST Upgrade will have extensive capabilities to explore the benefits of alternative divertor configurations such as the conventional, Super-X, x divertor, snowflake and variants in a single device with closed divertors. Initial experiments will concentrate on exploring the Super-X and conventional configurations, in terms of power and particle loads to divertor surfaces, access to detachment and its control. Simulations have been carried out with the SOLPS5.0 code validated against MAST experiments. The simulations predict that the Super-X configuration has significant advantages over the conventional, such as lower detachment threshold (2-3x lower in terms of upstream density and 4x higher in terms of PSOL). Synthetic spectroscopy diagnostics from these simulations have been created using the Raysect ray tracing code to produce synthetic filtered camera images, spectra and foil bolometer data. Forward modelling of the current set of divertor diagnostics will be presented, together with a discussion of future diagnostics and analysis to improve estimates of the plasma conditions. Work supported by the RCUK Energy Programme [Grant Number EP/P012450/1] and EURATOM.

  17. Biomarkers and their dependence on well-reported antibodies.

    PubMed

    Voskuil, Jan

    2015-11-01

    Jan Voskuil is the Chief Scientific Officer at antibody manufacturer Everest Biotech in Oxfordshire, UK. After specializing in prokaryotic cell biology through his PhD program in Amsterdam, The Netherlands and a postdoctorate position at Stanford (CA, USA), he switched to the science of neurodegenerative diseases at Oxford, UK through postdoctorate positions at Dunn School of Pathology and at MRC and through a leading position at the Alzheimer drug discovery company Synaptica. He subsequently gained experience in a Good Laboratory Practice-regulatory environment in contract research organization companies both in Oxfordshire and Cambridgeshire, validating assays in Flow Cytometry and ELISA platforms and writing standard operating procedures. His extensive experience with generating and characterizing monoclonal and polyclonal antibodies in combination with accrued knowledge on most immunoassays in academic and commercial environments made him the ideal candidate to take charge in putting Everest Biotech on the global map by ever raising the quality and size of its catalog and by delivery of adequate technical support. As a result, Everest antibodies are currently part of most globally well-known catalogs, and its products are increasingly recognized as useful alternatives to unfit monoclonal antibodies.

  18. An integrated solution for remote data access

    NASA Astrophysics Data System (ADS)

    Sapunenko, Vladimir; D'Urso, Domenico; dell'Agnello, Luca; Vagnoni, Vincenzo; Duranti, Matteo

    2015-12-01

    Data management constitutes one of the major challenges that a geographically- distributed e-Infrastructure has to face, especially when remote data access is involved. We discuss an integrated solution which enables transparent and efficient access to on-line and near-line data through high latency networks. The solution is based on the joint use of the General Parallel File System (GPFS) and of the Tivoli Storage Manager (TSM). Both products, developed by IBM, are well known and extensively used in the HEP computing community. Owing to a new feature introduced in GPFS 3.5, so-called Active File Management (AFM), the definition of a single, geographically-distributed namespace, characterised by automated data flow management between different locations, becomes possible. As a practical example, we present the implementation of AFM-based remote data access between two data centres located in Bologna and Rome, demonstrating the validity of the solution for the use case of the AMS experiment, an astro-particle experiment supported by the INFN CNAF data centre with the large disk space requirements (more than 1.5 PB).

  19. Satellite-Based Stratospheric and Tropospheric Measurements: Determination of Global Ozone and Other Trace Species

    NASA Technical Reports Server (NTRS)

    Chance, Kelly

    2003-01-01

    This grant is an extension to our previous NASA Grant NAG5-3461, providing incremental funding to continue GOME (Global Ozone Monitoring Experiment) and SCIAMACHY (SCanning Imaging Absorption SpectroMeter for Atmospheric CHartographY) studies. This report summarizes research done under these grants through December 31, 2002. The research performed during this reporting period includes development and maintenance of scientific software for the GOME retrieval algorithms, consultation on operational software development for GOME, consultation and development for SCIAMACHY near-real-time (NRT) and off-line (OL) data products, and participation in initial SCIAMACHY validation studies. The Global Ozone Monitoring Experiment was successfully launched on the ERS-2 satellite on April 20, 1995, and remains working in normal fashion. SCIAMACHY was launched March 1, 2002 on the ESA Envisat satellite. Three GOME-2 instruments are now scheduled to fly on the Metop series of operational meteorological satellites (Eumetsat). K. Chance is a member of the reconstituted GOME Scientific Advisory Group, which will guide the GOME-2 program as well as the continuing ERS-2 GOME program.

  20. Experimental determination of pore shapes using phase retrieval from q-space NMR diffraction.

    PubMed

    Demberg, Kerstin; Laun, Frederik Bernd; Bertleff, Marco; Bachert, Peter; Kuder, Tristan Anselm

    2018-05-01

    This paper presents an approach to solving the phase problem in nuclear magnetic resonance (NMR) diffusion pore imaging, a method that allows imaging the shape of arbitrary closed pores filled with an NMR-detectable medium for investigation of the microstructure of biological tissue and porous materials. Classical q-space imaging composed of two short diffusion-encoding gradient pulses yields, analogously to diffraction experiments, the modulus squared of the Fourier transform of the pore image which entails an inversion problem: An unambiguous reconstruction of the pore image requires both magnitude and phase. Here the phase information is recovered from the Fourier modulus by applying a phase retrieval algorithm. This allows omitting experimentally challenging phase measurements using specialized temporal gradient profiles. A combination of the hybrid input-output algorithm and the error reduction algorithm was used with dynamically adapting support (shrinkwrap extension). No a priori knowledge on the pore shape was fed to the algorithm except for a finite pore extent. The phase retrieval approach proved successful for simulated data with and without noise and was validated in phantom experiments with well-defined pores using hyperpolarized xenon gas.

  1. A Diagnostic Approach for Electro-Mechanical Actuators in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Balaban, Edward; Saxena, Abhinav; Bansal, Prasun; Goebel, Kai Frank; Stoelting, Paul; Curran, Simon

    2009-01-01

    Electro-mechanical actuators (EMA) are finding increasing use in aerospace applications, especially with the trend towards all all-electric aircraft and spacecraft designs. However, electro-mechanical actuators still lack the knowledge base accumulated for other fielded actuator types, particularly with regard to fault detection and characterization. This paper presents a thorough analysis of some of the critical failure modes documented for EMAs and describes experiments conducted on detecting and isolating a subset of them. The list of failures has been prepared through an extensive Failure Modes and Criticality Analysis (FMECA) reference, literature review, and accessible industry experience. Methods for data acquisition and validation of algorithms on EMA test stands are described. A variety of condition indicators were developed that enabled detection, identification, and isolation among the various fault modes. A diagnostic algorithm based on an artificial neural network is shown to operate successfully using these condition indicators and furthermore, robustness of these diagnostic routines to sensor faults is demonstrated by showing their ability to distinguish between them and component failures. The paper concludes with a roadmap leading from this effort towards developing successful prognostic algorithms for electromechanical actuators.

  2. On Parametric Sensitivity of Reynolds-Averaged Navier-Stokes SST Turbulence Model: 2D Hypersonic Shock-Wave Boundary Layer Interactions

    NASA Technical Reports Server (NTRS)

    Brown, James L.

    2014-01-01

    Examined is sensitivity of separation extent, wall pressure and heating to variation of primary input flow parameters, such as Mach and Reynolds numbers and shock strength, for 2D and Axisymmetric Hypersonic Shock Wave Turbulent Boundary Layer interactions obtained by Navier-Stokes methods using the SST turbulence model. Baseline parametric sensitivity response is provided in part by comparison with vetted experiments, and in part through updated correlations based on free interaction theory concepts. A recent database compilation of hypersonic 2D shock-wave/turbulent boundary layer experiments extensively used in a prior related uncertainty analysis provides the foundation for this updated correlation approach, as well as for more conventional validation. The primary CFD method for this work is DPLR, one of NASA's real-gas aerothermodynamic production RANS codes. Comparisons are also made with CFL3D, one of NASA's mature perfect-gas RANS codes. Deficiencies in predicted separation response of RANS/SST solutions to parametric variations of test conditions are summarized, along with recommendations as to future turbulence approach.

  3. Satellite-Based Stratospheric and Tropospheric Measurements: Determination of Global Ozone and Other Trace Species

    NASA Astrophysics Data System (ADS)

    Chance, Kelly

    2003-02-01

    This grant is an extension to our previous NASA Grant NAG5-3461, providing incremental funding to continue GOME (Global Ozone Monitoring Experiment) and SCIAMACHY (SCanning Imaging Absorption SpectroMeter for Atmospheric CHartographY) studies. This report summarizes research done under these grants through December 31, 2002. The research performed during this reporting period includes development and maintenance of scientific software for the GOME retrieval algorithms, consultation on operational software development for GOME, consultation and development for SCIAMACHY near-real-time (NRT) and off-line (OL) data products, and participation in initial SCIAMACHY validation studies. The Global Ozone Monitoring Experiment was successfully launched on the ERS-2 satellite on April 20, 1995, and remains working in normal fashion. SCIAMACHY was launched March 1, 2002 on the ESA Envisat satellite. Three GOME-2 instruments are now scheduled to fly on the Metop series of operational meteorological satellites (Eumetsat). K. Chance is a member of the reconstituted GOME Scientific Advisory Group, which will guide the GOME-2 program as well as the continuing ERS-2 GOME program.

  4. The Role of Internships in Raising Undergraduates' Awareness and Perception of Extension

    ERIC Educational Resources Information Center

    Grotta, Amy; McGrath, Daniel

    2013-01-01

    Extension does not often reach out to undergraduates at their home institutions. Doing so might help Extension reach new audiences; leverage scarce resources; provide meaningful, community-based work experience; and perhaps recruit another generation of Extension professionals. We surveyed students who had completed internships with Extension…

  5. The Teacher Sense of Efficacy Scale: Validation Evidence and Behavioral Prediction. WCER Working Paper No. 2006-7

    ERIC Educational Resources Information Center

    Heneman, Herbert G., III; Kimball, Steven; Milanowski, Anthony

    2006-01-01

    The present study contributes to knowledge of the construct validity of the short form of the Teacher Sense of Efficacy Scale (and by extension, given their similar content and psychometric properties, to the long form). The authors' research involves: (1) examining the psychometric properties of the TSES on a large sample of elementary, middle,…

  6. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    PubMed

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  7. The Extensive Air Shower Experiment Kascade-Grande

    NASA Astrophysics Data System (ADS)

    Kang, Donghwa; Apel, W. D.; Arteaga, J. C.; Badea, F.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Brüggemann, M.; Buchholz, P.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Finger, M.; Fuhrmann, D.; Ghia, P. L.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huege, T.; Isar, P. G.; Kampert, K.-H.; Kickelbick, D.; Klages, H. O.; Kolotaev, Y.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Navarra, G.; Nehls, S.; Oehlschläger, J.; Ostapchenko, S.; Over, S.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schatz, G.; Schieler, H.; Schröder, F.; Sima, O.; Stümpert, M.; Toma, G.; Trinchero, G. C.; Ulrich, H.; van Buren, J.; Walkowiak, W.; Weindl, A.; Wochele, J.; Wommer, M.; Zabierowski, J.

    The extensive air shower experiment KASCADE-Grande (KArlsruhe Shower Core and Array DEtector and Grande array) is located on site of the Forschungszentrum Karlsruhe in Germany. The original KASCADE experiment consisted of a densely packed scintillator array with unshielded and shielded detectors for the measurement of the electromagnetic and muonic shower component independently, as well as muon tracking devices and a hadron calorimeter. The Grande array as an extension of KASCADE consists of 37 scintillation detector stations covering an area of 700×700 m2. The main goal for the combined measurements of KASCADE and Grande is the investigation of the energy spectrum and composition of primary cosmic rays in the energy range of 1016 to 1018 eV. In this paper an overview of the KASCADE-Grande experiment and recent results will be presented.

  8. Ada (Tradename) Compiler Validation Summary Report. International Business Machines Corporation. IBM Development System for the Ada Language for VM/CMS, Version 1.0. IBM 4381 (IBM System/370) under VM/CMS.

    DTIC Science & Technology

    1986-04-29

    COMPILER VALIDATION SUMMARY REPORT: International Business Machines Corporation IBM Development System for the Ada Language for VM/CMS, Version 1.0 IBM 4381...tested using command scripts provided by International Business Machines Corporation. These scripts were reviewed by the validation team. Test.s were run...s): IBM 4381 (System/370) Operating System: VM/CMS, release 3.6 International Business Machines Corporation has made no deliberate extensions to the

  9. Effects of Hot Streak and Phantom Cooling on Heat Transfer in a Cooled Turbine Stage Including Particulate Deposition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bons, Jeffrey; Ameri, Ali

    2016-01-08

    The objective of this research effort was to develop a validated computational modeling capability for the characterization of the effects of hot streaks and particulate deposition on the heat load of modern gas turbines. This was accomplished with a multi-faceted approach including analytical, experimental, and computational components. A 1-year no cost extension request was approved for this effort, so the total duration was 4 years. The research effort succeeded in its ultimate objective by leveraging extensive experimental deposition studies complemented by computational modeling. Experiments were conducted with hot streaks, vane cooling, and combinations of hot streaks with vane cooling. Thesemore » studies contributed to a significant body of corporate knowledge of deposition, in combination with particle rebound and deposition studies funded by other agencies, to provide suitable conditions for the development of a new model. The model includes the following physical phenomena: elastic deformation, plastic deformation, adhesion, and shear removal. It also incorporates material property sensitivity to temperature and tangential-normal velocity rebound cross-dependencies observed in experiments. The model is well-suited for incorporation in CFD simulations of complex gas turbine flows due to its algebraic (explicit) formulation. This report contains model predictions compared to coefficient of restitution data available in the open literature as well as deposition results from two different high temperature turbine deposition facilities. While the model comparisons with experiments are in many cases promising, several key aspects of particle deposition remain elusive. The simple phenomenological nature of the model allows for parametric dependencies to be evaluated in a straightforward manner. This effort also included the first-ever full turbine stage deposition model published in the open literature. The simulations included hot streaks and simulated vane cooling. The new deposition model was implemented into the CFD model as a wall boundary condition, with various particle sizes investigated in the simulation. Simulations utilizing a steady mixing plane formulation and an unsteady sliding mesh were conducted and the flow solution of each was validated against experimental data. Results from each of these simulations, including impact and capture distributions and efficiencies, were compared and potential reasons for differences discussed in detail. The inclusion of a large range of particle sizes allowed investigation of trends with particle size, such as increased radial migration and reduced sticking efficiency at the larger particle sizes. The unsteady simulation predicted lower sticking efficiencies on the rotor blades than the mixing plane simulation for the majority of particle sizes. This is postulated to be due to the preservation of the hot streak and cool vane wake through the vane-rotor interface (which are smeared out circumferentially in the mixing-plane simulation). The results reported here represent the successful implementation of a novel deposition model into validated vane-rotor flow solutions that include a non-uniform inlet temperature profile and simulated vane cooling.« less

  10. Developing Effective Extension Agents: Experience Concerns.

    ERIC Educational Resources Information Center

    Goddu, Roland

    This paper is a description of the requirements placed on persons selected to fill the role of extension agents for the purpose of penetrating an educational environment, installing change in an educational organization, and completing tasks as a resource outside of the education establishment. These experience concerns are summarized by…

  11. Dark matter universe.

    PubMed

    Bahcall, Neta A

    2015-10-06

    Most of the mass in the universe is in the form of dark matter--a new type of nonbaryonic particle not yet detected in the laboratory or in other detection experiments. The evidence for the existence of dark matter through its gravitational impact is clear in astronomical observations--from the early observations of the large motions of galaxies in clusters and the motions of stars and gas in galaxies, to observations of the large-scale structure in the universe, gravitational lensing, and the cosmic microwave background. The extensive data consistently show the dominance of dark matter and quantify its amount and distribution, assuming general relativity is valid. The data inform us that the dark matter is nonbaryonic, is "cold" (i.e., moves nonrelativistically in the early universe), and interacts only weakly with matter other than by gravity. The current Lambda cold dark matter cosmology--a simple (but strange) flat cold dark matter model dominated by a cosmological constant Lambda, with only six basic parameters (including the density of matter and of baryons, the initial mass fluctuations amplitude and its scale dependence, and the age of the universe and of the first stars)--fits remarkably well all the accumulated data. However, what is the dark matter? This is one of the most fundamental open questions in cosmology and particle physics. Its existence requires an extension of our current understanding of particle physics or otherwise point to a modification of gravity on cosmological scales. The exploration and ultimate detection of dark matter are led by experiments for direct and indirect detection of this yet mysterious particle.

  12. Dark matter universe

    PubMed Central

    Bahcall, Neta A.

    2015-01-01

    Most of the mass in the universe is in the form of dark matter—a new type of nonbaryonic particle not yet detected in the laboratory or in other detection experiments. The evidence for the existence of dark matter through its gravitational impact is clear in astronomical observations—from the early observations of the large motions of galaxies in clusters and the motions of stars and gas in galaxies, to observations of the large-scale structure in the universe, gravitational lensing, and the cosmic microwave background. The extensive data consistently show the dominance of dark matter and quantify its amount and distribution, assuming general relativity is valid. The data inform us that the dark matter is nonbaryonic, is “cold” (i.e., moves nonrelativistically in the early universe), and interacts only weakly with matter other than by gravity. The current Lambda cold dark matter cosmology—a simple (but strange) flat cold dark matter model dominated by a cosmological constant Lambda, with only six basic parameters (including the density of matter and of baryons, the initial mass fluctuations amplitude and its scale dependence, and the age of the universe and of the first stars)—fits remarkably well all the accumulated data. However, what is the dark matter? This is one of the most fundamental open questions in cosmology and particle physics. Its existence requires an extension of our current understanding of particle physics or otherwise point to a modification of gravity on cosmological scales. The exploration and ultimate detection of dark matter are led by experiments for direct and indirect detection of this yet mysterious particle. PMID:26417091

  13. Innovative Extension Models and Smallholders: How ICT platforms can Deliver Timely Information to Farmers in India.

    NASA Astrophysics Data System (ADS)

    Nagothu, U. S.

    2016-12-01

    Agricultural extension services, among others, contribute to improving rural livelihoods and enhancing economic development. Knowledge development and transfer from the cognitive science point of view, is about, how farmers use and apply their experiential knowledge as well as acquired new knowledge to solve new problems. This depends on the models adopted, the way knowledge is generated and delivered. New extension models based on ICT platforms and smart phones are promising. Results from a 5-year project (www.climaadapt.org) in India shows that farmer led-on farm validations of technologies and knowledge exchange through ICT based platforms outperformed state operated linear extension programs. Innovation here depends on the connectivity, net-working between stakeholders that are involved in generating, transferring and using the knowledge. Key words: Smallholders, Knowledge, Extension, Innovation, India

  14. Broadband Scattering from Sand and Sand/Mud Sediments with Extensive Environmental Characterization

    DTIC Science & Technology

    2017-01-30

    experiment , extensive envi- ronmental characterization was also performed to support data/model comparisons for both experimental efforts. The site...mechanisms, potentially addressing questions left unresolved from the previous sediment acoustics experiments , SAX99 and SAX04. This work was also to provide...environmental characterization to support the analysis of data collected during the Target and Reverberation Experiment in 2013 (TREX13) as well as

  15. Validation & Safety Constraints: What We Want to Do… What We Can Do

    NASA Astrophysics Data System (ADS)

    Yepez, Amaya Atenicia; Peiro, Belen Martin; Bory, Stephane

    2010-09-01

    Autonomous safety critical systems require an exhaustive validation in order to guarantee robustness from different perspectives(SW, HW and algorithm design). In this paper we are presenting a performance validation approach dealing with an extensive list of difficulties, as lessons learnt from the space projects developed by GMV(e.g. within EGNOS and Galileo Programs). We will strongly recommend that the selected validation strategy is decided from the early stages of the system definition and it is carried out listening to the opinions and demands of all parties. In fact, to agree on the final solution, a trade-off will be needed in order to validate the requirements with the available means, in terms of amount of data and resources.

  16. Valid and Reliable Science Content Assessments for Science Teachers

    NASA Astrophysics Data System (ADS)

    Tretter, Thomas R.; Brown, Sherri L.; Bush, William S.; Saderholm, Jon C.; Holmes, Vicki-Lynn

    2013-03-01

    Science teachers' content knowledge is an important influence on student learning, highlighting an ongoing need for programs, and assessments of those programs, designed to support teacher learning of science. Valid and reliable assessments of teacher science knowledge are needed for direct measurement of this crucial variable. This paper describes multiple sources of validity and reliability (Cronbach's alpha greater than 0.8) evidence for physical, life, and earth/space science assessments—part of the Diagnostic Teacher Assessments of Mathematics and Science (DTAMS) project. Validity was strengthened by systematic synthesis of relevant documents, extensive use of external reviewers, and field tests with 900 teachers during assessment development process. Subsequent results from 4,400 teachers, analyzed with Rasch IRT modeling techniques, offer construct and concurrent validity evidence.

  17. Methodology for turbulence code validation: Quantification of simulation-experiment agreement and application to the TORPEX experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, Paolo; Theiler, C.; Fasoli, A.

    A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less

  18. Reliability and validity of an iPhone® application for the measurement of lumbar spine flexion and extension range of motion

    PubMed Central

    Pourahmadi, Mohammad Reza; Jannati, Elham; Mohseni-Bandpei, Mohammad Ali; Ebrahimi Takamjani, Ismail; Rajabzadeh, Fatemeh

    2016-01-01

    Background Measurement of lumbar spine range of motion (ROM) is often considered to be an essential component of lumbar spine physiotherapy and orthopedic assessment. The measurement can be carried out through various instruments such as inclinometers, goniometers, and etc. Recent smartphones have been equipped with accelerometers and magnetometers, which, through specific software applications (apps) can be used for inclinometric functions. Purpose The main purpose was to investigate the reliability and validity of an iPhone® app (TiltMeter© -advanced level and inclinometer) for measuring standing lumbar spine flexion–extension ROM in asymptomatic subjects. Design A cross-sectional study was carried out. Setting This study was conducted in a physiotherapy clinic located at School of Rehabilitation Sciences, Iran University of Medical Science and Health Services, Tehran, Iran. Subjects A convenience sample of 30 asymptomatic adults (15 males; 15 females; age range = 18–55 years) was recruited between August 2015 and December 2015. Methods Following a 2–minute warm-up, the subjects were asked to stand in a relaxed position and their skin was marked at the T12–L1 and S1–S2 spinal levels. From this position, they were asked to perform maximum lumbar flexion followed by maximum lumbar extension with their knees straight. Two blinded raters each used an inclinometer and the iPhone ® app to measure lumbar spine flexion–extension ROM. A third rater read the measured angles. To calculate total lumbar spine flexion–extension ROM, the measurement from S1–S2 was subtracted from T12–L1. The second (2 hours later) and third (48 hours later) sessions were carried out in the same manner as the first session. All of the measurements were conducted 3 times and the mean value of 3 repetitions for each measurement was used for analysis. Intraclass correlation coefficient (ICC) models (3, k) and (2, k) were used to determine the intra-rater and inter-rater reliability, respectively. The Pearson correlation coefficients were used to establish concurrent validity of the iPhone® app. Furthermore, minimum detectable change at the 95% confidence level (MDC95) was computed as 1.96 × standard error of measurement × \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\sqrt{2}$\\end{document}2. Results Good to excellent intra-rater and inter-rater reliability were demonstrated for both the gravity-based inclinometer with ICC values of ≥0.84 and ≥0.77 and the iPhone® app with ICC values of ≥0.85 and ≥0.85, respectively. The MDC95 ranged from 5.82°to 8.18°for the intra-rater analysis and from 7.38°to 8.66° for the inter-rater analysis. The concurrent validity for flexion and extension between the 2 instruments was 0.85 and 0.91, respectively. Conclusions The iPhone®app possesses good to excellent intra-rater and inter-rater reliability and concurrent validity. It seems that the iPhone® app can be used for the measurement of lumbar spine flexion–extension ROM. Level of evidence IIb. PMID:27635328

  19. Modal identification experiment

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G.

    1992-01-01

    The Modal Identification Experiment (MIE) is a proposed on-orbit experiment being developed by NASA's Office of Aeronautics and Space Technology wherein a series of vibration measurements would be made on various configurations of Space Station Freedom (SSF) during its on-orbit assembly phase. The experiment is to be conducted in conjunction with station reboost operations and consists of measuring the dynamic responses of the spacecraft produced by station-based attitude control system and reboost thrusters, recording and transmitting the data, and processing the data on the ground to identify the natural frequencies, damping factors, and shapes of significant vibratory modes. The experiment would likely be a part of the Space Station on-orbit verification. Basic research objectives of MIE are to evaluate and improve methods for analytically modeling large space structures, to develop techniques for performing in-space modal testing, and to validate candidate techniques for in-space modal identification. From an engineering point of view, MIE will provide the first opportunity to obtain vibration data for the fully-assembled structure because SSF is too large and too flexible to be tested as a single unit on the ground. Such full-system data is essential for validating the analytical model of SSF which would be used in any engineering efforts associated with structural or control system changes that might be made to the station as missions evolve over time. Extensive analytical simulations of on-orbit tests, as well exploratory laboratory simulations using small-scale models, have been conducted in-house and under contract to develop a measurement plan and evaluate its potential performance. In particular, performance trade and parametric studies conducted as part of these simulations were used to resolve issues related to the number and location of the measurements, the type of excitation, data acquisition and data processing, effects of noise and nonlinearities, selection of target vibration modes, and the appropriate type of data analysis scheme. The purpose of this talk is to provide an executive-summary-type overview of the modal identification experiment which has emerged from the conceptual design studies conducted to-date. Emphasis throughout is on those aspects of the experiment which should be of interest to those attending the subject utilization conference. The presentation begins with some preparatory remarks to provide background and motivation for the experiment, describe the experiment in general terms, and cite the specific technical objectives. This is followed by a summary of the major results of the conceptual design studies conducted to define the baseline experiment. The baseline experiment which has resulted from the studies is then described.

  20. Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation

    NASA Technical Reports Server (NTRS)

    Ross, James C.

    2016-01-01

    Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.

  1. Temperature and heat flux datasets of a complex object in a fire plume for the validation of fire and thermal response codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jernigan, Dann A.; Blanchat, Thomas K.

    It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less

  2. Use of gene-expression programming to estimate Manning’s roughness coefficient for high gradient streams

    USGS Publications Warehouse

    Azamathulla, H. Md.; Jarrett, Robert D.

    2013-01-01

    Manning’s roughness coefficient (n) has been widely used in the estimation of flood discharges or depths of flow in natural channels. Therefore, the selection of appropriate Manning’s nvalues is of paramount importance for hydraulic engineers and hydrologists and requires considerable experience, although extensive guidelines are available. Generally, the largest source of error in post-flood estimates (termed indirect measurements) is due to estimates of Manning’s n values, particularly when there has been minimal field verification of flow resistance. This emphasizes the need to improve methods for estimating n values. The objective of this study was to develop a soft computing model in the estimation of the Manning’s n values using 75 discharge measurements on 21 high gradient streams in Colorado, USA. The data are from high gradient (S > 0.002 m/m), cobble- and boulder-bed streams for within bank flows. This study presents Gene-Expression Programming (GEP), an extension of Genetic Programming (GP), as an improved approach to estimate Manning’s roughness coefficient for high gradient streams. This study uses field data and assessed the potential of gene-expression programming (GEP) to estimate Manning’s n values. GEP is a search technique that automatically simplifies genetic programs during an evolutionary processes (or evolves) to obtain the most robust computer program (e.g., simplify mathematical expressions, decision trees, polynomial constructs, and logical expressions). Field measurements collected by Jarrett (J Hydraulic Eng ASCE 110: 1519–1539, 1984) were used to train the GEP network and evolve programs. The developed network and evolved programs were validated by using observations that were not involved in training. GEP and ANN-RBF (artificial neural network-radial basis function) models were found to be substantially more effective (e.g., R2 for testing/validation of GEP and RBF-ANN is 0.745 and 0.65, respectively) than Jarrett’s (J Hydraulic Eng ASCE 110: 1519–1539, 1984) equation (R2 for testing/validation equals 0.58) in predicting the Manning’s n.

  3. The tissue microarray data exchange specification: A document type definition to validate and enhance XML data

    PubMed Central

    Nohle, David G; Ayers, Leona W

    2005-01-01

    Background The Association for Pathology Informatics (API) Extensible Mark-up Language (XML) TMA Data Exchange Specification (TMA DES) proposed in April 2003 provides a community-based, open source tool for sharing tissue microarray (TMA) data in a common format. Each tissue core within an array has separate data including digital images; therefore an organized, common approach to produce, navigate and publish such data facilitates viewing, sharing and merging TMA data from different laboratories. The AIDS and Cancer Specimen Resource (ACSR) is a HIV/AIDS tissue bank consortium sponsored by the National Cancer Institute (NCI) Division of Cancer Treatment and Diagnosis (DCTD). The ACSR offers HIV-related malignancies and uninfected control tissues in microarrays (TMA) accompanied by de-identified clinical data to approved researchers. Exporting our TMA data into the proposed API specified format offers an opportunity to evaluate the API specification in an applied setting and to explore its usefulness. Results A document type definition (DTD) that governs the allowed common data elements (CDE) in TMA DES export XML files was written, tested and evolved and is in routine use by the ACSR. This DTD defines TMA DES CDEs which are implemented in an external file that can be supplemented by internal DTD extensions for locally defined TMA data elements (LDE). Conclusion ACSR implementation of the TMA DES demonstrated the utility of the specification and allowed application of a DTD to validate the language of the API specified XML elements and to identify possible enhancements within our TMA data management application. Improvements to the specification have additionally been suggested by our experience in importing other institution's exported TMA data. Enhancements to TMA DES to remove ambiguous situations and clarify the data should be considered. Better specified identifiers and hierarchical relationships will make automatic use of the data possible. Our tool can be used to reorder data and add identifiers; upgrading data for changes in the specification can be automatically accomplished. Using a DTD (optionally reflecting our proposed enhancements) can provide stronger validation of exported TMA data. PMID:15871741

  4. An evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool): Concurrent, face and content validity.

    PubMed

    De Groef, An; Van Kampen, Marijke; Moortgat, Peter; Anthonissen, Mieke; Van den Kerckhove, Eric; Christiaens, Marie-Rose; Neven, Patrick; Geraerts, Inge; Devoogdt, Nele

    2018-01-01

    To investigate the concurrent, face and content validity of an evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool). 1) Concurrent validity of the MAP-BC evaluation tool was investigated by exploring correlations (Spearman's rank Correlation Coefficient) between the subjective scores (0 -no adhesions to 3 -very strong adhesions) of the skin level using the MAP-BC evaluation tool and objective elasticity parameters (maximal skin extension and gross elasticity) generated by the Cutometer Dual MPA 580. Nine different examination points on and around the mastectomy scar were evaluated. 2) Face and content validity were explored by questioning therapists experienced with myofascial therapy in breast cancer patients about the comprehensibility and comprehensiveness of the MAP-BC evaluation tool. 1) Only three meaningful correlations were found on the mastectomy scar. For the most lateral examination point on the mastectomy scar a moderate negative correlation (-0.44, p = 0.01) with the maximal skin extension and a moderate positive correlation with the resistance versus ability of returning or 'gross elasticity' (0.42, p = 0.02) were found. For the middle point on the mastectomy scar an almost moderate positive correlation with gross elasticity was found as well (0.38, p = 0.04) 2) Content and face validity have been found to be good. Eighty-nine percent of the respondent found the instructions understandable and 98% found the scoring system obvious. Thirty-seven percent of the therapists suggested to add the possibility to evaluate additional anatomical locations in case of reconstructive and/or bilateral surgery. The MAP-BC evaluation tool for myofascial adhesions in breast cancer patients has good face and content validity. Evidence for good concurrent validity of the skin level was found only on the mastectomy scar itself.

  5. Content Validity Index and Intra- and Inter-Rater Reliability of a New Muscle Strength/Endurance Test Battery for Swedish Soldiers

    PubMed Central

    Larsson, Helena; Tegern, Matthias; Monnier, Andreas; Skoglund, Jörgen; Helander, Charlotte; Persson, Emelie; Malm, Christer; Broman, Lisbet; Aasa, Ulrika

    2015-01-01

    The objective of this study was to examine the content validity of commonly used muscle performance tests in military personnel and to investigate the reliability of a proposed test battery. For the content validity investigation, thirty selected tests were those described in the literature and/or commonly used in the Nordic and North Atlantic Treaty Organization (NATO) countries. Nine selected experts rated, on a four-point Likert scale, the relevance of these tests in relation to five different work tasks: lifting, carrying equipment on the body or in the hands, climbing, and digging. Thereafter, a content validity index (CVI) was calculated for each work task. The result showed excellent CVI (≥0.78) for sixteen tests, which comprised of one or more of the military work tasks. Three of the tests; the functional lower-limb loading test (the Ranger test), dead-lift with kettlebells, and back extension, showed excellent content validity for four of the work tasks. For the development of a new muscle strength/endurance test battery, these three tests were further supplemented with two other tests, namely, the chins and side-bridge test. The inter-rater reliability was high (intraclass correlation coefficient, ICC2,1 0.99) for all five tests. The intra-rater reliability was good to high (ICC3,1 0.82–0.96) with an acceptable standard error of mean (SEM), except for the side-bridge test (SEM%>15). Thus, the final suggested test battery for a valid and reliable evaluation of soldiers’ muscle performance comprised the following four tests; the Ranger test, dead-lift with kettlebells, chins, and back extension test. The criterion-related validity of the test battery should be further evaluated for soldiers exposed to varying physical workload. PMID:26177030

  6. The Extension Storyteller: Using Stories to Enhance Meaning and Catalyze Change

    ERIC Educational Resources Information Center

    Franz, Nancy

    2016-01-01

    Many cultures share and pass on norms through storytelling. Extension as a culture also creates and shares stories to pass on history, provide information about Extension work and experiences, and develop the organization. However, Extension as a culture less frequently uses storytelling to enhance meaning and catalyze related change. This article…

  7. Analysis of in-flight boundary-layer state measurements on a subsonic transport wing in high-lift configuration

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Los, S. M.; Miley, S. J.; Yip, L. P.; Banks, D. W.; Roback, V. E.; Bertelrud, A.

    1995-01-01

    Flight experiments on NASA Langley's B737-100 (TSRV) airplane have been conducted to document flow characteristics in order to further the understanding of high-lift flow physics, and to correlate and validate computational predictions and wind-tunnel measurements. The project is a cooperative effort involving NASA, industry, and universities. In addition to focusing on in-flight measurements, the project includes extensive application of various computational techniques, and correlation of flight data with computational results and wind-tunnel measurements. Results obtained in the most recent phase of flight experiments are analyzed and presented in this paper. In-flight measurements include surface pressure distributions, measured using flush pressure taps and pressure belts on the slats, main element, and flap elements; surface shear stresses, measured using Preston tubes; off-surface velocity distributions, measured using shear-layer rakes; aeroelastic deformations of the flap elements, measured using an optical positioning system; and boundary-layer transition phenomena, measured using hot-film anemometers and an infrared imaging system. The analysis in this paper primarily focuses on changes in the boundary-layer state that occurred on the slats, main element, and fore flap as a result of changes in flap setting and/or flight condition. Following a detailed description of the experiment, the boundary-layer state phenomenon will be discussed based on data measured during these recent flight experiments.

  8. Full Simulation for the Qweak Experiment at 1.16 and 0.877 GeV and their Impact on Extracting the PV Asymmetry in the N → Δ Transition.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuhait, Hend

    The Qweak project is seeking to find new physics beyond the Standard Model. It is aimed to measure the weak charge of the proton, which has never been measured, at 4% precision at low momentum transfer. The experiment is performed by scattering electrons from protons and exploiting parity violation in the weak interaction at low four-momentum transfer. In this experiment, two measurements were considered: which are elastic and inelastic. The elastic is to measure the proton's weak charge. In addition, the inelastic asymmetry measurement, which will extract the low energy constant d. That measurement works in the neutral current sectormore » of the weak interaction. Qweak measures the asymmetry in the N → Δ; transition. Because the elastic radiative tail gives a dominant contribution to the uncertainty to the N → Δ; asymmetries, this thesis will discuss the radiative correction. In addition, this thesis will describe in details the extensive simulations preformed to determine the impact of all simulated background processes on extracting the PV N → Δ; asymmetries. In the process of verifying the validity of these background fractions, we determined the best value of a quantity measured during the Qweak experiment: the beam normal single spin asymmetry, Bn, in the N → Δ; transition.« less

  9. The SCD - Stem Cell Differentiation ESA Project: Preparatory Work for the Spaceflight Mission

    NASA Astrophysics Data System (ADS)

    Versari, Silvia; Barenghi, Livia; van Loon, Jack; Bradamante, Silvia

    2016-04-01

    Due to spaceflight, astronauts experience serious, weightlessness-induced bone loss because of an unbalanced process of bone remodeling that involves bone marrow mesenchymal stem cells (BMSCs), as well as osteoblasts, osteocytes, and osteoclasts. The effects of microgravity on osteo-cells have been extensively studied, but it is only recently that consideration has been given to the role of BMSCs. Previous researches indicated that human BMSCs cultured in simulated microgravity (sim-μg) alter their proliferation and differentiation. The spaceflight opportunities for biomedical experiments are rare and suffer from a number of operative constraints that could bias the validity of the experiment itself, but remain a unique opportunity to confirm and explain the effects due to microgravity, that are only partially activated/detectable in simulated conditions. For this reason, we carefully prepared the SCD - STEM CELLS DIFFERENTIATION experiment, selected by the European Space Agency (ESA) and now on the International Space Station (ISS). Here we present the preparatory studies performed on ground to adapt the project to the spaceflight constraints in terms of culture conditions, fixation and storage of human BMSCs in space aiming at satisfying the biological requirements mandatory to retrieve suitable samples for post-flight analyses. We expect to understand better the molecular mechanisms governing human BMSC growth and differentiation hoping to outline new countermeasures against astronaut bone loss.

  10. MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.

    PubMed

    Elliott, Mark T; Welchman, Andrew E; Wing, Alan M

    2009-02-15

    Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.

  11. Electrolysis Performance Improvement and Validation Experiment

    NASA Technical Reports Server (NTRS)

    Schubert, Franz H.

    1992-01-01

    Viewgraphs on electrolysis performance improvement and validation experiment are presented. Topics covered include: water electrolysis: an ever increasing need/role for space missions; static feed electrolysis (SFE) technology: a concept developed for space applications; experiment objectives: why test in microgravity environment; and experiment description: approach, hardware description, test sequence and schedule.

  12. Concept analysis and validation of the nursing diagnosis, delayed surgical recovery.

    PubMed

    Appoloni, Aline Helena; Herdman, T Heather; Napoleão, Anamaria Alves; Campos de Carvalho, Emilia; Hortense, Priscilla

    2013-10-01

    To analyze the human response of delayed surgical recovery, approved by NANDA-I, and to validate its defining characteristics (DCs) and related factors (RFs). This was a two-part study using a concept analysis based on the method of Walker and Avant, and diagnostic content validation based on Fehring's model. Three of the original DCs, and three proposed DCs identified from the concept analysis, were validated in this study; five of the original RFs and four proposed RFs were validated. A revision of the concept studied is suggested, incorporating the validation of some of the DCs and RFs presented by NANDA-I, and the insertion of new, validated DCs and RFs. This study may enable the extension of the use of this diagnosis and contribute to quality surgical care of clients. © 2013, The Authors. International Journal of Nursing Knowledge © 2013, NANDA International.

  13. A CFD validation roadmap for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  14. A CFD validation roadmap for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1993-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  15. Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed

    NASA Technical Reports Server (NTRS)

    Tian, Ye; Song, Qi; Cattafesta, Louis

    2005-01-01

    This report summarizes the activities on "Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed." The work summarized consists primarily of two parts. The first part summarizes our previous work and the extensions to adaptive ID and control algorithms. The second part concentrates on the validation of adaptive algorithms by applying them to a vibration beam test bed. Extensions to flow control problems are discussed.

  16. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin

    PubMed Central

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2016-01-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a “complete mystical experience” that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. PMID:26442957

  17. The Protective Value of Hardiness on Military Posttraumatic Stress Symptoms

    DTIC Science & Technology

    2013-01-01

    such as the death of service member colleagues and combat experi- ences. Extensive military experience may play a larger role in the development of...related stres - sors, such as number of deployments, combat experience, and exposure to death or serious injury of military colleagues and nonmilitary...predictor of PTSD. Although it is impor- tant to acknowledge that extensive military ser- vice may play a role in the development of PTSD, it is

  18. Validation and understanding of Moderate Resolution Imaging Spectroradiometer aerosol products (C5) using ground-based measurements from the handheld Sun photometer network in China

    Treesearch

    Zhanqing Li; Feng Niu; Kwon-Ho Lee; Jinyuan Xin; Wei Min Hao; Bryce L. Nordgren; Yuesi Wang; Pucai Wang

    2007-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) currently provides the most extensive aerosol retrievals on a global basis, but validation is limited to a small number of ground stations. This study presents a comprehensive evaluation of Collection 4 and 5 MODIS aerosol products using ground measurements from the Chinese Sun Hazemeter Network (CSHNET). The...

  19. A Cost Analysis Model for Army Sponsored Graduate Dental Education Programs.

    DTIC Science & Technology

    1997-04-01

    characteristics of a good measurement tool ? Cooper and Emory in their textbook, Business Research Methods, state there are three major criteria for evaluating...a measurement tool : validity, reliability, and practicality (Cooper and Emory 1995). Validity can be compartmentalized into internal and external...tremendous expense? The AEGD-1 year program is used extensively as a recruiting tool to encourage senior dental students to join the Army Dental Corps. The

  20. Impact of imaging measurements on response assessment in glioblastoma clinical trials

    PubMed Central

    Reardon, David A.; Ballman, Karla V.; Buckner, Jan C.; Chang, Susan M.; Ellingson, Benjamin M.

    2014-01-01

    We provide historical and scientific guidance on imaging response assessment for incorporation into clinical trials to stimulate effective and expedited drug development for recurrent glioblastoma by addressing 3 fundamental questions: (i) What is the current validation status of imaging response assessment, and when are we confident assessing response using today's technology? (ii) What imaging technology and/or response assessment paradigms can be validated and implemented soon, and how will these technologies provide benefit? (iii) Which imaging technologies need extensive testing, and how can they be prospectively validated? Assessment of T1 +/− contrast, T2/FLAIR, diffusion, and perfusion-imaging sequences are routine and provide important insight into underlying tumor activity. Nonetheless, utility of these data within and across patients, as well as across institutions, are limited by challenges in quantifying measurements accurately and lack of consistent and standardized image acquisition parameters. Currently, there exists a critical need to generate guidelines optimizing and standardizing MRI sequences for neuro-oncology patients. Additionally, more accurate differentiation of confounding factors (pseudoprogression or pseudoresponse) may be valuable. Although promising, diffusion MRI, perfusion MRI, MR spectroscopy, and amino acid PET require extensive standardization and validation. Finally, additional techniques to enhance response assessment, such as digital T1 subtraction maps, warrant further investigation. PMID:25313236

  1. Patient Experience and Satisfaction with Inpatient Service: Development of Short Form Survey Instrument Measuring the Core Aspect of Inpatient Experience

    PubMed Central

    Wong, Eliza L. Y.; Coulter, Angela; Hewitson, Paul; Cheung, Annie W. L.; Yam, Carrie H. K.; Lui, Siu fai; Tam, Wilson W. S.; Yeoh, Eng-kiong

    2015-01-01

    Patient experience reflects quality of care from the patients’ perspective; therefore, patients’ experiences are important data in the evaluation of the quality of health services. The development of an abbreviated, reliable and valid instrument for measuring inpatients’ experience would reflect the key aspect of inpatient care from patients’ perspective as well as facilitate quality improvement by cultivating patient engagement and allow the trends in patient satisfaction and experience to be measured regularly. The study developed a short-form inpatient instrument and tested its ability to capture a core set of inpatients’ experiences. The Hong Kong Inpatient Experience Questionnaire (HKIEQ) was established in 2010; it is an adaptation of the General Inpatient Questionnaire of the Care Quality Commission created by the Picker Institute in United Kingdom. This study used a consensus conference and a cross-sectional validation survey to create and validate a short-form of the Hong Kong Inpatient Experience Questionnaire (SF-HKIEQ). The short-form, the SF-HKIEQ, consisted of 18 items derived from the HKIEQ. The 18 items mainly covered relational aspects of care under four dimensions of the patient’s journey: hospital staff, patient care and treatment, information on leaving the hospital, and overall impression. The SF-HKIEQ had a high degree of face validity, construct validity and internal reliability. The validated SF-HKIEQ reflects the relevant core aspects of inpatients’ experience in a hospital setting. It provides a quick reference tool for quality improvement purposes and a platform that allows both healthcare staff and patients to monitor the quality of hospital care over time. PMID:25860775

  2. The Complicate Observations and Multi-Parameter Land Information Constructions on Allied Telemetry Experiment (COMPLICATE)

    PubMed Central

    Tian, Xin; Li, Zengyuan; Chen, Erxue; Liu, Qinhuo; Yan, Guangjian; Wang, Jindi; Niu, Zheng; Zhao, Shaojie; Li, Xin; Pang, Yong; Su, Zhongbo; van der Tol, Christiaan; Liu, Qingwang; Wu, Chaoyang; Xiao, Qing; Yang, Le; Mu, Xihan; Bo, Yanchen; Qu, Yonghua; Zhou, Hongmin; Gao, Shuai; Chai, Linna; Huang, Huaguo; Fan, Wenjie; Li, Shihua; Bai, Junhua; Jiang, Lingmei; Zhou, Ji

    2015-01-01

    The Complicate Observations and Multi-Parameter Land Information Constructions on Allied Telemetry Experiment (COMPLICATE) comprises a network of remote sensing experiments designed to enhance the dynamic analysis and modeling of remotely sensed information for complex land surfaces. Two types of experimental campaigns were established under the framework of COMPLICATE. The first was designed for continuous and elaborate experiments. The experimental strategy helps enhance our understanding of the radiative and scattering mechanisms of soil and vegetation and modeling of remotely sensed information for complex land surfaces. To validate the methodologies and models for dynamic analyses of remote sensing for complex land surfaces, the second campaign consisted of simultaneous satellite-borne, airborne, and ground-based experiments. During field campaigns, several continuous and intensive observations were obtained. Measurements were undertaken to answer key scientific issues, as follows: 1) Determine the characteristics of spatial heterogeneity and the radiative and scattering mechanisms of remote sensing on complex land surfaces. 2) Determine the mechanisms of spatial and temporal scale extensions for remote sensing on complex land surfaces. 3) Determine synergist inversion mechanisms for soil and vegetation parameters using multi-mode remote sensing on complex land surfaces. Here, we introduce the background, the objectives, the experimental designs, the observations and measurements, and the overall advances of COMPLICATE. As a result of the implementation of COMLICATE and for the next several years, we expect to contribute to quantitative remote sensing science and Earth observation techniques. PMID:26332035

  3. Is there inter-procedural transfer of skills in intraocular surgery? A randomized controlled trial.

    PubMed

    Thomsen, Ann Sofia Skou; Kiilgaard, Jens Folke; la Cour, Morten; Brydges, Ryan; Konge, Lars

    2017-12-01

    To investigate how experience in simulated cataract surgery impacts and transfers to the learning curves for novices in vitreoretinal surgery. Twelve ophthalmology residents without previous experience in intraocular surgery were randomized to (1) intensive training in cataract surgery on a virtual-reality simulator until passing a test with predefined validity evidence (cataract trainees) or to (2) no cataract surgery training (novices). Possible skill transfer was assessed using a test consisting of all 11 vitreoretinal modules on the EyeSi virtual-reality simulator. All participants repeated the test of vitreoretinal surgical skills until their performance curve plateaued. Three experienced vitreoretinal surgeons also performed the test to establish validity evidence. Analysis with independent samples t-tests was performed. The vitreoretinal test on the EyeSi simulator demonstrated evidence of validity, given statistically significant differences in mean test scores for the first repetition; experienced surgeons scored higher than novices (p = 0.023) and cataract trainees (p = 0.003). Internal consistency for the 11 modules of the test was acceptable (Cronbach's α = 0.73). Our findings did not indicate a transfer effect with no significant differences found between cataract trainees and novices in their starting scores (mean ± SD 381 ± 129 points versus 455 ± 82 points, p = 0.262), time to reach maximum performance level (10.7 ± 3.0 hr versus 8.7 ± 2.8 hr, p = 0.265), or maximum scores (785 ± 162 points versus 805 ± 73 points, p = 0.791). Pretraining in cataract surgery did not demonstrate any measurable effect on vitreoretinal procedural performance. The results of this study indicate that we should not anticipate extensive transfer of surgical skills when planning training programmes in intraocular surgery. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  4. Diffusion of Magnetized Binary Ionic Mixtures at Ultracold Plasma Conditions

    NASA Astrophysics Data System (ADS)

    Vidal, Keith R.; Baalrud, Scott D.

    2017-10-01

    Ultracold plasma experiments offer an accessible means to test transport theories for strongly coupled systems. Application of an external magnetic field might further increase their utility by inhibiting heating mechanisms of ions and electrons and increasing the temperature at which strong coupling effects are observed. We present results focused on developing and validating a transport theory to describe binary ionic mixtures across a wide range of coupling and magnetization strengths relevant to ultracold plasma experiments. The transport theory is an extension of the Effective Potential Theory (EPT), which has been shown to accurately model correlation effects at these conditions, to include magnetization. We focus on diffusion as it can be measured in ultracold plasma experiments. Using EPT within the framework of the Chapman-Enskog expansion, the parallel and perpendicular self and interdiffusion coefficients for binary ionic mixtures with varying mass ratios are calculated and are compared to molecular dynamics simulations. The theory is found to accurately extend Braginskii-like transport to stronger coupling, but to break down when the magnetization strength becomes large enough that the typical gyroradius is smaller than the interaction scale length. This material is based upon work supported by the Air Force Office of Scientific Research under Award Number FA9550-16-1-0221.

  5. Fluid dynamic mechanisms and interactions within separated flows and their effects on missile aerodynamics

    NASA Astrophysics Data System (ADS)

    Addy, A. L.; Chow, W. L.; Korst, H. H.; White, R. A.

    1983-05-01

    Significant data and detailed results of a joint research effort investigating the fluid dynamic mechanisms and interactions within separated flows are presented. The results were obtained through analytical, experimental, and computational investigations of base flow related configurations. The research objectives focus on understanding the component mechanisms and interactions which establish and maintain separated flow regions. Flow models and theoretical analyses were developed to describe the base flowfield. The research approach has been to conduct extensive small-scale experiments on base flow configurations and to analyze these flows by component models and finite-difference techniques. The modeling of base flows of missiles (both powered and unpowered) for transonic and supersonic freestreams has been successful by component models. Research on plume effects and plume modeling indicated the need to match initial plume slope and plume surface curvature for valid wind tunnel simulation of an actual rocket plume. The assembly and development of a state-of-the-art laser Doppler velocimeter (LDV) system for experiments with two-dimensional small-scale models has been completed and detailed velocity and turbulence measurements are underway. The LDV experiments include the entire range of base flowfield mechanisms - shear layer development, recompression/reattachment, shock-induced separation, and plume-induced separation.

  6. Validating MODIS above-cloud aerosol optical depth retrieved from "color ratio" algorithm using direct measurements made by NASA's airborne AATS and 4STAR sensors

    NASA Astrophysics Data System (ADS)

    Jethva, Hiren; Torres, Omar; Remer, Lorraine; Redemann, Jens; Livingston, John; Dunagan, Stephen; Shinozuka, Yohei; Kacenelenbogen, Meloe; Segal Rosenheimer, Michal; Spurr, Rob

    2016-10-01

    We present the validation analysis of above-cloud aerosol optical depth (ACAOD) retrieved from the "color ratio" method applied to MODIS cloudy-sky reflectance measurements using the limited direct measurements made by NASA's airborne Ames Airborne Tracking Sunphotometer (AATS) and Spectrometer for Sky-Scanning, Sun-Tracking Atmospheric Research (4STAR) sensors. A thorough search of the airborne database collection revealed a total of five significant events in which an airborne sun photometer, coincident with the MODIS overpass, observed partially absorbing aerosols emitted from agricultural biomass burning, dust, and wildfires over a low-level cloud deck during SAFARI-2000, ACE-ASIA 2001, and SEAC4RS 2013 campaigns, respectively. The co-located satellite-airborne matchups revealed a good agreement (root-mean-square difference < 0.1), with most matchups falling within the estimated uncertainties associated the MODIS retrievals (about -10 to +50 %). The co-retrieved cloud optical depth was comparable to that of the MODIS operational cloud product for ACE-ASIA and SEAC4RS, however, higher by 30-50 % for the SAFARI-2000 case study. The reason for this discrepancy could be attributed to the distinct aerosol optical properties encountered during respective campaigns. A brief discussion on the sources of uncertainty in the satellite-based ACAOD retrieval and co-location procedure is presented. Field experiments dedicated to making direct measurements of aerosols above cloud are needed for the extensive validation of satellite-based retrievals.

  7. Validating MODIS Above-Cloud Aerosol Optical Depth Retrieved from Color Ratio Algorithm Using Direct Measurements Made by NASA's Airborne AATS and 4STAR Sensors

    NASA Technical Reports Server (NTRS)

    Jethva, Hiren; Torres, Omar; Remer, Lorraine; Redemann, Jens; Livingston, John; Dunagan, Stephen; Shinozuka, Yohei; Kacenelenbogen, Meloe; Segal Rozenhaimer, Michal; Spurr, Rob

    2016-01-01

    We present the validation analysis of above-cloud aerosol optical depth (ACAOD) retrieved from the color ratio method applied to MODIS cloudy-sky reflectance measurements using the limited direct measurements made by NASAs airborne Ames Airborne Tracking Sunphotometer (AATS) and Spectrometer for Sky-Scanning, Sun-Tracking Atmospheric Research (4STAR) sensors. A thorough search of the airborne database collection revealed a total of five significant events in which an airborne sun photometer, coincident with the MODIS overpass, observed partially absorbing aerosols emitted from agricultural biomass burning, dust, and wildfires over a low-level cloud deck during SAFARI-2000, ACE-ASIA 2001, and SEAC4RS 2013 campaigns, respectively. The co-located satellite-airborne match ups revealed a good agreement (root-mean-square difference less than 0.1), with most match ups falling within the estimated uncertainties associated with the MODIS retrievals (about -10 to +50 ). The co-retrieved cloud optical depth was comparable to that of the MODIS operational cloud product for ACE-ASIA and SEAC4RS, however, higher by 30-50% for the SAFARI-2000 case study. The reason for this discrepancy could be attributed to the distinct aerosol optical properties encountered during respective campaigns. A brief discussion on the sources of uncertainty in the satellite-based ACAOD retrieval and co-location procedure is presented. Field experiments dedicated to making direct measurements of aerosols above cloud are needed for the extensive validation of satellite based retrievals.

  8. Forensic Uncertainty Quantification of Explosive Dispersal of Particles

    NASA Astrophysics Data System (ADS)

    Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho

    2017-06-01

    In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.

  9. Using Colored Stochastic Petri Net (CS-PN) software for protocol specification, validation, and evaluation

    NASA Technical Reports Server (NTRS)

    Zenie, Alexandre; Luguern, Jean-Pierre

    1987-01-01

    The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.

  10. Development and psychometric validation of the general practice nurse satisfaction scale.

    PubMed

    Halcomb, Elizabeth J; Caldwell, Belinda; Salamonson, Yenna; Davidson, Patricia M

    2011-09-01

    To develop an instrument to assess consumer satisfaction with nursing in general practice to provide feedback to nurses about consumers' perceptions of their performance. Prospective psychometric instrument validation study. A literature review was conducted to generate items for an instrument to measure consumer satisfaction with nursing in general practice. Face and content validity were evaluated by an expert panel, which had extensive experience in general practice nursing and research. Included in the questionnaire battery was the 27-item General Practice Nurse Satisfaction (GPNS) scale, as well as demographic and health status items. This survey was distributed to 739 consumers following intervention administered by a practice nurse in 16 general practices across metropolitan, rural, and regional Australia. Participants had the option of completing the survey online or receiving a hard copy of the survey form at the time of their visit. These data were collected between June and August 2009. Satisfaction data from 739 consumers were collected following their consultation with a general practice nurse. From the initial 27-item GPNS scale, a 21-item instrument was developed. Two factors, "confidence and credibility" and "interpersonal and communication" were extracted using principal axis factoring and varimax rotation. These two factors explained 71.9% of the variance. Cronbach's α was 0.97. The GPNS scale has demonstrated acceptable psychometric properties and can be used both in research and clinical practice for evaluating consumer satisfaction with general practice nurses. Assessing consumer satisfaction is important for developing and evaluating nursing roles. The GPNS scale is a valid and reliable tool that can be utilized to assess consumer satisfaction with general practice nurses and can assist in performance management and improving the quality of nursing services. © 2011 Sigma Theta Tau International.

  11. Consensus Recommendations on Initiating Prescription Therapies for Opioid‐Induced Constipation

    PubMed Central

    Argoff, Charles E.; Brennan, Michael J.; Camilleri, Michael; Davies, Andrew; Fudin, Jeffrey; Galluzzi, Katherine E.; Gudin, Jeffrey; Lembo, Anthony; Stanos, Steven P.

    2015-01-01

    Abstract Objective Aims of this consensus panel were to determine (1) an optimal symptom‐based method for assessing opioid‐induced constipation in clinical practice and (2) a threshold of symptom severity to prompt consideration of prescription therapy. Methods A multidisciplinary panel of 10 experts with extensive knowledge/experience with opioid‐associated adverse events convened to discuss the literature on assessment methods used for opioid‐induced constipation and reach consensus on each objective using the nominal group technique. Results Five validated assessment tools were evaluated: the Patient Assessment of Constipation–Symptoms (PAC‐SYM), Patient Assessment of Constipation–Quality of Life (PAC‐QOL), Stool Symptom Screener (SSS), Bowel Function Index (BFI), and Bowel Function Diary (BF‐Diary). The 3‐item BFI and 4‐item SSS, both clinician administered, are the shortest tools. In published trials, the BFI and 12‐item PAC‐SYM are most commonly used. The 11‐item BF‐Diary is highly relevant in opioid‐induced constipation and was developed and validated in accordance with US Food and Drug Administration guidelines. However, the panel believes that the complex scoring for this tool and the SSS, PAC‐SYM, and 28‐item PAC‐QOL may be unfeasible for clinical practice. The BFI is psychometrically validated and responsive to changes in symptom severity; scores range from 0 to 100, with higher scores indicating greater severity and scores >28.8 points indicating constipation. Conclusions The BFI is a simple assessment tool with a validated threshold of clinically significant constipation. Prescription treatments for opioid‐induced constipation should be considered for patients who have a BFI score of ≥30 points and an inadequate response to first‐line interventions. PMID:26582720

  12. The iMTA Productivity Cost Questionnaire: A Standardized Instrument for Measuring and Valuing Health-Related Productivity Losses.

    PubMed

    Bouwmans, Clazien; Krol, Marieke; Severens, Hans; Koopmanschap, Marc; Brouwer, Werner; Hakkaart-van Roijen, Leona

    2015-09-01

    Productivity losses often contribute significantly to the total costs in economic evaluations adopting a societal perspective. Currently, no consensus exists on the measurement and valuation of productivity losses. We aimed to develop a standardized instrument for measuring and valuing productivity losses. A group of researchers with extensive experience in measuring and valuing productivity losses designed an instrument suitable for self-completion, building on preknowledge and evidence on validity. The instrument was designed to cover all domains of productivity losses, thus allowing quantification and valuation of all productivity losses. A feasibility study was performed to check the questionnaire's consistency and intelligibility. The iMTA Productivity Cost Questionnaire (iPCQ) includes three modules measuring productivity losses of paid work due to 1) absenteeism and 2) presenteeism and productivity losses related to 3) unpaid work. Questions for measuring absenteeism and presenteeism were derived from existing validated questionnaires. Because validated measures of losses of unpaid work are scarce, the questions of this module were newly developed. To enhance the instrument's feasibility, simple language was used. The feasibility study included 195 respondents (response rate 80%) older than 18 years. Seven percent (n = 13) identified problems while filling in the iPCQ, including problems with the questionnaire's instructions and routing (n = 6) and wording (n = 2). Five respondents experienced difficulties in estimating the time that would be needed for other people to make up for lost unpaid work. Most modules of the iPCQ are based on validated questions derived from previously available instruments. The instrument is understandable for most of the general public. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. The Development and Validation of a Life Experience Inventory for the Identification of Creative Electrical Engineers.

    ERIC Educational Resources Information Center

    Michael, William B.; Colson, Kenneth R.

    1979-01-01

    The construction and validation of the Life Experience Inventory (LEI) for the identification of creative electrical engineers are described. Using the number of patents held or pending as a criterion measure, the LEI was found to have high concurrent validity. (JKS)

  14. UTCI-Fiala multi-node model of human heat transfer and temperature regulation

    NASA Astrophysics Data System (ADS)

    Fiala, Dusan; Havenith, George; Bröde, Peter; Kampmann, Bernhard; Jendritzky, Gerd

    2012-05-01

    The UTCI-Fiala mathematical model of human temperature regulation forms the basis of the new Universal Thermal Climate Index (UTC). Following extensive validation tests, adaptations and extensions, such as the inclusion of an adaptive clothing model, the model was used to predict human temperature and regulatory responses for combinations of the prevailing outdoor climate conditions. This paper provides an overview of the underlying algorithms and methods that constitute the multi-node dynamic UTCI-Fiala model of human thermal physiology and comfort. Treated topics include modelling heat and mass transfer within the body, numerical techniques, modelling environmental heat exchanges, thermoregulatory reactions of the central nervous system, and perceptual responses. Other contributions of this special issue describe the validation of the UTCI-Fiala model against measured data and the development of the adaptive clothing model for outdoor climates.

  15. Effective and extensible feature extraction method using genetic algorithm-based frequency-domain feature search for epileptic EEG multiclassification

    PubMed Central

    Wen, Tingxi; Zhang, Zhongnan

    2017-01-01

    Abstract In this paper, genetic algorithm-based frequency-domain feature search (GAFDS) method is proposed for the electroencephalogram (EEG) analysis of epilepsy. In this method, frequency-domain features are first searched and then combined with nonlinear features. Subsequently, these features are selected and optimized to classify EEG signals. The extracted features are analyzed experimentally. The features extracted by GAFDS show remarkable independence, and they are superior to the nonlinear features in terms of the ratio of interclass distance and intraclass distance. Moreover, the proposed feature search method can search for features of instantaneous frequency in a signal after Hilbert transformation. The classification results achieved using these features are reasonable; thus, GAFDS exhibits good extensibility. Multiple classical classifiers (i.e., k-nearest neighbor, linear discriminant analysis, decision tree, AdaBoost, multilayer perceptron, and Naïve Bayes) achieve satisfactory classification accuracies by using the features generated by the GAFDS method and the optimized feature selection. The accuracies for 2-classification and 3-classification problems may reach up to 99% and 97%, respectively. Results of several cross-validation experiments illustrate that GAFDS is effective in the extraction of effective features for EEG classification. Therefore, the proposed feature selection and optimization model can improve classification accuracy. PMID:28489789

  16. High-throughput hyperpolarized 13C metabolic investigations using a multi-channel acquisition system

    NASA Astrophysics Data System (ADS)

    Lee, Jaehyuk; Ramirez, Marc S.; Walker, Christopher M.; Chen, Yunyun; Yi, Stacey; Sandulache, Vlad C.; Lai, Stephen Y.; Bankson, James A.

    2015-11-01

    Magnetic resonance imaging and spectroscopy of hyperpolarized (HP) compounds such as [1-13C]-pyruvate have shown tremendous potential for offering new insight into disease and response to therapy. New applications of this technology in clinical research and care will require extensive validation in cells and animal models, a process that may be limited by the high cost and modest throughput associated with dynamic nuclear polarization. Relatively wide spectral separation between [1-13C]-pyruvate and its chemical endpoints in vivo are conducive to simultaneous multi-sample measurements, even in the presence of a suboptimal global shim. Multi-channel acquisitions could conserve costs and accelerate experiments by allowing acquisition from multiple independent samples following a single dissolution. Unfortunately, many existing preclinical MRI systems are equipped with only a single channel for broadband acquisitions. In this work, we examine the feasibility of this concept using a broadband multi-channel digital receiver extension and detector arrays that allow concurrent measurement of dynamic spectroscopic data from ex vivo enzyme phantoms, in vitro anaplastic thyroid carcinoma cells, and in vivo in tumor-bearing mice. Throughput and the cost of consumables were improved by up to a factor of four. These preliminary results demonstrate the potential for efficient multi-sample studies employing hyperpolarized agents.

  17. Simulation of orientational coherent effects via Geant4

    NASA Astrophysics Data System (ADS)

    Bagli, E.; Asai, M.; Brandt, D.; Dotti, A.; Guidi, V.; Verderi, M.; Wright, D.

    2017-10-01

    Simulation of orientational coherent effects via Geant4 beam manipulation of high-and very-high-energy particle beams is a hot topic in accelerator physics. Coherent effects of ultra-relativistic particles in bent crystals allow the steering of particle trajectories thanks to the strong electrical field generated between atomic planes. Recently, a collimation experiment with bent crystals was carried out at the CERN-LHC, paving the way to the usage of such technology in current and future accelerators. Geant4 is a widely used object-oriented tool-kit for the Monte Carlo simulation of the interaction of particles with matter in high-energy physics. Moreover, its areas of application include also nuclear and accelerator physics, as well as studies in medical and space science. We present the first Geant4 extension for the simulation of orientational effects in straight and bent crystals for high energy charged particles. The model allows the manipulation of particle trajectories by means of straight and bent crystals and the scaling of the cross sections of hadronic and electromagnetic processes for channeled particles. Based on such a model, an extension of the Geant4 toolkit has been developed. The code and the model have been validated by comparison with published experimental data regarding the deflection efficiency via channeling and the variation of the rate of inelastic nuclear interactions.

  18. Observed Local Impacts of Global Irrigation on Surface Temperature

    NASA Astrophysics Data System (ADS)

    Chen, L.; Dirmeyer, P.

    2017-12-01

    Agricultural irrigation has significant potential for altering local climate through reducing soil albedo, increasing evapotranspiration, and enabling greater leaf area. Numerous studies using regional or global climate models have demonstrated the cooling effects of irrigation on mean and extreme temperature, especially over regions where irrigation is extensive. However, these model-based results have not been validated due to the limitations of observational datasets. In this study, multiple satellite-based products, including the Moderate Resolution Imaging Spectroradiometer (MODIS) and Soil Moisture Active Passive (SMAP) data sets, are used to isolate and quantify the local impacts of irrigation on surface climate over the irrigated regions, which are derived from the Global Map of Irrigation Areas (GMIA). The relationships among soil moisture, albedo, evapotranspiration, and surface temperature are explored. Strong evaporative cooling of irrigation on daytime surface temperature is found over the arid and semi-arid regions, such as California's Central Valley, the Great Plains, and central Asia. However, the cooling effects are less evident in most areas of eastern China, India, and the Lower Mississippi River Basin in spite of extensive irrigation over these regions. Results are also compared with irrigation experiments using the Community Earth System Model (CESM) to assess the model's ability to represent land-atmosphere interactions in regards to irrigation.

  19. Effective and extensible feature extraction method using genetic algorithm-based frequency-domain feature search for epileptic EEG multiclassification.

    PubMed

    Wen, Tingxi; Zhang, Zhongnan

    2017-05-01

    In this paper, genetic algorithm-based frequency-domain feature search (GAFDS) method is proposed for the electroencephalogram (EEG) analysis of epilepsy. In this method, frequency-domain features are first searched and then combined with nonlinear features. Subsequently, these features are selected and optimized to classify EEG signals. The extracted features are analyzed experimentally. The features extracted by GAFDS show remarkable independence, and they are superior to the nonlinear features in terms of the ratio of interclass distance and intraclass distance. Moreover, the proposed feature search method can search for features of instantaneous frequency in a signal after Hilbert transformation. The classification results achieved using these features are reasonable; thus, GAFDS exhibits good extensibility. Multiple classical classifiers (i.e., k-nearest neighbor, linear discriminant analysis, decision tree, AdaBoost, multilayer perceptron, and Naïve Bayes) achieve satisfactory classification accuracies by using the features generated by the GAFDS method and the optimized feature selection. The accuracies for 2-classification and 3-classification problems may reach up to 99% and 97%, respectively. Results of several cross-validation experiments illustrate that GAFDS is effective in the extraction of effective features for EEG classification. Therefore, the proposed feature selection and optimization model can improve classification accuracy.

  20. Kinesthetic Feedback During 2DOF Wrist Movements via a Novel MR-Compatible Robot.

    PubMed

    Erwin, Andrew; O'Malley, Marcia K; Ress, David; Sergi, Fabrizio

    2017-09-01

    We demonstrate the interaction control capabilities of the MR-SoftWrist, a novel MR-compatible robot capable of applying accurate kinesthetic feedback to wrist pointing movements executed during fMRI. The MR-SoftWrist, based on a novel design that combines parallel piezoelectric actuation with compliant force feedback, is capable of delivering 1.5 N [Formula: see text] of torque to the wrist of an interacting subject about the flexion/extension and radial/ulnar deviation axes. The robot workspace, defined by admissible wrist rotation angles, fully includes a circle with a 20 deg radius. Via dynamic characterization, we demonstrate capability for transparent operation with low (10% of maximum torque output) backdrivability torques at nominal speeds. Moreover, we demonstrate a 5.5 Hz stiffness control bandwidth for a 14 dB range of virtual stiffness values, corresponding to 25%-125% of the device's physical reflected stiffness in the nominal configuration. We finally validate the possibility of operation during fMRI via a case study involving one healthy subject. Our validation experiment demonstrates the capability of the device to apply kinesthetic feedback to elicit distinguishable kinetic and neural responses without significant degradation of image quality or task-induced head movements. With this study, we demonstrate the feasibility of MR-compatible devices like the MR-SoftWrist to be used in support of motor control experiments investigating wrist pointing under robot-applied force fields. Such future studies may elucidate fundamental neural mechanisms enabling robot-assisted motor skill learning, which is crucial for robot-aided neurorehabilitation.

  1. An atmospheric pressure high-temperature laminar flow reactor for investigation of combustion and related gas phase reaction systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oßwald, Patrick; Köhler, Markus

    A new high-temperature flow reactor experiment utilizing the powerful molecular beam mass spectrometry (MBMS) technique for detailed observation of gas phase kinetics in reacting flows is presented. The reactor design provides a consequent extension of the experimental portfolio of validation experiments for combustion reaction kinetics. Temperatures up to 1800 K are applicable by three individually controlled temperature zones with this atmospheric pressure flow reactor. Detailed speciation data are obtained using the sensitive MBMS technique, providing in situ access to almost all chemical species involved in the combustion process, including highly reactive species such as radicals. Strategies for quantifying the experimentalmore » data are presented alongside a careful analysis of the characterization of the experimental boundary conditions to enable precise numeric reproduction of the experimental results. The general capabilities of this new analytical tool for the investigation of reacting flows are demonstrated for a selected range of conditions, fuels, and applications. A detailed dataset for the well-known gaseous fuels, methane and ethylene, is provided and used to verify the experimental approach. Furthermore, application for liquid fuels and fuel components important for technical combustors like gas turbines and engines is demonstrated. Besides the detailed investigation of novel fuels and fuel components, the wide range of operation conditions gives access to extended combustion topics, such as super rich conditions at high temperature important for gasification processes, or the peroxy chemistry governing the low temperature oxidation regime. These demonstrations are accompanied by a first kinetic modeling approach, examining the opportunities for model validation purposes.« less

  2. CFD Modeling Needs and What Makes a Good Supersonic Combustion Validation Experiment

    NASA Technical Reports Server (NTRS)

    Gaffney, Richard L., Jr.; Cutler, Andrew D.

    2005-01-01

    If a CFD code/model developer is asked what experimental data he wants to validate his code or numerical model, his answer will be: "Everything, everywhere, at all times." Since this is not possible, practical, or even reasonable, the developer must understand what can be measured within the limits imposed by the test article, the test location, the test environment and the available diagnostic equipment. At the same time, it is important for the expermentalist/diagnostician to understand what the CFD developer needs (as opposed to wants) in order to conduct a useful CFD validation experiment. If these needs are not known, it is possible to neglect easily measured quantities at locations needed by the developer, rendering the data set useless for validation purposes. It is also important for the experimentalist/diagnostician to understand what the developer is trying to validate so that the experiment can be designed to isolate (as much as possible) the effects of a particular physical phenomena that is associated with the model to be validated. The probability of a successful validation experiment can be greatly increased if the two groups work together, each understanding the needs and limitations of the other.

  3. Validity of maximal isometric knee extension strength measurements obtained via belt-stabilized hand-held dynamometry in healthy adults.

    PubMed

    Ushiyama, Naoko; Kurobe, Yasushi; Momose, Kimito

    2017-11-01

    [Purpose] To determine the validity of knee extension muscle strength measurements using belt-stabilized hand-held dynamometry with and without body stabilization compared with the gold standard isokinetic dynamometry in healthy adults. [Subjects and Methods] Twenty-nine healthy adults (mean age, 21.3 years) were included. Study parameters involved right side measurements of maximal isometric knee extension strength obtained using belt-stabilized hand-held dynamometry with and without body stabilization and the gold standard. Measurements were performed in all subjects. [Results] A moderate correlation and fixed bias were found between measurements obtained using belt-stabilized hand-held dynamometry with body stabilization and the gold standard. No significant correlation and proportional bias were found between measurements obtained using belt-stabilized hand-held dynamometry without body stabilization and the gold standard. The strength identified using belt-stabilized hand-held dynamometry with body stabilization may not be commensurate with the maximum strength individuals can generate; however, it reflects such strength. In contrast, the strength identified using belt-stabilized hand-held dynamometry without body stabilization does not reflect the maximum strength. Therefore, a chair should be used to stabilize the body when performing measurements of maximal isometric knee extension strength using belt-stabilized hand-held dynamometry in healthy adults. [Conclusion] Belt-stabilized hand-held dynamometry with body stabilization is more convenient than the gold standard in clinical settings.

  4. Colors of attraction: Modeling insect flight to light behavior.

    PubMed

    Donners, Maurice; van Grunsven, Roy H A; Groenendijk, Dick; van Langevelde, Frank; Bikker, Jan Willem; Longcore, Travis; Veenendaal, Elmar

    2018-06-26

    Light sources attract nocturnal flying insects, but some lamps attract more insects than others. The relation between the properties of a light source and the number of attracted insects is, however, poorly understood. We developed a model to quantify the attractiveness of light sources based on the spectral output. This model is fitted using data from field experiments that compare a large number of different light sources. We validated this model using two additional datasets, one for all insects and one excluding the numerous Diptera. Our model facilitates the development and application of light sources that attract fewer insects without the need for extensive field tests and it can be used to correct for spectral composition when formulating hypotheses on the ecological impact of artificial light. In addition, we present a tool allowing the conversion of the spectral output of light sources to their relative insect attraction based on this model. © 2018 Wiley Periodicals, Inc.

  5. Estuarine Human Activities Modulate the Fate of Changjiang-derived Materials in Adjacent Seas

    NASA Astrophysics Data System (ADS)

    WU, H.

    2017-12-01

    Mega constructions have been built in many river estuaries, but their environmental consequences in the adjacent coastal oceans were often overlooked. This issue was addressed with an example of the Changjiang River Estuary, which was recently built with massive navigation and reclamation constructions in recent years. Based on the model validations against cruises data and the numerical scenario experiments, it is shown that the estuarine constructions profoundly affected the fates of riverine materials in an indeed large offshore area. This is because estuarine dynamics are highly sensitive to their bathymetries. Previously, the Three Gorges Dam (TGD) was thought to be responsible for some offshore environmental changes through modulating the river plume extension, but here we show that its influences are secondary. Since the TGD and the mega estuarine constructions were built during the similar period, their influences might be confused.

  6. Identification of sea ice types in spaceborne synthetic aperture radar data

    NASA Technical Reports Server (NTRS)

    Kwok, Ronald; Rignot, Eric; Holt, Benjamin; Onstott, R.

    1992-01-01

    This study presents an approach for identification of sea ice types in spaceborne SAR image data. The unsupervised classification approach involves cluster analysis for segmentation of the image data followed by cluster labeling based on previously defined look-up tables containing the expected backscatter signatures of different ice types measured by a land-based scatterometer. Extensive scatterometer observations and experience accumulated in field campaigns during the last 10 yr were used to construct these look-up tables. The classification approach, its expected performance, the dependence of this performance on radar system performance, and expected ice scattering characteristics are discussed. Results using both aircraft and simulated ERS-1 SAR data are presented and compared to limited field ice property measurements and coincident passive microwave imagery. The importance of an integrated postlaunch program for the validation and improvement of this approach is discussed.

  7. Deformation Response and Life of Metallic Composites

    NASA Technical Reports Server (NTRS)

    Lissenden, Cliff J.

    2005-01-01

    The project was initially funded for one year (for $100,764) to investigate the potential of particulate reinforced metals for aeropropulsion applications and to generate fatigue results that quantify the mean stress effect for a titanium alloy matrix material (TIMETAL 21S). The project was continued for a second year (for $85,000) to more closely investigate cyclic deformation, especially ratcheting, of the titanium alloy matrix at elevated temperature. Equipment was purchased (for $19,000) to make the experimental program feasible; this equipment included an extensometer calibrator and a multi-channel signal conditioning amplifier. The project was continued for a third year ($50,000) to conduct cyclic relaxation experiments aimed at validating the elastic-viscoelastic-viscoplastic model that NASA GRC had developed for the titanium alloy. Finally, a one-year no cost extension was granted to enable continued analysis of the experimental results and model comparisons.

  8. Two-phase flow in the cooling circuit of a cryogenic rocket engine

    NASA Astrophysics Data System (ADS)

    Preclik, D.

    1992-07-01

    Transient two-phase flow was investigated for the hydrogen cooling circuit of the HM7 rocket engine. The nuclear reactor code ATHLET/THESEUS was adapted to cryogenics and applied to both principal and prototype experiments for validation and simulation purposes. The cooling circuit two-phase flow simulation focused on the hydrogen prechilling and pump transient phase prior to ignition. Both a single- and a multichannel model were designed and employed for a valve leakage flow, a nominal prechilling flow, and a prechilling with a subsequent pump-transient flow. The latter case was performed in order to evaluate the difference between a nominal and a delayed turbo-pump start-up. It was found that an extension of the nominal prechilling sequence in the order of 1 second is sufficient to finally provide for liquid injection conditions of hydrogen which, as commonly known, is undesirable for smooth ignition and engine starting transients.

  9. Reduced-order model based active disturbance rejection control of hydraulic servo system with singular value perturbation theory.

    PubMed

    Wang, Chengwen; Quan, Long; Zhang, Shijie; Meng, Hongjun; Lan, Yuan

    2017-03-01

    Hydraulic servomechanism is the typical mechanical/hydraulic double-dynamics coupling system with the high stiffness control and mismatched uncertainties input problems, which hinder direct applications of many advanced control approaches in the hydraulic servo fields. In this paper, by introducing the singular value perturbation theory, the original double-dynamics coupling model of the hydraulic servomechanism was reduced to a integral chain system. So that, the popular ADRC (active disturbance rejection control) technology could be directly applied to the reduced system. In addition, the high stiffness control and mismatched uncertainties input problems are avoided. The validity of the simplified model is analyzed and proven theoretically. The standard linear ADRC algorithm is then developed based on the obtained reduced-order model. Extensive comparative co-simulations and experiments are carried out to illustrate the effectiveness of the proposed method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Perspective: Quantum Hamiltonians for optical interactions

    NASA Astrophysics Data System (ADS)

    Andrews, David L.; Jones, Garth A.; Salam, A.; Woolley, R. Guy

    2018-01-01

    The multipolar Hamiltonian of quantum electrodynamics is extensively employed in chemical and optical physics to treat rigorously the interaction of electromagnetic fields with matter. It is also widely used to evaluate intermolecular interactions. The multipolar version of the Hamiltonian is commonly obtained by carrying out a unitary transformation of the Coulomb gauge Hamiltonian that goes by the name of Power-Zienau-Woolley (PZW). Not only does the formulation provide excellent agreement with experiment, and versatility in its predictive ability, but also superior physical insight. Recently, the foundations and validity of the PZW Hamiltonian have been questioned, raising a concern over issues of gauge transformation and invariance, and whether observable quantities obtained from unitarily equivalent Hamiltonians are identical. Here, an in-depth analysis of theoretical foundations clarifies the issues and enables misconceptions to be identified. Claims of non-physicality are refuted: the PZW transformation and ensuing Hamiltonian are shown to rest on solid physical principles and secure theoretical ground.

  11. Direct observation of Young’s double-slit interferences in vibrationally resolved photoionization of diatomic molecules

    PubMed Central

    Canton, Sophie E.; Plésiat, Etienne; Bozek, John D.; Rude, Bruce S.; Decleva, Piero; Martín, Fernando

    2011-01-01

    Vibrationally resolved valence-shell photoionization spectra of H2, N2 and CO have been measured in the photon energy range 20–300 eV using third-generation synchrotron radiation. Young’s double-slit interferences lead to oscillations in the corresponding vibrational ratios, showing that the molecules behave as two-center electron-wave emitters and that the associated interferences leave their trace in the angle-integrated photoionization cross section. In contrast to previous work, the oscillations are directly observable in the experiment, thereby removing any possible ambiguity related to the introduction of external parameters or fitting functions. A straightforward extension of an original idea proposed by Cohen and Fano [Cohen HD, Fano U (1966) Phys Rev 150:30] confirms this interpretation and shows that it is also valid for diatomic heteronuclear molecules. Results of accurate theoretical calculations are in excellent agreement with the experimental findings.

  12. Analysis of the Mean Absolute Error (MAE) and the Root Mean Square Error (RMSE) in Assessing Rounding Model

    NASA Astrophysics Data System (ADS)

    Wang, Weijie; Lu, Yanmin

    2018-03-01

    Most existing Collaborative Filtering (CF) algorithms predict a rating as the preference of an active user toward a given item, which is always a decimal fraction. Meanwhile, the actual ratings in most data sets are integers. In this paper, we discuss and demonstrate why rounding can bring different influences to these two metrics; prove that rounding is necessary in post-processing of the predicted ratings, eliminate of model prediction bias, improving the accuracy of the prediction. In addition, we also propose two new rounding approaches based on the predicted rating probability distribution, which can be used to round the predicted rating to an optimal integer rating, and get better prediction accuracy compared to the Basic Rounding approach. Extensive experiments on different data sets validate the correctness of our analysis and the effectiveness of our proposed rounding approaches.

  13. Technical Review of the CENWP Computational Fluid Dynamics Model of the John Day Dam Forebay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.

    The US Army Corps of Engineers Portland District (CENWP) has developed a computational fluid dynamics (CFD) model of the John Day forebay on the Columbia River to aid in the development and design of alternatives to improve juvenile salmon passage at the John Day Project. At the request of CENWP, Pacific Northwest National Laboratory (PNNL) Hydrology Group has conducted a technical review of CENWP's CFD model run in CFD solver software, STAR-CD. PNNL has extensive experience developing and applying 3D CFD models run in STAR-CD for Columbia River hydroelectric projects. The John Day forebay model developed by CENWP is adequatelymore » configured and validated. The model is ready for use simulating forebay hydraulics for structural and operational alternatives. The approach and method are sound, however CENWP has identified some improvements that need to be made for future models and for modifications to this existing model.« less

  14. Search for 1st Generation Leptoquarks in the eejj channel with the DZero experiment (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barfuss, Anne-Fleur

    2008-09-12

    An evidence of the existence of leptoquarks (LQ) would prove the validity of various extensions of the Standard Model of Particle Physics (SM). The search for first generation leptoquarks presented in this dissertation has been performed by analyzing a 1.02 fb -1 sample of data collected by the D0 detector, events with a final state comprising two light jets and two electrons. The absence of an excess of events in comparison to SM expectations leads to exclude scalar LQ masses up to 292 GeV and vector LQ masses from 350 to 458 GeV, depending on the LQ-l-q coupling type. Themore » great importance of a good jet energy measurement motivated the study of the instrumental backgrounds correlated to the calorimeter, as much as studies of the hadronic showers energy resolution in γ + jets events.« less

  15. Rational design of capillary-driven flows for paper-based microfluidics.

    PubMed

    Elizalde, Emanuel; Urteaga, Raúl; Berli, Claudio L A

    2015-05-21

    The design of paper-based assays that integrate passive pumping requires a precise programming of the fluid transport, which has to be encoded in the geometrical shape of the substrate. This requirement becomes critical in multiple-step processes, where fluid handling must be accurate and reproducible for each operation. The present work theoretically investigates the capillary imbibition in paper-like substrates to better understand fluid transport in terms of the macroscopic geometry of the flow domain. A fluid dynamic model was derived for homogeneous porous substrates with arbitrary cross-sectional shapes, which allows one to determine the cross-sectional profile required for a prescribed fluid velocity or mass transport rate. An extension of the model to slit microchannels is also demonstrated. Calculations were validated by experiments with prototypes fabricated in our lab. The proposed method constitutes a valuable tool for the rational design of paper-based assays.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Michael; Kollias, Pavlos; Giangrande, Scott

    The Mid-latitude Continental Convective Clouds Experiment (MC3E) took place from April 22 through June 6, 2011, centered at the ARM Southern Great Plains site (http://www.arm.gov/sites/sgp) in northcentral Oklahoma. MC3E was a collaborative effort between the ARM Climate Research Facility and the National Aeronautics and Space Administration’s (NASA’s) Global Precipitation Measurement (GPM) mission Ground Validation (GV) program. The campaign leveraged the largest ground-based observing infrastructure available in the central United States, including recent upgrades through the American Recovery and Reinvestment Act of 2009, combined with an extensive sounding array, remote sensing and in situ aircraft observations, and additional radar and inmore » situ precipitation instrumentation. The overarching goal of the campaign was to provide a three-dimensional characterization of convective clouds and precipitation for the purpose of improving the representation of convective lifecycle in atmospheric models and the reliability of satellite-based retrievals of precipitation.« less

  17. Sparse representation-based image restoration via nonlocal supervised coding

    NASA Astrophysics Data System (ADS)

    Li, Ao; Chen, Deyun; Sun, Guanglu; Lin, Kezheng

    2016-10-01

    Sparse representation (SR) and nonlocal technique (NLT) have shown great potential in low-level image processing. However, due to the degradation of the observed image, SR and NLT may not be accurate enough to obtain a faithful restoration results when they are used independently. To improve the performance, in this paper, a nonlocal supervised coding strategy-based NLT for image restoration is proposed. The novel method has three main contributions. First, to exploit the useful nonlocal patches, a nonnegative sparse representation is introduced, whose coefficients can be utilized as the supervised weights among patches. Second, a novel objective function is proposed, which integrated the supervised weights learning and the nonlocal sparse coding to guarantee a more promising solution. Finally, to make the minimization tractable and convergence, a numerical scheme based on iterative shrinkage thresholding is developed to solve the above underdetermined inverse problem. The extensive experiments validate the effectiveness of the proposed method.

  18. Applying a weed risk assessment approach to GM crops.

    PubMed

    Keese, Paul K; Robold, Andrea V; Myers, Ruth C; Weisman, Sarah; Smith, Joe

    2014-12-01

    Current approaches to environmental risk assessment of genetically modified (GM) plants are modelled on chemical risk assessment methods, which have a strong focus on toxicity. There are additional types of harms posed by plants that have been extensively studied by weed scientists and incorporated into weed risk assessment methods. Weed risk assessment uses robust, validated methods that are widely applied to regulatory decision-making about potentially problematic plants. They are designed to encompass a broad variety of plant forms and traits in different environments, and can provide reliable conclusions even with limited data. The knowledge and experience that underpin weed risk assessment can be harnessed for environmental risk assessment of GM plants. A case study illustrates the application of the Australian post-border weed risk assessment approach to a representative GM plant. This approach is a valuable tool to identify potential risks from GM plants.

  19. A numerical algorithm for MHD of free surface flows at low magnetic Reynolds numbers

    NASA Astrophysics Data System (ADS)

    Samulyak, Roman; Du, Jian; Glimm, James; Xu, Zhiliang

    2007-10-01

    We have developed a numerical algorithm and computational software for the study of magnetohydrodynamics (MHD) of free surface flows at low magnetic Reynolds numbers. The governing system of equations is a coupled hyperbolic-elliptic system in moving and geometrically complex domains. The numerical algorithm employs the method of front tracking and the Riemann problem for material interfaces, second order Godunov-type hyperbolic solvers, and the embedded boundary method for the elliptic problem in complex domains. The numerical algorithm has been implemented as an MHD extension of FronTier, a hydrodynamic code with free interface support. The code is applicable for numerical simulations of free surface flows of conductive liquids or weakly ionized plasmas. The code has been validated through the comparison of numerical simulations of a liquid metal jet in a non-uniform magnetic field with experiments and theory. Simulations of the Muon Collider/Neutrino Factory target have also been discussed.

  20. The Ionosphere's Pocket Litter: Exploiting Crowd-Sourced Observations

    NASA Astrophysics Data System (ADS)

    Miller, E. S.; Frissell, N. A.; Kaeppler, S. R.; Demajistre, R.; Knuth, A. A.

    2015-12-01

    One of the biggest challenges faced in developing and testing our understanding of the ionosphere is acquiring data that characterizes the latitudinal and longitudinal variability of the ionosphere. While there are extensive networks of ground sites that sample the vertical distribution, we have rather poor coverage over the oceans and in parts of the southern hemisphere. Our ability to validate the ionospheric models is limited by the lack of point measurements and those measurements that essentially constitute characterization of horizontal gradients. In this talk, we discuss and demonstrate the use of various types of crowd-sourced information that enables us to extend our coverage over these regions. We will discuss new sources of these data, concepts for new experiments and the use of these data in assimilative models. We note that there are new, low cost options for obtaining data that broaden the participation beyond the aeronomy/ionospheric community.

  1. Single-accelerometer-based daily physical activity classification.

    PubMed

    Long, Xi; Yin, Bin; Aarts, Ronald M

    2009-01-01

    In this study, a single tri-axial accelerometer placed on the waist was used to record the acceleration data for human physical activity classification. The data collection involved 24 subjects performing daily real-life activities in a naturalistic environment without researchers' intervention. For the purpose of assessing customers' daily energy expenditure, walking, running, cycling, driving, and sports were chosen as target activities for classification. This study compared a Bayesian classification with that of a Decision Tree based approach. A Bayes classifier has the advantage to be more extensible, requiring little effort in classifier retraining and software update upon further expansion or modification of the target activities. Principal components analysis was applied to remove the correlation among features and to reduce the feature vector dimension. Experiments using leave-one-subject-out and 10-fold cross validation protocols revealed a classification accuracy of approximately 80%, which was comparable with that obtained by a Decision Tree classifier.

  2. Prediction of enzymatic pathways by integrative pathway mapping

    PubMed Central

    Wichelecki, Daniel J; San Francisco, Brian; Zhao, Suwen; Rodionov, Dmitry A; Vetting, Matthew W; Al-Obaidi, Nawar F; Lin, Henry; O'Meara, Matthew J; Scott, David A; Morris, John H; Russel, Daniel; Almo, Steven C; Osterman, Andrei L

    2018-01-01

    The functions of most proteins are yet to be determined. The function of an enzyme is often defined by its interacting partners, including its substrate and product, and its role in larger metabolic networks. Here, we describe a computational method that predicts the functions of orphan enzymes by organizing them into a linear metabolic pathway. Given candidate enzyme and metabolite pathway members, this aim is achieved by finding those pathways that satisfy structural and network restraints implied by varied input information, including that from virtual screening, chemoinformatics, genomic context analysis, and ligand -binding experiments. We demonstrate this integrative pathway mapping method by predicting the L-gulonate catabolic pathway in Haemophilus influenzae Rd KW20. The prediction was subsequently validated experimentally by enzymology, crystallography, and metabolomics. Integrative pathway mapping by satisfaction of structural and network restraints is extensible to molecular networks in general and thus formally bridges the gap between structural biology and systems biology. PMID:29377793

  3. A complete solution classification and unified algorithmic treatment for the one- and two-step asymmetric S-transverse mass event scale statistic

    NASA Astrophysics Data System (ADS)

    Walker, Joel W.

    2014-08-01

    The M T2, or "s-transverse mass", statistic was developed to associate a parent mass scale to a missing transverse energy signature, given that escaping particles are generally expected in pairs, while collider experiments are sensitive to just a single transverse momentum vector sum. This document focuses on the generalized extension of that statistic to asymmetric one- and two-step decay chains, with arbitrary child particle masses and upstream missing transverse momentum. It provides a unified theoretical formulation, complete solution classification, taxonomy of critical points, and technical algorithmic prescription for treatment of the event scale. An implementation of the described algorithm is available for download, and is also a deployable component of the author's selection cut software package AEAC uS (Algorithmic Event Arbiter and C ut Selector). appendices address combinatoric event assembly, algorithm validation, and a complete pseudocode.

  4. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    PubMed

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  5. Mu2e transport solenoid prototype tests results

    DOE PAGES

    Lopes, Mauricio L.; G. Ambrosio; DiMarco, J.; ...

    2016-02-08

    The Fermilab Mu2e experiment has been developed to search for evidence of charged lepton flavor violation through the direct conversion of muons into electrons. The transport solenoid is an s-shaped magnet which guides the muons from the source to the stopping target. It consists of fifty-two superconducting coils arranged in twenty-seven coil modules. A full-size prototype coil module, with all the features of a typical module of the full assembly, was successfully manufactured by a collaboration between INFN-Genoa and Fermilab. The prototype contains two coils that can be powered independently. In order to validate the design, the magnet went throughmore » an extensive test campaign. Warm tests included magnetic measurements with a vibrating stretched wire, electrical and dimensional checks. As a result, the cold performance was evaluated by a series of power tests as well as temperature dependence and minimum quench energy studies.« less

  6. Photo nuclear energy loss term for muon-nucleus interactions based on xi scaling model of QCD

    NASA Technical Reports Server (NTRS)

    Roychoudhury, R.

    1985-01-01

    Extensive air showers (EMC) experiments discovered a significant deviation of the ratio of structure functions of iron and deuteron from unity. It was established that the quark parton distribution in nuclei are different from the corresponding distribution in the nucleus. It was examined whether these results have an effect on the calculation of photo nucleus energy loss term for muon-nucleus nuclear interaction. Though the EMC and SLAC data were restricted to rather large q sq region it is expected that the derivation would persist even in the low q sq domain. For the ratio of iron and deuteron structure function a rather naive least square fit of the form R(x) = a + bx was taken and it is assumed that the formula is valid for the whole q sq region the absence of any knowledge of R(x) for small q sq.

  7. The Fibromyalgia Impact Questionnaire (FIQ): a review of its development, current version, operating characteristics and uses.

    PubMed

    Bennett, R

    2005-01-01

    The Fibromyalgia Impact Questionnaire (FIQ) was developed in the late 1980s by clinicians at Oregon Health & Science University in an attempt to capture the total spectrum of problems related to fibromyalgia and the responses to therapy. It was first published in 1991 and since that time has been extensively used as an index of therapeutic efficacy. Overall, it has been shown to have a credible construct validity, reliable test-retest characteristics and a good sensitivity in demonstrating therapeutic change. The original questionnaire was modified in 1997 and 2002, to reflect ongoing experience with the instrument and to clarify the scoring system. The latest version of the FIQ can be found at the web site of the Oregon Fibromyalgia Foundation (www.myalgia.com/FIQ/FIQ). The FIQ has now been translated into eight languages, and the translated versions have shown operating characteristics similar to the English version.

  8. Validity and reliability of an instrumented leg-extension machine for measuring isometric muscle strength of the knee extensors.

    PubMed

    Ruschel, Caroline; Haupenthal, Alessandro; Jacomel, Gabriel Fernandes; Fontana, Heiliane de Brito; Santos, Daniela Pacheco dos; Scoz, Robson Dias; Roesler, Helio

    2015-05-20

    Isometric muscle strength of knee extensors has been assessed for estimating performance, evaluating progress during physical training, and investigating the relationship between isometric and dynamic/functional performance. To assess the validity and reliability of an adapted leg-extension machine for measuring isometric knee extensor force. Validity (concurrent approach) and reliability (test and test-retest approach) study. University laboratory. 70 healthy men and women aged between 20 and 30 y (39 in the validity study and 31 in the reliability study). Intraclass correlation coefficient (ICC) values calculated for the maximum voluntary isometric torque of knee extensors at 30°, 60°, and 90°, measured with the prototype and with an isokinetic dynamometer (ICC2,1, validity study) and measured with the prototype in test and retest sessions, scheduled from 48 h to 72 h apart (ICC1,1, reliability study). In the validity analysis, the prototype showed good agreement for measurements at 30° (ICC2,1 = .75, SEM = 18.2 Nm) and excellent agreement for measurements at 60° (ICC2,1 = .93, SEM = 9.6 Nm) and at 90° (ICC2,1 = .94, SEM = 8.9 Nm). Regarding the reliability analysis, between-days' ICC1,1 were good to excellent, ranging from .88 to .93. Standard error of measurement and minimal detectable difference based on test-retest ranged from 11.7 Nm to 18.1 Nm and 32.5 Nm to 50.1 Nm, respectively, for the 3 analyzed knee angles. The analysis of validity and repeatability of the prototype for measuring isometric muscle strength has shown to be good or excellent, depending on the knee joint angle analyzed. The new instrument, which presents a relative low cost and easiness of transportation when compared with an isokinetic dynamometer, is valid and provides consistent data concerning isometric strength of knee extensors and, for this reason, can be used for practical, clinical, and research purposes.

  9. An Overlooked Population in Community College: International Students' (In)Validation Experiences With Academic Advising

    ERIC Educational Resources Information Center

    Zhang, Yi

    2016-01-01

    Objective: Guided by validation theory, this study aims to better understand the role that academic advising plays in international community college students' adjustment. More specifically, this study investigated how academic advising validates or invalidates their academic and social experiences in a community college context. Method: This…

  10. Validation Experiences and Persistence among Urban Community College Students

    ERIC Educational Resources Information Center

    Barnett, Elisabeth A.

    2007-01-01

    The purpose of this research was to examine the extent to which urban community college students' experiences with validation by faculty contributed to their sense of integration in college and whether this, in turn, contributed to their intent to persist in college. This study focused on urban community college students' validating experiences…

  11. You don't have to believe everything you read: background knowledge permits fast and efficient validation of information.

    PubMed

    Richter, Tobias; Schroeder, Sascha; Wöhrmann, Britta

    2009-03-01

    In social cognition, knowledge-based validation of information is usually regarded as relying on strategic and resource-demanding processes. Research on language comprehension, in contrast, suggests that validation processes are involved in the construction of a referential representation of the communicated information. This view implies that individuals can use their knowledge to validate incoming information in a routine and efficient manner. Consistent with this idea, Experiments 1 and 2 demonstrated that individuals are able to reject false assertions efficiently when they have validity-relevant beliefs. Validation processes were carried out routinely even when individuals were put under additional cognitive load during comprehension. Experiment 3 demonstrated that the rejection of false information occurs automatically and interferes with affirmative responses in a nonsemantic task (epistemic Stroop effect). Experiment 4 also revealed complementary interference effects of true information with negative responses in a nonsemantic task. These results suggest the existence of fast and efficient validation processes that protect mental representations from being contaminated by false and inaccurate information.

  12. Integral Full Core Multi-Physics PWR Benchmark with Measured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forget, Benoit; Smith, Kord; Kumar, Shikhar

    In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less

  13. Validation of a dye stain assay for vaginally inserted HEC-filled microbicide applicators

    PubMed Central

    Katzen, Lauren L.; Fernández-Romero, José A.; Sarna, Avina; Murugavel, Kailapuri G.; Gawarecki, Daniel; Zydowsky, Thomas M.; Mensch, Barbara S.

    2011-01-01

    Background The reliability and validity of self-reports of vaginal microbicide use are questionable given the explicit understanding that participants are expected to comply with study protocols. Our objective was to optimize the Population Council's previously validated dye stain assay (DSA) and related procedures, and establish predictive values for the DSA's ability to identify vaginally inserted single-use, low-density polyethylene microbicide applicators filled with hydroxyethylcellulose gel. Methods Applicators, inserted by 252 female sex workers enrolled in a microbicide feasibility study in Southern India, served as positive controls for optimization and validation experiments. Prior to validation, optimal dye concentration and staining time were ascertained. Three validation experiments were conducted to determine sensitivity, specificity, negative predictive values and positive predictive values. Results The dye concentration of 0.05% (w/v) FD&C Blue No. 1 Granular Food Dye and staining time of five seconds were determined to be optimal and were used for the three validation experiments. There were a total of 1,848 possible applicator readings across validation experiments; 1,703 (92.2%) applicator readings were correct. On average, the DSA performed with 90.6% sensitivity, 93.9% specificity, and had a negative predictive value of 93.8% and a positive predictive value of 91.0%. No statistically significant differences between experiments were noted. Conclusions The DSA was optimized and successfully validated for use with single-use, low-density polyethylene applicators filled with hydroxyethylcellulose (HEC) gel. We recommend including the DSA in future microbicide trials involving vaginal gels in order to identify participants who have low adherence to dosing regimens. In doing so, we can develop strategies to improve adherence as well as investigate the association between product use and efficacy. PMID:21992983

  14. Extension of nanoconfined DNA: Quantitative comparison between experiment and theory

    NASA Astrophysics Data System (ADS)

    Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.

    2015-12-01

    The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.

  15. Three-dimensional deformation response of a NiTi shape memory helical-coil actuator during thermomechanical cycling: experimentally validated numerical model

    NASA Astrophysics Data System (ADS)

    Dhakal, B.; Nicholson, D. E.; Saleeb, A. F.; Padula, S. A., II; Vaidyanathan, R.

    2016-09-01

    Shape memory alloy (SMA) actuators often operate under a complex state of stress for an extended number of thermomechanical cycles in many aerospace and engineering applications. Hence, it becomes important to account for multi-axial stress states and deformation characteristics (which evolve with thermomechanical cycling) when calibrating any SMA model for implementation in large-scale simulation of actuators. To this end, the present work is focused on the experimental validation of an SMA model calibrated for the transient and cyclic evolutionary behavior of shape memory Ni49.9Ti50.1, for the actuation of axially loaded helical-coil springs. The approach requires both experimental and computational aspects to appropriately assess the thermomechanical response of these multi-dimensional structures. As such, an instrumented and controlled experimental setup was assembled to obtain temperature, torque, degree of twist and extension, while controlling end constraints during heating and cooling of an SMA spring under a constant externally applied axial load. The computational component assesses the capabilities of a general, multi-axial, SMA material-modeling framework, calibrated for Ni49.9Ti50.1 with regard to its usefulness in the simulation of SMA helical-coil spring actuators. Axial extension, being the primary response, was examined on an axially-loaded spring with multiple active coils. Two different conditions of end boundary constraint were investigated in both the numerical simulations as well as the validation experiments: Case (1) where the loading end is restrained against twist (and the resulting torque measured as the secondary response) and Case (2) where the loading end is free to twist (and the degree of twist measured as the secondary response). The present study focuses on the transient and evolutionary response associated with the initial isothermal loading and the subsequent thermal cycles under applied constant axial load. The experimental results for the helical-coil actuator under two different boundary conditions are found to be within error to their counterparts in the numerical simulations. The numerical simulation and the experimental validation demonstrate similar transient and evolutionary behavior in the deformation response under the complex, inhomogeneous, multi-axial stress-state and large deformations of the helical-coil actuator. This response, although substantially different in magnitude, exhibited similar evolutionary characteristics to the simple, uniaxial, homogeneous, stress-state of the isobaric tensile tests results used for the model calibration. There was no significant difference in the axial displacement (primary response) magnitudes observed between Cases (1) and (2) for the number of cycles investigated here. The simulated secondary responses of the two cases evolved in a similar manner when compared to the experimental validation of the respective cases.

  16. Goals and Status of the NASA Juncture Flow Experiment

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Morrison, Joseph H.

    2016-01-01

    The NASA Juncture Flow experiment is a new effort whose focus is attaining validation data in the juncture region of a wing-body configuration. The experiment is designed specifically for the purpose of CFD validation. Current turbulence models routinely employed by Reynolds-averaged Navier-Stokes CFD are inconsistent in their prediction of corner flow separation in aircraft juncture regions, so experimental data in the near-wall region of such a configuration will be useful both for assessment as well as for turbulence model improvement. This paper summarizes the Juncture Flow effort to date, including preliminary risk-reduction experiments already conducted and planned future experiments. The requirements and challenges associated with conducting a quality validation test are discussed.

  17. Use of Demonstration Gardens in Extension: Challenges and Benefits

    ERIC Educational Resources Information Center

    Glen, Charlotte D.; Moore, Gary E.; Jayaratne, K. S. U.; Bradley, Lucy K.

    2014-01-01

    Extension agents' use of demonstration gardens was studied to determine how gardens are employed in horticultural programming, perceived benefits and challenges of using gardens for Extension programming, and desired competencies. Gardens are primarily used to enhance educational efforts by providing hands-on learning experiences. Greatest…

  18. The Virtual Extension Annual Conference: Addressing Contemporary Professional Development Needs

    ERIC Educational Resources Information Center

    Franz, Nancy K.; Brekke, Robin; Coates, Deb; Kress, Cathann; Hlas, Julie

    2014-01-01

    Extension systems are experimenting with new models for conducting professional development to enhance staff competence and other returns on professional development investments. The ISUEO virtual annual conference provides a successful flipped classroom model of asynchronous and synchronous learning events for conducting an Extension annual…

  19. CFD validation experiments at the Lockheed-Georgia Company

    NASA Technical Reports Server (NTRS)

    Malone, John B.; Thomas, Andrew S. W.

    1987-01-01

    Information is given in viewgraph form on computational fluid dynamics (CFD) validation experiments at the Lockheed-Georgia Company. Topics covered include validation experiments on a generic fighter configuration, a transport configuration, and a generic hypersonic vehicle configuration; computational procedures; surface and pressure measurements on wings; laser velocimeter measurements of a multi-element airfoil system; the flowfield around a stiffened airfoil; laser velocimeter surveys of a circulation control wing; circulation control for high lift; and high angle of attack aerodynamic evaluations.

  20. Observations on CFD Verification and Validation from the AIAA Drag Prediction Workshops

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Kleb, Bil; Vassberg, John C.

    2014-01-01

    The authors provide observations from the AIAA Drag Prediction Workshops that have spanned over a decade and from a recent validation experiment at NASA Langley. These workshops provide an assessment of the predictive capability of forces and moments, focused on drag, for transonic transports. It is very difficult to manage the consistency of results in a workshop setting to perform verification and validation at the scientific level, but it may be sufficient to assess it at the level of practice. Observations thus far: 1) due to simplifications in the workshop test cases, wind tunnel data are not necessarily the “correct” results that CFD should match, 2) an average of core CFD data are not necessarily a better estimate of the true solution as it is merely an average of other solutions and has many coupled sources of variation, 3) outlier solutions should be investigated and understood, and 4) the DPW series does not have the systematic build up and definition on both the computational and experimental side that is required for detailed verification and validation. Several observations regarding the importance of the grid, effects of physical modeling, benefits of open forums, and guidance for validation experiments are discussed. The increased variation in results when predicting regions of flow separation and increased variation due to interaction effects, e.g., fuselage and horizontal tail, point out the need for validation data sets for these important flow phenomena. Experiences with a recent validation experiment at NASA Langley are included to provide guidance on validation experiments.

  1. Reliability and validity of clinical tests to assess the anatomical integrity of the cervical spine in adults with neck pain and its associated disorders: Part 1-A systematic review from the Cervical Assessment and Diagnosis Research Evaluation (CADRE) Collaboration.

    PubMed

    Lemeunier, Nadège; da Silva-Oolup, S; Chow, N; Southerst, D; Carroll, L; Wong, J J; Shearer, H; Mastragostino, P; Cox, J; Côté, E; Murnaghan, K; Sutton, D; Côté, P

    2017-09-01

    To determine the reliability and validity of clinical tests to assess the anatomical integrity of the cervical spine in adults with neck pain and its associated disorders. We updated the systematic review of the 2000-2010 Bone and Joint Decade Task Force on Neck Pain and its Associated Disorders. We also searched the literature to identify studies on the reliability and validity of Doppler velocimetry for the evaluation of cervical arteries. Two independent reviewers screened and critically appraised studies. We conducted a best evidence synthesis of low risk of bias studies and ranked the phases of investigations using the classification proposed by Sackett and Haynes. We screened 9022 articles and critically appraised 8 studies; all 8 studies had low risk of bias (three reliability and five validity Phase II-III studies). Preliminary evidence suggests that the extension-rotation test may be reliable and has adequate validity to rule out pain arising from facet joints. The evidence suggests variable reliability and preliminary validity for the evaluation of cervical radiculopathy including neurological examination (manual motor testing, dermatomal sensory testing, deep tendon reflexes, and pathological reflex testing), Spurling's and the upper limb neurodynamic tests. No evidence was found for doppler velocimetry. Little evidence exists to support the use of clinical tests to evaluate the anatomical integrity of the cervical spine in adults with neck pain and its associated disorders. We found preliminary evidence to support the use of the extension-rotation test, neurological examination, Spurling's and the upper limb neurodynamic tests.

  2. Fracture mechanics validity limits

    NASA Technical Reports Server (NTRS)

    Lambert, Dennis M.; Ernst, Hugo A.

    1994-01-01

    Fracture behavior is characteristics of a dramatic loss of strength compared to elastic deformation behavior. Fracture parameters have been developed and exhibit a range within which each is valid for predicting growth. Each is limited by the assumptions made in its development: all are defined within a specific context. For example, the stress intensity parameters, K, and the crack driving force, G, are derived using an assumption of linear elasticity. To use K or G, the zone of plasticity must be small as compared to the physical dimensions of the object being loaded. This insures an elastic response, and in this context, K and G will work well. Rice's J-integral has been used beyond the limits imposed on K and G. J requires an assumption of nonlinear elasticity, which is not characteristic of real material behavior, but is thought to be a reasonable approximation if unloading is kept to a minimum. As well, the constraint cannot change dramatically (typically, the crack extension is limited to ten-percent of the initial remaining ligament length). Rice, et al investigated the properties required of J-type parameters, J(sub x), and showed that the time rate, dJ(sub x)/dt, must not be a function of the crack extension rate, da/dt. Ernst devised the modified-J parameter, J(sub M), that meets this criterion. J(sub M) correlates fracture data to much higher crack growth than does J. Ultimately, a limit of the validity of J(sub M) is anticipated, and this has been estimated to be at a crack extension of about 40-percent of the initial remaining ligament length. None of the various parameters can be expected to describe fracture in an environment of gross plasticity, in which case the process is better described by deformation parameters, e.g., stress and strain. In the current study, various schemes to identify the onset of the plasticity-dominated behavior, i.e., the end of fracture mechanics validity, are presented. Each validity limit parameter is developed in detail, and then data is presented and the various schemes for establishing a limit of the validity are compared. The selected limiting parameter is applied to a set of fracture data showing the improvement of correlation gained.

  3. Measuring suicidality using the personality assessment inventory: a convergent validity study with federal inmates.

    PubMed

    Patry, Marc W; Magaletta, Philip R

    2015-02-01

    Although numerous studies have examined the psychometric properties and clinical utility of the Personality Assessment Inventory in correctional contexts, only two studies to date have specifically focused on suicide ideation. This article examines the convergent validity of the Suicide Ideation Scale and the Suicide Potential Index on the Personality Assessment Inventory in a large, nontreatment sample of male and female federal inmates (N = 1,120). The data indicated robust validity support for both the Suicide Ideation Scale and Suicide Potential Index, which were each correlated with a broad group of validity indices representing multiple assessment modalities. Recommendations for future research to build upon these findings through replication and extension are made. © The Author(s) 2014.

  4. DBS-LC-MS/MS assay for caffeine: validation and neonatal application.

    PubMed

    Bruschettini, Matteo; Barco, Sebastiano; Romantsik, Olga; Risso, Francesco; Gennai, Iulian; Chinea, Benito; Ramenghi, Luca A; Tripodi, Gino; Cangemi, Giuliana

    2016-09-01

    DBS might be an appropriate microsampling technique for therapeutic drug monitoring of caffeine in infants. Nevertheless, its application presents several issues that still limit its use. This paper describes a validated DBS-LC-MS/MS method for caffeine. The results of the method validation showed an hematocrit dependence. In the analysis of 96 paired plasma and DBS clinical samples, caffeine levels measured in DBS were statistically significantly lower than in plasma but the observed differences were independent from hematocrit. These results clearly showed the need for extensive validation with real-life samples for DBS-based methods. DBS-LC-MS/MS can be considered to be a good alternative to traditional methods for therapeutic drug monitoring or PK studies in preterm infants.

  5. Validation Test Results for Orthogonal Probe Eddy Current Thruster Inspection System

    NASA Technical Reports Server (NTRS)

    Wincheski, Russell A.

    2007-01-01

    Recent nondestructive evaluation efforts within NASA have focused on an inspection system for the detection of intergranular cracking originating in the relief radius of Primary Reaction Control System (PCRS) Thrusters. Of particular concern is deep cracking in this area which could lead to combustion leakage in the event of through wall cracking from the relief radius into an acoustic cavity of the combustion chamber. In order to reliably detect such defects while ensuring minimal false positives during inspection, the Orthogonal Probe Eddy Current (OPEC) system has been developed and an extensive validation study performed. This report describes the validation procedure, sample set, and inspection results as well as comparing validation flaws with the response from naturally occuring damage.

  6. Validation of Skills, Knowledge and Experience in Lifelong Learning in Europe

    ERIC Educational Resources Information Center

    Ogunleye, James

    2012-01-01

    The paper examines systems of validation of skills and experience as well as the main methods/tools currently used for validating skills and knowledge in lifelong learning. The paper uses mixed methods--a case study research and content analysis of European Union policy documents and frameworks--as a basis for this research. The selection of the…

  7. Changes and Issues in the Validation of Experience

    ERIC Educational Resources Information Center

    Triby, Emmanuel

    2005-01-01

    This article analyses the main changes in the rules for validating experience in France and of what they mean for society. It goes on to consider university validation practices. The way in which this system is evolving offers a chance to identify the issues involved for the economy and for society, with particular attention to the expected…

  8. Involving lay community researchers in epidemiological research: experiences from a seroprevalence study among sub-Saharan African migrants.

    PubMed

    Nöstlinger, Christiana; Loos, Jasna

    2016-01-01

    Community-based participatory research (CBPR) has received considerable attention during past decades as a method to increase community ownership in research and prevention. We discuss its application to epidemiological research using the case of second-generation surveillance conducted among sub-Saharan African (SSA) migrants in Antwerp city. To inform evidence-based prevention planning for this target group, this HIV-prevalence study used two-stage time-location sampling preceded by formative research. Extensive collaborative partnerships were built with community organizations, a Community Advisory Board provided input throughout the project, and community researchers were trained to participate in all phases of the seroprevalence study. Valid oral fluid samples for HIV testing were collected among 717 SSA migrants and linked to behavioural data assessed through an anonymous survey between December 2013 and August 2014. A qualitative content analysis of various data sources (extensive field notes, minutes of intervision, and training protocols) collected at 77 data collection visits in 51 settings was carried out to describe experiences with challenges and opportunities inherent to the CBPR approach at three crucial stages of the research process: building collaborative partnerships; implementing the study; dissemination of findings including prevention planning. The results show that CBPR is feasible in conducting scientifically sound epidemiological research, but certain requirements need to be in place. These include among others sufficient resources to train, coordinate, and supervise community researchers; continuity in the implementation; transparency about decision-taking and administrative procedures, and willingness to share power and control over the full research process. CBPR contributed to empowering community researchers on a personal level, and to create greater HIV prevention demand in the SSA communities.

  9. Tranpsort phenomena in solidification processing of functionally graded materials

    NASA Astrophysics Data System (ADS)

    Gao, Juwen

    A combined numerical and experimental study of the transport phenomena during solidification processing of metal matrix composite functionally graded materials (FGMs) is conducted in this work. A multiphase transport model for the solidification of metal-matrix composite FGMs has been developed that accounts for macroscopic particle segregation due to liquid-particle flow and particle-solid interactions. An experimental study has also been conducted to gain physical insight as well as to validate the model. A novel method to in-situ measure the particle volume fraction using fiber optic probes is developed for transparent analogue solidification systems. The model is first applied to one-dimensional pure matrix FGM solidification under gravity or centrifugal field and is extensively validated against the experimental results. The mechanisms for the formation of particle concentration gradient are identified. Two-dimensional solidification of pure matrix FGM with convection is then studied using the model as well as experiments. The interaction among convection flow, solidification process and the particle transport is demonstrated. The results show the importance of convection in the particle concentration gradient formation. Then, simulations for alloy FGM solidification are carried out for unidirectional solidification as well as two-dimensional solidification with convection. The interplay among heat and species transport, convection and particle motion is investigated. Finally, future theoretical and experimental work is outlined.

  10. Changes in monosaccharides, organic acids and amino acids during Cabernet Sauvignon wine ageing based on a simultaneous analysis using gas chromatography-mass spectrometry.

    PubMed

    Zhang, Xin-Ke; Lan, Yi-Bin; Zhu, Bao-Qing; Xiang, Xiao-Feng; Duan, Chang-Qing; Shi, Ying

    2018-01-01

    Monosaccharides, organic acids and amino acids are the important flavour-related components in wines. The aim of this article is to develop and validate a method that could simultaneously analyse these compounds in wine based on silylation derivatisation and gas chromatography-mass spectrometry (GC-MS), and apply this method to the investigation of the changes of these compounds and speculate upon their related influences on Cabernet Sauvignon wine flavour during wine ageing. This work presented a new approach for wine analysis and provided more information concerning red wine ageing. This method could simultaneously quantitatively analyse 2 monosaccharides, 8 organic acids and 13 amino acids in wine. A validation experiment showed good linearity, sensitivity, reproducibility and recovery. Multiple derivatives of five amino acids have been found but their effects on quantitative analysis were negligible, except for methionine. The evolution pattern of each category was different, and we speculated that the corresponding mechanisms involving microorganism activities, physical interactions and chemical reactions had a great correlation with red wine flavours during ageing. Simultaneously quantitative analysis of monosaccharides, organic acids and amino acids in wine was feasible and reliable and this method has extensive application prospects. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  11. Dense 3D Face Alignment from 2D Video for Real-Time Use

    PubMed Central

    Jeni, László A.; Cohn, Jeffrey F.; Kanade, Takeo

    2018-01-01

    To enable real-time, person-independent 3D registration from 2D video, we developed a 3D cascade regression approach in which facial landmarks remain invariant across pose over a range of approximately 60 degrees. From a single 2D image of a person’s face, a dense 3D shape is registered in real time for each frame. The algorithm utilizes a fast cascade regression framework trained on high-resolution 3D face-scans of posed and spontaneous emotion expression. The algorithm first estimates the location of a dense set of landmarks and their visibility, then reconstructs face shapes by fitting a part-based 3D model. Because no assumptions are required about illumination or surface properties, the method can be applied to a wide range of imaging conditions that include 2D video and uncalibrated multi-view video. The method has been validated in a battery of experiments that evaluate its precision of 3D reconstruction, extension to multi-view reconstruction, temporal integration for videos and 3D head-pose estimation. Experimental findings strongly support the validity of real-time, 3D registration and reconstruction from 2D video. The software is available online at http://zface.org. PMID:29731533

  12. Heater Validation for the NEXT-C Hollow Cathodes

    NASA Technical Reports Server (NTRS)

    Verhey, Timothy R.; Soulas, George C.; Mackey, Jonathan A.

    2018-01-01

    Swaged cathode heaters whose design was successfully demonstrated under a prior flight project are to be provided by the NASA Glenn Research Center for the NEXT-C ion thruster being fabricated by Aerojet Rocketdyne. Extensive requalification activities were performed to validate process controls that had to be re-established or revised because systemic changes prevented reuse of the past approaches. A development batch of heaters was successfully fabricated based on the new process controls. Acceptance and cyclic life testing of multiple discharge and neutralizer sized heaters extracted from the development batch was initiated in August, 2016, with the last heater completing testing in April, 2017. Cyclic life testing results substantially exceeded the NEXT-C thruster requirement as well as all past experience for GRC-fabricated units. The heaters demonstrated ultimate cyclic life capability of 19050 to 33500 cycles. A qualification batch of heaters is now being fabricated using the finalized process controls. A set of six heaters will be acceptance and cyclic tested to verify conformance to the behavior observed with the development heaters. The heaters for flight use will be then be provided to the contractor from the remainder of the qualification batch. This paper summarizes the fabrication process control activities and the acceptance and life testing of the development heater units.

  13. Detection of greenhouse-gas-induced climatic change. Progress report, July 1, 1994--July 31, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, P.D.; Wigley, T.M.L.

    1995-07-21

    The objective of this research is to assembly and analyze instrumental climate data and to develop and apply climate models as a basis for detecting greenhouse-gas-induced climatic change, and validation of General Circulation Models. In addition to changes due to variations in anthropogenic forcing, including greenhouse gas and aerosol concentration changes, the global climate system exhibits a high degree of internally-generated and externally-forced natural variability. To detect the anthropogenic effect, its signal must be isolated from the ``noise`` of this natural climatic variability. A high quality, spatially extensive data base is required to define the noise and its spatial characteristics.more » To facilitate this, available land and marine data bases will be updated and expanded. The data will be analyzed to determine the potential effects on climate of greenhouse gas and aerosol concentration changes and other factors. Analyses will be guided by a variety of models, from simple energy balance climate models to coupled atmosphere ocean General Circulation Models. These analyses are oriented towards obtaining early evidence of anthropogenic climatic change that would lead either to confirmation, rejection or modification of model projections, and towards the statistical validation of General Circulation Model control runs and perturbation experiments.« less

  14. Facial expression: An under-utilised tool for the assessment of welfare in mammals.

    PubMed

    Descovich, Kris A; Wathan, Jennifer; Leach, Matthew C; Buchanan-Smith, Hannah M; Flecknell, Paul; Farningham, David; Vick, Sarah-Jane

    2017-01-01

    Animal welfare is a key issue for industries that use or impact upon animals. The accurate identification of welfare states is particularly relevant to the field of bioscience, where the 3Rs framework encourages refinement of experimental procedures involving animal models. The assessment and improvement of welfare states in animals depends on reliable and valid measurement tools. Behavioral measures (activity, attention, posture and vocalization) are frequently used because they are immediate and non-invasive, however no single indicator can yield a complete picture of the internal state of an animal. Facial expressions are extensively studied in humans as a measure of psychological and emotional experiences but are infrequently used in animal studies, with the exception of emerging research on pain behavior. In this review, we discuss current evidence for facial representations of underlying affective states, and how communicative or functional expressions can be useful within welfare assessments. Validated tools for measuring facial movement are outlined, and the potential of expressions as honest signals is discussed, alongside other challenges and limitations to facial expression measurement within the context of animal welfare. We conclude that facial expression determination in animals is a useful but underutilized measure that complements existing tools in the assessment of welfare.

  15. Partial ASL extensions for stochastic programming.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gay, David

    2010-03-31

    partially completed extensions for stochastic programming to the AMPL/solver interface library (ASL).modeling and experimenting with stochastic recourse problems. This software is not primarily for military applications

  16. 77 FR 77069 - Commission Information Collection Activities (FERC-730); Comment Request; Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-31

    ... reliability and to reduce the cost of delivered power by reducing transmission congestion. Order No. 679 also... information, including the validity of the methodology and assumptions used; (3) ways to enhance the quality...

  17. Asteroids as Calibration Standards in the Thermal Infrared -- Applications and Results from ISO

    NASA Astrophysics Data System (ADS)

    Müller, T. G.; Lagerros, J. S. V.

    Asteroids have been used extensively as calibration sources for ISO. We summarise the asteroid observational parameters in the thermal infrared and explain the important modelling aspects. Ten selected asteroids were extensively used for the absolute photometric calibration of ISOPHOT in the far-IR. Additionally, the point-like and bright asteroids turned out to be of great interest for many technical tests and calibration aspects. They have been used for testing the calibration for SWS and LWS, the validation of relative spectral response functions of different bands, for colour correction and filter leak tests. Currently, there is a strong emphasis on ISO cross-calibration, where the asteroids contribute in many fields. Well known asteroids have also been seen serendipitously in the CAM Parallel Mode and the PHT Serendipity Mode, allowing for validation and improvement of the photometric calibration of these special observing modes.

  18. Development of a Three-Dimensional, Unstructured Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  19. Preliminary Report on Oak Ridge National Laboratory Testing of Drake/ACSS/MA2/E3X

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Irminger, Philip; King, Daniel J.; Herron, Andrew N.

    2016-01-01

    A key to industry acceptance of a new technology is extensive validation in field trials. The Powerline Conductor Accelerated Test facility (PCAT) at Oak Ridge National Laboratory (ORNL) is specifically designed to evaluate the performance and reliability of a new conductor technology under real world conditions. The facility is set up to capture large amounts of data during testing. General Cable used the ORNL PCAT facility to validate the performance of TransPowr with E3X Technology a standard overhead conductor with an inorganic high emissivity, low absorptivity surface coating. Extensive testing has demonstrated a significant improvement in conductor performance across amore » wide range of operating temperatures, indicating that E3X Technology can provide a reduction in temperature, a reduction in sag, and an increase in ampacity when applied to the surface of any overhead conductor. This report provides initial results of that testing.« less

  20. Assessment of Hybrid High-Order methods on curved meshes and comparison with discontinuous Galerkin methods

    NASA Astrophysics Data System (ADS)

    Botti, Lorenzo; Di Pietro, Daniele A.

    2018-10-01

    We propose and validate a novel extension of Hybrid High-Order (HHO) methods to meshes featuring curved elements. HHO methods are based on discrete unknowns that are broken polynomials on the mesh and its skeleton. We propose here the use of physical frame polynomials over mesh elements and reference frame polynomials over mesh faces. With this choice, the degree of face unknowns must be suitably selected in order to recover on curved meshes the same convergence rates as on straight meshes. We provide an estimate of the optimal face polynomial degree depending on the element polynomial degree and on the so-called effective mapping order. The estimate is numerically validated through specifically crafted numerical tests. All test cases are conducted considering two- and three-dimensional pure diffusion problems, and include comparisons with discontinuous Galerkin discretizations. The extension to agglomerated meshes with curved boundaries is also considered.

  1. Run control techniques for the Fermilab DART data acquisition system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oleynik, G.; Engelfried, J.; Mengel, L.

    1995-10-01

    DART is the high speed, Unix based data acquisition system being developed by the Fermilab Computing Division in collaboration with eight High Energy Physics Experiments. This paper describes DART run-control which implements flexible, distributed, extensible and portable paradigms for the control and monitoring of data acquisition systems. We discuss the unique and interesting aspects of the run-control - why we chose the concepts we did, the benefits we have seen from the choices we made, as well as our experiences in deploying and supporting it for experiments during their commissioning and sub-system testing phases. We emphasize the software and techniquesmore » we believe are extensible to future use, and potential future modifications and extensions for those we feel are not.« less

  2. Performance Evaluation of Extension Education Centers in Universities Based on the Balanced Scorecard

    ERIC Educational Resources Information Center

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-01-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance…

  3. Extension in Planned Social Change, the Indian Experience.

    ERIC Educational Resources Information Center

    Rudramoorthy, B.

    Extension, the process of extending the knowledge of recent advances in science and technology to the people who need it, has been emphasized in India since the introduction of the Community Development Programme in 1952. Community development involves two distinct processes--extension education and community organization--and has had four…

  4. Agricultural Extension Services and the Issue of Equity in Agricultural Development.

    ERIC Educational Resources Information Center

    Monu, Erasmus D.

    1981-01-01

    Reviews experiments in Kenya and Nigeria attempting to modify the progressive-farmer strategy. Success requires that extension services recognize small farmers' ability to make their own rational decisions and involve farmers in planning and implementing extension programs. Available from: Rural Sociological Society, 325 Morgan Hall, University of…

  5. University to Community and Back: Creating a Customer Focused Process.

    ERIC Educational Resources Information Center

    Martin-Milius, Tara

    This paper examines ways in which university extension programs can become more customer-focused in the courses and services that they deliver, focusing on the experiences of the University of California Extension, Santa Cruz. Extension programs can increase their effectiveness by: (1) establishing partnerships with other service organizations,…

  6. Using Non-Extension Volunteering as an Experiential Learning Activity for Extension Professionals

    ERIC Educational Resources Information Center

    Andrews, Kevin B.; Lockett, Landry L.

    2013-01-01

    Extension professionals can gain much-needed competencies in volunteer administration through experiential learning by participating in volunteer activities. Experiential learning is a means of behavior change that allows the individual learner to reflect on, abstract, and apply their experiences to new situations. This article expands on…

  7. Tools for Formative Evaluation: Gathering the Information Necessary for Program Improvement

    ERIC Educational Resources Information Center

    Jayaratne, K. S. U.

    2016-01-01

    New Extension educators experience a steep learning curve when attempting to develop effective Extension programs. Formative evaluation is helpful to new, and experienced, Extension educators in determining the changes necessary for making programs more effective. Formative evaluation is an essential part of program evaluation. However, its use…

  8. Competence Challenges of Demand-Led Agricultural Research and Extension in Uganda

    ERIC Educational Resources Information Center

    Kibwika, P.; Wals, A. E. J.; Nassuna-Musoke, M. G.

    2009-01-01

    Governments and development agencies in Sub-Saharan Africa are experimenting alternative approaches within the innovation systems paradigm to enhance relevance of agricultural research and extension to the poverty eradication agenda. Uganda, for example, has recently shifted from the supply driven to demand-led agricultural research and extension.…

  9. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    DTIC Science & Technology

    2015-03-01

    domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model

  10. Bioindicators of contaminant exposure and effect in aquatic and terrestrial monitoring

    USGS Publications Warehouse

    Melancon, Mark J.; Hoffman, David J.; Rattner, Barnett A.; Burton, G. Allen; Cairns, John

    2003-01-01

    Bioindicators of contaminant exposure presently used in environmental monitoring arc discussed. Some have been extensively field-validated and arc already in routine application. Included are (1) inhibition of brain or blood cholinesterase by anticholinesterase pesticides, (2) induction of hepatic microsomal cytochromes P450 by chemicals such as PAHs and PCBs, (3) reproductive problems such as terata and eggshell thinning, and (4) aberrations of hemoglobin synthesis, including the effects of lead and of certain chlorinated hydrocarbons. Many studies on DNA damage and of histopathological effects, particularly in the form of tumors, have already been completed. There are presently numerous other opportunities for field validation. Bile metabolites of contaminants in fish reveal exposure to contaminants that might otherwise be difficult to detect or quantify. Bile analysis is beginning to be extended to species other than fishes. Assessment of oxidative damage and immune competence appear to be valuable biomarkers. needing only additional field validation for wider use. The use of metallothioneins as biomarkers depends on the development of convenient, inexpensive methodology that provides information not available from measurements of metal ions. The use of stress proteins as biomarkers depends on development of convenient, inexpensive methodology and field validation. Gene arrays and proteomics hold promise as bioindicators for contaminant exposure or effect, particularly because of the large amount of data that could be generated, but they still need extensive development and testing.

  11. Reliability and validity of the neurorehabilitation experience questionnaire for inpatients.

    PubMed

    Kneebone, Ian I; Hull, Samantha L; McGurk, Rhona; Cropley, Mark

    2012-09-01

    Patient-centered measures of the inpatient neurorehabilitation experience are needed to assess services. The objective of this study was to develop a valid and reliable Neurorehabilitation Experience Questionnaire (NREQ) to assess whether neurorehabilitation inpatients experience service elements important to them. Based on the themes established in prior qualitative research, adopting questions from established inventories and using a literature review, a draft version of the NREQ was generated. Focus groups and interviews were conducted with 9 patients and 26 staff from neurological rehabilitation units to establish face validity. Then, 70 patients were recruited to complete the NREQ to ascertain reliability (internal and test-retest) and concurrent validity. On the basis of the face validity testing, several modifications were made to the draft version of the NREQ. Subsequently, internal reliability (time 1 α = .76, time 2 α = .80), test retest reliability (r = 0.70), and concurrent validity (r = 0.32 and r = 0.56) were established for the revised version. Whereas responses were associated with positive mood (r = 0.30), they appeared not to be influenced by negative mood, age, education, length of stay, sex, functional independence, or whether a participant had been a patient on a unit previously. Preliminary validation of the NREQ suggests promise for use with its target population.

  12. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  13. The Grand Banks ERS-1 SAR wave spectra validation experiment

    NASA Technical Reports Server (NTRS)

    Vachon, P. W.; Dobson, F. W.; Smith, S. D.; Anderson, R. J.; Buckley, J. R.; Allingham, M.; Vandemark, D.; Walsh, E. J.; Khandekar, M.; Lalbeharry, R.

    1993-01-01

    As part of the ERS-1 validation program, the ERS-1 Synthetic Aperture Radar (SAR) wave spectra validation experiment was carried out over the Grand Banks of Newfoundland (Canada) in Nov. 1991. The principal objective of the experiment was to obtain complete sets of wind and wave data from a variety of calibrated instruments to validate SAR measurements of ocean wave spectra. The field program activities are described and the rather complex wind and wave conditions which were observed are summarized. Spectral comparisons with ERS-1 SAR image spectra are provided. The ERS-1 SAR is shown to have measured swell and range traveling wind seas, but did not measure azimuth traveling wind seas at any time during the experiment. Results of velocity bunching forward mapping and new measurements of the relationship between wind stress and sea state are also shown.

  14. Extension of a Kinetic Approach to Chemical Reactions to Electronic Energy Levels and Reactions Involving Charged Species With Application to DSMC Simulations

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2013-01-01

    The ability to compute rarefied, ionized hypersonic flows is becoming more important as missions such as Earth reentry, landing high mass payloads on Mars, and the exploration of the outer planets and their satellites are being considered. Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties are extended in the current work to include electronic energy level transitions and reactions involving charged particles. These extensions are shown to agree favorably with reported transition and reaction rates from the literature for nearequilibrium conditions. Also, the extensions are applied to the second flight of the Project FIRE flight experiment at 1634 seconds with a Knudsen number of 0.001 at an altitude of 76.4 km. In order to accomplish this, NASA's direct simulation Monte Carlo code DAC was rewritten to include the ability to simulate charge-neutral ionized flows, take advantage of the recently introduced chemistry model, and to include the extensions presented in this work. The 1634 second data point was chosen for comparisons to be made in order to include a CFD solution. The Knudsen number at this point in time is such that the DSMC simulations are still tractable and the CFD computations are at the edge of what is considered valid because, although near-transitional, the flow is still considered to be continuum. It is shown that the inclusion of electronic energy levels in the DSMC simulation is necessary for flows of this nature and is required for comparison to the CFD solution. The flow field solutions are also post-processed by the nonequilibrium radiation code HARA to compute the radiative portion of the heating and is then compared to the total heating measured in flight.

  15. New true-triaxial rock strength criteria considering intrinsic material characteristics

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Li, Cheng; Quan, Xiaowei; Wang, Yanning; Yu, Liyuan; Jiang, Binsong

    2018-02-01

    A reasonable strength criterion should reflect the hydrostatic pressure effect, minimum principal stress effect, and intermediate principal stress effect. The former two effects can be described by the meridian curves, and the last one mainly depends on the Lode angle dependence function. Among three conventional strength criteria, i.e. Mohr-Coulomb (MC), Hoek-Brown (HB), and Exponent (EP) criteria, the difference between generalized compression and extension strength of EP criterion experience a firstly increase then decrease process, and tends to be zero when hydrostatic pressure is big enough. This is in accordance with intrinsic rock strength characterization. Moreover, the critical hydrostatic pressure I_c corresponding to the maximum difference of between generalized compression and extension strength can be easily adjusted by minimum principal stress influence parameter K. So, the exponent function is a more reasonable meridian curves, which well reflects the hydrostatic pressure effect and is employed to describe the generalized compression and extension strength. Meanwhile, three Lode angle dependence functions of L_{{MN}}, L_{{WW}}, and L_{{YMH}}, which unconditionally satisfy the convexity and differential requirements, are employed to represent the intermediate principal stress effect. Realizing the actual strength surface should be located between the generalized compression and extension surface, new true-triaxial criteria are proposed by combining the two states of EP criterion by Lode angle dependence function with a same lode angle. The proposed new true-triaxial criteria have the same strength parameters as EP criterion. Finally, 14 groups of triaxial test data are employed to validate the proposed criteria. The results show that the three new true-triaxial exponent criteria, especially the Exponent Willam-Warnke criterion (EPWW) criterion, give much lower misfits, which illustrates that the EP criterion and L_{{WW}} have more reasonable meridian and deviatoric function form, respectively. The proposed new true-triaxial strength criteria can provide theoretical foundation for stability analysis and optimization of support design of rock engineering.

  16. Modeling fission product vapor transport in the Falcon facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shepherd, I.M.; Drossinos, Y.; Benson, C.G.

    1995-05-01

    An extensive database of aerosol Experiments exists and has been used for checking aerosol transport codes. Data for fission product vapor transport are harder to find. Some qualitative data are available, but the Falcon thermal gradient tube tests carried out at AEA Technology`s laboratories in Winfrith, England, mark the first serious attempt to provide a set of experiments suitable for the validation of codes that predict the transport and condensation of realistic mixtures of fission product vapors. Four of these have been analyzed to check how well the computer code VICTORIA can predict the most important phenomena. Of the fourmore » experiments studied, two are reference cases (FAL-17 and FAL-19), one is a case without boric acid (FAL-18), and the other is run in a reducing atmosphere (FAL-20). The results show that once the vapors condense onto aerosols, VICTORIA can predict their deposition rather well. The dominant mechanism is thermophoresis, and each element deposits with more or less the same deposition velocity. The behavior of the vapors is harder to interpret. Essentially, it is important to know the temperature at which each element condenses. It is clear from the measurements that this temperature changed from test to test-caused mostly by the different speciation as the composition of the carrier gas and the relative concentration of other fission products changed. Only in the test with a steam atmosphere and without boric acid was the assumption valid that most of the iodine is cesium iodide and most of the cesium is cesium hydroxide. In general, VICTORIA predicts that, with the exception of cesium, there will be less variation in the speciation-and, hence, variation in the deposition-between tests than is in fact observed. VICTORIA underpredicts the volatility of most elements, and this is partly a consequence of the ideal solution assumption and partly an overestimation of vapor/aerosol interactions.« less

  17. Applying visual attention theory to transportation safety research and design: evaluation of alternative automobile rear lighting systems.

    PubMed

    McIntyre, Scott E; Gugerty, Leo

    2014-06-01

    This field experiment takes a novel approach in applying methodologies and theories of visual search to the subject of conspicuity in automobile rear lighting. Traditional rear lighting research has not used the visual search paradigm in experimental design. It is our claim that the visual search design uniquely uncovers visual attention processes operating when drivers search the visual field that current designs fail to capture. This experiment is a validation and extension of previous simulator research on this same topic and demonstrates that detection of red automobile brake lamps will be improved if tail lamps are another color (in this test, amber) rather than the currently mandated red. Results indicate that when drivers miss brake lamp onset in low ambient light, RT and error are reduced in detecting the presence and absence of red brake lamps with multiple lead vehicles when tail lamps are not red compared to current rear lighting which mandates red tail lamps. This performance improvement is attributed to efficient visual processing that automatically segregates tail (amber) and brake (red) lamp colors into distractors and targets respectively. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Theory and experiments in model-based space system anomaly management

    NASA Astrophysics Data System (ADS)

    Kitts, Christopher Adam

    This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

  19. Is Best-Worst Scaling Suitable for Health State Valuation? A Comparison with Discrete Choice Experiments.

    PubMed

    Krucien, Nicolas; Watson, Verity; Ryan, Mandy

    2017-12-01

    Health utility indices (HUIs) are widely used in economic evaluation. The best-worst scaling (BWS) method is being used to value dimensions of HUIs. However, little is known about the properties of this method. This paper investigates the validity of the BWS method to develop HUI, comparing it to another ordinal valuation method, the discrete choice experiment (DCE). Using a parametric approach, we find a low level of concordance between the two methods, with evidence of preference reversals. BWS responses are subject to decision biases, with significant effects on individuals' preferences. Non parametric tests indicate that BWS data has lower stability, monotonicity and continuity compared to DCE data, suggesting that the BWS provides lower quality data. As a consequence, for both theoretical and technical reasons, practitioners should be cautious both about using the BWS method to measure health-related preferences, and using HUI based on BWS data. Given existing evidence, it seems that the DCE method is a better method, at least because its limitations (and measurement properties) have been extensively researched. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Empirical testing of criteria for dissociative schizophrenia.

    PubMed

    Laferrière-Simard, Marie-Christine; Lecomte, Tania; Ahoundova, Lola

    2014-01-01

    This study examined the validity of dissociative schizophrenia diagnostic criteria. In the first phase, 50 participants with a psychotic disorder were administered the Dissociative Experiences Scale and the Childhood Trauma Questionnaire to identify those with dissociative characteristics. In the second phase, we selected those who had a score of 15 or above on the Dissociative Experiences Scale. Fifteen of these participants were evaluated thoroughly with the Structured Clinical Interview for DSM-IV Axis I, Structured Clinical Interview for DSM-IV Axis II, and Structured Clinical Interview for DSM-IV Dissociative Disorders to determine whether they met the criteria for dissociative schizophrenia and to generate a clinical description. Our results indicated that 24% of the individuals we tested met these criteria. We propose making mandatory 1 of the 3 dissociative symptoms of the criteria to eliminate people with only nonspecific symptoms (e.g., extensive comorbidity). According to this modified criterion, 14% of our sample would receive a diagnosis of dissociative schizophrenia. However, a more comprehensive look at the clinical picture begs the question of whether dissociative schizophrenia is truly present in every person meeting the criteria. We discuss the relevance of creating a new schizophrenia subtype and offer recommendations for clinicians.

Top