Validation and upgrading of physically based mathematical models
NASA Technical Reports Server (NTRS)
Duval, Ronald
1992-01-01
The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.
Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko
2015-10-30
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems
Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko
2015-01-01
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982
NASA GPM GV Science Implementation
NASA Technical Reports Server (NTRS)
Petersen, W. A.
2009-01-01
Pre-launch algorithm development & post-launch product evaluation: The GPM GV paradigm moves beyond traditional direct validation/comparison activities by incorporating improved algorithm physics & model applications (end-to-end validation) in the validation process. Three approaches: 1) National Network (surface): Operational networks to identify and resolve first order discrepancies (e.g., bias) between satellite and ground-based precipitation estimates. 2) Physical Process (vertical column): Cloud system and microphysical studies geared toward testing and refinement of physically-based retrieval algorithms. 3) Integrated (4-dimensional): Integration of satellite precipitation products into coupled prediction models to evaluate strengths/limitations of satellite precipitation producers.
Hu, Ming-Hsia; Yeh, Chih-Jun; Chen, Tou-Rong; Wang, Ching-Yi
2014-01-01
A valid, time-efficient and easy-to-use instrument is important for busy clinical settings, large scale surveys, or community screening use. The purpose of this study was to validate the mobility hierarchical disability categorization model (an abbreviated model) by investigating its concurrent validity with the multidimensional hierarchical disability categorization model (a comprehensive model) and triangulating both models with physical performance measures in older adults. 604 community-dwelling older adults of at least 60 years in age volunteered to participate. Self-reported function on mobility, instrumental activities of daily living (IADL) and activities of daily living (ADL) domains were recorded and then the disability status determined based on both the multidimensional hierarchical categorization model and the mobility hierarchical categorization model. The physical performance measures, consisting of grip strength and usual and fastest gait speeds (UGS, FGS), were collected on the same day. Both categorization models showed high correlation (γs = 0.92, p < 0.001) and agreement (kappa = 0.61, p < 0.0001). Physical performance measures demonstrated significant different group means among the disability subgroups based on both categorization models. The results of multiple regression analysis indicated that both models individually explain similar amount of variance on all physical performances, with adjustments for age, sex, and number of comorbidities. Our results found that the mobility hierarchical disability categorization model is a valid and time efficient tool for large survey or screening use.
A physics based method for combining multiple anatomy models with application to medical simulation.
Zhu, Yanong; Magee, Derek; Ratnalingam, Rishya; Kessel, David
2009-01-01
We present a physics based approach to the construction of anatomy models by combining components from different sources; different image modalities, protocols, and patients. Given an initial anatomy, a mass-spring model is generated which mimics the physical properties of the solid anatomy components. This helps maintain valid spatial relationships between the components, as well as the validity of their shapes. Combination can be either replacing/modifying an existing component, or inserting a new component. The external forces that deform the model components to fit the new shape are estimated from Gradient Vector Flow and Distance Transform maps. We demonstrate the applicability and validity of the described approach in the area of medical simulation, by showing the processes of non-rigid surface alignment, component replacement, and component insertion.
ERIC Educational Resources Information Center
Jackson, Allen W.; Morrow, James R., Jr.; Bowles, Heather R.; FitzGerald, Shannon J.; Blair, Steven N.
2007-01-01
Valid measurement of physical activity is important for studying the risks for morbidity and mortality. The purpose of this study was to examine evidence of construct validity of two similar single-response items assessing physical activity via self-report. Both items are based on the stages of change model. The sample was 687 participants (men =…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Parsons, T.; King, R.
This report summarizes the theory, verification, and validation of a new sizing tool for wind turbine drivetrain components, the Drivetrain Systems Engineering (DriveSE) tool. DriveSE calculates the dimensions and mass properties of the hub, main shaft, main bearing(s), gearbox, bedplate, transformer if up-tower, and yaw system. The level of fi¬ delity for each component varies depending on whether semiempirical parametric or physics-based models are used. The physics-based models have internal iteration schemes based on system constraints and design criteria. Every model is validated against available industry data or finite-element analysis. The verification and validation results show that the models reasonablymore » capture primary drivers for the sizing and design of major drivetrain components.« less
Gruber-Baldini, Ann L.; Hicks, Gregory; Ostir, Glen; Klinedinst, N. Jennifer; Orwig, Denise; Magaziner, Jay
2015-01-01
Background Measurement of physical function post hip fracture has been conceptualized using multiple different measures. Purpose This study tested a comprehensive measurement model of physical function. Design This was a descriptive secondary data analysis including 168 men and 171 women post hip fracture. Methods Using structural equation modeling, a measurement model of physical function which included grip strength, activities of daily living, instrumental activities of daily living and performance was tested for fit at 2 and 12 months post hip fracture and among male and female participants and validity of the measurement model of physical function was evaluated based on how well the model explained physical activity, exercise and social activities post hip fracture. Findings The measurement model of physical function fit the data. The amount of variance the model or individual factors of the model explained varied depending on the activity. Conclusion Decisions about the ideal way in which to measure physical function should be based on outcomes considered and participant Clinical Implications The measurement model of physical function is a reliable and valid method to comprehensively measure physical function across the hip fracture recovery trajectory. Practical but useful assessment of function should be considered and monitored over the recovery trajectory post hip fracture. PMID:26492866
ERIC Educational Resources Information Center
Zhang, Tan; Chen, Ang
2017-01-01
Based on the job demands-resources model, the study developed and validated an instrument that measures physical education teachers' job demands-resources perception. Expert review established content validity with the average item rating of 3.6/5.0. Construct validity and reliability were determined with a teacher sample (n = 397). Exploratory…
Validation of the TTM processes of change measure for physical activity in an adult French sample.
Bernard, Paquito; Romain, Ahmed-Jérôme; Trouillet, Raphael; Gernigon, Christophe; Nigg, Claudio; Ninot, Gregory
2014-04-01
Processes of change (POC) are constructs from the transtheoretical model that propose to examine how people engage in a behavior. However, there is no consensus about a leading model explaining POC and there is no validated French POC scale in physical activity This study aimed to compare the different existing models to validate a French POC scale. Three studies, with 748 subjects included, were carried out to translate the items and evaluate their clarity (study 1, n = 77), to assess the factorial validity (n = 200) and invariance/equivalence (study 2, n = 471), and to analyze the concurrent validity by stage × process analyses (study 3, n = 671). Two models displayed adequate fit to the data; however, based on the Akaike information criterion, the fully correlated five-factor model appeared as the most appropriate to measure POC in physical activity. The invariance/equivalence was also confirmed across genders and student status. Four of the five existing factors discriminated pre-action and post-action stages. These data support the validation of the POC questionnaire in physical activity among a French sample. More research is needed to explore the longitudinal properties of this scale.
Spatial calibration and temporal validation of flow for regional scale hydrologic modeling
USDA-ARS?s Scientific Manuscript database
Physically based regional scale hydrologic modeling is gaining importance for planning and management of water resources. Calibration and validation of such regional scale model is necessary before applying it for scenario assessment. However, in most regional scale hydrologic modeling, flow validat...
Adaptive Modeling of Details for Physically-Based Sound Synthesis and Propagation
2015-03-21
the interface that ensures the consistency and validity of the solution given by the two methods. Transfer functions are used to model two-way...release; distribution is unlimited. Adaptive modeling of details for physically-based sound synthesis and propagation The views, opinions and/or...Research Triangle Park, NC 27709-2211 Applied sciences, Adaptive modeling , Physcially-based, Sound synthesis, Propagation, Virtual world REPORT
A Diagnostic Model for Impending Death in Cancer Patients: Preliminary Report
Hui, David; Hess, Kenneth; dos Santos, Renata; Chisholm, Gary; Bruera, Eduardo
2015-01-01
Background We recently identified several highly specific bedside physical signs associated with impending death within 3 days among patients with advanced cancer. In this study, we developed and assessed a diagnostic model for impending death based on these physical signs. Methods We systematically documented 62 physical signs every 12 hours from admission to death or discharge in 357 patients with advanced cancer admitted to acute palliative care units (APCUs) at two tertiary care cancer centers. We used recursive partitioning analysis (RPA) to develop a prediction model for impending death in 3 days using admission data. We validated the model with 5 iterations of 10-fold cross-validation, and also applied the model to APCU days 2/3/4/5/6. Results Among 322/357 (90%) patients with complete data for all signs, the 3-day mortality was 24% on admission. The final model was based on 2 variables (palliative performance scale [PPS] and drooping of nasolabial fold) and had 4 terminal leaves: PPS≤20% and drooping of nasolabial fold present, PPS≤20% and drooping of nasolabial fold absent, PPS 30–60% and PPS ≥ 70%, with 3-day mortality of 94%, 42%, 16% and 3%, respectively. The diagnostic accuracy was 81% for the original tree, 80% for cross-validation, and 79%–84% for subsequent APCU days. Conclusion(s) We developed a diagnostic model for impending death within 3 days based on 2 objective bedside physical signs. This model was applicable to both APCU admission and subsequent days. Upon further external validation, this model may help clinicians to formulate the diagnosis of impending death. PMID:26218612
NASA Astrophysics Data System (ADS)
Hidayati, A.; Rahmi, A.; Yohandri; Ratnawulan
2018-04-01
The importance of teaching materials in accordance with the characteristics of students became the main reason for the development of basic electronics I module integrated character values based on conceptual change teaching model. The module development in this research follows the development procedure of Plomp which includes preliminary research, prototyping phase and assessment phase. In the first year of this research, the module is validated. Content validity is seen from the conformity of the module with the development theory in accordance with the demands of learning model characteristics. The validity of the construct is seen from the linkage and consistency of each module component developed with the characteristic of the integrated learning model of character values obtained through validator assessment. The average validation value assessed by the validator belongs to a very valid category. Based on the validator assessment then revised the basic electronics I module integrated character values based on conceptual change teaching model.
ERIC Educational Resources Information Center
Derlina; Sabani; Mihardi, Satria
2015-01-01
Education Research in Indonesia has begun to lead to the development of character education and is no longer fixated on the outcomes of cognitive learning. This study purposed to produce character education based general physics learning model (CEBGP Learning Model) and with valid, effective and practical peripheral devices to improve character…
A diagnostic model for impending death in cancer patients: Preliminary report.
Hui, David; Hess, Kenneth; dos Santos, Renata; Chisholm, Gary; Bruera, Eduardo
2015-11-01
Several highly specific bedside physical signs associated with impending death within 3 days for patients with advanced cancer were recently identified. A diagnostic model for impending death based on these physical signs was developed and assessed. Sixty-two physical signs were systematically documented every 12 hours from admission to death or discharge for 357 patients with advanced cancer who were admitted to acute palliative care units (APCUs) at 2 tertiary care cancer centers. Recursive partitioning analysis was used to develop a prediction model for impending death within 3 days with admission data. The model was validated with 5 iterations of 10-fold cross-validation, and the model was also applied to APCU days 2 to 6. For the 322 of 357 patients (90%) with complete data for all signs, the 3-day mortality rate was 24% on admission. The final model was based on 2 variables (Palliative Performance Scale [PPS] and drooping of nasolabial folds) and had 4 terminal leaves: PPS score ≤ 20% and drooping of nasolabial folds present, PPS score ≤ 20% and drooping of nasolabial folds absent, PPS score of 30% to 60%, and PPS score ≥ 70%. The 3-day mortality rates were 94%, 42%, 16%, and 3%, respectively. The diagnostic accuracy was 81% for the original tree, 80% for cross-validation, and 79% to 84% for subsequent APCU days. Based on 2 objective bedside physical signs, a diagnostic model was developed for impending death within 3 days. This model was applicable to both APCU admission and subsequent days. Upon further external validation, this model may help clinicians to formulate the diagnosis of impending death. © 2015 American Cancer Society.
Sebire, Simon J; Jago, Russell; Fox, Kenneth R; Edwards, Mark J; Thompson, Janice L
2013-09-26
Understanding children's physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children's physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children's minutes in moderate-to-vigorous physical activity. The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children's motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical activity and such motivation is positively associated with perceptions of psychological need satisfaction. These psychological factors represent potential malleable targets for interventions to increase children's physical activity.
NASA Astrophysics Data System (ADS)
Maghareh, Amin; Silva, Christian E.; Dyke, Shirley J.
2018-05-01
Hydraulic actuators play a key role in experimental structural dynamics. In a previous study, a physics-based model for a servo-hydraulic actuator coupled with a nonlinear physical system was developed. Later, this dynamical model was transformed into controllable canonical form for position tracking control purposes. For this study, a nonlinear device is designed and fabricated to exhibit various nonlinear force-displacement profiles depending on the initial condition and the type of materials used as replaceable coupons. Using this nonlinear system, the controllable canonical dynamical model is experimentally validated for a servo-hydraulic actuator coupled with a nonlinear physical system.
ERIC Educational Resources Information Center
Plotnikoff, Ronald C.; Lippke, Sonia; Reinbold-Matthews, Melissa; Courneya, Kerry S.; Karunamuni, Nandini; Sigal, Ronald J.; Birkett, Nicholas
2007-01-01
This study was designed to test the validity of a transtheoretical model's physical activity (PA) stage measure with intention and different intensities of behavior in a large population-based sample of adults living with diabetes (Type 1 diabetes, n = 697; Type 2 diabetes, n = 1,614) and examine different age groups. The overall…
NASA Astrophysics Data System (ADS)
Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.
2016-05-01
Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.
Validating an operational physical method to compute surface radiation from geostationary satellites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Manajit; Dhere, Neelkanth G.; Wohlgemuth, John H.
We developed models to compute global horizontal irradiance (GHI) and direct normal irradiance (DNI) over the last three decades. These models can be classified as empirical or physical based on the approach. Empirical models relate ground-based observations with satellite measurements and use these relations to compute surface radiation. Physical models consider the physics behind the radiation received at the satellite and create retrievals to estimate surface radiation. Furthermore, while empirical methods have been traditionally used for computing surface radiation for the solar energy industry, the advent of faster computing has made operational physical models viable. The Global Solar Insolation Projectmore » (GSIP) is a physical model that computes DNI and GHI using the visible and infrared channel measurements from a weather satellite. GSIP uses a two-stage scheme that first retrieves cloud properties and uses those properties in a radiative transfer model to calculate GHI and DNI. Developed for polar orbiting satellites, GSIP has been adapted to NOAA's Geostationary Operation Environmental Satellite series and can run operationally at high spatial resolutions. Our method holds the possibility of creating high quality datasets of GHI and DNI for use by the solar energy industry. We present an outline of the methodology and results from running the model as well as a validation study using ground-based instruments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee
2015-09-01
This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.
Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensormore » level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Chao; Xu, Zhijie; Lai, Canhai
A hierarchical model calibration and validation is proposed for quantifying the confidence level of mass transfer prediction using a computational fluid dynamics (CFD) model, where the solvent-based carbon dioxide (CO2) capture is simulated and simulation results are compared to the parallel bench-scale experimental data. Two unit problems with increasing level of complexity are proposed to breakdown the complex physical/chemical processes of solvent-based CO2 capture into relatively simpler problems to separate the effects of physical transport and chemical reaction. This paper focuses on the calibration and validation of the first unit problem, i.e. the CO2 mass transfer across a falling ethanolaminemore » (MEA) film in absence of chemical reaction. This problem is investigated both experimentally and numerically using nitrous oxide (N2O) as a surrogate for CO2. To capture the motion of gas-liquid interface, a volume of fluid method is employed together with a one-fluid formulation to compute the mass transfer between the two phases. Bench-scale parallel experiments are designed and conducted to validate and calibrate the CFD models using a general Bayesian calibration. Two important transport parameters, e.g. Henry’s constant and gas diffusivity, are calibrated to produce the posterior distributions, which will be used as the input for the second unit problem to address the chemical adsorption of CO2 across the MEA falling film, where both mass transfer and chemical reaction are involved.« less
A Comparison of Energy Expenditure Estimation of Several Physical Activity Monitors
Dannecker, Kathryn L.; Sazonova, Nadezhda A.; Melanson, Edward L.; Sazonov, Edward S.; Browning, Raymond C.
2013-01-01
Accurately and precisely estimating free-living energy expenditure (EE) is important for monitoring energy balance and quantifying physical activity. Recently, single and multi-sensor devices have been developed that can classify physical activities, potentially resulting in improved estimates of EE. PURPOSE To determine the validity of EE estimation of a footwear-based physical activity monitor and to compare this validity against a variety of research and consumer physical activity monitors. METHODS Nineteen healthy young adults (10 male, 9 female), completed a four-hour stay in a room calorimeter. Participants wore a footwear-based physical activity monitor, as well as Actical, Actigraph, IDEEA, DirectLife and Fitbit devices. Each individual performed a series of postures/activities. We developed models to estimate EE from the footwear-based device, and we used the manufacturer's software to estimate EE for all other devices. RESULTS Estimated EE using the shoe-based device was not significantly different than measured EE (476(20) vs. 478(18) kcal) (Mean (SE)), respectively, and had a root mean square error (RMSE) of (29.6 kcal (6.2%)). The IDEEA and DirectLlife estimates of EE were not significantly different than the measured EE but the Actigraph and Fitbit devices significantly underestimated EE. Root mean square errors were 93.5 (19%), 62.1 kcal (14%), 88.2 kcal (18%), 136.6 kcal (27%), 130.1 kcal (26%), and 143.2 kcal (28%) for Actical, DirectLife, IDEEA, Actigraph and Fitbit respectively. CONCLUSIONS The shoe based physical activity monitor provides a valid estimate of EE while the other physical activity monitors tested have a wide range of validity when estimating EE. Our results also demonstrate that estimating EE based on classification of physical activities can be more accurate and precise than estimating EE based on total physical activity. PMID:23669877
NASA Astrophysics Data System (ADS)
Nasution, Derlina; Syahreni Harahap, Putri; Harahap, Marabangun
2018-03-01
This research aims to: (1) developed a instrument’s learning (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) of physics learning through scientific inquiry learning model based Batak culture to achieve skills improvement process of science students and the students’ curiosity; (2) describe the quality of the result of develop instrument’s learning in high school using scientific inquiry learning model based Batak culture (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) to achieve the science process skill improvement of students and the student curiosity. This research is research development. This research developed a instrument’s learning of physics by using a development model that is adapted from the development model Thiagarajan, Semmel, and Semmel. The stages are traversed until retrieved a valid physics instrument’s learning, practical, and effective includes :(1) definition phase, (2) the planning phase, and (3) stages of development. Test performed include expert test/validation testing experts, small groups, and test classes is limited. Test classes are limited to do in SMAN 1 Padang Bolak alternating on a class X MIA. This research resulted in: 1) the learning of physics static fluid material specially for high school grade 10th consisted of (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) and quality worthy of use in the learning process; 2) each component of the instrument’s learning meet the criteria have valid learning, practical, and effective way to reach the science process skill improvement and curiosity in students.
The effectiveness of physics learning material based on South Kalimantan local wisdom
NASA Astrophysics Data System (ADS)
Hartini, Sri; Misbah, Helda, Dewantara, Dewi
2017-08-01
The local wisdom is essential element incorporated into learning process. However, there are no learning materials in Physics learning process which contain South Kalimantan local wisdom. Therefore, it is necessary to develop a Physics learning material based on South Kalimantan local wisdom. The objective of this research is to produce products in the form of learning material based on South Kalimantan local wisdom that is feasible and effective based on the validity, practicality, effectiveness of learning material and achievement of waja sampai kaputing (wasaka) character. This research is a research and development which refers to the ADDIE model. Data were obtained through the validation sheet of learning material, questionnaire, the test of learning outcomes and the sheet of character assesment. The research results showed that (1) the validity category of the learning material was very valid, (2) the practicality category of the learning material was very practical, (3) the effectiveness category of thelearning material was very effective, and (4) the achivement of wasaka characters was very good. In conclusion, the Physics learning materials based on South Kalimantan local wisdom are feasible and effective to be used in learning activities.
Chum, Antony; Skosireva, Anna; Tobon, Juliana; Hwang, Stephen
2016-01-01
Background Self-reported health measures are important indicators used by clinicians and researchers for the evaluation of health interventions, outcome assessment of clinical studies, and identification of health needs to improve resource allocation. However, the application of self-reported health measures relies on developing reliable and valid instruments that are suitable across diverse populations. The main objective of this study is to evaluate the construct validity of the SF-12v.2, an instrument for measuring self-rated physical and mental health, for homeless adults with mental illness. Various interventions have been aimed at improving the health of homeless people with mental illness, and the development of valid instruments to evaluate these interventions is imperative. Study Design We measured self-rated mental and physical health from a quota sample of 575 homeless people with mental illness using the SF-12v2, EQ-5D, Colorado Symptoms Index, and physical/mental health visual analogue scales. We examined the construct validity of the SF-12v2 through confirmatory factor analyses (CFA), and using ANOVA/correlation analyses to compare the SF-12v2 to the other instruments to ascertain discriminant/convergent validity. Results Our CFA showed that the measurement properties of the original SF-12v2 model had a mediocre fit with our empirical data (χ2 = 193.6, df = 43, p < .0001, CFI = 0.85, NFI = 0.83, RMSEA = 0.08). We demonstrate that changes based on theoretical rationale and previous studies can significantly improve the model, achieving an excellent fit in our final model (χ2 = 160.6, df = 48, p < .0001, CFI = 0.95, NFI = 0.95, RMSEA = 0.06). Our CFA results suggest that an alternative scoring method based on the new model may optimize health status measurement of a homeless population. Despite these issues, convergent and discriminant validity of the SF-12v2 (scored based on the original model) was supported through multiple comparisons with other instruments. Conclusion Our study demonstrates for the first time that the SF-12v2 is generally appropriate as a measure of physical and mental health status for a homeless population with mental illness. PMID:26938990
Chum, Antony; Skosireva, Anna; Tobon, Juliana; Hwang, Stephen
2016-01-01
Self-reported health measures are important indicators used by clinicians and researchers for the evaluation of health interventions, outcome assessment of clinical studies, and identification of health needs to improve resource allocation. However, the application of self-reported health measures relies on developing reliable and valid instruments that are suitable across diverse populations. The main objective of this study is to evaluate the construct validity of the SF-12v.2, an instrument for measuring self-rated physical and mental health, for homeless adults with mental illness. Various interventions have been aimed at improving the health of homeless people with mental illness, and the development of valid instruments to evaluate these interventions is imperative. We measured self-rated mental and physical health from a quota sample of 575 homeless people with mental illness using the SF-12v2, EQ-5D, Colorado Symptoms Index, and physical/mental health visual analogue scales. We examined the construct validity of the SF-12v2 through confirmatory factor analyses (CFA), and using ANOVA/correlation analyses to compare the SF-12v2 to the other instruments to ascertain discriminant/convergent validity. Our CFA showed that the measurement properties of the original SF-12v2 model had a mediocre fit with our empirical data (χ2 = 193.6, df = 43, p < .0001, CFI = 0.85, NFI = 0.83, RMSEA = 0.08). We demonstrate that changes based on theoretical rationale and previous studies can significantly improve the model, achieving an excellent fit in our final model (χ2 = 160.6, df = 48, p < .0001, CFI = 0.95, NFI = 0.95, RMSEA = 0.06). Our CFA results suggest that an alternative scoring method based on the new model may optimize health status measurement of a homeless population. Despite these issues, convergent and discriminant validity of the SF-12v2 (scored based on the original model) was supported through multiple comparisons with other instruments. Our study demonstrates for the first time that the SF-12v2 is generally appropriate as a measure of physical and mental health status for a homeless population with mental illness.
NASA Astrophysics Data System (ADS)
Riandry, M. A.; Ismet, I.; Akhsan, H.
2017-09-01
This study aims to produce a valid and practical statistical physics course handout on distribution function materials based on STEM. Rowntree development model is used to produce this handout. The model consists of three stages: planning, development and evaluation stages. In this study, the evaluation stage used Tessmer formative evaluation. It consists of 5 stages: self-evaluation, expert review, one-to-one evaluation, small group evaluation and field test stages. However, the handout is limited to be tested on validity and practicality aspects, so the field test stage is not implemented. The data collection technique used walkthroughs and questionnaires. Subjects of this study are students of 6th and 8th semester of academic year 2016/2017 Physics Education Study Program of Sriwijaya University. The average result of expert review is 87.31% (very valid category). One-to-one evaluation obtained the average result is 89.42%. The result of small group evaluation is 85.92%. From one-to-one and small group evaluation stages, averagestudent response to this handout is 87,67% (very practical category). Based on the results of the study, it can be concluded that the handout is valid and practical.
2013-01-01
Background Understanding children’s physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children’s physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Methods Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. Results The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children’s minutes in moderate-to-vigorous physical activity. Conclusions The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children’s motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical activity and such motivation is positively associated with perceptions of psychological need satisfaction. These psychological factors represent potential malleable targets for interventions to increase children’s physical activity. PMID:24067078
The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment
NASA Technical Reports Server (NTRS)
Clement, Bradley J.; Frank, Jeremy D.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.
2011-01-01
A principal obstacle to fielding automated planning systems is the difficulty of modeling. Physical systems are modeled conventionally based on specification documents and the modeler's understanding of the system. Thus, the model is developed in a way that is disconnected from the system's actual behavior and is vulnerable to manual error. Another obstacle to fielding planners is testing and validation. For a space mission, generated plans must be validated often by translating them into command sequences that are run in a simulation testbed. Testing in this way is complex and onerous because of the large number of possible plans and states of the spacecraft. Though, if used as a source of domain knowledge, the simulator can ease validation. This paper poses a challenge: to ground planning models in the system physics represented by simulation. A proposed, interactive model development environment illustrates the integration of planning and simulation to meet the challenge. This integration reveals research paths for automated model construction and validation.
Short-Term Forecasts Using NU-WRF for the Winter Olympics 2018
NASA Technical Reports Server (NTRS)
Srikishen, Jayanthi; Case, Jonathan L.; Petersen, Walter A.; Iguchi, Takamichi; Tao, Wei-Kuo; Zavodsky, Bradley T.; Molthan, Andrew
2017-01-01
The NASA Unified-Weather Research and Forecasting model (NU-WRF) will be included for testing and evaluation in the forecast demonstration project (FDP) of the International Collaborative Experiment -PyeongChang 2018 Olympic and Paralympic (ICE-POP) Winter Games. An international array of radar and supporting ground based observations together with various forecast and now-cast models will be operational during ICE-POP. In conjunction with personnel from NASA's Goddard Space Flight Center, the NASA Short-term Prediction Research and Transition (SPoRT) Center is developing benchmark simulations for a real-time NU-WRF configuration to run during the FDP. ICE-POP observational datasets will be used to validate model simulations and investigate improved model physics and performance for prediction of snow events during the research phase (RDP) of the project The NU-WRF model simulations will also support NASA Global Precipitation Measurement (GPM) Mission ground-validation physical and direct validation activities in relation to verifying, testing and improving satellite-based snowfall retrieval algorithms over complex terrain.
2018-01-01
Profile Database E-17 Attachment 2: NRMM Data Input Requirements E-25 Attachment 3: General Physics -Based Model Data Input Requirements E-28...E-15 Figure E-11 Examples of Unique Surface Types E-20 Figure E-12 Correlating Physical Testing with Simulation E-21 Figure E-13 Simplified Tire...Table 10-8 Scoring Values 10-19 Table 10-9 Accuracy – Physics -Based 10-20 Table 10-10 Accuracy – Validation Through Measurement 10-22 Table 10-11
A unified dislocation density-dependent physical-based constitutive model for cold metal forming
NASA Astrophysics Data System (ADS)
Schacht, K.; Motaman, A. H.; Prahl, U.; Bleck, W.
2017-10-01
Dislocation-density-dependent physical-based constitutive models of metal plasticity while are computationally efficient and history-dependent, can accurately account for varying process parameters such as strain, strain rate and temperature; different loading modes such as continuous deformation, creep and relaxation; microscopic metallurgical processes; and varying chemical composition within an alloy family. Since these models are founded on essential phenomena dominating the deformation, they have a larger range of usability and validity. Also, they are suitable for manufacturing chain simulations since they can efficiently compute the cumulative effect of the various manufacturing processes by following the material state through the entire manufacturing chain and also interpass periods and give a realistic prediction of the material behavior and final product properties. In the physical-based constitutive model of cold metal plasticity introduced in this study, physical processes influencing cold and warm plastic deformation in polycrystalline metals are described using physical/metallurgical internal variables such as dislocation density and effective grain size. The evolution of these internal variables are calculated using adequate equations that describe the physical processes dominating the material behavior during cold plastic deformation. For validation, the model is numerically implemented in general implicit isotropic elasto-viscoplasticity algorithm as a user-defined material subroutine (UMAT) in ABAQUS/Standard and used for finite element simulation of upsetting tests and a complete cold forging cycle of case hardenable MnCr steel family.
Simulation-Based Training for Colonoscopy
Preisler, Louise; Svendsen, Morten Bo Søndergaard; Nerup, Nikolaj; Svendsen, Lars Bo; Konge, Lars
2015-01-01
Abstract The aim of this study was to create simulation-based tests with credible pass/fail standards for 2 different fidelities of colonoscopy models. Only competent practitioners should perform colonoscopy. Reliable and valid simulation-based tests could be used to establish basic competency in colonoscopy before practicing on patients. Twenty-five physicians (10 consultants with endoscopic experience and 15 fellows with very little endoscopic experience) were tested on 2 different simulator models: a virtual-reality simulator and a physical model. Tests were repeated twice on each simulator model. Metrics with discriminatory ability were identified for both modalities and reliability was determined. The contrasting-groups method was used to create pass/fail standards and the consequences of these were explored. The consultants significantly performed faster and scored higher than the fellows on both the models (P < 0.001). Reliability analysis showed Cronbach α = 0.80 and 0.87 for the virtual-reality and the physical model, respectively. The established pass/fail standards failed one of the consultants (virtual-reality simulator) and allowed one fellow to pass (physical model). The 2 tested simulations-based modalities provided reliable and valid assessments of competence in colonoscopy and credible pass/fail standards were established for both the tests. We propose to use these standards in simulation-based training programs before proceeding to supervised training on patients. PMID:25634177
Bolandzadeh, Niousha; Kording, Konrad; Salowitz, Nicole; Davis, Jennifer C; Hsu, Liang; Chan, Alison; Sharma, Devika; Blohm, Gunnar; Liu-Ambrose, Teresa
2015-01-01
Current research suggests that the neuropathology of dementia-including brain changes leading to memory impairment and cognitive decline-is evident years before the onset of this disease. Older adults with cognitive decline have reduced functional independence and quality of life, and are at greater risk for developing dementia. Therefore, identifying biomarkers that can be easily assessed within the clinical setting and predict cognitive decline is important. Early recognition of cognitive decline could promote timely implementation of preventive strategies. We included 89 community-dwelling adults aged 70 years and older in our study, and collected 32 measures of physical function, health status and cognitive function at baseline. We utilized an L1-L2 regularized regression model (elastic net) to identify which of the 32 baseline measures were strongly predictive of cognitive function after one year. We built three linear regression models: 1) based on baseline cognitive function, 2) based on variables consistently selected in every cross-validation loop, and 3) a full model based on all the 32 variables. Each of these models was carefully tested with nested cross-validation. Our model with the six variables consistently selected in every cross-validation loop had a mean squared prediction error of 7.47. This number was smaller than that of the full model (115.33) and the model with baseline cognitive function (7.98). Our model explained 47% of the variance in cognitive function after one year. We built a parsimonious model based on a selected set of six physical function and health status measures strongly predictive of cognitive function after one year. In addition to reducing the complexity of the model without changing the model significantly, our model with the top variables improved the mean prediction error and R-squared. These six physical function and health status measures can be easily implemented in a clinical setting.
Zhang, Tan; Chen, Ang
2017-01-01
Based on the job demands-resources model, the study developed and validated an instrument that measures physical education teachers' job demands-resources perception. Expert review established content validity with the average item rating of 3.6/5.0. Construct validity and reliability were determined with a teacher sample ( n = 397). Exploratory factor analysis established a five-dimension construct structure matching the theoretical construct deliberated in the literature. The composite reliability scores for the five dimensions range from .68 to .83. Validity coefficients (intraclass correlational coefficients) are .69 for job resources items and .82 for job demands items. Inter-scale correlational coefficients range from -.32 to .47. Confirmatory factor analysis confirmed the construct validity with high dimensional factor loadings (ranging from .47 to .84 for job resources scale and from .50 to .85 for job demands scale) and adequate model fit indexes (root mean square error of approximation = .06). The instrument provides a tool to measure physical education teachers' perception of their working environment.
Zhang, Tan; Chen, Ang
2017-01-01
Based on the job demands–resources model, the study developed and validated an instrument that measures physical education teachers’ job demands–resources perception. Expert review established content validity with the average item rating of 3.6/5.0. Construct validity and reliability were determined with a teacher sample (n = 397). Exploratory factor analysis established a five-dimension construct structure matching the theoretical construct deliberated in the literature. The composite reliability scores for the five dimensions range from .68 to .83. Validity coefficients (intraclass correlational coefficients) are .69 for job resources items and .82 for job demands items. Inter-scale correlational coefficients range from −.32 to .47. Confirmatory factor analysis confirmed the construct validity with high dimensional factor loadings (ranging from .47 to .84 for job resources scale and from .50 to .85 for job demands scale) and adequate model fit indexes (root mean square error of approximation = .06). The instrument provides a tool to measure physical education teachers’ perception of their working environment. PMID:29200808
NASA Astrophysics Data System (ADS)
Ammouri, Aymen; Ben Salah, Walid; Khachroumi, Sofiane; Ben Salah, Tarek; Kourda, Ferid; Morel, Hervé
2014-05-01
Design of integrated power converters needs prototype-less approaches. Specific simulations are required for investigation and validation process. Simulation relies on active and passive device models. Models of planar devices, for instance, are still not available in power simulator tools. There is, thus, a specific limitation during the simulation process of integrated power systems. The paper focuses on the development of a physically-based planar inductor model and its validation inside a power converter during transient switching. The planar inductor model remains a complex device to model, particularly when the skin, the proximity and the parasitic capacitances effects are taken into account. Heterogeneous simulation scheme, including circuit and device models, is successfully implemented in VHDL-AMS language and simulated in Simplorer platform. The mixed simulation results has been favorably tested and compared with practical measurements. It is found that the multi-domain simulation results and measurements data are in close agreement.
Simplified predictive models for CO 2 sequestration performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Srikanta; Ganesh, Priya; Schuetter, Jared
CO2 sequestration in deep saline formations is increasingly being considered as a viable strategy for the mitigation of greenhouse gas emissions from anthropogenic sources. In this context, detailed numerical simulation based models are routinely used to understand key processes and parameters affecting pressure propagation and buoyant plume migration following CO2 injection into the subsurface. As these models are data and computation intensive, the development of computationally-efficient alternatives to conventional numerical simulators has become an active area of research. Such simplified models can be valuable assets during preliminary CO2 injection project screening, serve as a key element of probabilistic system assessmentmore » modeling tools, and assist regulators in quickly evaluating geological storage projects. We present three strategies for the development and validation of simplified modeling approaches for CO2 sequestration in deep saline formations: (1) simplified physics-based modeling, (2) statisticallearning based modeling, and (3) reduced-order method based modeling. In the first category, a set of full-physics compositional simulations is used to develop correlations for dimensionless injectivity as a function of the slope of the CO2 fractional-flow curve, variance of layer permeability values, and the nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. Furthermore, the dimensionless average pressure buildup after the onset of boundary effects can be correlated to dimensionless time, CO2 plume footprint, and storativity contrast between the reservoir and caprock. In the second category, statistical “proxy models” are developed using the simulation domain described previously with two approaches: (a) classical Box-Behnken experimental design with a quadratic response surface, and (b) maximin Latin Hypercube sampling (LHS) based design with a multidimensional kriging metamodel fit. For roughly the same number of simulations, the LHS-based metamodel yields a more robust predictive model, as verified by a k-fold cross-validation approach (with data split into training and test sets) as well by validation with an independent dataset. In the third category, a reduced-order modeling procedure is utilized that combines proper orthogonal decomposition (POD) for reducing problem dimensionality with trajectory-piecewise linearization (TPWL) in order to represent system response at new control settings from a limited number of training runs. Significant savings in computational time are observed with reasonable accuracy from the PODTPWL reduced-order model for both vertical and horizontal well problems – which could be important in the context of history matching, uncertainty quantification and optimization problems. The simplified physics and statistical learning based models are also validated using an uncertainty analysis framework. Reference cumulative distribution functions of key model outcomes (i.e., plume radius and reservoir pressure buildup) generated using a 97-run full-physics simulation are successfully validated against the CDF from 10,000 sample probabilistic simulations using the simplified models. The main contribution of this research project is the development and validation of a portfolio of simplified modeling approaches that will enable rapid feasibility and risk assessment for CO2 sequestration in deep saline formations.« less
Mendez, Roberto Della Rosa; Rodrigues, Roberta Cunha Matheus; Spana, Thaís Moreira; Cornélio, Marília Estevam; Gallani, Maria Cecília Bueno Jayme; Pérez-Nebra, Amalia Raquel
2012-01-01
to validate the content of persuasive messages for promoting walking among patients with coronary heart disease (CHD). The messages were constructed to strengthen or change patients' attitudes to walking. the selection of persuasive arguments was based on behavioral beliefs (determinants of attitude) related to walking. The messages were constructed based in the Elaboration Likelihood Model and were submitted to content validation. the data was analyzed with the content validity index and by the importance which the patients attributed to the messages' persuasive arguments. Positive behavioral beliefs (i.e. positive and negative reinforcement) and self-efficacy were the appeals which the patients considered important. The messages with validation evidence will be tested in an intervention study for the promotion of the practice of physical activity among patients with CHD.
A comparison of energy expenditure estimation of several physical activity monitors.
Dannecker, Kathryn L; Sazonova, Nadezhda A; Melanson, Edward L; Sazonov, Edward S; Browning, Raymond C
2013-11-01
Accurately and precisely estimating free-living energy expenditure (EE) is important for monitoring energy balance and quantifying physical activity. Recently, single and multisensor devices have been developed that can classify physical activities, potentially resulting in improved estimates of EE. This study aimed to determine the validity of EE estimation of a footwear-based physical activity monitor and to compare this validity against a variety of research and consumer physical activity monitors. Nineteen healthy young adults (10 men, 9 women) completed a 4-h stay in a room calorimeter. Participants wore a footwear-based physical activity monitor as well as Actical, ActiGraph, IDEEA, DirectLife, and Fitbit devices. Each individual performed a series of postures/activities. We developed models to estimate EE from the footwear-based device, and we used the manufacturer's software to estimate EE for all other devices. Estimated EE using the shoe-based device was not significantly different than measured EE (mean ± SE; 476 ± 20 vs 478 ± 18 kcal, respectively) and had a root-mean-square error of 29.6 kcal (6.2%). The IDEEA and the DirectLlife estimates of EE were not significantly different than the measured EE, but the ActiGraph and the Fitbit devices significantly underestimated EE. Root-mean-square errors were 93.5 (19%), 62.1 kcal (14%), 88.2 kcal (18%), 136.6 kcal (27%), 130.1 kcal (26%), and 143.2 kcal (28%) for Actical, DirectLife, IDEEA, ActiGraph, and Fitbit, respectively. The shoe-based physical activity monitor provides a valid estimate of EE, whereas the other physical activity monitors tested have a wide range of validity when estimating EE. Our results also demonstrate that estimating EE based on classification of physical activities can be more accurate and precise than estimating EE based on total physical activity.
NASA Astrophysics Data System (ADS)
Wang, W. L.; Zhou, Z. R.; Yu, D. S.; Qin, Q. H.; Iwnicki, S.
2017-10-01
A full nonlinear physical 'in-service' model was built for a rail vehicle secondary suspension hydraulic damper with shim-pack-type valves. In the modelling process, a shim pack deflection theory with an equivalent-pressure correction factor was proposed, and a Finite Element Analysis (FEA) approach was applied. Bench test results validated the damper model over its full velocity range and thus also proved that the proposed shim pack deflection theory and the FEA-based parameter identification approach are effective. The validated full damper model was subsequently incorporated into a detailed vehicle dynamics simulation to study how its key in-service parameter variations influence the secondary-suspension-related vehicle system dynamics. The obtained nonlinear physical in-service damper model and the vehicle dynamic response characteristics in this study could be used in the product design optimization and nonlinear optimal specifications of high-speed rail hydraulic dampers.
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Saha, Sankalita; Goebel, Kai
2011-01-01
Accelerated aging methodologies for electrolytic components have been designed and accelerated aging experiments have been carried out. The methodology is based on imposing electrical and/or thermal overstresses via electrical power cycling in order to mimic the real world operation behavior. Data are collected in-situ and offline in order to periodically characterize the devices' electrical performance as it ages. The data generated through these experiments are meant to provide capability for the validation of prognostic algorithms (both model-based and data-driven). Furthermore, the data allow validation of physics-based and empirical based degradation models for this type of capacitor. A first set of models and algorithms has been designed and tested on the data.
Web Based Semi-automatic Scientific Validation of Models of the Corona and Inner Heliosphere
NASA Astrophysics Data System (ADS)
MacNeice, P. J.; Chulaki, A.; Taktakishvili, A.; Kuznetsova, M. M.
2013-12-01
Validation is a critical step in preparing models of the corona and inner heliosphere for future roles supporting either or both the scientific research community and the operational space weather forecasting community. Validation of forecasting quality tends to focus on a short list of key features in the model solutions, with an unchanging order of priority. Scientific validation exposes a much larger range of physical processes and features, and as the models evolve to better represent features of interest, the research community tends to shift its focus to other areas which are less well understood and modeled. Given the more comprehensive and dynamic nature of scientific validation, and the limited resources available to the community to pursue this, it is imperative that the community establish a semi-automated process which engages the model developers directly into an ongoing and evolving validation process. In this presentation we describe the ongoing design and develpment of a web based facility to enable this type of validation of models of the corona and inner heliosphere, on the growing list of model results being generated, and on strategies we have been developing to account for model results that incorporate adaptively refined numerical grids.
De Leersnyder, Fien; Peeters, Elisabeth; Djalabi, Hasna; Vanhoorne, Valérie; Van Snick, Bernd; Hong, Ke; Hammond, Stephen; Liu, Angela Yang; Ziemons, Eric; Vervaet, Chris; De Beer, Thomas
2018-03-20
A calibration model for in-line API quantification based on near infrared (NIR) spectra collection during tableting in the tablet press feed frame was developed and validated. First, the measurement set-up was optimised and the effect of filling degree of the feed frame on the NIR spectra was investigated. Secondly, a predictive API quantification model was developed and validated by calculating the accuracy profile based on the analysis results of validation experiments. Furthermore, based on the data of the accuracy profile, the measurement uncertainty was determined. Finally, the robustness of the API quantification model was evaluated. An NIR probe (SentroPAT FO) was implemented into the feed frame of a rotary tablet press (Modul™ P) to monitor physical mixtures of a model API (sodium saccharine) and excipients with two different API target concentrations: 5 and 20% (w/w). Cutting notches into the paddle wheel fingers did avoid disturbances of the NIR signal caused by the rotating paddle wheel fingers and hence allowed better and more complete feed frame monitoring. The effect of the design of the notched paddle wheel fingers was also investigated and elucidated that straight paddle wheel fingers did cause less variation in NIR signal compared to curved paddle wheel fingers. The filling degree of the feed frame was reflected in the raw NIR spectra. Several different calibration models for the prediction of the API content were developed, based on the use of single spectra or averaged spectra, and using partial least squares (PLS) regression or ratio models. These predictive models were then evaluated and validated by processing physical mixtures with different API concentrations not used in the calibration models (validation set). The β-expectation tolerance intervals were calculated for each model and for each of the validated API concentration levels (β was set at 95%). PLS models showed the best predictive performance. For each examined saccharine concentration range (i.e., between 4.5 and 6.5% and between 15 and 25%), at least 95% of future measurements will not deviate more than 15% from the true value. Copyright © 2018 Elsevier B.V. All rights reserved.
Prediction of energy expenditure and physical activity in preschoolers
USDA-ARS?s Scientific Manuscript database
Accurate, nonintrusive, and feasible methods are needed to predict energy expenditure (EE) and physical activity (PA) levels in preschoolers. Herein, we validated cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on accelerometry and heart rate (HR) ...
NASA Astrophysics Data System (ADS)
Shou, Y.; Combi, M.; Toth, G.; Tenishev, V.; Fougere, N.; Jia, X.; Rubin, M.; Huang, Z.; Hansen, K.; Gombosi, T.; Bieler, A.
2016-12-01
Physics-based numerical coma models are desirable whether to interpret the spacecraft observations of the inner coma or to compare with the ground-based observations of the outer coma. In this work, we develop a multi-neutral-fluid model based on the BATS-R-US code of the University of Michigan, which is capable of computing both the inner and outer coma and simulating time-variable phenomena. It treats H2O, OH, H2, O, and H as separate fluids and each fluid has its own velocity and temperature, with collisions coupling all fluids together. The self-consistent collisional interactions decrease the velocity differences, re-distribute the excess energy deposited by chemical reactions among all species, and account for the varying heating efficiency under various physical conditions. Recognizing that the fluid approach has limitations in capturing all of the correct physics for certain applications, especially for very low density environment, we applied our multi-fluid coma model to comet 67P/Churyumov-Gerasimenko at various heliocentric distances and demonstrated that it yields comparable results to the Direct Simulation Monte Carlo (DSMC) model, which is based on a kinetic approach that is valid under these conditions. Therefore, our model may be a powerful alternative to the particle-based model, especially for some computationally intensive simulations. In addition, by running the model with several combinations of production rates and heliocentric distances, we characterize the cometary H2O expansion speeds and demonstrate the nonlinear dependencies of production rate and heliocentric distance. Our results are also compared to previous modeling work and remote observations, which serve as further validation of our model.
Predicting subsurface contaminant transport and transformation requires mathematical models based on a variety of physical, chemical, and biological processes. The mathematical model is an attempt to quantitatively describe observed processes in order to permit systematic forecas...
An acoustic glottal source for vocal tract physical models
NASA Astrophysics Data System (ADS)
Hannukainen, Antti; Kuortti, Juha; Malinen, Jarmo; Ojalammi, Antti
2017-11-01
A sound source is proposed for the acoustic measurement of physical models of the human vocal tract. The physical models are produced by fast prototyping, based on magnetic resonance imaging during prolonged vowel production. The sound source, accompanied by custom signal processing algorithms, is used for two kinds of measurements from physical models of the vocal tract: (i) amplitude frequency response and resonant frequency measurements, and (ii) signal reconstructions at the source output according to a target pressure waveform with measurements at the mouth position. The proposed source and the software are validated by computational acoustics experiments and measurements on a physical model of the vocal tract corresponding to the vowels [] of a male speaker.
Nurmi, Johanna; Hagger, Martin S; Haukkala, Ari; Araújo-Soares, Vera; Hankonen, Nelli
2016-04-01
This study tested the predictive validity of a multitheory process model in which the effect of autonomous motivation from self-determination theory on physical activity participation is mediated by the adoption of self-regulatory techniques based on control theory. Finnish adolescents (N = 411, aged 17-19) completed a prospective survey including validated measures of the predictors and physical activity, at baseline and after one month (N = 177). A subsample used an accelerometer to objectively measure physical activity and further validate the physical activity self-report assessment tool (n = 44). Autonomous motivation statistically significantly predicted action planning, coping planning, and self-monitoring. Coping planning and self-monitoring mediated the effect of autonomous motivation on physical activity, although self-monitoring was the most prominent. Controlled motivation had no effect on self-regulation techniques or physical activity. Developing interventions that support autonomous motivation for physical activity may foster increased engagement in self-regulation techniques and positively affect physical activity behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo
Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.
Etien, Erik
2013-05-01
This paper deals with the design of a speed soft sensor for induction motor. The sensor is based on the physical model of the motor. Because the validation step highlight the fact that the sensor cannot be validated for all the operating points, the model is modified in order to obtain a fully validated sensor in the whole speed range. An original feature of the proposed approach is that the modified model is derived from stability analysis using automatic control theory. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
Turusheva, Anna; Frolova, Elena; Bert, Vaes; Hegendoerfer, Eralda; Degryse, Jean-Marie
2017-07-01
Prediction models help to make decisions about further management in clinical practice. This study aims to develop a mortality risk score based on previously identified risk predictors and to perform internal and external validations. In a population-based prospective cohort study of 611 community-dwelling individuals aged 65+ in St. Petersburg (Russia), all-cause mortality risks over 2.5 years follow-up were determined based on the results obtained from anthropometry, medical history, physical performance tests, spirometry and laboratory tests. C-statistic, risk reclassification analysis, integrated discrimination improvement analysis, decision curves analysis, internal validation and external validation were performed. Older adults were at higher risk for mortality [HR (95%CI)=4.54 (3.73-5.52)] when two or more of the following components were present: poor physical performance, low muscle mass, poor lung function, and anemia. If anemia was combined with high C-reactive protein (CRP) and high B-type natriuretic peptide (BNP) was added the HR (95%CI) was slightly higher (5.81 (4.73-7.14)) even after adjusting for age, sex and comorbidities. Our models were validated in an external population of adults 80+. The extended model had a better predictive capacity for cardiovascular mortality [HR (95%CI)=5.05 (2.23-11.44)] compared to the baseline model [HR (95%CI)=2.17 (1.18-4.00)] in the external population. We developed and validated a new risk prediction score that may be used to identify older adults at higher risk for mortality in Russia. Additional studies need to determine which targeted interventions improve the outcomes of these at-risk individuals. Copyright © 2017 Elsevier B.V. All rights reserved.
Thorne, M C; Degnan, P; Ewen, J; Parkin, G
2000-12-01
The physically based river catchment modelling system SHETRAN incorporates components representing water flow, sediment transport and radionuclide transport both in solution and bound to sediments. The system has been applied to simulate hypothetical future catchments in the context of post-closure radiological safety assessments of a potential site for a deep geological disposal facility for intermediate and certain low-level radioactive wastes at Sellafield, west Cumbria. In order to have confidence in the application of SHETRAN for this purpose, various blind validation studies have been undertaken. In earlier studies, the validation was undertaken against uncertainty bounds in model output predictions set by the modelling team on the basis of how well they expected the model to perform. However, validation can also be carried out with bounds set on the basis of how well the model is required to perform in order to constitute a useful assessment tool. Herein, such an assessment-based validation exercise is reported. This exercise related to a field plot experiment conducted at Calder Hollow, west Cumbria, in which the migration of strontium and lanthanum in subsurface Quaternary deposits was studied on a length scale of a few metres. Blind predictions of tracer migration were compared with experimental results using bounds set by a small group of assessment experts independent of the modelling team. Overall, the SHETRAN system performed well, failing only two out of seven of the imposed tests. Furthermore, of the five tests that were not failed, three were positively passed even when a pessimistic view was taken as to how measurement errors should be taken into account. It is concluded that the SHETRAN system, which is still being developed further, is a powerful tool for application in post-closure radiological safety assessments.
NASA Astrophysics Data System (ADS)
Roushangar, Kiyoumars; Mehrabani, Fatemeh Vojoudi; Shiri, Jalal
2014-06-01
This study presents Artificial Intelligence (AI)-based modeling of total bed material load through developing the accuracy level of the predictions of traditional models. Gene expression programming (GEP) and adaptive neuro-fuzzy inference system (ANFIS)-based models were developed and validated for estimations. Sediment data from Qotur River (Northwestern Iran) were used for developing and validation of the applied techniques. In order to assess the applied techniques in relation to traditional models, stream power-based and shear stress-based physical models were also applied in the studied case. The obtained results reveal that developed AI-based models using minimum number of dominant factors, give more accurate results than the other applied models. Nonetheless, it was revealed that k-fold test is a practical but high-cost technique for complete scanning of applied data and avoiding the over-fitting.
A Stratified Acoustic Model Accounting for Phase Shifts for Underwater Acoustic Networks
Wang, Ping; Zhang, Lin; Li, Victor O. K.
2013-01-01
Accurate acoustic channel models are critical for the study of underwater acoustic networks. Existing models include physics-based models and empirical approximation models. The former enjoy good accuracy, but incur heavy computational load, rendering them impractical in large networks. On the other hand, the latter are computationally inexpensive but inaccurate since they do not account for the complex effects of boundary reflection losses, the multi-path phenomenon and ray bending in the stratified ocean medium. In this paper, we propose a Stratified Acoustic Model (SAM) based on frequency-independent geometrical ray tracing, accounting for each ray's phase shift during the propagation. It is a feasible channel model for large scale underwater acoustic network simulation, allowing us to predict the transmission loss with much lower computational complexity than the traditional physics-based models. The accuracy of the model is validated via comparisons with the experimental measurements in two different oceans. Satisfactory agreements with the measurements and with other computationally intensive classical physics-based models are demonstrated. PMID:23669708
A stratified acoustic model accounting for phase shifts for underwater acoustic networks.
Wang, Ping; Zhang, Lin; Li, Victor O K
2013-05-13
Accurate acoustic channel models are critical for the study of underwater acoustic networks. Existing models include physics-based models and empirical approximation models. The former enjoy good accuracy, but incur heavy computational load, rendering them impractical in large networks. On the other hand, the latter are computationally inexpensive but inaccurate since they do not account for the complex effects of boundary reflection losses, the multi-path phenomenon and ray bending in the stratified ocean medium. In this paper, we propose a Stratified Acoustic Model (SAM) based on frequency-independent geometrical ray tracing, accounting for each ray's phase shift during the propagation. It is a feasible channel model for large scale underwater acoustic network simulation, allowing us to predict the transmission loss with much lower computational complexity than the traditional physics-based models. The accuracy of the model is validated via comparisons with the experimental measurements in two different oceans. Satisfactory agreements with the measurements and with other computationally intensive classical physics-based models are demonstrated.
Learning Physics-based Models in Hydrology under the Framework of Generative Adversarial Networks
NASA Astrophysics Data System (ADS)
Karpatne, A.; Kumar, V.
2017-12-01
Generative adversarial networks (GANs), that have been highly successful in a number of applications involving large volumes of labeled and unlabeled data such as computer vision, offer huge potential for modeling the dynamics of physical processes that have been traditionally studied using simulations of physics-based models. While conventional physics-based models use labeled samples of input/output variables for model calibration (estimating the right parametric forms of relationships between variables) or data assimilation (identifying the most likely sequence of system states in dynamical systems), there is a greater opportunity to explore the full power of machine learning (ML) methods (e.g, GANs) for studying physical processes currently suffering from large knowledge gaps, e.g. ground-water flow. However, success in this endeavor requires a principled way of combining the strengths of ML methods with physics-based numerical models that are founded on a wealth of scientific knowledge. This is especially important in scientific domains like hydrology where the number of data samples is small (relative to Internet-scale applications such as image recognition where machine learning methods has found great success), and the physical relationships are complex (high-dimensional) and non-stationary. We will present a series of methods for guiding the learning of GANs using physics-based models, e.g., by using the outputs of physics-based models as input data to the generator-learner framework, and by using physics-based models as generators trained using validation data in the adversarial learning framework. These methods are being developed under the broad paradigm of theory-guided data science that we are developing to integrate scientific knowledge with data science methods for accelerating scientific discovery.
Development and evaluation of social cognitive measures related to adolescent physical activity.
Dewar, Deborah L; Lubans, David Revalds; Morgan, Philip James; Plotnikoff, Ronald C
2013-05-01
This study aimed to develop and evaluate the construct validity and reliability of modernized social cognitive measures relating to physical activity behaviors in adolescents. An instrument was developed based on constructs from Bandura's Social Cognitive Theory and included the following scales: self-efficacy, situation (perceived physical environment), social support, behavioral strategies, and outcome expectations and expectancies. The questionnaire was administered in a sample of 171 adolescents (age = 13.6 ± 1.2 years, females = 61%). Confirmatory factor analysis was employed to examine model-fit for each scale using multiple indices, including chi-square index, comparative-fit index (CFI), goodness-of-fit index (GFI), and the root mean square error of approximation (RMSEA). Reliability properties were also examined (ICC and Cronbach's alpha). Each scale represented a statistically sound measure: fit indices indicated each model to be an adequate-to-exact fit to the data; internal consistency was acceptable to good (α = 0.63-0.79); rank order repeatability was strong (ICC = 0.82-0.91). Results support the validity and reliability of social cognitive scales relating to physical activity among adolescents. As such, the developed scales have utility for the identification of potential social cognitive correlates of youth physical activity, mediators of physical activity behavior changes and the testing of theoretical models based on Social Cognitive Theory.
Passive Optical Technique to Measure Physical Properties of a Vibrating Surface
2014-01-01
it is not necessary to understand the details of a non-Lambertian BRDF to detect surface vibration phenomena, an accurate model incorporating physics...summarize the discussion of BRDF , while a physics-based BRDF model is not necessary to use scattered light as a surface vibration diagnostic, it may...penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 2014 2
Crins, Martine H P; Terwee, Caroline B; Klausch, Thomas; Smits, Niels; de Vet, Henrica C W; Westhovens, Rene; Cella, David; Cook, Karon F; Revicki, Dennis A; van Leeuwen, Jaap; Boers, Maarten; Dekker, Joost; Roorda, Leo D
2017-07-01
The objective of this study was to assess the psychometric properties of the Dutch-Flemish Patient-Reported Outcomes Measurement Information System (PROMIS) Physical Function item bank in Dutch patients with chronic pain. A bank of 121 items was administered to 1,247 Dutch patients with chronic pain. Unidimensionality was assessed by fitting a one-factor confirmatory factor analysis and evaluating resulting fit statistics. Items were calibrated with the graded response model and its fit was evaluated. Cross-cultural validity was assessed by testing items for differential item functioning (DIF) based on language (Dutch vs. English). Construct validity was evaluated by calculation correlations between scores on the Dutch-Flemish PROMIS Physical Function measure and scores on generic and disease-specific measures. Results supported the Dutch-Flemish PROMIS Physical Function item bank's unidimensionality (Comparative Fit Index = 0.976, Tucker Lewis Index = 0.976) and model fit. Item thresholds targeted a wide range of physical function construct (threshold-parameters range: -4.2 to 5.6). Cross-cultural validity was good as four items only showed DIF for language and their impact on item scores was minimal. Physical Function scores were strongly associated with scores on all other measures (all correlations ≤ -0.60 as expected). The Dutch-Flemish PROMIS Physical Function item bank exhibited good psychometric properties. Development of a computer adaptive test based on the large bank is warranted. Copyright © 2017 Elsevier Inc. All rights reserved.
Validation of the Physical Activity Questionnaire for Older Children (PAQ-C) among Chinese Children.
Wang, Jing Jing; Baranowski, Tom; Lau, Wc Patrick; Chen, Tzu An; Pitkethly, Amanda Jane
2016-03-01
This study initially validates the Chinese version of the Physical Activity Questionnaire for Older Children (PAQ-C), which has been identified as a potentially valid instrument to assess moderate-to-vigorous physical activity (MVPA) in children among diverse racial groups. The psychometric properties of the PAQ-C with 742 Hong Kong Chinese children were assessed with the scale's internal consistency, reliability, test-retest reliability, confirmatory factory analysis (CFA) in the overall sample, and multistep invariance tests across gender groups as well as convergent validity with body mass index (BMI), and an accelerometry-based MVPA. The Cronbach alpha coefficient (α=0.79), composite reliability value (ρ=0.81), and the intraclass correlation coefficient (α=0.82) indicate the satisfactory reliability of the PAQ-C score. The CFA indicated data fit a single factor model, suggesting that the PAQ-C measures only one construct, on MVPA over the previous 7 days. The multiple-group CFAs suggested that the factor loadings and variances and covariances of the PAQ-C measurement model were invariant across gender groups. The PAQ-C score was related to accelerometry-based MVPA (r=0.33) and inversely related to BMI (r=-0.18). This study demonstrates the reliability and validity of the PAQ-C in Chinese children. Copyright © 2016 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.
[New questionnaire to assess self-efficacy toward physical activity in children].
Aedo, Angeles; Avila, Héctor
2009-10-01
To design a questionnaire for assessment of self-efficacy toward physical activity in school children, as well as to measure its construct validity, test-retest reliability, and internal consistency. A four-stage multimethod approach was used: (1) bibliographic research followed by exploratory study and the formulation of questions and responses based on a dichotomous scale of 14 items; (2) validation of the content by a panel of experts; (3) application of the preliminary version of the questionnaire to a sample of 900 school-aged children in Mexico City; and (4) determination of the construct validity, test-retest reliability, and internal consistency (Cronbach's alpha). Three factors were identified that explain 64.15% of the variance: the search for positive alternatives to physical activity, ability to deal with possible barriers to exercising, and expectations of skill or competence. The model was validated using the goodness of fit, and the result of 65% less than 0.05 indicated that the estimated factor model fit the data. Cronbach's consistency alpha was 0.733; test-retest reliability was 0.867. The scale designed has adequate reliability and validity. These results are a good indicator of self-efficacy toward physical activity in school children, which is important when developing programs intended to promote such behavior in this age group.
Source Physics Experiments at the Nevada Test Site
2010-09-01
not display a currently valid OMB control number. 1. REPORT DATE SEP 2010 2. REPORT TYPE 3. DATES COVERED 00-00-2010 to 00-00-2010 4. TITLE AND...seismograms through three-dimensional models of the earth will move monitoring science into a physics- based era. This capability should enable...the advanced ability to model synthetic seismograms in three-dimensional earth models should also lead to advances in the ability to locate and
Prediction of physical workload in reduced gravity environments
NASA Technical Reports Server (NTRS)
Goldberg, Joseph H.
1987-01-01
The background, development, and application of a methodology to predict human energy expenditure and physical workload in low gravity environments, such as a Lunar or Martian base, is described. Based on a validated model to predict energy expenditures in Earth-based industrial jobs, the model relies on an elemental analysis of the proposed job. Because the job itself need not physically exist, many alternative job designs may be compared in their physical workload. The feasibility of using the model for prediction of low gravity work was evaluated by lowering body and load weights, while maintaining basal energy expenditure. Comparison of model results was made both with simulated low gravity energy expenditure studies and with reported Apollo 14 Lunar EVA expenditure. Prediction accuracy was very good for walking and for cart pulling on slopes less than 15 deg, but the model underpredicted the most difficult work conditions. This model was applied to example core sampling and facility construction jobs, as presently conceptualized for a Lunar or Martian base. Resultant energy expenditures and suggested work-rest cycles were well within the range of moderate work difficulty. Future model development requirements were also discussed.
Fricke, Moritz B; Rolfes, Raimund
2015-03-01
An approach for the prediction of underwater noise caused by impact pile driving is described and validated based on in situ measurements. The model is divided into three sub-models. The first sub-model, based on the finite element method, is used to describe the vibration of the pile and the resulting acoustic radiation into the surrounding water and soil column. The mechanical excitation of the pile by the piling hammer is estimated by the second sub-model using an analytical approach which takes the large vertical dimension of the ram into account. The third sub-model is based on the split-step Padé solution of the parabolic equation and targets the long-range propagation up to 20 km. In order to presume realistic environmental properties for the validation, a geoacoustic model is derived from spatially averaged geological information about the investigation area. Although it can be concluded from the validation that the model and the underlying assumptions are appropriate, there are some deviations between modeled and measured results. Possible explanations for the observed errors are discussed.
A Baseline Patient Model to Support Testing of Medical Cyber-Physical Systems.
Silva, Lenardo C; Perkusich, Mirko; Almeida, Hyggo O; Perkusich, Angelo; Lima, Mateus A M; Gorgônio, Kyller C
2015-01-01
Medical Cyber-Physical Systems (MCPS) are currently a trending topic of research. The main challenges are related to the integration and interoperability of connected medical devices, patient safety, physiologic closed-loop control, and the verification and validation of these systems. In this paper, we focus on patient safety and MCPS validation. We present a formal patient model to be used in health care systems validation without jeopardizing the patient's health. To determine the basic patient conditions, our model considers the four main vital signs: heart rate, respiratory rate, blood pressure and body temperature. To generate the vital signs we used regression models based on statistical analysis of a clinical database. Our solution should be used as a starting point for a behavioral patient model and adapted to specific clinical scenarios. We present the modeling process of the baseline patient model and show its evaluation. The conception process may be used to build different patient models. The results show the feasibility of the proposed model as an alternative to the immediate need for clinical trials to test these medical systems.
NASA Astrophysics Data System (ADS)
Yusliana Ekawati, Elvin
2017-01-01
This study aimed to produce a model of scientific attitude assessment in terms of the observations for physics learning based scientific approach (case study of dynamic fluid topic in high school). Development of instruments in this study adaptation of the Plomp model, the procedure includes the initial investigation, design, construction, testing, evaluation and revision. The test is done in Surakarta, so that the data obtained are analyzed using Aiken formula to determine the validity of the content of the instrument, Cronbach’s alpha to determine the reliability of the instrument, and construct validity using confirmatory factor analysis with LISREL 8.50 program. The results of this research were conceptual models, instruments and guidelines on scientific attitudes assessment by observation. The construct assessment instruments include components of curiosity, objectivity, suspended judgment, open-mindedness, honesty and perseverance. The construct validity of instruments has been qualified (rated load factor > 0.3). The reliability of the model is quite good with the Alpha value 0.899 (> 0.7). The test showed that the model fits the theoretical models are supported by empirical data, namely p-value 0.315 (≥ 0.05), RMSEA 0.027 (≤ 0.08)
Overview of physical models of liquid entrainment in annular gas-liquid flow
NASA Astrophysics Data System (ADS)
Cherdantsev, Andrey V.
2018-03-01
A number of recent papers devoted to development of physically-based models for prediction of liquid entrainment in annular regime of two-phase flow are analyzed. In these models shearing-off the crests of disturbance waves by the gas drag force is supposed to be the physical mechanism of entrainment phenomenon. The models are based on a number of assumptions on wavy structure, including inception of disturbance waves due to Kelvin-Helmholtz instability, linear velocity profile inside liquid film and high degree of three-dimensionality of disturbance waves. Validity of the assumptions is analyzed by comparison to modern experimental observations. It was shown that nearly every assumption is in strong qualitative and quantitative disagreement with experiments, which leads to massive discrepancies between the modeled and real properties of the disturbance waves. As a result, such models over-predict the entrained fraction by several orders of magnitude. The discrepancy is usually reduced using various kinds of empirical corrections. This, combined with empiricism already included in the models, turns the models into another kind of empirical correlations rather than physically-based models.
Velocity Model Using the Large-N Seismic Array from the Source Physics Experiment (SPE)
NASA Astrophysics Data System (ADS)
Chen, T.; Snelson, C. M.
2016-12-01
The Source Physics Experiment (SPE) is a multi-institutional, multi-disciplinary project that consists of a series of chemical explosions conducted at the Nevada National Security Site (NNSS). The goal of SPE is to understand the complicated effect of geological structures on seismic wave propagation and source energy partitioning, develop and validate physics-based modeling, and ultimately better monitor low-yield nuclear explosions. A Large-N seismic array was deployed at the SPE site to image the full 3D wavefield from the most recent SPE-5 explosion on April 26, 2016. The Large-N seismic array consists of 996 geophones (half three-component and half vertical-component sensors), and operated for one month, recording the SPE-5 shot, ambient noise, and additional controlled-sources (a large hammer). This study uses Large-N array recordings of the SPE-5 chemical explosion to develop high resolution images of local geologic structures. We analyze different phases of recorded seismic data and construct a velocity model based on arrival times. The results of this study will be incorporated into the large modeling and simulation efforts as ground-truth further validating the models.
Sun, Xingshu; Silverman, Timothy; Garris, Rebekah; ...
2016-07-18
In this study, we present a physics-based analytical model for copper indium gallium diselenide (CIGS) solar cells that describes the illumination- and temperature-dependent current-voltage (I-V) characteristics and accounts for the statistical shunt variation of each cell. The model is derived by solving the drift-diffusion transport equation so that its parameters are physical and, therefore, can be obtained from independent characterization experiments. The model is validated against CIGS I-V characteristics as a function of temperature and illumination intensity. This physics-based model can be integrated into a large-scale simulation framework to optimize the performance of solar modules, as well as predict themore » long-term output yields of photovoltaic farms under different environmental conditions.« less
NASA Astrophysics Data System (ADS)
Shou, Yinsi; Combi, Michael R.; Toth, Gabor; Huang, Zhenguang; Jia, Xianzhe; Fougere, Nicolas; Tenishev, Valeriy; Gombosi, T. I.; Hansen, Kenneth C.; Bieler, Andre
2016-10-01
Physics-based numerical coma models are desirable whether to interpret the spacecraft observations of the inner coma or to compare with the ground-based observations of the outer coma. In this work, we develop a multi-neutral-fluid model based on BATS-R-US in the University of Michigan's SWMF (Space Weather Modeling Framework), which is capable of computing both the inner and the outer coma and simulating time-variable phenomena. It treats H2O, OH, H2, O, and H as separate fluids and each fluid has its own velocity and temperature, with collisions coupling all fluids together. The self-consistent collisional interactions decrease the velocity differences, re-distribute the excess energy deposited by chemical reactions among all species, and account for the varying heating efficiency under various physical conditions. Recognizing that the fluid approach has limitations in capturing all of the correct physics for certain applications, especially for very low density environment, we applied our multi-fluid coma model to comet 67P/Churyumov-Gerasimenko (CG) at various heliocentric distances and demonstrated that it is able to yield comparable results as the Direct Simulation Monte Carlo (DSMC) model, which is based on a kinetic approach that is valid under these conditions. Therefore, our model may be a powerful alternative to the particle-based model, especially for some computationally intensive simulations. In addition, by running the model with several combinations of production rates and heliocentric distances, we can characterize the cometary H2O expansion speeds and demonstrate the nonlinear effect of production rates or photochemical heating. Our results are also compared to previous modeling work (e.g., Bockelee-Morvan & Crovisier 1987) and remote observations (e.g., Tseng et al. 2007), which serve as further validation of our model. This work has been partially supported by grant NNX14AG84G from the NASA Planetary Atmospheres Program, and US Rosetta contracts JPL #1266313, JPL #1266314 and JPL #1286489.
Object-Oriented Modeling of an Energy Harvesting System Based on Thermoelectric Generators
NASA Astrophysics Data System (ADS)
Nesarajah, Marco; Frey, Georg
This paper deals with the modeling of an energy harvesting system based on thermoelectric generators (TEG), and the validation of the model by means of a test bench. TEGs are capable to improve the overall energy efficiency of energy systems, e.g. combustion engines or heating systems, by using the remaining waste heat to generate electrical power. Previously, a component-oriented model of the TEG itself was developed in Modelica® language. With this model any TEG can be described and simulated given the material properties and the physical dimension. Now, this model was extended by the surrounding components to a complete model of a thermoelectric energy harvesting system. In addition to the TEG, the model contains the cooling system, the heat source, and the power electronics. To validate the simulation model, a test bench was built and installed on an oil-fired household heating system. The paper reports results of the measurements and discusses the validity of the developed simulation models. Furthermore, the efficiency of the proposed energy harvesting system is derived and possible improvements based on design variations tested in the simulation model are proposed.
Physical models and primary design of reactor based slow positron source at CMRR
NASA Astrophysics Data System (ADS)
Wang, Guanbo; Li, Rundong; Qian, Dazhi; Yang, Xin
2018-07-01
Slow positron facilities are widely used in material science. A high intensity slow positron source is now at the design stage based on the China Mianyang Research Reactor (CMRR). This paper describes the physical models and our primary design. We use different computer programs or mathematical formula to simulate different physical process, and validate them by proper experiments. Considering the feasibility, we propose a primary design, containing a cadmium shield, a honeycomb arranged W tubes assembly, electrical lenses, and a solenoid. It is planned to be vertically inserted in the Si-doping channel. And the beam intensity is expected to be 5 ×109
Ullrich-French, Sarah; González Hernández, Juan; Hidalgo Montesinos, María D
2017-02-01
Mindfulness is an increasingly popular construct with promise in enhancing multiple positive health outcomes. Physical activity is an important behavior for enhancing overall health, but no Spanish language scale exists to test how mindfulness during physical activity may facilitate physical activity motivation or behavior. This study examined the validity of a Spanish adaption of a new scale, the State Mindfulness Scale for Physical Activity, to assess mindfulness during a specific experience of physical activity. Spanish youths (N = 502) completed a cross-sectional survey of state mindfulness during physical activity and physical activity motivation regulations based on Self-Determination Theory. A high-order model fit the data well and supports the use of one general state mindfulness factor or the use of separate subscales of mindfulness of mental (e.g., thoughts, emotions) and body (physical movement, muscles) aspects of the experience. Internal consistency reliability was good for the general scale and both sub-scales. The pattern of correlations with motivation regulations provides further support for construct validity with significant and positive correlations with self-determined forms of motivation and significant and negative correlations with external regulation and amotivation. Initial validity evidence is promising for the use of the adapted measure.
A Comprehensive Validation Methodology for Sparse Experimental Data
NASA Technical Reports Server (NTRS)
Norman, Ryan B.; Blattnig, Steve R.
2010-01-01
A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.
NASA Astrophysics Data System (ADS)
Ubben, Malte; Heusler, Stefan
2018-07-01
Vibration modes in spherical geometry can be classified based on the number and position of nodal planes. However, the geometry of these planes is non-trivial and cannot be easily displayed in two dimensions. We present 3D-printed models of those vibration modes, enabling a haptic approach for understanding essential features of bound states in quantum physics and beyond. In particular, when applied to atomic physics, atomic orbitals are obtained in a natural manner. Applied to nuclear physics, the same patterns of vibration modes emerge as cornerstone for the nuclear shell model. These applications of the very same model in a range of more than 5 orders of magnitude in length scales leads to a general discussion of the applicability and limits of validity of physical models in general.
Parra-Robles, J; Ajraoui, S; Deppe, M H; Parnell, S R; Wild, J M
2010-06-01
Models of lung acinar geometry have been proposed to analytically describe the diffusion of (3)He in the lung (as measured with pulsed gradient spin echo (PGSE) methods) as a possible means of characterizing lung microstructure from measurement of the (3)He ADC. In this work, major limitations in these analytical models are highlighted in simple diffusion weighted experiments with (3)He in cylindrical models of known geometry. The findings are substantiated with numerical simulations based on the same geometry using finite difference representation of the Bloch-Torrey equation. The validity of the existing "cylinder model" is discussed in terms of the physical diffusion regimes experienced and the basic reliance of the cylinder model and other ADC-based approaches on a Gaussian diffusion behaviour is highlighted. The results presented here demonstrate that physical assumptions of the cylinder model are not valid for large diffusion gradient strengths (above approximately 15 mT/m), which are commonly used for (3)He ADC measurements in human lungs. (c) 2010 Elsevier Inc. All rights reserved.
Impact of Learning Model Based on Cognitive Conflict toward Student’s Conceptual Understanding
NASA Astrophysics Data System (ADS)
Mufit, F.; Festiyed, F.; Fauzan, A.; Lufri, L.
2018-04-01
The problems that often occur in the learning of physics is a matter of misconception and low understanding of the concept. Misconceptions do not only happen to students, but also happen to college students and teachers. The existing learning model has not had much impact on improving conceptual understanding and remedial efforts of student misconception. This study aims to see the impact of cognitive-based learning model in improving conceptual understanding and remediating student misconceptions. The research method used is Design / Develop Research. The product developed is a cognitive conflict-based learning model along with its components. This article reports on product design results, validity tests, and practicality test. The study resulted in the design of cognitive conflict-based learning model with 4 learning syntaxes, namely (1) preconception activation, (2) presentation of cognitive conflict, (3) discovery of concepts & equations, (4) Reflection. The results of validity tests by some experts on aspects of content, didactic, appearance or language, indicate very valid criteria. Product trial results also show a very practical product to use. Based on pretest and posttest results, cognitive conflict-based learning models have a good impact on improving conceptual understanding and remediating misconceptions, especially in high-ability students.
S. Wang; Z. Zhang; G. Sun; P. Strauss; J. Guo; Y. Tang; A. Yao
2012-01-01
Model calibration is essential for hydrologic modeling of large watersheds in a heterogeneous mountain environment. Little guidance is available for model calibration protocols for distributed models that aim at capturing the spatial variability of hydrologic processes. This study used the physically-based distributed hydrologic model, MIKE SHE, to contrast a lumped...
Ionic polymer-metal composite torsional sensor: physics-based modeling and experimental validation
NASA Astrophysics Data System (ADS)
Aidi Sharif, Montassar; Lei, Hong; Khalid Al-Rubaiai, Mohammed; Tan, Xiaobo
2018-07-01
Ionic polymer-metal composites (IPMCs) have intrinsic sensing and actuation properties. Typical IPMC sensors are in the shape of beams and only respond to stimuli acting along beam-bending directions. Rod or tube-shaped IPMCs have been explored as omnidirectional bending actuators or sensors. In this paper, physics-based modeling is studied for a tubular IPMC sensor under pure torsional stimulus. The Poisson–Nernst–Planck model is used to describe the fundamental physics within the IPMC, where it is hypothesized that the anion concentration is coupled to the sum of shear strains induced by the torsional stimulus. Finite element simulation is conducted to solve for the torsional sensing response, where some of the key parameters are identified based on experimental measurements using an artificial neural network. Additional experimental results suggest that the proposed model is able to capture the torsional sensing dynamics for different amplitudes and rates of the torsional stimulus.
NASA Astrophysics Data System (ADS)
Yao, Bing; Yang, Hui
2016-12-01
This paper presents a novel physics-driven spatiotemporal regularization (STRE) method for high-dimensional predictive modeling in complex healthcare systems. This model not only captures the physics-based interrelationship between time-varying explanatory and response variables that are distributed in the space, but also addresses the spatial and temporal regularizations to improve the prediction performance. The STRE model is implemented to predict the time-varying distribution of electric potentials on the heart surface based on the electrocardiogram (ECG) data from the distributed sensor network placed on the body surface. The model performance is evaluated and validated in both a simulated two-sphere geometry and a realistic torso-heart geometry. Experimental results show that the STRE model significantly outperforms other regularization models that are widely used in current practice such as Tikhonov zero-order, Tikhonov first-order and L1 first-order regularization methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larmat, Carene; Rougier, Esteban; Lei, Zhou
This project is in support of the Source Physics Experiment SPE (Snelson et al. 2013), which aims to develop new seismic source models of explosions. One priority of this program is first principle numerical modeling to validate and extend current empirical models.
Finite Element Model Development For Aircraft Fuselage Structures
NASA Technical Reports Server (NTRS)
Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.
2000-01-01
The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results.
Empirical Measurement and Model Validation of Infrared Spectra of Contaminated Surfaces
NASA Astrophysics Data System (ADS)
Archer, Sean
The goal of this thesis was to validate predicted infrared spectra of liquid contaminated surfaces from a micro-scale bi-directional reflectance distribution function (BRDF) model through the use of empirical measurement. Liquid contaminated surfaces generally require more sophisticated radiometric modeling to numerically describe surface properties. The Digital Image and Remote Sensing Image Generation (DIRSIG) model utilizes radiative transfer modeling to generate synthetic imagery for a variety of applications. Aside from DIRSIG, a micro-scale model known as microDIRSIG has been developed as a rigorous ray tracing physics-based model that could predict the BRDF of geometric surfaces that are defined as micron to millimeter resolution facets. The model offers an extension from the conventional BRDF models by allowing contaminants to be added as geometric objects to a micro-facet surface. This model was validated through the use of Fourier transform infrared spectrometer measurements. A total of 18 different substrate and contaminant combinations were measured and compared against modeled outputs. The substrates used in this experiment were wood and aluminum that contained three different paint finishes. The paint finishes included no paint, Krylon ultra-flat black, and Krylon glossy black. A silicon based oil (SF96) was measured out and applied to each surface to create three different contamination cases for each surface. Radiance in the longwave infrared region of the electromagnetic spectrum was measured by a Design and Prototypes (D&P) Fourier transform infrared spectrometer and a Physical Sciences Inc. Adaptive Infrared Imaging Spectroradiometer (AIRIS). The model outputs were compared against the measurements quantitatively in both the emissivity and radiance domains. A temperature emissivity separation (TES) algorithm had to be applied to the measured radiance spectra for comparison with the microDIRSIG predicted emissivity spectra. The model predicted emissivity spectra was also forward modeled through a DIRSIG simulation for comparisons to the radiance measurements. The results showed a promising agreement for homogeneous surfaces with liquid contamination that could be well characterized geometrically. Limitations arose in substrates that were modeled as homogeneous surfaces, but had spatially varying artifacts due to uncertainties with contaminant and surface interactions. There is high desire for accurate physics based modeling of liquid contaminated surfaces and this validation framework may be extended to include a wider array of samples for more realistic natural surfaces that are often found in real world scenarios.
Validation of the thermal challenge problem using Bayesian Belief Networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFarland, John; Swiler, Laura Painton
The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less
Continued Development and Validation of the USU GAIM Models
2010-08-01
Markov data assimilation model (GAIM-GM) uses a physics-based model of the ionosphere ( IFM ) and a Kalman filter as a basis for assimilating a diverse... a data assimilation model of the ionosphere that is based on the Ionosphere Forecast Model ( IFM ) (Schunk, 1988; Sojka, 1989; Schunk et al., 1997...Monograph 181, (ed. R M . Kitner. A . J. Coster, T. Fuller-Rowell, A J. Mannucci, M . Mendill, and R. Heelis), pp. 35-49, AGU. Washington, DC, 2008
Skill assessment of the coupled physical-biogeochemical operational Mediterranean Forecasting System
NASA Astrophysics Data System (ADS)
Cossarini, Gianpiero; Clementi, Emanuela; Salon, Stefano; Grandi, Alessandro; Bolzon, Giorgio; Solidoro, Cosimo
2016-04-01
The Mediterranean Monitoring and Forecasting Centre (Med-MFC) is one of the regional production centres of the European Marine Environment Monitoring Service (CMEMS-Copernicus). Med-MFC operatively manages a suite of numerical model systems (3DVAR-NEMO-WW3 and 3DVAR-OGSTM-BFM) that provides gridded datasets of physical and biogeochemical variables for the Mediterranean marine environment with a horizontal resolution of about 6.5 km. At the present stage, the operational Med-MFC produces ten-day forecast: daily for physical parameters and bi-weekly for biogeochemical variables. The validation of the coupled model system and the estimate of the accuracy of model products are key issues to ensure reliable information to the users and the downstream services. Product quality activities at Med-MFC consist of two levels of validation and skill analysis procedures. Pre-operational qualification activities focus on testing the improvement of the quality of a new release of the model system and relays on past simulation and historical data. Then, near real time (NRT) validation activities aim at the routinely and on-line skill assessment of the model forecast and relays on the NRT available observations. Med-MFC validation framework uses both independent (i.e. Bio-Argo float data, in-situ mooring and vessel data of oxygen, nutrients and chlorophyll, moored buoys, tide-gauges and ADCP of temperature, salinity, sea level and velocity) and semi-independent data (i.e. data already used for assimilation, such as satellite chlorophyll, Satellite SLA and SST and in situ vertical profiles of temperature and salinity from XBT, Argo and Gliders) We give evidence that different variables (e.g. CMEMS-products) can be validated at different levels (i.e. at the forecast level or at the level of model consistency) and at different spatial and temporal scales. The fundamental physical parameters temperature, salinity and sea level are routinely validated on daily, weekly and quarterly base at regional and sub-regional scale and along specific vertical layers (temperature and salinity); while velocity fields are daily validated against in situ coastal moorings. Since the velocity skill cannot be accurately assessed through coastal measurements due to the actual model horizontal resolution (~6.5 km), new validation metrics and procedures are under investigation. Chlorophyll is the only biogeochemical variable that can be validated routinely at the temporal and spatial scale of the weekly forecast, while nutrients and oxygen predictions can be validated locally or at sub-basin and seasonal scales. For the other biogeochemical variables (i.e. primary production, carbonate system variables) only the accuracy of the average dynamics and model consistency can be evaluated. Then, we discuss the limiting factors of the present validation framework, and the quality and extension of the observing system that would be needed for improving the reliability of the physical and biogeochemical Mediterranean forecast services.
Accelerated Aging in Electrolytic Capacitors for Prognostics
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Kulkarni, Chetan; Saha, Sankalita; Biswas, Gautam; Goebel, Kai Frank
2012-01-01
The focus of this work is the analysis of different degradation phenomena based on thermal overstress and electrical overstress accelerated aging systems and the use of accelerated aging techniques for prognostics algorithm development. Results on thermal overstress and electrical overstress experiments are presented. In addition, preliminary results toward the development of physics-based degradation models are presented focusing on the electrolyte evaporation failure mechanism. An empirical degradation model based on percentage capacitance loss under electrical overstress is presented and used in: (i) a Bayesian-based implementation of model-based prognostics using a discrete Kalman filter for health state estimation, and (ii) a dynamic system representation of the degradation model for forecasting and remaining useful life (RUL) estimation. A leave-one-out validation methodology is used to assess the validity of the methodology under the small sample size constrain. The results observed on the RUL estimation are consistent through the validation tests comparing relative accuracy and prediction error. It has been observed that the inaccuracy of the model to represent the change in degradation behavior observed at the end of the test data is consistent throughout the validation tests, indicating the need of a more detailed degradation model or the use of an algorithm that could estimate model parameters on-line. Based on the observed degradation process under different stress intensity with rest periods, the need for more sophisticated degradation models is further supported. The current degradation model does not represent the capacitance recovery over rest periods following an accelerated aging stress period.
A Model of Physical Performance for Occupational Tasks.
ERIC Educational Resources Information Center
Hogan, Joyce
This report acknowledges the problems faced by industrial/organizational psychologists who must make personnel decisions involving physically demanding jobs. The scarcity of criterion-related validation studies and the difficulty of generalizing validity are considered, and a model of physical performance that builds on Fleishman's (1984)…
Soulis, Konstantinos X; Valiantzas, John D; Ntoulas, Nikolaos; Kargas, George; Nektarios, Panayiotis A
2017-09-15
In spite of the well-known green roof benefits, their widespread adoption in the management practices of urban drainage systems requires the use of adequate analytical and modelling tools. In the current study, green roof runoff modeling was accomplished by developing, testing, and jointly using a simple conceptual model and a physically based numerical simulation model utilizing HYDRUS-1D software. The use of such an approach combines the advantages of the conceptual model, namely simplicity, low computational requirements, and ability to be easily integrated in decision support tools with the capacity of the physically based simulation model to be easily transferred in conditions and locations other than those used for calibrating and validating it. The proposed approach was evaluated with an experimental dataset that included various green roof covers (either succulent plants - Sedum sediforme, or xerophytic plants - Origanum onites, or bare substrate without any vegetation) and two substrate depths (either 8 cm or 16 cm). Both the physically based and the conceptual models matched very closely the observed hydrographs. In general, the conceptual model performed better than the physically based simulation model but the overall performance of both models was sufficient in most cases as it is revealed by the Nash-Sutcliffe Efficiency index which was generally greater than 0.70. Finally, it was showcased how a physically based and a simple conceptual model can be jointly used to allow the use of the simple conceptual model for a wider set of conditions than the available experimental data and in order to support green roof design. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Scherliess, L.; Schunk, R. W.; Sojka, J. J.; Thompson, D. C.; Zhu, L.
2006-11-01
The Utah State University Gauss-Markov Kalman Filter (GMKF) was developed as part of the Global Assimilation of Ionospheric Measurements (GAIM) program. The GMKF uses a physics-based model of the ionosphere and a Gauss-Markov Kalman filter as a basis for assimilating a diverse set of real-time (or near real-time) observations. The physics-based model is the Ionospheric Forecast Model (IFM), which accounts for five ion species and covers the E region, F region, and the topside from 90 to 1400 km altitude. Within the GMKF, the IFM derived ionospheric densities constitute a background density field on which perturbations are superimposed based on the available data and their errors. In the current configuration, the GMKF assimilates slant total electron content (TEC) from a variable number of global positioning satellite (GPS) ground sites, bottomside electron density (Ne) profiles from a variable number of ionosondes, in situ Ne from four Defense Meteorological Satellite Program (DMSP) satellites, and nighttime line-of-sight ultraviolet (UV) radiances measured by satellites. To test the GMKF for real-time operations and to validate its ionospheric density specifications, we have tested the model performance for a variety of geophysical conditions. During these model runs various combination of data types and data quantities were assimilated. To simulate real-time operations, the model ran continuously and automatically and produced three-dimensional global electron density distributions in 15 min increments. In this paper we will describe the Gauss-Markov Kalman filter model and present results of our validation study, with an emphasis on comparisons with independent observations.
Wang, Hongyuan; Zhang, Wei; Dong, Aotuo
2012-11-10
A modeling and validation method of photometric characteristics of the space target was presented in order to track and identify different satellites effectively. The background radiation characteristics models of the target were built based on blackbody radiation theory. The geometry characteristics of the target were illustrated by the surface equations based on its body coordinate system. The material characteristics of the target surface were described by a bidirectional reflectance distribution function model, which considers the character of surface Gauss statistics and microscale self-shadow and is obtained by measurement and modeling in advance. The contributing surfaces of the target to observation system were determined by coordinate transformation according to the relative position of the space-based target, the background radiation sources, and the observation platform. Then a mathematical model on photometric characteristics of the space target was built by summing reflection components of all the surfaces. Photometric characteristics simulation of the space-based target was achieved according to its given geometrical dimensions, physical parameters, and orbital parameters. Experimental validation was made based on the scale model of the satellite. The calculated results fit well with the measured results, which indicates the modeling method of photometric characteristics of the space target is correct.
Psychometric Properties of the “Sport Motivation Scale (SMS)” Adapted to Physical Education
Granero-Gallegos, Antonio; Baena-Extremera, Antonio; Gómez-López, Manuel; Sánchez-Fuentes, José Antonio; Abraldes, J. Arturo
2014-01-01
The aim of this study was to investigate the factor structure of a Spanish version of the Sport Motivation Scale adapted to physical education. A second aim was to test which one of three hypothesized models (three, five and seven-factor) provided best model fit. 758 Spanish high school students completed the Sport Motivation Scale adapted for Physical Education and also completed the Learning and Performance Orientation in Physical Education Classes Questionnaire. We examined the factor structure of each model using confirmatory factor analysis and also assessed internal consistency and convergent validity. The results showed that all three models in Spanish produce good indicators of fitness, but we suggest using the seven-factor model (χ2/gl = 2.73; ECVI = 1.38) as it produces better values when adapted to physical education, that five-factor model (χ2/gl = 2.82; ECVI = 1.44) and three-factor model (χ2/gl = 3.02; ECVI = 1.53). Key Points Physical education research conducted in Spain has used the version of SMS designed to assess motivation in sport, but validity reliability and validity results in physical education have not been reported. Results of the present study lend support to the factorial validity and internal reliability of three alternative factor structures (3, 5, and 7 factors) of SMS adapted to Physical Education in Spanish. Although all three models in Spanish produce good indicators of fitness, but we suggest using the seven-factor model. PMID:25435772
WEPP model implementation project with the USDA-Natural Resources Conservation Service
USDA-ARS?s Scientific Manuscript database
The Water Erosion Prediction Project (WEPP) is a physical process-based soil erosion model that can be used to estimate runoff, soil loss, and sediment yield from hillslope profiles, fields, and small watersheds. Initially developed from 1985-1995, WEPP has been applied and validated across a wide r...
Mechanical relaxation in a Zr-based bulk metallic glass: Analysis based on physical models
NASA Astrophysics Data System (ADS)
Qiao, J. C.; Pelletier, J. M.
2012-08-01
The mechanical relaxation behavior in a Zr55Cu30Ni5Al10 bulk metallic glass is investigated by dynamic mechanical analysis in both temperature and frequency domains. Master curves can be obtained for the storage modulus G' and for the loss modulus G'', confirming the validity of the time-temperature superposition principle. Different models are discussed to describe the main (α) relaxation, e.g., Debye model, Havriliak-Negami (HN) model, Kohlrausch-Williams-Watt (KWW) model, and quasi-point defects (QPDs) model. The main relaxation in bulk metallic glass cannot be described using a single relaxation time. The HN model, the KWW model, and the QPD theory can be used to fit the data of mechanical spectroscopy experiments. However, unlike the HN model and the KWW model, some physical parameters are introduced in QPD model, i.e., atomic mobility and correlation factor, giving, therefore, a new physical approach to understand the mechanical relaxation in bulk metallic glasses.
Lattice hydrodynamic model based traffic control: A transportation cyber-physical system approach
NASA Astrophysics Data System (ADS)
Liu, Hui; Sun, Dihua; Liu, Weining
2016-11-01
Lattice hydrodynamic model is a typical continuum traffic flow model, which describes the jamming transition of traffic flow properly. Previous studies in lattice hydrodynamic model have shown that the use of control method has the potential to improve traffic conditions. In this paper, a new control method is applied in lattice hydrodynamic model from a transportation cyber-physical system approach, in which only one lattice site needs to be controlled in this control scheme. The simulation verifies the feasibility and validity of this method, which can ensure the efficient and smooth operation of the traffic flow.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Engineering uses of physics-based ground motion simulations
Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.
2014-01-01
This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.
NASA Astrophysics Data System (ADS)
Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang
2017-09-01
Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.
NASA Astrophysics Data System (ADS)
Putra, A.; Masril, M.; Yurnetti, Y.
2018-04-01
One of the causes of low achievement of student’s competence in physics learning in high school is the process which they have not been able to develop student’s creativity in problem solving. This is shown that the teacher’s learning plan is not accordance with the National Eduction Standard. This study aims to produce a reconstruction model of physics learning that fullfil the competency standards, content standards, and assessment standards in accordance with applicable curriculum standards. The development process follows: Needs analysis, product design, product development, implementation, and product evaluation. The research process involves 2 peers judgment, 4 experts judgment and two study groups of high school students in Padang. The data obtained, in the form of qualitative and quantitative data that collected through documentation, observation, questionnaires, and tests. The result of this research up to the product development stage that obtained the physics learning plan model that meets the validity of the content and the validity of the construction in terms of the fulfillment of Basic Competence, Content Standards, Process Standards and Assessment Standards.
NASA Astrophysics Data System (ADS)
Lu, Meilian; Yang, Dong; Zhou, Xing
2013-03-01
Based on the analysis of the requirements of conversation history storage in CPM (Converged IP Messaging) system, a Multi-views storage model and access methods of conversation history are proposed. The storage model separates logical views from physical storage and divides the storage into system managed region and user managed region. It simultaneously supports conversation view, system pre-defined view and user-defined view of storage. The rationality and feasibility of multi-view presentation, the physical storage model and access methods are validated through the implemented prototype. It proves that, this proposal has good scalability, which will help to optimize the physical data storage structure and improve storage performance.
Statistical analysis of target acquisition sensor modeling experiments
NASA Astrophysics Data System (ADS)
Deaver, Dawne M.; Moyer, Steve
2015-05-01
The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.
Simplified Physics Based Models Research Topical Report on Task #2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Srikanta; Ganesh, Priya
We present a simplified-physics based approach, where only the most important physical processes are modeled, to develop and validate simplified predictive models of CO2 sequestration in deep saline formation. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. We use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and themore » nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. Similar correlations are also developed to predict the average pressure within the injection reservoir, and the pressure buildup within the caprock.« less
Further Studies into Synthetic Image Generation using CameoSim
2011-08-01
preparation of the validation effort a study of BRDF models has been completed, which includes the physical plausibility of models , how measured data...the visible to shortwave infrared. In preparation of the validation effort a study of BRDF models has been completed, which includes the physical...Example..................................................................................................................... 17 4. MODELLING BRDFS
NASA Astrophysics Data System (ADS)
Sivandran, G.; Bisht, G.; Ivanov, V. Y.; Bras, R. L.
2008-12-01
A coupled, dynamic vegetation and hydrologic model, tRIBS+VEGGIE, was applied to the semiarid Walnut Gulch Experimental Watershed in Arizona. The physically-based, distributed nature of the coupled model allows for parameterization and simulation of watershed vegetation-water-energy dynamics on timescales varying from hourly to interannual. The model also allows for explicit spatial representation of processes that vary due to complex topography, such as lateral redistribution of moisture and partitioning of radiation with respect to aspect and slope. Model parameterization and forcing was conducted using readily available databases for topography, soil types, and land use cover as well as the data from network of meteorological stations located within the Walnut Gulch watershed. In order to test the performance of the model, three sets of simulations were conducted over an 11 year period from 1997 to 2007. Two simulations focus on heavily instrumented nested watersheds within the Walnut Gulch basin; (i) Kendall watershed, which is dominated by annual grasses; and (ii) Lucky Hills watershed, which is dominated by a mixture of deciduous and evergreen shrubs. The third set of simulations cover the entire Walnut Gulch Watershed. Model validation and performance were evaluated in relation to three broad categories; (i) energy balance components: the network of meteorological stations were used to validate the key energy fluxes; (ii) water balance components: the network of flumes, rain gauges and soil moisture stations installed within the watershed were utilized to validate the manner in which the model partitions moisture; and (iii) vegetation dynamics: remote sensing products from MODIS were used to validate spatial and temporal vegetation dynamics. Model results demonstrate satisfactory spatial and temporal agreement with observed data, giving confidence that key ecohydrological processes can be adequately represented for future applications of tRIBS+VEGGIE in regional modeling of land-atmosphere interactions.
NASA Astrophysics Data System (ADS)
He, Xiao Dong
This thesis studies light scattering processes off rough surfaces. Analytic models for reflection, transmission and subsurface scattering of light are developed. The results are applicable to realistic image generation in computer graphics. The investigation focuses on the basic issue of how light is scattered locally by general surfaces which are neither diffuse nor specular; Physical optics is employed to account for diffraction and interference which play a crucial role in the scattering of light for most surfaces. The thesis presents: (1) A new reflectance model; (2) A new transmittance model; (3) A new subsurface scattering model. All of these models are physically-based, depend on only physical parameters, apply to a wide range of materials and surface finishes and more importantly, provide a smooth transition from diffuse-like to specular reflection as the wavelength and incidence angle are increased or the surface roughness is decreased. The reflectance and transmittance models are based on the Kirchhoff Theory and the subsurface scattering model is based on Energy Transport Theory. They are valid only for surfaces with shallow slopes. The thesis shows that predicted reflectance distributions given by the reflectance model compare favorably with experiment. The thesis also investigates and implements fast ways of computing the reflectance and transmittance models. Furthermore, the thesis demonstrates that a high level of realistic image generation can be achieved due to the physically -correct treatment of the scattering processes by the reflectance model.
A closed-loop hybrid physiological model relating to subjects under physical stress.
El-Samahy, Emad; Mahfouf, Mahdi; Linkens, Derek A
2006-11-01
The objective of this research study is to derive a comprehensive physiological model relating to subjects under physical stress conditions. The model should describe the behaviour of the cardiovascular system, respiratory system, thermoregulation and brain activity in response to physical workload. An experimental testing rig was built which consists of recumbent high performance bicycle for inducing the physical load and a data acquisition system comprising monitors and PCs. The signals acquired and used within this study are the blood pressure, heart rate, respiration, body temperature, and EEG signals. The proposed model is based on a grey-box based modelling approach which was used because of the sufficient level of details it provides. Cardiovascular and EEG Data relating to 16 healthy subject volunteers (data from 12 subjects were used for training/validation and the data from 4 subjects were used for model testing) were collected using the Finapres and the ProComp+ monitors. For model validation, residual analysis via the computing of the confidence intervals as well as related histograms was performed. Closed-loop simulations for different subjects showed that the model can provide reliable predictions for heart rate, blood pressure, body temperature, respiration, and the EEG signals. These findings were also reinforced by the residual analyses data obtained, which suggested that the residuals were within the 90% confidence bands and that the corresponding histograms were of a normal distribution. A higher intelligent level was added to the model, based on neural networks, to extend the capabilities of the model to predict over a wide range of subjects dynamics. The elicited physiological model describing the effect of physiological stress on several physiological variables can be used to predict performance breakdown of operators in critical environments. Such a model architecture lends itself naturally to exploitation via feedback control in a 'reverse-engineering' fashion to control stress via the specification of a safe operating range for the psycho-physiological variables.
Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction
NASA Astrophysics Data System (ADS)
Mach, J. C.; Budrow, C. J.; Pagan, D. C.; Ruff, J. P. C.; Park, J.-S.; Okasinski, J.; Beaudoin, A. J.; Miller, M. P.
2017-05-01
Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present work, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to develop significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. The experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.
Shielded-Twisted-Pair Cable Model for Chafe Fault Detection via Time-Domain Reflectometry
NASA Technical Reports Server (NTRS)
Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.
2012-01-01
This report details the development, verification, and validation of an innovative physics-based model of electrical signal propagation through shielded-twisted-pair cable, which is commonly found on aircraft and offers an ideal proving ground for detection of small holes in a shield well before catastrophic damage occurs. The accuracy of this model is verified through numerical electromagnetic simulations using a commercially available software tool. The model is shown to be representative of more realistic (analytically intractable) cable configurations as well. A probabilistic framework is developed for validating the model accuracy with reflectometry data obtained from real aircraft-grade cables chafed in the laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Canhai; Xu, Zhijie; Pan, Wenxiao
2016-01-01
To quantify the predictive confidence of a solid sorbent-based carbon capture design, a hierarchical validation methodology—consisting of basic unit problems with increasing physical complexity coupled with filtered model-based geometric upscaling has been developed and implemented. This paper describes the computational fluid dynamics (CFD) multi-phase reactive flow simulations and the associated data flows among different unit problems performed within the said hierarchical validation approach. The bench-top experiments used in this calibration and validation effort were carefully designed to follow the desired simple-to-complex unit problem hierarchy, with corresponding data acquisition to support model parameters calibrations at each unit problem level. A Bayesianmore » calibration procedure is employed and the posterior model parameter distributions obtained at one unit-problem level are used as prior distributions for the same parameters in the next-tier simulations. Overall, the results have demonstrated that the multiphase reactive flow models within MFIX can be used to capture the bed pressure, temperature, CO2 capture capacity, and kinetics with quantitative accuracy. The CFD modeling methodology and associated uncertainty quantification techniques presented herein offer a solid framework for estimating the predictive confidence in the virtual scale up of a larger carbon capture device.« less
Global Precipitation Measurement (GPM) Ground Validation (GV) Science Implementation Plan
NASA Technical Reports Server (NTRS)
Petersen, Walter A.; Hou, Arthur Y.
2008-01-01
For pre-launch algorithm development and post-launch product evaluation Global Precipitation Measurement (GPM) Ground Validation (GV) goes beyond direct comparisons of surface rain rates between ground and satellite measurements to provide the means for improving retrieval algorithms and model applications.Three approaches to GPM GV include direct statistical validation (at the surface), precipitation physics validation (in a vertical columns), and integrated science validation (4-dimensional). These three approaches support five themes: core satellite error characterization; constellation satellites validation; development of physical models of snow, cloud water, and mixed phase; development of cloud-resolving model (CRM) and land-surface models to bridge observations and algorithms; and, development of coupled CRM-land surface modeling for basin-scale water budget studies and natural hazard prediction. This presentation describes the implementation of these approaches.
Economos, Christina D; Sacheck, Jennifer M; Kwan Ho Chui, Kenneth; Irizarry, Laura; Irizzary, Laura; Guillemont, Juliette; Collins, Jessica J; Hyatt, Raymond R
2008-04-01
Interventions aiming to modify the dietary and physical activity behaviors of young children require precise and accurate measurement tools. As part of a larger community-based project, three school-based questionnaires were developed to assess (a) fruit and vegetable intake, (b) physical activity and television (TV) viewing, and (c) perceived parental support for diet and physical activity. Test-retest reliability was performed on all questionnaires and validity was measured for fruit and vegetable intake, physical activity, and TV viewing. Eighty-four school children (8.3+/-1.1 years) were studied. Test-retest reliability was performed by administering questionnaires twice, 1 to 2 hours apart. Validity of the fruit and vegetable questionnaire was measured by direct observation, while the physical activity and TV questionnaire was validated by a parent phone interview. All three questionnaires yielded excellent test-retest reliability (P<0.001). The majority of fruit and vegetable questions and the questions regarding specific physical activities and TV viewing were valid. Low validity scores were found for questions on watching TV during breakfast or dinner. These questionnaires are reliable and valid tools to assess fruit and vegetable intake, physical activity, and TV viewing behaviors in early elementary school-aged children. Methods for assessment of children's TV viewing during meals should be further investigated because of parent-child discrepancies.
Assessing participation in community-based physical activity programs in Brazil.
Reis, Rodrigo S; Yan, Yan; Parra, Diana C; Brownson, Ross C
2014-01-01
This study aimed to develop and validate a risk prediction model to examine the characteristics that are associated with participation in community-based physical activity programs in Brazil. We used pooled data from three surveys conducted from 2007 to 2009 in state capitals of Brazil with 6166 adults. A risk prediction model was built considering program participation as an outcome. The predictive accuracy of the model was quantified through discrimination (C statistic) and calibration (Brier score) properties. Bootstrapping methods were used to validate the predictive accuracy of the final model. The final model showed sex (women: odds ratio [OR] = 3.18, 95% confidence interval [CI] = 2.14-4.71), having less than high school degree (OR = 1.71, 95% CI = 1.16-2.53), reporting a good health (OR = 1.58, 95% CI = 1.02-2.24) or very good/excellent health (OR = 1.62, 95% CI = 1.05-2.51), having any comorbidity (OR = 1.74, 95% CI = 1.26-2.39), and perceiving the environment as safe to walk at night (OR = 1.59, 95% CI = 1.18-2.15) as predictors of participation in physical activity programs. Accuracy indices were adequate (C index = 0.778, Brier score = 0.031) and similar to those obtained from bootstrapping (C index = 0.792, Brier score = 0.030). Sociodemographic and health characteristics as well as perceptions of the environment are strong predictors of participation in community-based programs in selected cities of Brazil.
Model-Based Verification and Validation of Spacecraft Avionics
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Sievers, Michael; Standley, Shaun
2012-01-01
Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
Spacecraft Internal Acoustic Environment Modeling
NASA Technical Reports Server (NTRS)
Chu, S. Reynold; Allen, Chris
2009-01-01
The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles. The use of such a model will help ensure compliance with acoustic requirements. Also, this project includes modeling validation and development feedback via building physical mockups and conducting acoustic measurements to compare with the predictions.
Switching moving boundary models for two-phase flow evaporators and condensers
NASA Astrophysics Data System (ADS)
Bonilla, Javier; Dormido, Sebastián; Cellier, François E.
2015-03-01
The moving boundary method is an appealing approach for the design, testing and validation of advanced control schemes for evaporators and condensers. When it comes to advanced control strategies, not only accurate but fast dynamic models are required. Moving boundary models are fast low-order dynamic models, and they can describe the dynamic behavior with high accuracy. This paper presents a mathematical formulation based on physical principles for two-phase flow moving boundary evaporator and condenser models which support dynamic switching between all possible flow configurations. The models were implemented in a library using the equation-based object-oriented Modelica language. Several integrity tests in steady-state and transient predictions together with stability tests verified the models. Experimental data from a direct steam generation parabolic-trough solar thermal power plant is used to validate and compare the developed moving boundary models against finite volume models.
Spacecraft Internal Acoustic Environment Modeling
NASA Technical Reports Server (NTRS)
Allen, Christopher; Chu, S. Reynold
2008-01-01
The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles to ensure compliance with acoustic requirements and thus provide a safe and habitable acoustic environment for the crews, and to validate developed models via building physical mockups and conducting acoustic measurements.
Tooze, Janet A; Troiano, Richard P; Carroll, Raymond J; Moshfegh, Alanna J; Freedman, Laurence S
2013-06-01
Systematic investigations into the structure of measurement error of physical activity questionnaires are lacking. We propose a measurement error model for a physical activity questionnaire that uses physical activity level (the ratio of total energy expenditure to basal energy expenditure) to relate questionnaire-based reports of physical activity level to true physical activity levels. The 1999-2006 National Health and Nutrition Examination Survey physical activity questionnaire was administered to 433 participants aged 40-69 years in the Observing Protein and Energy Nutrition (OPEN) Study (Maryland, 1999-2000). Valid estimates of participants' total energy expenditure were also available from doubly labeled water, and basal energy expenditure was estimated from an equation; the ratio of those measures estimated true physical activity level ("truth"). We present a measurement error model that accommodates the mixture of errors that arise from assuming a classical measurement error model for doubly labeled water and a Berkson error model for the equation used to estimate basal energy expenditure. The method was then applied to the OPEN Study. Correlations between the questionnaire-based physical activity level and truth were modest (r = 0.32-0.41); attenuation factors (0.43-0.73) indicate that the use of questionnaire-based physical activity level would lead to attenuated estimates of effect size. Results suggest that sample sizes for estimating relationships between physical activity level and disease should be inflated, and that regression calibration can be used to provide measurement error-adjusted estimates of relationships between physical activity and disease.
Décary, Simon; Ouellet, Philippe; Vendittoli, Pascal-André; Roy, Jean-Sébastien; Desmeules, François
2017-01-01
More evidence on diagnostic validity of physical examination tests for knee disorders is needed to lower frequently used and costly imaging tests. To conduct a systematic review of systematic reviews (SR) and meta-analyses (MA) evaluating the diagnostic validity of physical examination tests for knee disorders. A structured literature search was conducted in five databases until January 2016. Methodological quality was assessed using the AMSTAR. Seventeen reviews were included with mean AMSTAR score of 5.5 ± 2.3. Based on six SR, only the Lachman test for ACL injuries is diagnostically valid when individually performed (Likelihood ratio (LR+):10.2, LR-:0.2). Based on two SR, the Ottawa Knee Rule is a valid screening tool for knee fractures (LR-:0.05). Based on one SR, the EULAR criteria had a post-test probability of 99% for the diagnosis of knee osteoarthritis. Based on two SR, a complete physical examination performed by a trained health provider was found to be diagnostically valid for ACL, PCL and meniscal injuries as well as for cartilage lesions. When individually performed, common physical tests are rarely able to rule in or rule out a specific knee disorder, except the Lachman for ACL injuries. There is low-quality evidence concerning the validity of combining history elements and physical tests. Copyright © 2016 Elsevier Ltd. All rights reserved.
A charge-based model of Junction Barrier Schottky rectifiers
NASA Astrophysics Data System (ADS)
Latorre-Rey, Alvaro D.; Mudholkar, Mihir; Quddus, Mohammed T.; Salih, Ali
2018-06-01
A new charge-based model of the electric field distribution for Junction Barrier Schottky (JBS) diodes is presented, based on the description of the charge-sharing effect between the vertical Schottky junction and the lateral pn-junctions that constitute the active cell of the device. In our model, the inherently 2-D problem is transformed into a simple but accurate 1-D problem which has a closed analytical solution that captures the reshaping and reduction of the electric field profile responsible for the improved electrical performance of these devices, while preserving physically meaningful expressions that depend on relevant device parameters. The validation of the model is performed by comparing calculated electric field profiles with drift-diffusion simulations of a JBS device showing good agreement. Even though other fully 2-D models already available provide higher accuracy, they lack physical insight making the proposed model an useful tool for device design.
A paradigm for modeling and computation of gas dynamics
NASA Astrophysics Data System (ADS)
Xu, Kun; Liu, Chang
2017-02-01
In the continuum flow regime, the Navier-Stokes (NS) equations are usually used for the description of gas dynamics. On the other hand, the Boltzmann equation is applied for the rarefied flow. These two equations are based on distinguishable modeling scales for flow physics. Fortunately, due to the scale separation, i.e., the hydrodynamic and kinetic ones, both the Navier-Stokes equations and the Boltzmann equation are applicable in their respective domains. However, in real science and engineering applications, they may not have such a distinctive scale separation. For example, around a hypersonic flying vehicle, the flow physics at different regions may correspond to different regimes, where the local Knudsen number can be changed significantly in several orders of magnitude. With a variation of flow physics, theoretically a continuous governing equation from the kinetic Boltzmann modeling to the hydrodynamic Navier-Stokes dynamics should be used for its efficient description. However, due to the difficulties of a direct modeling of flow physics in the scale between the kinetic and hydrodynamic ones, there is basically no reliable theory or valid governing equations to cover the whole transition regime, except resolving flow physics always down to the mean free path scale, such as the direct Boltzmann solver and the Direct Simulation Monte Carlo (DSMC) method. In fact, it is an unresolved problem about the exact scale for the validity of the NS equations, especially in the small Reynolds number cases. The computational fluid dynamics (CFD) is usually based on the numerical solution of partial differential equations (PDEs), and it targets on the recovering of the exact solution of the PDEs as mesh size and time step converging to zero. This methodology can be hardly applied to solve the multiple scale problem efficiently because there is no such a complete PDE for flow physics through a continuous variation of scales. For the non-equilibrium flow study, the direct modeling methods, such as DSMC, particle in cell, and smooth particle hydrodynamics, play a dominant role to incorporate the flow physics into the algorithm construction directly. It is fully legitimate to combine the modeling and computation together without going through the process of constructing PDEs. In other words, the CFD research is not only to obtain the numerical solution of governing equations but to model flow dynamics as well. This methodology leads to the unified gas-kinetic scheme (UGKS) for flow simulation in all flow regimes. Based on UGKS, the boundary for the validation of the Navier-Stokes equations can be quantitatively evaluated. The combination of modeling and computation provides a paradigm for the description of multiscale transport process.
Review of TRMM/GPM Rainfall Algorithm Validation
NASA Technical Reports Server (NTRS)
Smith, Eric A.
2004-01-01
A review is presented concerning current progress on evaluation and validation of standard Tropical Rainfall Measuring Mission (TRMM) precipitation retrieval algorithms and the prospects for implementing an improved validation research program for the next generation Global Precipitation Measurement (GPM) Mission. All standard TRMM algorithms are physical in design, and are thus based on fundamental principles of microwave radiative transfer and its interaction with semi-detailed cloud microphysical constituents. They are evaluated for consistency and degree of equivalence with one another, as well as intercompared to radar-retrieved rainfall at TRMM's four main ground validation sites. Similarities and differences are interpreted in the context of the radiative and microphysical assumptions underpinning the algorithms. Results indicate that the current accuracies of the TRMM Version 6 algorithms are approximately 15% at zonal-averaged / monthly scales with precisions of approximately 25% for full resolution / instantaneous rain rate estimates (i.e., level 2 retrievals). Strengths and weaknesses of the TRMM validation approach are summarized. Because the dew of convergence of level 2 TRMM algorithms is being used as a guide for setting validation requirements for the GPM mission, it is important that the GPM algorithm validation program be improved to ensure concomitant improvement in the standard GPM retrieval algorithms. An overview of the GPM Mission's validation plan is provided including a description of a new type of physical validation model using an analytic 3-dimensional radiative transfer model.
Catchment-scale Validation of a Physically-based, Post-fire Runoff and Erosion Model
NASA Astrophysics Data System (ADS)
Quinn, D.; Brooks, E. S.; Robichaud, P. R.; Dobre, M.; Brown, R. E.; Wagenbrenner, J.
2017-12-01
The cascading consequences of fire-induced ecological changes have profound impacts on both natural and managed forest ecosystems. Forest managers tasked with implementing post-fire mitigation strategies need robust tools to evaluate the effectiveness of their decisions, particularly those affecting hydrological recovery. Various hillslope-scale interfaces of the physically-based Water Erosion Prediction Project (WEPP) model have been successfully validated for this purpose using fire-effected plot experiments, however these interfaces are explicitly designed to simulate single hillslopes. Spatially-distributed, catchment-scale WEPP interfaces have been developed over the past decade, however none have been validated for post-fire simulations, posing a barrier to adoption for forest managers. In this validation study, we compare WEPP simulations with pre- and post-fire hydrological records for three forested catchments (W. Willow, N. Thomas, and S. Thomas) that burned in the 2011 Wallow Fire in Northeastern Arizona, USA. Simulations were conducted using two approaches; the first using automatically created inputs from an online, spatial, post-fire WEPP interface, and the second using manually created inputs which incorporate the spatial variability of fire effects observed in the field. Both approaches were compared to five years of observed post-fire sediment and flow data to assess goodness of fit.
KINEROS2-AGWA: Model Use, Calibration, and Validation
NASA Technical Reports Server (NTRS)
Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..
2013-01-01
KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.
KINEROS2/AGWA: Model use, calibration and validation
Goodrich, D.C.; Burns, I.S.; Unkrich, C.L.; Semmens, Darius J.; Guertin, D.P.; Hernandez, M.; Yatheendradas, S.; Kennedy, Jeffrey R.; Levick, Lainie R.
2012-01-01
KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-02-06
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.
A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-01-01
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184
Saunders, Ruth P.; McIver, Kerry L.; Dowda, Marsha; Pate, Russell R.
2013-01-01
Objective Scales used to measure selected social-cognitive beliefs and motives for physical activity were tested among boys and girls. Methods Covariance modeling was applied to responses obtained from large multi-ethnic samples of students in the fifth and sixth grades. Results Theoretically and statistically sound models were developed, supporting the factorial validity of the scales in all groups. Multi-group longitudinal invariance was confirmed between boys and girls, overweight and normal weight students, and non-Hispanic black and white children. The construct validity of the scales was supported by hypothesized convergent and discriminant relationships within a measurement model that included correlations with physical activity (MET • min/day) measured by an accelerometer. Conclusions Scores from the scales provide valid assessments of selected beliefs and motives that are putative mediators of change in physical activity among boys and girls, as they begin the understudied transition from the fifth grade into middle school, when physical activity naturally declines. PMID:23459310
Dishman, Rod K; Saunders, Ruth P; McIver, Kerry L; Dowda, Marsha; Pate, Russell R
2013-06-01
Scales used to measure selected social-cognitive beliefs and motives for physical activity were tested among boys and girls. Covariance modeling was applied to responses obtained from large multi-ethnic samples of students in the fifth and sixth grades. Theoretically and statistically sound models were developed, supporting the factorial validity of the scales in all groups. Multi-group longitudinal invariance was confirmed between boys and girls, overweight and normal weight students, and non-Hispanic black and white children. The construct validity of the scales was supported by hypothesized convergent and discriminant relationships within a measurement model that included correlations with physical activity (MET • min/day) measured by an accelerometer. Scores from the scales provide valid assessments of selected beliefs and motives that are putative mediators of change in physical activity among boys and girls, as they begin the understudied transition from the fifth grade into middle school, when physical activity naturally declines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan
2015-02-16
CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tregillis, I. L.
The Los Alamos Physics and Engineering Models (PEM) program has developed a model for Richtmyer-Meshkov instability (RMI) based ejecta production from shock-melted surfaces, along with a prescription for a self-similar velocity distribution (SSVD) of the resulting ejecta particles. We have undertaken an effort to validate this source model using data from explosively driven tin coupon experiments. The model’s current formulation lacks a crucial piece of physics: a method for determining the duration of the ejecta production interval. Without a mechanism for terminating ejecta production, the model is not predictive. Furthermore, when the production interval is hand-tuned to match time-integrated massmore » data, the predicted time-dependent mass accumulation on a downstream sensor rises too sharply at early times and too slowly at late times because the SSVD overestimates the amount of mass stored in the fastest particles and underestimates the mass stored in the slowest particles. The functional form of the resulting m(t) is inconsistent with the available time-dependent data; numerical simulations and analytic studies agree on this point. Simulated mass tallies are highly sensitive to radial expansion of the ejecta cloud. It is not clear if the same effect is present in the experimental data but if so, depending on the degree, this may challenge the model’s compatibility with tin coupon data. The current implementation of the model in FLAG is sensitive to the detailed interaction between kinematics (hydrodynamic methods) and thermodynamics (material models); this sensitivity prohibits certain physics modeling choices. The appendices contain an extensive analytic study of piezoelectric ejecta mass measurements, along with test problems, excerpted from a longer work (LA-UR-17-21218).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anh Bui; Nam Dinh; Brian Williams
In addition to validation data plan, development of advanced techniques for calibration and validation of complex multiscale, multiphysics nuclear reactor simulation codes are a main objective of the CASL VUQ plan. Advanced modeling of LWR systems normally involves a range of physico-chemical models describing multiple interacting phenomena, such as thermal hydraulics, reactor physics, coolant chemistry, etc., which occur over a wide range of spatial and temporal scales. To a large extent, the accuracy of (and uncertainty in) overall model predictions is determined by the correctness of various sub-models, which are not conservation-laws based, but empirically derived from measurement data. Suchmore » sub-models normally require extensive calibration before the models can be applied to analysis of real reactor problems. This work demonstrates a case study of calibration of a common model of subcooled flow boiling, which is an important multiscale, multiphysics phenomenon in LWR thermal hydraulics. The calibration process is based on a new strategy of model-data integration, in which, all sub-models are simultaneously analyzed and calibrated using multiple sets of data of different types. Specifically, both data on large-scale distributions of void fraction and fluid temperature and data on small-scale physics of wall evaporation were simultaneously used in this work’s calibration. In a departure from traditional (or common-sense) practice of tuning/calibrating complex models, a modern calibration technique based on statistical modeling and Bayesian inference was employed, which allowed simultaneous calibration of multiple sub-models (and related parameters) using different datasets. Quality of data (relevancy, scalability, and uncertainty) could be taken into consideration in the calibration process. This work presents a step forward in the development and realization of the “CIPS Validation Data Plan” at the Consortium for Advanced Simulation of LWRs to enable quantitative assessment of the CASL modeling of Crud-Induced Power Shift (CIPS) phenomenon, in particular, and the CASL advanced predictive capabilities, in general. This report is prepared for the Department of Energy’s Consortium for Advanced Simulation of LWRs program’s VUQ Focus Area.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael
The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less
NASA Technical Reports Server (NTRS)
Kim, E.; Tedesco, M.; Reichle, R.; Choudhury, B.; Peters-Lidard C.; Foster, J.; Hall, D.; Riggs, G.
2006-01-01
Microwave-based retrievals of snow parameters from satellite observations have a long heritage and have so far been generated primarily by regression-based empirical "inversion" methods based on snapshots in time. Direct assimilation of microwave radiance into physical land surface models can be used to avoid errors associated with such retrieval/inversion methods, instead utilizing more straightforward forward models and temporal information. This approach has been used for years for atmospheric parameters by the operational weather forecasting community with great success. Recent developments in forward radiative transfer modeling, physical land surface modeling, and land data assimilation are converging to allow the assembly of an integrated framework for snow/cold lands modeling and radiance assimilation. The objective of the Goddard snow radiance assimilation project is to develop such a framework and explore its capabilities. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. In fact, multiple models are available for each element enabling optimization to match the needs of a particular study. Together these form a modular and flexible framework for self-consistent, physically-based remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster. Capabilities for assimilation of snow retrieval products are already under development for LIS. We will describe plans to add radiance-based assimilation capabilities. Plans for validation activities using field measurements will also be discussed.
Physics-based distributed snow models in the operational arena: Current and future challenges
NASA Astrophysics Data System (ADS)
Winstral, A. H.; Jonas, T.; Schirmer, M.; Helbig, N.
2017-12-01
The demand for modeling tools robust to climate change and weather extremes along with coincident increases in computational capabilities have led to an increase in the use of physics-based snow models in operational applications. Current operational applications include the WSL-SLF's across Switzerland, ASO's in California, and USDA-ARS's in Idaho. While the physics-based approaches offer many advantages there remain limitations and modeling challenges. The most evident limitation remains computation times that often limit forecasters to a single, deterministic model run. Other limitations however remain less conspicuous amidst the assumptions that these models require little to no calibration based on their foundation on physical principles. Yet all energy balance snow models seemingly contain parameterizations or simplifications of processes where validation data are scarce or present understanding is limited. At the research-basin scale where many of these models were developed these modeling elements may prove adequate. However when applied over large areas, spatially invariable parameterizations of snow albedo, roughness lengths and atmospheric exchange coefficients - all vital to determining the snowcover energy balance - become problematic. Moreover as we apply models over larger grid cells, the representation of sub-grid variability such as the snow-covered fraction adds to the challenges. Here, we will demonstrate some of the major sensitivities of distributed energy balance snow models to particular model constructs, the need for advanced and spatially flexible methods and parameterizations, and prompt the community for open dialogue and future collaborations to further modeling capabilities.
Bornstein, Daniel B; Pate, Russell R; Beets, Michael W; Saunders, Ruth P; Blair, Steven N
2015-06-01
Coalitions are often composed of member organizations. Member involvement is thought to be associated with coalition success. No instrument currently exists for evaluating organizational member involvement in physical activity coalitions. This study aimed to develop a survey instrument for evaluating organizational member involvement in physical activity coalitions. The study was carried out in three phases: (a) developing a draft survey, (b) assessing the content validity of the draft survey, and (c) assessing the underlying factor structure, reliability, and validity of the survey. A cross-sectional design was employed. In Phase 1, a team of experts in survey development produced a draft survey. In Phase 2, the content validity of the draft survey was evaluated by a panel of individuals with expertise in physical activity coalitions. In Phase 3, the survey was administered to 120 individuals on local-, state-, and national-level physical activity coalitions. Responses were subjected to an exploratory factor analysis in order to determine the survey's underlying factor structure, reliability, and validity. Phases 1 and 2yielded a survey instrument with demonstrated content validity. Phase 3 yielded a three-factor model with three subscales: Strategic Alignment, Organizational Alignment, and Providing Input. Each subscale demonstrated high internal consistency reliability and construct validity. The survey instrument developed here demonstrated sound psychometric properties and provides new insight into organizational member involvement in physical activity coalitions. This instrument may be an important tool in developing a more complete picture of coalition functioning in physical activity coalitions specifically and health-based coalitions overall. © 2014 Society for Public Health Education.
Simulation of Atmospheric-Entry Capsules in the Subsonic Regime
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Childs, Robert E.; Garcia, Joseph A.
2015-01-01
The accuracy of Computational Fluid Dynamics predictions of subsonic capsule aerodynamics is examined by comparison against recent NASA wind-tunnel data at high-Reynolds-number flight conditions. Several aspects of numerical and physical modeling are considered, including inviscid numerical scheme, mesh adaptation, rough-wall modeling, rotation and curvature corrections for eddy-viscosity models, and Detached-Eddy Simulations of the unsteady wake. All of these are considered in isolation against relevant data where possible. The results indicate that an improved predictive capability is developed by considering physics-based approaches and validating the results against flight-relevant experimental data.
Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models
Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.
2011-01-01
We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.
Gupta, Nidhi; Heiden, Marina; Mathiassen, Svend Erik; Holtermann, Andreas
2016-05-01
We aimed at developing and evaluating statistical models predicting objectively measured occupational time spent sedentary or in physical activity from self-reported information available in large epidemiological studies and surveys. Two-hundred-and-fourteen blue-collar workers responded to a questionnaire containing information about personal and work related variables, available in most large epidemiological studies and surveys. Workers also wore accelerometers for 1-4 days measuring time spent sedentary and in physical activity, defined as non-sedentary time. Least-squares linear regression models were developed, predicting objectively measured exposures from selected predictors in the questionnaire. A full prediction model based on age, gender, body mass index, job group, self-reported occupational physical activity (OPA), and self-reported occupational sedentary time (OST) explained 63% (R (2)adjusted) of the variance of both objectively measured time spent sedentary and in physical activity since these two exposures were complementary. Single-predictor models based only on self-reported information about either OPA or OST explained 21% and 38%, respectively, of the variance of the objectively measured exposures. Internal validation using bootstrapping suggested that the full and single-predictor models would show almost the same performance in new datasets as in that used for modelling. Both full and single-predictor models based on self-reported information typically available in most large epidemiological studies and surveys were able to predict objectively measured occupational time spent sedentary or in physical activity, with explained variances ranging from 21-63%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Xingshu; Alam, Muhammad Ashraful; Raguse, John
2015-10-15
In this paper, we develop a physics-based compact model for copper indium gallium diselenide (CIGS) and cadmium telluride (CdTe) heterojunction solar cells that attributes the failure of superposition to voltage-dependent carrier collection in the absorber layer, and interprets light-enhanced reverse breakdown as a consequence of tunneling-assisted Poole-Frenkel conduction. The temperature dependence of the model is validated against both simulation and experimental data for the entire range of bias conditions. The model can be used to characterize device parameters, optimize new designs, and most importantly, predict performance and reliability of solar panels including the effects of self-heating and reverse breakdown duemore » to partial-shading degradation.« less
The development and validation of Science Learning Inventory (SLI): A conceptual change framework
NASA Astrophysics Data System (ADS)
Seyedmonir, Mehdi
2000-12-01
A multidimensional theoretical model, Conceptual Change Science Learning (CCSL), was developed based on Standard Model of Conceptual Change and Cognitive Reconstruction of Knowledge Model. The model addresses three main components of science learning, namely the learner's conceptual ecology, the message along with its social context, and the cognitive engagement. A learner's conceptual ecology is organized around three clusters, including epistemological beliefs, existing conceptions, and motivation. Learner's cognitive engagement is represented by a continuum from peripheral processing involving shallow cognitive engagement to central processing involving deep cognitive engagement. Through reciprocal, non-sequential interactions of such constructs, the learners' conceptual change is achieved. Using a quantitative empirical approach, three studies were conducted to investigate the theoretical constructs based on the CCSL Model. The first study reports the development and validation of the hypothesized and factor-analytic scales comprising the instrument, Science Learning Inventory (SLI) intended for college students. The self-report instrument was designed in two parts, SLI-A (conceptual ecology and cognitive engagement) with 48 initial items, and SLI-B (science epistemology) with 49 initial items. The items for SLI-B were based on the tenets of Nature of Science as reflected in the recent reform documents, Science for All Americans (Project 2061) and National Science Education Standards. The results of factor analysis indicated seven factors for SLI-A and four factors for SLI-B. The second study investigated the criterion-related (conceptual change) predictive validity of the SLI in an instructional setting (a college-level physics course). The findings suggested the possibility of different interplay of factors and dynamics depending on the nature of the criterion (gain scores from a three-week intervention versus final course grade). Gain scores were predicted by "self-reflective study behavior" and "science self-efficacy" scales of SLI, whereas the course grade was predicted by "metacognitive engagement" and "dynamic scientific truth," (a factor from science epistemology). The third study investigated the effects of text-based conceptual-change strategy (Enhanced Refutational Text; ERT) on Newtonian Laws of Motion, and the efficacy of the SLI scales in a controlled setting. Also, initial divergent and convergent validity procedures are reported in the study. The results provided partial support for the superiority of ERT over expository text. The ERT was an effective intervention for students with no prior physics background but not for students with prior physics background.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Assessing Participation in Community-Based Physical Activity Programs in Brazil
REIS, RODRIGO S.; YAN, YAN; PARRA, DIANA C.; BROWNSON, ROSS C.
2015-01-01
Purpose This study aimed to develop and validate a risk prediction model to examine the characteristics that are associated with participation in community-based physical activity programs in Brazil. Methods We used pooled data from three surveys conducted from 2007 to 2009 in state capitals of Brazil with 6166 adults. A risk prediction model was built considering program participation as an outcome. The predictive accuracy of the model was quantified through discrimination (C statistic) and calibration (Brier score) properties. Bootstrapping methods were used to validate the predictive accuracy of the final model. Results The final model showed sex (women: odds ratio [OR] = 3.18, 95% confidence interval [CI] = 2.14–4.71), having less than high school degree (OR = 1.71, 95% CI = 1.16–2.53), reporting a good health (OR = 1.58, 95% CI = 1.02–2.24) or very good/excellent health (OR = 1.62, 95% CI = 1.05–2.51), having any comorbidity (OR = 1.74, 95% CI = 1.26–2.39), and perceiving the environment as safe to walk at night (OR = 1.59, 95% CI = 1.18–2.15) as predictors of participation in physical activity programs. Accuracy indices were adequate (C index = 0.778, Brier score = 0.031) and similar to those obtained from bootstrapping (C index = 0.792, Brier score = 0.030). Conclusions Sociodemographic and health characteristics as well as perceptions of the environment are strong predictors of participation in community-based programs in selected cities of Brazil. PMID:23846162
Ardestani, M S; Niknami, S; Hidarnia, A; Hajizadeh, E
2016-08-18
This research examined the validity and reliability of a researcher-developed questionnaire based on Social Cognitive Theory (SCT) to assess the physical activity behaviour of Iranian adolescent girls (SCT-PAIAGS). Psychometric properties of the SCT-PAIAGS were assessed by determining its face validity, content and construct validity as well as its reliability. In order to evaluate factor structure, cross-sectional research was conducted on 400 high-school girls in Tehran. Content validity index, content validity ratio and impact score for the SCT-PAIAGS varied between 0.97-1, 0.91-1 and 4.6-4.9 respectively. Confirmatory factor analysis approved a six-factor structure comprising self-efficacy, self-regulation, family support, friend support, outcome expectancy and self-efficacy to overcoming impediments. Factor loadings, t-values and fit indices showed that the SCT model was fitted to the data. Cronbach's α-coefficient ranged from 0.78 to 0.85 and intraclass correlation coefficient from 0.73 to 0.90.
Physical mechanism and numerical simulation of the inception of the lightning upward leader
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Qingmin; Lu Xinchang; Shi Wei
2012-12-15
The upward leader is a key physical process of the leader progression model of lightning shielding. The inception mechanism and criterion of the upward leader need further understanding and clarification. Based on leader discharge theory, this paper proposes the critical electric field intensity of the stable upward leader (CEFISUL) and characterizes it by the valve electric field intensity on the conductor surface, E{sub L}, which is the basis of a new inception criterion for the upward leader. Through numerical simulation under various physical conditions, we verified that E{sub L} is mainly related to the conductor radius, and data fitting yieldsmore » the mathematical expression of E{sub L}. We further establish a computational model for lightning shielding performance of the transmission lines based on the proposed CEFISUL criterion, which reproduces the shielding failure rate of typical UHV transmission lines. The model-based calculation results agree well with the statistical data from on-site operations, which show the effectiveness and validity of the CEFISUL criterion.« less
Agent-based modeling of noncommunicable diseases: a systematic review.
Nianogo, Roch A; Arah, Onyebuchi A
2015-03-01
We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application.
Agent-Based Modeling of Noncommunicable Diseases: A Systematic Review
Arah, Onyebuchi A.
2015-01-01
We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application. PMID:25602871
Chung, Eva Yin-Han; Lam, Gigi
2018-05-29
The World Health Organization has asserted the importance of enhancing participation of people with disabilities within the International Classification of Functioning, Disability and Health framework. Participation is regarded as a vital outcome in community-based rehabilitation. The actualization of the right to participate is limited by social stigma and discrimination. To date, there is no validated instrument for use in Chinese communities to measure participation restriction or self-perceived stigma. This study aimed to translate and validate the Participation Scale and the Explanatory Model Interview Catalogue (EMIC) Stigma Scale for use in Chinese communities with people with physical disabilities. The Chinese versions of the Participation Scale and the EMIC stigma scale were administered to 264 adults with physical disabilities. The two scales were examined separately. The reliability analysis was studied in conjunction with the construct validity. Reliability analysis was conducted to assess the internal consistency and item-total correlation. Exploratory factor analysis was conducted to investigate the latent patterns of relationships among variables. A Rasch model analysis was conducted to test the dimensionality, internal validity, item hierarchy, and scoring category structure of the two scales. Both the Participation Scale and the EMIC stigma scale were confirmed to have good internal consistency and high item-total correlation. Exploratory factor analysis revealed the factor structure of the two scales, which demonstrated the fitting of a pattern of variables within the studied construct. The Participation Scale was found to be multidimensional, whereas the EMIC stigma scale was confirmed to be unidimensional. The item hierarchies of the Participation Scale and the EMIC stigma scale were discussed and were regarded as compatible with the cultural characteristics of Chinese communities. The Chinese versions of the Participation Scale and the EMIC stigma scale were thoroughly tested in this study to demonstrate their robustness and feasibility in measuring the participation restriction and perceived stigma of people with physical disabilities in Chinese communities. This is crucial as it provides valid measurements to enable comprehensive understanding and assessment of the participation and stigma among people with physical disabilities in Chinese communities.
Wind-US Code Physical Modeling Improvements to Complement Hypersonic Testing and Evaluation
NASA Technical Reports Server (NTRS)
Georgiadis, Nicholas J.; Yoder, Dennis A.; Towne, Charles S.; Engblom, William A.; Bhagwandin, Vishal A.; Power, Greg D.; Lankford, Dennis W.; Nelson, Christopher C.
2009-01-01
This report gives an overview of physical modeling enhancements to the Wind-US flow solver which were made to improve the capabilities for simulation of hypersonic flows and the reliability of computations to complement hypersonic testing. The improvements include advanced turbulence models, a bypass transition model, a conjugate (or closely coupled to vehicle structure) conduction-convection heat transfer capability, and an upgraded high-speed combustion solver. A Mach 5 shock-wave boundary layer interaction problem is used to investigate the benefits of k- s and k-w based explicit algebraic stress turbulence models relative to linear two-equation models. The bypass transition model is validated using data from experiments for incompressible boundary layers and a Mach 7.9 cone flow. The conjugate heat transfer method is validated for a test case involving reacting H2-O2 rocket exhaust over cooled calorimeter panels. A dual-mode scramjet configuration is investigated using both a simplified 1-step kinetics mechanism and an 8-step mechanism. Additionally, variations in the turbulent Prandtl and Schmidt numbers are considered for this scramjet configuration.
Physically-based in silico light sheet microscopy for visualizing fluorescent brain models
2015-01-01
Background We present a physically-based computational model of the light sheet fluorescence microscope (LSFM). Based on Monte Carlo ray tracing and geometric optics, our method simulates the operational aspects and image formation process of the LSFM. This simulated, in silico LSFM creates synthetic images of digital fluorescent specimens that can resemble those generated by a real LSFM, as opposed to established visualization methods producing visually-plausible images. We also propose an accurate fluorescence rendering model which takes into account the intrinsic characteristics of fluorescent dyes to simulate the light interaction with fluorescent biological specimen. Results We demonstrate first results of our visualization pipeline to a simplified brain tissue model reconstructed from the somatosensory cortex of a young rat. The modeling aspects of the LSFM units are qualitatively analysed, and the results of the fluorescence model were quantitatively validated against the fluorescence brightness equation and characteristic emission spectra of different fluorescent dyes. AMS subject classification Modelling and simulation PMID:26329404
Subramanyam, Rajeev; Yeramaneni, Samrat; Hossain, Mohamed Monir; Anneken, Amy M; Varughese, Anna M
2016-05-01
Perioperative respiratory adverse events (PRAEs) are the most common cause of serious adverse events in children receiving anesthesia. Our primary aim of this study was to develop and validate a risk prediction tool for the occurrence of PRAE from the onset of anesthesia induction until discharge from the postanesthesia care unit in children younger than 18 years undergoing elective ambulatory anesthesia for surgery and radiology. The incidence of PRAE was studied. We analyzed data from 19,059 patients from our department's quality improvement database. The predictor variables were age, sex, ASA physical status, morbid obesity, preexisting pulmonary disorder, preexisting neurologic disorder, and location of ambulatory anesthesia (surgery or radiology). Composite PRAE was defined as the presence of any 1 of the following events: intraoperative bronchospasm, intraoperative laryngospasm, postoperative apnea, postoperative laryngospasm, postoperative bronchospasm, or postoperative prolonged oxygen requirement. Development and validation of the risk prediction tool for PRAE were performed using a split sampling technique to split the database into 2 independent cohorts based on the year when the patient received ambulatory anesthesia for surgery and radiology using logistic regression. A risk score was developed based on the regression coefficients from the validation tool. The performance of the risk prediction tool was assessed by using tests of discrimination and calibration. The overall incidence of composite PRAE was 2.8%. The derivation cohort included 8904 patients, and the validation cohort included 10,155 patients. The risk of PRAE was 3.9% in the development cohort and 1.8% in the validation cohort. Age ≤ 3 years (versus >3 years), ASA physical status II or III (versus ASA physical status I), morbid obesity, preexisting pulmonary disorder, and surgery (versus radiology) significantly predicted the occurrence of PRAE in a multivariable logistic regression model. A risk score in the range of 0 to 3 was assigned to each significant variable in the logistic regression model, and final score for all risk factors ranged from 0 to 11. A cutoff score of 4 was derived from a receiver operating characteristic curve to determine the high-risk category. The model C-statistic and the corresponding SE for the derivation and validation cohort was 0.64 ± 0.01 and 0.63 ± 0.02, respectively. Sensitivity and SE of the risk prediction tool to identify children at risk for PRAE was 77.6 ± 0.02 in the derivation cohort and 76.2 ± 0.03 in the validation cohort. The risk tool developed and validated from our study cohort identified 5 risk factors: age ≤ 3 years (versus >3 years), ASA physical status II and III (versus ASA physical status I), morbid obesity, preexisting pulmonary disorder, and surgery (versus radiology) for PRAE. This tool can be used to provide an individual risk score for each patient to predict the risk of PRAE in the preoperative period.
Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction
Mach, J. C.; Budrow, C. J.; Pagan, D. C.; ...
2017-03-15
Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present paper, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to developmore » significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. Finally, the experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.« less
Lozano, Oscar M; Rojas, Antonio J; Pérez, Cristino; González-Sáiz, Francisco; Ballesta, Rosario; Izaskun, Bilbao
2008-05-01
The aim of this work is to show evidence of the validity of the Health-Related Quality of Life for Drug Abusers Test (HRQoLDA Test). This test was developed to measure specific HRQoL for drugs abusers, within the theoretical addiction framework of the biaxial model. The sample comprised 138 patients diagnosed with opiate drug dependence. In this study, the following constructs and variables of the biaxial model were measured: severity of dependence, physical health status, psychological adjustment and substance consumption. Results indicate that the HRQoLDA Test scores are related to dependency and consumption-related problems. Multiple regression analysis reveals that HRQoL can be predicted from drug dependence, physical health status and psychological adjustment. These results contribute empirical evidence of the theoretical relationships established between HRQoL and the biaxial model, and they support the interpretation of the HRQoLDA Test to measure HRQoL in drug abusers, thus providing a test to measure this specific construct in this population.
Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mach, J. C.; Budrow, C. J.; Pagan, D. C.
Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present paper, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to developmore » significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. Finally, the experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.« less
Sensitivity analysis and calibration of a dynamic physically based slope stability model
NASA Astrophysics Data System (ADS)
Zieher, Thomas; Rutzinger, Martin; Schneider-Muntau, Barbara; Perzl, Frank; Leidinger, David; Formayer, Herbert; Geitner, Clemens
2017-06-01
Physically based modelling of slope stability on a catchment scale is still a challenging task. When applying a physically based model on such a scale (1 : 10 000 to 1 : 50 000), parameters with a high impact on the model result should be calibrated to account for (i) the spatial variability of parameter values, (ii) shortcomings of the selected model, (iii) uncertainties of laboratory tests and field measurements or (iv) parameters that cannot be derived experimentally or measured in the field (e.g. calibration constants). While systematic parameter calibration is a common task in hydrological modelling, this is rarely done using physically based slope stability models. In the present study a dynamic, physically based, coupled hydrological-geomechanical slope stability model is calibrated based on a limited number of laboratory tests and a detailed multitemporal shallow landslide inventory covering two landslide-triggering rainfall events in the Laternser valley, Vorarlberg (Austria). Sensitive parameters are identified based on a local one-at-a-time sensitivity analysis. These parameters (hydraulic conductivity, specific storage, angle of internal friction for effective stress, cohesion for effective stress) are systematically sampled and calibrated for a landslide-triggering rainfall event in August 2005. The identified model ensemble, including 25 behavioural model runs
with the highest portion of correctly predicted landslides and non-landslides, is then validated with another landslide-triggering rainfall event in May 1999. The identified model ensemble correctly predicts the location and the supposed triggering timing of 73.0 % of the observed landslides triggered in August 2005 and 91.5 % of the observed landslides triggered in May 1999. Results of the model ensemble driven with raised precipitation input reveal a slight increase in areas potentially affected by slope failure. At the same time, the peak run-off increases more markedly, suggesting that precipitation intensities during the investigated landslide-triggering rainfall events were already close to or above the soil's infiltration capacity.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
Greenwald, Jeffrey L; Cronin, Patrick R; Carballo, Victoria; Danaei, Goodarz; Choy, Garry
2017-03-01
With the increasing focus on reducing hospital readmissions in the United States, numerous readmissions risk prediction models have been proposed, mostly developed through analyses of structured data fields in electronic medical records and administrative databases. Three areas that may have an impact on readmission but are poorly captured using structured data sources are patients' physical function, cognitive status, and psychosocial environment and support. The objective of the study was to build a discriminative model using information germane to these 3 areas to identify hospitalized patients' risk for 30-day all cause readmissions. We conducted clinician focus groups to identify language used in the clinical record regarding these 3 areas. We then created a dataset including 30,000 inpatients, 10,000 from each of 3 hospitals, and searched those records for the focus group-derived language using natural language processing. A 30-day readmission prediction model was developed on 75% of the dataset and validated on the other 25% and also on hospital specific subsets. Focus group language was aggregated into 35 variables. The final model had 16 variables, a validated C-statistic of 0.74, and was well calibrated. Subset validation of the model by hospital yielded C-statistics of 0.70-0.75. Deriving a 30-day readmission risk prediction model through identification of physical, cognitive, and psychosocial issues using natural language processing yielded a model that performs similarly to the better performing models previously published with the added advantage of being based on clinically relevant factors and also automated and scalable. Because of the clinical relevance of the variables in the model, future research may be able to test if targeting interventions to identified risks results in reductions in readmissions.
NASA Technical Reports Server (NTRS)
Peddle, Derek R.; Huemmrich, K. Fred; Hall, Forrest G.; Masek, Jeffrey G.; Soenen, Scott A.; Jackson, Chris D.
2011-01-01
Canopy reflectance model inversion using look-up table approaches provides powerful and flexible options for deriving improved forest biophysical structural information (BSI) compared with traditional statistical empirical methods. The BIOPHYS algorithm is an improved, physically-based inversion approach for deriving BSI for independent use and validation and for monitoring, inventory and quantifying forest disturbance as well as input to ecosystem, climate and carbon models. Based on the multiple-forward mode (MFM) inversion approach, BIOPHYS results were summarized from different studies (Minnesota/NASA COVER; Virginia/LEDAPS; Saskatchewan/BOREAS), sensors (airborne MMR; Landsat; MODIS) and models (GeoSail; GOMS). Applications output included forest density, height, crown dimension, branch and green leaf area, canopy cover, disturbance estimates based on multi-temporal chronosequences, and structural change following recovery from forest fires over the last century. Good correspondences with validation field data were obtained. Integrated analyses of multiple solar and view angle imagery further improved retrievals compared with single pass data. Quantifying ecosystem dynamics such as the area and percent of forest disturbance, early regrowth and succession provide essential inputs to process-driven models of carbon flux. BIOPHYS is well suited for large-area, multi-temporal applications involving multiple image sets and mosaics for assessing vegetation disturbance and quantifying biophysical structural dynamics and change. It is also suitable for integration with forest inventory, monitoring, updating, and other programs.
Structured Uncertainty Bound Determination From Data for Control and Performance Validation
NASA Technical Reports Server (NTRS)
Lim, Kyong B.
2003-01-01
This report attempts to document the broad scope of issues that must be satisfactorily resolved before one can expect to methodically obtain, with a reasonable confidence, a near-optimal robust closed loop performance in physical applications. These include elements of signal processing, noise identification, system identification, model validation, and uncertainty modeling. Based on a recently developed methodology involving a parameterization of all model validating uncertainty sets for a given linear fractional transformation (LFT) structure and noise allowance, a new software, Uncertainty Bound Identification (UBID) toolbox, which conveniently executes model validation tests and determine uncertainty bounds from data, has been designed and is currently available. This toolbox also serves to benchmark the current state-of-the-art in uncertainty bound determination and in turn facilitate benchmarking of robust control technology. To help clarify the methodology and use of the new software, two tutorial examples are provided. The first involves the uncertainty characterization of a flexible structure dynamics, and the second example involves a closed loop performance validation of a ducted fan based on an uncertainty bound from data. These examples, along with other simulation and experimental results, also help describe the many factors and assumptions that determine the degree of success in applying robust control theory to practical problems.
NASA Astrophysics Data System (ADS)
Lute, A. C.; Luce, Charles H.
2017-11-01
The related challenges of predictions in ungauged basins and predictions in ungauged climates point to the need to develop environmental models that are transferable across both space and time. Hydrologic modeling has historically focused on modelling one or only a few basins using highly parameterized conceptual or physically based models. However, model parameters and structures have been shown to change significantly when calibrated to new basins or time periods, suggesting that model complexity and model transferability may be antithetical. Empirical space-for-time models provide a framework within which to assess model transferability and any tradeoff with model complexity. Using 497 SNOTEL sites in the western U.S., we develop space-for-time models of April 1 SWE and Snow Residence Time based on mean winter temperature and cumulative winter precipitation. The transferability of the models to new conditions (in both space and time) is assessed using non-random cross-validation tests with consideration of the influence of model complexity on transferability. As others have noted, the algorithmic empirical models transfer best when minimal extrapolation in input variables is required. Temporal split-sample validations use pseudoreplicated samples, resulting in the selection of overly complex models, which has implications for the design of hydrologic model validation tests. Finally, we show that low to moderate complexity models transfer most successfully to new conditions in space and time, providing empirical confirmation of the parsimony principal.
ERIC Educational Resources Information Center
Mayol, Mindy Hartman; Scott, Brianna M.; Schreiber, James B.
2017-01-01
Background: In some professions, "wellness" has become shorthand for physical fitness and nutrition but dimensions outside the physical are equally important. As wellness models continue to materialize, a validated instrument is needed to substantiate the characteristics of a multidimensional wellness model. Purpose: This 2-pronged study…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dartevelle, Sebastian
2007-10-01
Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less
ERIC Educational Resources Information Center
Gao, Zan; Lee, Amelia M.; Solmon, Melinda A.; Kosma, Maria; Carson, Russell L.; Zhang, Tao; Domangue, Elizabeth; Moore, Delilah
2010-01-01
The purpose of this study was to validate physical activity time in middle school physical education as measured by pedometers in relation to a criterion measure, namely, students' accelerometer determined moderate to vigorous physical activity (MVPA). Participants were 155 sixth to eighth graders participating in regularly scheduled physical…
Simulation of the UT inspection of planar defects using a generic GTD-Kirchhoff approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorval, Vincent, E-mail: vincent.dorval@cea.fr; Darmon, Michel, E-mail: vincent.dorval@cea.fr; Chatillon, Sylvain, E-mail: vincent.dorval@cea.fr
2015-03-31
The modeling of ultrasonic Non Destructive Evaluation often plays an important part in the assessment of detection capabilities or as a help to interpret experiments. The ultrasonic modeling tool of the CIVA platform uses semi-analytical approximations for fast computations. Kirchhoff and GTD are two classical approximations for the modeling of echoes from plane-like defects such as cracks, and they aim at taking into account two different types of physical phenomena. The Kirchhoff approximation is mainly suitable to predict specular reflections from the flaw surface, whereas GTD is dedicated to the modeling of edge diffraction. As a consequence, these two approximationsmore » have distinct and complementary validity domains. Choosing between them requires expertise and is problematic in some inspection configurations. The Physical Theory of Diffraction (PTD) was developed based on both Kirchhoff and GTD in order to combine their advantages and overcome their limitations. The theoretical basis for PTD and its integration in the CIVA modeling approach are discussed in this communication. Several results that validate this newly developed model and illustrate its advantages are presented.« less
NASA Astrophysics Data System (ADS)
Makungo, Rachel; Odiyo, John O.
2017-08-01
This study was focused on testing the ability of a coupled linear and non-linear system identification model in estimating groundwater levels. System identification provides an alternative approach for estimating groundwater levels in areas that lack data required by physically-based models. It also overcomes the limitations of physically-based models due to approximations, assumptions and simplifications. Daily groundwater levels for 4 boreholes, rainfall and evaporation data covering the period 2005-2014 were used in the study. Seventy and thirty percent of the data were used to calibrate and validate the model, respectively. Correlation coefficient (R), coefficient of determination (R2), root mean square error (RMSE), percent bias (PBIAS), Nash Sutcliffe coefficient of efficiency (NSE) and graphical fits were used to evaluate the model performance. Values for R, R2, RMSE, PBIAS and NSE ranged from 0.8 to 0.99, 0.63 to 0.99, 0.01-2.06 m, -7.18 to 1.16 and 0.68 to 0.99, respectively. Comparisons of observed and simulated groundwater levels for calibration and validation runs showed close agreements. The model performance mostly varied from satisfactory, good, very good and excellent. Thus, the model is able to estimate groundwater levels. The calibrated models can reasonably capture description between input and output variables and can, thus be used to estimate long term groundwater levels.
Integral Full Core Multi-Physics PWR Benchmark with Measured Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forget, Benoit; Smith, Kord; Kumar, Shikhar
In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less
Irma 5.2 multi-sensor signature prediction model
NASA Astrophysics Data System (ADS)
Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles
2007-04-01
The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/MN) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after an extensive verification and validation of an upgraded and reengineered active channel. Since 2005, the reengineering effort has focused on the Irma passive channel. Field measurements for the validation effort include the unpolarized data collection. Irma 5.2 is scheduled for release in the summer of 2007. This paper will report the validation test results of the Irma passive models and discuss the new features in Irma 5.2.
Development of a contrast phantom for active millimeter-wave imaging systems
NASA Astrophysics Data System (ADS)
Barber, Jeffrey; Weatherall, James C.; Brauer, Carolyn S.; Smith, Barry T.
2011-06-01
As the development of active millimeter wave imaging systems continues, it is necessary to validate materials that simulate the expected response of explosives. While physics-based models have been used to develop simulants, it is desirable to image both the explosive and simulant together in a controlled fashion in order to demonstrate success. To this end, a millimeter wave contrast phantom has been created to calibrate image grayscale while controlling the configuration of the explosive and simulant such that direct comparison of their respective returns can be performed. The physics of the phantom are described, with millimeter wave images presented to show successful development of the phantom and simulant validation at GHz frequencies.
Examining students' views about validity of experiments: From introductory to Ph.D. students
NASA Astrophysics Data System (ADS)
Hu, Dehui; Zwickl, Benjamin M.
2018-06-01
We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems.
Mahadevan, Vijay S; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-08-06
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-01-01
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework. PMID:24982250
Low Complexity Models to improve Incomplete Sensitivities for Shape Optimization
NASA Astrophysics Data System (ADS)
Stanciu, Mugurel; Mohammadi, Bijan; Moreau, Stéphane
2003-01-01
The present global platform for simulation and design of multi-model configurations treat shape optimization problems in aerodynamics. Flow solvers are coupled with optimization algorithms based on CAD-free and CAD-connected frameworks. Newton methods together with incomplete expressions of gradients are used. Such incomplete sensitivities are improved using reduced models based on physical assumptions. The validity and the application of this approach in real-life problems are presented. The numerical examples concern shape optimization for an airfoil, a business jet and a car engine cooling axial fan.
An architecture for the development of real-time fault diagnosis systems using model-based reasoning
NASA Technical Reports Server (NTRS)
Hall, Gardiner A.; Schuetzle, James; Lavallee, David; Gupta, Uday
1992-01-01
Presented here is an architecture for implementing real-time telemetry based diagnostic systems using model-based reasoning. First, we describe Paragon, a knowledge acquisition tool for offline entry and validation of physical system models. Paragon provides domain experts with a structured editing capability to capture the physical component's structure, behavior, and causal relationships. We next describe the architecture of the run time diagnostic system. The diagnostic system, written entirely in Ada, uses the behavioral model developed offline by Paragon to simulate expected component states as reflected in the telemetry stream. The diagnostic algorithm traces causal relationships contained within the model to isolate system faults. Since the diagnostic process relies exclusively on the behavioral model and is implemented without the use of heuristic rules, it can be used to isolate unpredicted faults in a wide variety of systems. Finally, we discuss the implementation of a prototype system constructed using this technique for diagnosing faults in a science instrument. The prototype demonstrates the use of model-based reasoning to develop maintainable systems with greater diagnostic capabilities at a lower cost.
Perception of competence in middle school physical education: instrument development and validation.
Scrabis-Fletcher, Kristin; Silverman, Stephen
2010-03-01
Perception of Competence (POC) has been studied extensively in physical activity (PA) research with similar instruments adapted for physical education (PE) research. Such instruments do not account for the unique PE learning environment. Therefore, an instrument was developed and the scores validated to measure POC in middle school PE. A multiphase design was used consisting of an intensive theoretical review, elicitation study, prepilot study, pilot study, content validation study, and final validation study (N=1281). Data analysis included a multistep iterative process to identify the best model fit. A three-factor model for POC was tested and resulted in root mean square error of approximation = .09, root mean square residual = .07, goodness offit index = .90, and adjusted goodness offit index = .86 values in the acceptable range (Hu & Bentler, 1999). A two-factor model was also tested and resulted in a good fit (two-factor fit indexes values = .05, .03, .98, .97, respectively). The results of this study suggest that an instrument using a three- or two-factor model provides reliable and valid scores ofPOC measurement in middle school PE.
Modeling and Validation of Lithium-ion Automotive Battery Packs (SAE 2013-01-1539)
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...
Model for intensity calculation in electron guns
NASA Astrophysics Data System (ADS)
Doyen, O.; De Conto, J. M.; Garnier, J. P.; Lefort, M.; Richard, N.
2007-04-01
The calculation of the current in an electron gun structure is one of the main investigations involved in the electron gun physics understanding. In particular, various simulation codes exist but often present some important discrepancies with experiments. Moreover, those differences cannot be reduced because of the lack of physical information in these codes. We present a simple physical three-dimensional model, valid for all kinds of gun geometries. This model presents a better precision than all the other simulation codes and models encountered and allows the real understanding of the electron gun physics. It is based only on the calculation of the Laplace electric field at the cathode, the use of the classical Child-Langmuir's current density, and a geometrical correction to this law. Finally, the intensity versus voltage characteristic curve can be precisely described with only a few physical parameters. Indeed, we have showed that only the shape of the electric field at the cathode without beam, and a distance of an equivalent infinite planar diode gap, govern mainly the electron gun current generation.
Toward Supersonic Retropropulsion CFD Validation
NASA Technical Reports Server (NTRS)
Kleb, Bil; Schauerhamer, D. Guy; Trumble, Kerry; Sozer, Emre; Barnhardt, Michael; Carlson, Jan-Renee; Edquist, Karl
2011-01-01
This paper begins the process of verifying and validating computational fluid dynamics (CFD) codes for supersonic retropropulsive flows. Four CFD codes (DPLR, FUN3D, OVERFLOW, and US3D) are used to perform various numerical and physical modeling studies toward the goal of comparing predictions with a wind tunnel experiment specifically designed to support CFD validation. Numerical studies run the gamut in rigor from code-to-code comparisons to observed order-of-accuracy tests. Results indicate that this complex flowfield, involving time-dependent shocks and vortex shedding, design order of accuracy is not clearly evident. Also explored is the extent of physical modeling necessary to predict the salient flowfield features found in high-speed Schlieren images and surface pressure measurements taken during the validation experiment. Physical modeling studies include geometric items such as wind tunnel wall and sting mount interference, as well as turbulence modeling that ranges from a RANS (Reynolds-Averaged Navier-Stokes) 2-equation model to DES (Detached Eddy Simulation) models. These studies indicate that tunnel wall interference is minimal for the cases investigated; model mounting hardware effects are confined to the aft end of the model; and sparse grid resolution and turbulence modeling can damp or entirely dissipate the unsteadiness of this self-excited flow.
Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads
Moon, Jae; Manuel, Lance; Churchfield, Matthew; ...
2017-12-28
Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC) standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES). Stochastic characteristics of these LES waked wind velocity field,more » including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR) with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study's overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.« less
Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moon, Jae; Manuel, Lance; Churchfield, Matthew
Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC) standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES). Stochastic characteristics of these LES waked wind velocity field,more » including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR) with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study's overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.« less
2009-12-01
INHALATION TOXICOLOGY RESEARCH 2.1.1 Development of a Fatigue Model & Blood Oxygen-based Parameter Corre- lates Liu et al. (2002) introduced a muscle ...and Stuhmiller, J.H. “Generalization of a ‘phenomenological’ muscle fatigue model.” Technical report J0287-10-382 (in preparation). Product 3. Sih...physiologic response to exercise and a model of muscle fatigue which have been developed and validated separately are integrated. Integration occurs through
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Brand, Jeremy H.; Pereira, J. Michael; Revilock, Duane M.
2007-01-01
Following the tragedy of the Space Shuttle Columbia on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the Space Shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize Reinforced Carbon-Carbon (RCC) and various debris materials which could potentially shed on ascent and impact the Orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS DYNA to predict damage by potential and actual impact events on the Orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: fundamental tests to obtain independent static and dynamic material model properties of materials of interest, sub-component impact tests to provide highly controlled impact test data for the correlation and validation of the models, and full-scale impact tests to establish the final level of confidence for the analysis methodology. This paper discusses the second level subcomponent test program in detail and its application to the LS DYNA model validation process. The level two testing consisted of over one hundred impact tests in the NASA Glenn Research Center Ballistic Impact Lab on 6 by 6 in. and 6 by 12 in. flat plates of RCC and evaluated three types of debris projectiles: BX 265 External Tank foam, ice, and PDL 1034 External Tank foam. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile. The information obtained from this testing validated the LS DYNA damage prediction models and provided a certain level of confidence to begin performing analysis for full-size RCC test articles for returning NASA to flight with STS 114 and beyond.
Injector Design Tool Improvements: User's manual for FDNS V.4.5
NASA Technical Reports Server (NTRS)
Chen, Yen-Sen; Shang, Huan-Min; Wei, Hong; Liu, Jiwen
1998-01-01
The major emphasis of the current effort is in the development and validation of an efficient parallel machine computational model, based on the FDNS code, to analyze the fluid dynamics of a wide variety of liquid jet configurations for general liquid rocket engine injection system applications. This model includes physical models for droplet atomization, breakup/coalescence, evaporation, turbulence mixing and gas-phase combustion. Benchmark validation cases for liquid rocket engine chamber combustion conditions will be performed for model validation purpose. Test cases may include shear coaxial, swirl coaxial and impinging injection systems with combinations LOXIH2 or LOXISP-1 propellant injector elements used in rocket engine designs. As a final goal of this project, a well tested parallel CFD performance methodology together with a user's operation description in a final technical report will be reported at the end of the proposed research effort.
Haptic simulation framework for determining virtual dental occlusion.
Wu, Wen; Chen, Hui; Cen, Yuhai; Hong, Yang; Khambay, Balvinder; Heng, Pheng Ann
2017-04-01
The surgical treatment of many dentofacial deformities is often complex due to its three-dimensional nature. To determine the dental occlusion in the most stable position is essential for the success of the treatment. Computer-aided virtual planning on individualized patient-specific 3D model can help formulate the surgical plan and predict the surgical change. However, in current computer-aided planning systems, it is not possible to determine the dental occlusion of the digital models in the intuitive way during virtual surgical planning because of absence of haptic feedback. In this paper, a physically based haptic simulation framework is proposed, which can provide surgeons with the intuitive haptic feedback to determine the dental occlusion of the digital models in their most stable position. To provide the physically realistic force feedback when the dental models contact each other during the searching process, the contact model is proposed to describe the dynamic and collision properties of the dental models during the alignment. The simulated impulse/contact-based forces are integrated into the unified simulation framework. A validation study has been conducted on fifteen sets of virtual dental models chosen at random and covering a wide range of the dental relationships found clinically. The dental occlusions obtained by an expert were employed as a benchmark to compare the virtual occlusion results. The mean translational and angular deviations of the virtual occlusion results from the benchmark were small. The experimental results show the validity of our method. The simulated forces can provide valuable insights to determine the virtual dental occlusion. The findings of this work and the validation of proposed concept lead the way for full virtual surgical planning on patient-specific virtual models allowing fully customized treatment plans for the surgical correction of dentofacial deformities.
NASA Astrophysics Data System (ADS)
Li, Xiaoyu; Pan, Ke; Fan, Guodong; Lu, Rengui; Zhu, Chunbo; Rizzoni, Giorgio; Canova, Marcello
2017-11-01
State of energy (SOE) is an important index for the electrochemical energy storage system in electric vehicles. In this paper, a robust state of energy estimation method in combination with a physical model parameter identification method is proposed to achieve accurate battery state estimation at different operating conditions and different aging stages. A physics-based fractional order model with variable solid-state diffusivity (FOM-VSSD) is used to characterize the dynamic performance of a LiFePO4/graphite battery. In order to update the model parameter automatically at different aging stages, a multi-step model parameter identification method based on the lexicographic optimization is especially designed for the electric vehicle operating conditions. As the battery available energy changes with different applied load current profiles, the relationship between the remaining energy loss and the state of charge, the average current as well as the average squared current is modeled. The SOE with different operating conditions and different aging stages are estimated based on an adaptive fractional order extended Kalman filter (AFEKF). Validation results show that the overall SOE estimation error is within ±5%. The proposed method is suitable for the electric vehicle online applications.
Lindberg, Ann-Sofie; Oksa, Juha; Antti, Henrik; Malm, Christer
2015-01-01
Physical capacity has previously been deemed important for firefighters physical work capacity, and aerobic fitness, muscular strength, and muscular endurance are the most frequently investigated parameters of importance. Traditionally, bivariate and multivariate linear regression statistics have been used to study relationships between physical capacities and work capacities among firefighters. An alternative way to handle datasets consisting of numerous correlated variables is to use multivariate projection analyses, such as Orthogonal Projection to Latent Structures. The first aim of the present study was to evaluate the prediction and predictive power of field and laboratory tests, respectively, on firefighters' physical work capacity on selected work tasks. Also, to study if valid predictions could be achieved without anthropometric data. The second aim was to externally validate selected models. The third aim was to validate selected models on firefighters' and on civilians'. A total of 38 (26 men and 12 women) + 90 (38 men and 52 women) subjects were included in the models and the external validation, respectively. The best prediction (R2) and predictive power (Q2) of Stairs, Pulling, Demolition, Terrain, and Rescue work capacities included field tests (R2 = 0.73 to 0.84, Q2 = 0.68 to 0.82). The best external validation was for Stairs work capacity (R2 = 0.80) and worst for Demolition work capacity (R2 = 0.40). In conclusion, field and laboratory tests could equally well predict physical work capacities for firefighting work tasks, and models excluding anthropometric data were valid. The predictive power was satisfactory for all included work tasks except Demolition.
NASA Astrophysics Data System (ADS)
Fiorenti, Simone; Guanetti, Jacopo; Guezennec, Yann; Onori, Simona
2013-11-01
This paper presents the development and experimental validation of a dynamic model of a Hybridized Energy Storage System (HESS) consisting of a parallel connection of a lead acid (PbA) battery and double layer capacitors (DLCs), for automotive applications. The dynamic modeling of both the PbA battery and the DLC has been tackled via the equivalent electric circuit based approach. Experimental tests are designed for identification purposes. Parameters of the PbA battery model are identified as a function of state of charge and current direction, whereas parameters of the DLC model are identified for different temperatures. A physical HESS has been assembled at the Center for Automotive Research The Ohio State University and used as a test-bench to validate the model against a typical current profile generated for Start&Stop applications. The HESS model is then integrated into a vehicle simulator to assess the effects of the battery hybridization on the vehicle fuel economy and mitigation of the battery stress.
Alternative model for administration and analysis of research-based assessments
NASA Astrophysics Data System (ADS)
Wilcox, Bethany R.; Zwickl, Benjamin M.; Hobbs, Robert D.; Aiken, John M.; Welch, Nathan M.; Lewandowski, H. J.
2016-06-01
Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof of concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.
Zhaohua Dai; Carl Trettin; Changsheng Li; Devendra M. Amatya; Ge Sun; Harbin Li
2010-01-01
A physically based distributed hydrological model, MIKE SHE, was used to evaluate the effects of altered temperature and precipitation regimes on the streamflow and water table in a forested watershed on the southeastern Atlantic coastal plain. The model calibration and validation against both streamflow and water table depth showed that the MIKE SHE was applicable for...
Characterization and Physics-Based Modeling of Electrochemical Memristors
2015-11-16
conducting films that result from electrical or optical stress. Model parameters and electrical characteristics were obtained from and validated...x- ray scattering, Conductive Bridge Random Access Memory 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME...Calculated DOS for GeSe2 in valence band and (b) conduction band .................. 43 Figure 45. DFT band structure for crystalline GeSe2
An Example-Based Brain MRI Simulation Framework.
He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L
2015-02-21
The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.
Morin, Alexandre J S; Maïano, Christophe
2011-09-01
In a recent review of various physical self-concept instruments, Marsh and Cheng (in press) noted that the very short 12-item version of the French Physical Self-Inventory (PSI-VS) represents an important contribution to applied research but that further research was needed to investigate the robustness of its psychometric properties in new and diversified samples. The present study was designed to answer these questions based on a sample of 1103 normally achieving French adolescents. The results show that the PSI-VS measurement model is quite robust and fully invariant across subgroups of students formed according to gender, weight, age and ethnicity. The results also confirm the convergent validity and scale score reliability of the PSI-VS subscales. Copyright © 2011 Elsevier Ltd. All rights reserved.
Lindemann, Ulrich; Zijlstra, Wiebren; Aminian, Kamiar; Chastin, Sebastien F M; de Bruin, Eling D; Helbostad, Jorunn L; Bussmann, Johannes B J
2014-01-10
Physical activity is an important determinant of health and well-being in older persons and contributes to their social participation and quality of life. Hence, assessment tools are needed to study this physical activity in free-living conditions. Wearable motion sensing technology is used to assess physical activity. However, there is a lack of harmonisation of validation protocols and applied statistics, which make it hard to compare available and future studies. Therefore, the aim of this paper is to formulate recommendations for assessing the validity of sensor-based activity monitoring in older persons with focus on the measurement of body postures and movements. Validation studies of body-worn devices providing parameters on body postures and movements were identified and summarized and an extensive inter-active process between authors resulted in recommendations about: information on the assessed persons, the technical system, and the analysis of relevant parameters of physical activity, based on a standardized and semi-structured protocol. The recommended protocols can be regarded as a first attempt to standardize validity studies in the area of monitoring physical activity.
Lessons learned from recent geomagnetic disturbance model validation activities
NASA Astrophysics Data System (ADS)
Pulkkinen, A. A.; Welling, D. T.
2017-12-01
Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.
Evaluation of the National Solar Radiation Database (NSRDB) Using Ground-Based Measurements
NASA Astrophysics Data System (ADS)
Xie, Y.; Sengupta, M.; Habte, A.; Lopez, A.
2017-12-01
Solar resource is essential for a wide spectrum of applications including renewable energy, climate studies, and solar forecasting. Solar resource information can be obtained from ground-based measurement stations and/or from modeled data sets. While measurements provide data for the development and validation of solar resource models and other applications modeled data expands the ability to address the needs for increased accuracy and spatial and temporal resolution. The National Renewable Energy Laboratory (NREL) has developed and regular updates modeled solar resource through the National Solar Radiation Database (NSRDB). The recent NSRDB dataset was developed using the physics-based Physical Solar Model (PSM) and provides gridded solar irradiance (global horizontal irradiance (GHI), direct normal irradiance (DNI), and diffuse horizontal irradiance) at a 4-km by 4-km spatial and half-hourly temporal resolution covering 18 years from 1998-2015. A comprehensive validation of the performance of the NSRDB (1998-2015) was conducted to quantify the accuracy of the spatial and temporal variability of the solar radiation data. Further, the study assessed the ability of NSRDB (1998-2015) to accurately capture inter-annual variability, which is essential information for solar energy conversion projects and grid integration studies. Comparisons of the NSRDB (1998-2015) with nine selected ground-measured data were conducted under both clear- and cloudy-sky conditions. These locations provide a high quality data covering a variety of geographical locations and climates. The comparison of the NSRDB to the ground-based data demonstrated that biases were within +/- 5% for GHI and +/-10% for DNI. A comprehensive uncertainty estimation methodology was established to analyze the performance of the gridded NSRDB and includes all sources of uncertainty at various time-averaged periods, a method that is not often used in model evaluation. Further, the study analyzed the inter-annual and mean-anomaly of the 18 years of solar radiation data. This presentation will outline the validation methodology and provide detailed results of the comparison.
Review of surface steam sterilization for validation purposes.
van Doornmalen, Joost; Kopinga, Klaas
2008-03-01
Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.
ERIC Educational Resources Information Center
Manthey, Seth; Brewe, Eric
2013-01-01
University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER)…
Pilot Wave Model for Impulsive Thrust from RF Test Device Measured in Vacuum
NASA Technical Reports Server (NTRS)
White, Harold; Lawrence, James; Sylvester, Andre; Vera, Jerry; Chap, Andrew; George, Jeff
2017-01-01
A physics model is developed in detail and its place in the taxonomy of ideas about the nature of the quantum vacuum is discussed. The experimental results from the recently completed vacuum test campaign evaluating the impulsive thrust performance of a tapered RF test article excited in the TM212 mode at 1,937 megahertz (MHz) are summarized. The empirical data from this campaign is compared to the predictions from the physics model tools. A discussion is provided to further elaborate on the possible implications of the proposed model if it is physically valid. Based on the correlation of analysis prediction with experimental data collected, it is proposed that the observed anomalous thrust forces are real, not due to experimental error, and are due to a new type of interaction with quantum vacuum fluctuations.
NASA Astrophysics Data System (ADS)
Li, Jing; Singh, Chandralekha
2017-03-01
Development of validated physics surveys on various topics is important for investigating the extent to which students master those concepts after traditional instruction and for assessing innovative curricula and pedagogies that can improve student understanding significantly. Here, we discuss the development and validation of a conceptual multiple-choice survey related to magnetism suitable for introductory physics courses. The survey was developed taking into account common students’ difficulties with magnetism concepts covered in introductory physics courses found in our investigation and the incorrect choices to the multiple-choice questions were designed based upon those common student difficulties. After the development and validation of the survey, it was administered to introductory physics students in various classes in paper-pencil format before and after traditional lecture-based instruction in relevant concepts. We compared the performance of students on the survey in the algebra-based and calculus-based introductory physics courses before and after traditional lecture-based instruction in relevant magnetism concepts. We discuss the common difficulties of introductory physics students with magnetism concepts we found via the survey. We also administered the survey to upper-level undergraduates majoring in physics and PhD students to benchmark the survey and compared their performance with those of traditionally taught introductory physics students for whom the survey is intended. A comparison with the base line data on the validated magnetism survey from traditionally taught introductory physics courses and upper-level undergraduate and PhD students discussed in this paper can help instructors assess the effectiveness of curricula and pedagogies which is especially designed to help students integrate conceptual and quantitative understanding and develop a good grasp of the concepts. In particular, if introductory physics students’ average performance in a class is significantly better than those of students in traditionally taught courses described here (and particularly when it is comparable to that of physics PhD students’ average performance discussed here), the curriculum or pedagogy used in that introductory class can be deemed effective. Moreover, we discuss the use of the survey to investigate gender differences in student performance.
Structure of the tropical lower stratosphere as revealed by three reanalysis data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawson, S.; Fiorino, M.
1996-05-01
While the skill of climate simulation models has advanced over the last decade, mainly through improvements in modeling, further progress will depend on the availability and the quality of comprehensive validation data sets covering long time periods. A new source of such validation data is atmospheric {open_quotes}reanalysis{close_quotes} where a fixed, state-of-the-art global atmospheric model/data assimilation system is run through archived and recovered observations to produce a consistent set of atmospheric analyses. Although reanalysis will be free of non-physical variability caused by changes in the models and/or the assimilation procedure, it is necessary to assess its quality. A region for stringentmore » testing of the quality of reanalysis is the tropical lower stratosphere. This portion of the atmosphere is sparse in observations but displays the prominent quasi-biennial oscillation (QBO) and an annual cycle, neither of which is fully understood, but which are likely coupled dynamically. We first consider the performance of three reanalyses, from NCEP/NCAR, NASA and ECMWF, against rawinsonde data in depicting the QBO and then examine the structure of the tropical lower stratosphere in NCEP and ECMWF data sets in detail. While the annual cycle and the QBO in wind and temperature are quite successfully represented, the mean meridional circulations in NCEP and ECMWF data sets contain unusual features which may be due to the assimilation process rather than being physically based. Further, the models capture the long-term temperature fluctuations associated with volcanic eruptions, even though the physical mechanisms are not included, thus implying that the model does not mask prominent stratospheric signals in the observational data. We conclude that reanalysis offers a unique opportunity to better understand the dynamics of QBO and can be applied to climate model validation.« less
Development of multimedia learning based inquiry on vibration and wave material
NASA Astrophysics Data System (ADS)
Madeali, H.; Prahani, B. K.
2018-03-01
This study aims to develop multimedia learning based inquiry that is interesting, easy to understand by students and streamline the time of teachers in bringing the teaching materials as well as feasible to be used in learning the physics subject matter of vibration and wave. This research is a Research and Development research with reference to ADDIE model that is Analysis, Design, Development, Implementation, and Evaluation. Multimedia based learning inquiry is packaged in hypertext form using Adobe Flash CS6 Software. The inquiry aspect is constructed by showing the animation of the concepts that the student wants to achieve and then followed by questions that will ask the students what is observable. Multimedia learning based inquiry is then validated by 2 learning experts, 3 material experts and 3 media experts and tested on 3 junior high school teachers and 23 students of state junior high school 5 of Kendari. The results of the study include: (1) Validation results by learning experts, material experts and media experts in valid categories; (2) The results of trials by teachers and students fall into the practical category. These results prove that the multimedia learning based inquiry on vibration and waves materials that have been developed feasible use in physics learning by students of junior high school class VIII.
NASA Astrophysics Data System (ADS)
Zheng, Xu; Hao, Zhiyong; Wang, Xu; Mao, Jie
2016-06-01
High-speed-railway-train interior noise at low, medium, and high frequencies could be simulated by finite element analysis (FEA) or boundary element analysis (BEA), hybrid finite element analysis-statistical energy analysis (FEA-SEA) and statistical energy analysis (SEA), respectively. First, a new method named statistical acoustic energy flow (SAEF) is proposed, which can be applied to the full-spectrum HST interior noise simulation (including low, medium, and high frequencies) with only one model. In an SAEF model, the corresponding multi-physical-field coupling excitations are firstly fully considered and coupled to excite the interior noise. The interior noise attenuated by sound insulation panels of carriage is simulated through modeling the inflow acoustic energy from the exterior excitations into the interior acoustic cavities. Rigid multi-body dynamics, fast multi-pole BEA, and large-eddy simulation with indirect boundary element analysis are first employed to extract the multi-physical-field excitations, which include the wheel-rail interaction forces/secondary suspension forces, the wheel-rail rolling noise, and aerodynamic noise, respectively. All the peak values and their frequency bands of the simulated acoustic excitations are validated with those from the noise source identification test. Besides, the measured equipment noise inside equipment compartment is used as one of the excitation sources which contribute to the interior noise. Second, a full-trimmed FE carriage model is firstly constructed, and the simulated modal shapes and frequencies agree well with the measured ones, which has validated the global FE carriage model as well as the local FE models of the aluminum alloy-trim composite panel. Thus, the sound transmission loss model of any composite panel has indirectly been validated. Finally, the SAEF model of the carriage is constructed based on the accurate FE model and stimulated by the multi-physical-field excitations. The results show that the trend of the simulated 1/3 octave band sound pressure spectrum agrees well with that of the on-site-measured one. The deviation between the simulated and measured overall sound pressure level (SPL) is 2.6 dB(A) and well controlled below the engineering tolerance limit, which has validated the SAEF model in the full-spectrum analysis of the high speed train interior noise.
Recent modelling advances for ultrasonic TOFD inspections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darmon, Michel; Ferrand, Adrien; Dorval, Vincent
The ultrasonic TOFD (Time of Flight Diffraction) Technique is commonly used to detect and characterize disoriented cracks using their edge diffraction echoes. An overview of the models integrated in the CIVA software platform and devoted to TOFD simulation is presented. CIVA allows to predict diffraction echoes from complex 3D flaws using a PTD (Physical Theory of Diffraction) based model. Other dedicated developments have been added to simulate lateral waves in 3D on planar entry surfaces and in 2D on irregular surfaces by a ray approach. Calibration echoes from Side Drilled Holes (SDHs), specimen echoes and shadowing effects from flaws canmore » also been modelled. Some examples of theoretical validation of the models are presented. In addition, experimental validations have been performed both on planar blocks containing calibration holes and various notches and also on a specimen with an irregular entry surface and allow to draw conclusions on the validity of all the developed models.« less
Sell, Timothy C; Abt, John P; Crawford, Kim; Lovalekar, Mita; Nagai, Takashi; Deluzio, Jennifer B; Smalley, Brain W; McGrail, Mark A; Rowe, Russell S; Cardin, Sylvain; Lephart, Scott M
2010-01-01
Physical training for United States military personnel requires a combination of injury prevention and performance optimization to counter unintentional musculoskeletal injuries and maximize warrior capabilities. Determining the most effective activities and tasks to meet these goals requires a systematic, research-based approach that is population specific based on the tasks and demands of the Warrior. The authors have modified the traditional approach to injury prevention to implement a comprehensive injury prevention and performance optimization research program with the 101st Airborne Division (Air Assault) at Fort Campbell, KY. This is second of two companion papers and presents the last three steps of the research model and includes Design and Validation of the Interventions, Program Integration and Implementation, and Monitor and Determine the Effectiveness of the Program. An 8-week trial was performed to validate the Eagle Tactical Athlete Program (ETAP) to improve modifiable suboptimal characteristics identified in Part I. The experimental group participated in ETAP under the direction of a ETAP Strength and Conditioning Specialist while the control group performed the current physical training at Fort Campbell under the direction of a Physical Training Leader and as governed by FM 21-20 for the 8-week study period. Soldiers performing ETAP demonstrated improvements in several tests for strength, flexibility, performance, physiology, and the APFT compared to current physical training performed at Fort Campbell. ETAP was proven valid to improve certain suboptimal characteristics within the 8-week trial as compared to the current training performed at Fort Campbell. ETAP has long-term implications and with expected greater improvements when implemented into a Division pre-deployment cycle of 10-12 months which will result in further systemic adaptations for each variable.
Velpuri, N.M.; Senay, G.B.; Asante, K.O.
2011-01-01
Managing limited surface water resources is a great challenge in areas where ground-based data are either limited or unavailable. Direct or indirect measurements of surface water resources through remote sensing offer several advantages of monitoring in ungauged basins. A physical based hydrologic technique to monitor lake water levels in ungauged basins using multi-source satellite data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, a digital elevation model, and other data is presented. This approach is applied to model Lake Turkana water levels from 1998 to 2009. Modelling results showed that the model can reasonably capture all the patterns and seasonal variations of the lake water level fluctuations. A composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data is used for model calibration (1998-2000) and model validation (2001-2009). Validation results showed that model-based lake levels are in good agreement with observed satellite altimetry data. Compared to satellite altimetry data, the Pearson's correlation coefficient was found to be 0.81 during the validation period. The model efficiency estimated using NSCE is found to be 0.93, 0.55 and 0.66 for calibration, validation and combined periods, respectively. Further, the model-based estimates showed a root mean square error of 0.62 m and mean absolute error of 0.46 m with a positive mean bias error of 0.36 m for the validation period (2001-2009). These error estimates were found to be less than 15 % of the natural variability of the lake, thus giving high confidence on the modelled lake level estimates. The approach presented in this paper can be used to (a) simulate patterns of lake water level variations in data scarce regions, (b) operationally monitor lake water levels in ungauged basins, (c) derive historical lake level information using satellite rainfall and evapotranspiration data, and (d) augment the information provided by the satellite altimetry systems on changes in lake water levels. ?? Author(s) 2011.
Dependency of the Reynolds number on the water flow through the perforated tube
DOE Office of Scientific and Technical Information (OSTI.GOV)
Závodný, Zdenko, E-mail: zdenko.zavodny@stuba.sk; Bereznai, Jozef, E-mail: jozef.bereznai@stuba.sk; Urban, František
Safe and effective loading of nuclear reactor fuel assemblies demands qualitative and quantitative analysis of the relationship between the coolant temperature in the fuel assembly outlet, measured by the thermocouple, and the mean coolant temperature profile in the thermocouple plane position. It is not possible to perform the analysis directly in the reactor, so it is carried out using measurements on the physical model, and the CFD fuel assembly coolant flow models. The CFD models have to be verified and validated in line with the temperature and velocity profile obtained from the measurements of the cooling water flowing in themore » physical model of the fuel assembly. Simplified physical model with perforated central tube and its validated CFD model serve to design of the second physical model of the fuel assembly of the nuclear reactor VVER 440. Physical model will be manufactured and installed in the laboratory of the Institute of Energy Machines, Faculty of Mechanical Engineering of the Slovak University of Technology in Bratislava.« less
Validation of the Physical Activity Scale for individuals with physical disabilities.
van den Berg-Emons, Rita J; L'Ortye, Annemiek A; Buffart, Laurien M; Nieuwenhuijsen, Channah; Nooijen, Carla F; Bergen, Michael P; Stam, Henk J; Bussmann, Johannes B
2011-06-01
To determine the criterion validity of the Physical Activity Scale for Individuals With Physical Disabilities (PASIPD) by means of daily physical activity levels measured by using a validated accelerometry-based activity monitor in a large group of persons with a physical disability. Cross-sectional. Participants' home environment. Ambulatory and nonambulatory persons with cerebral palsy, meningomyelocele, or spinal cord injury (N=124). Not applicable. Self-reported physical activity level measured by using the PASIPD, a 2-day recall questionnaire, was correlated to objectively measured physical activity level measured by using a validated accelerometry-based activity monitor. Significant Spearman correlation coefficients between the PASIPD and activity monitor outcome measures ranged from .22 to .37. The PASIPD overestimated the duration of physical activity measured by using the activity monitor (mean ± SD, 3.9±2.9 vs 1.5±0.9h/d; P<.01). Significant correlation (ρ=-.74; P<.01) was found between average number of hours of physical activity per day measured by using the 2 methods and difference in hours between methods. This indicates larger overestimation for persons with higher activity levels. The PASIPD correlated poorly with objective measurements using an accelerometry-based activity monitor in people with a physical disability. However, similar low correlations between objective and subjective activity measurements have been found in the general population. Users of the PASIPD should be cautious about overestimating physical activity levels. Copyright © 2011 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems
Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang
2016-01-01
With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach. PMID:26999141
A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems.
Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang
2016-03-17
With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach.
An uncertainty model of acoustic metamaterials with random parameters
NASA Astrophysics Data System (ADS)
He, Z. C.; Hu, J. Y.; Li, Eric
2018-01-01
Acoustic metamaterials (AMs) are man-made composite materials. However, the random uncertainties are unavoidable in the application of AMs due to manufacturing and material errors which lead to the variance of the physical responses of AMs. In this paper, an uncertainty model based on the change of variable perturbation stochastic finite element method (CVPS-FEM) is formulated to predict the probability density functions of physical responses of AMs with random parameters. Three types of physical responses including the band structure, mode shapes and frequency response function of AMs are studied in the uncertainty model, which is of great interest in the design of AMs. In this computation, the physical responses of stochastic AMs are expressed as linear functions of the pre-defined random parameters by using the first-order Taylor series expansion and perturbation technique. Then, based on the linear function relationships of parameters and responses, the probability density functions of the responses can be calculated by the change-of-variable technique. Three numerical examples are employed to demonstrate the effectiveness of the CVPS-FEM for stochastic AMs, and the results are validated by Monte Carlo method successfully.
Turusheva, Anna; Frolova, Elena; Korystina, Elena; Zelenukha, Dmitry; Tadjibaev, Pulodjon; Gurina, Natalia; Turkeshi, Eralda; Degryse, Jean-Marie
2016-05-09
Frailty prevalence differs across countries depending on the models used to assess it that are based on various conceptual and operational definitions. This study aims to assess the clinical validity of three frailty models among community-dwelling older adults in north-western Russia where there is a higher incidence of cardiovascular disease and lower life expectancy than in European countries. The Crystal study is a population-based prospective cohort study in Kolpino, St. Petersburg, Russia. A random sample of the population living in the district was stratified into two age groups: 65-75 (n = 305) and 75+ (n = 306) and had a baseline comprehensive health assessment followed by a second one after 33.4 +/-3 months. The total observation time was 47 +/-14.6 months. Frailty was assessed according to the models of Fried, Puts and Steverink-Slaets. Its association with mortality at 5 years follow-up as well as dependency, mental and physical decline at around 2.5 years follow up was explored by multivariable and time-to-event analyses. Mortality was predicted independently from age, sex and comorbidities only by the frail status of the Fried model in those over 75 years old [HR (95 % CI) = 2.50 (1.20-5.20)]. Mental decline was independently predicted only by pre-frail [OR (95 % CI) = 0.24 (0.10-0.55)] and frail [OR (95 % CI) = 0.196 (0.06-0.67)] status of Fried model in those 65-75 years old. The prediction of dependency and physical decline by pre-frail and frail status of any the three frailty models was not statistically significant in this cohort of older adults. None of the three frailty models was valid at predicting 5 years mortality and disability, mental and physical decline at 2.5 years in a cohort of older adults in north-west Russia. Frailty by the Fried model had only limited value for mortality in those 75 years old and mental decline in those 65-75 years old. Further research is needed to identify valid frailty markers for older adults in this population.
Effect of Lime Stabilization on Vertical Deformation of Laterite Halmahera Soil
NASA Astrophysics Data System (ADS)
Saing, Zubair; Djainal, Herry
2018-04-01
In this paper, the study was conducted to determine the lime effect on vertical deformation of road base physical model of laterite Halmahera soil. The samples of laterite soil were obtained from Halmahera Island, North Maluku Province, Indonesia. Soil characteristics were obtained from laboratory testing, according to American Standard for Testing and Materials (ASTM), consists of physical, mechanical, minerals, and chemical. The base layer of physical model testing with the dimension; 2m of length, 2m of width, and 1.5m of height. The addition of lime with variations of 3, 5, 7, an 10%, based on maximum dry density of standard Proctor test results and cured for 28 days. The model of lime treated laterite Halmahera soil with 0,1m thickness placed on subgrade layer with 1,5m thickness. Furthermore, the physical model was given static vertical loading. Some dial gauge is placed on the lime treated soil surface with distance interval 20cm, to read the vertical deformation that occurs during loading. The experimentals data was analyzed and validated with numerical analysis using finite element method. The results showed that the vertical deformation reduced significantly on 10% lime content (three times less than untreated soil), and qualify for maximum deflection (standard requirement L/240) on 7-10% lime content.
Schilirò, Luca; Montrasio, Lorella; Scarascia Mugnozza, Gabriele
2016-11-01
In recent years, physically-based numerical models have frequently been used in the framework of early-warning systems devoted to rainfall-induced landslide hazard monitoring and mitigation. For this reason, in this work we describe the potential of SLIP (Shallow Landslides Instability Prediction), a simplified physically-based model for the analysis of shallow landslide occurrence. In order to test the reliability of this model, a back analysis of recent landslide events occurred in the study area (located SW of Messina, northeastern Sicily, Italy) on October 1st, 2009 was performed. The simulation results have been compared with those obtained for the same event by using TRIGRS, another well-established model for shallow landslide prediction. Afterwards, a simulation over a 2-year span period has been performed for the same area, with the aim of evaluating the performance of SLIP as early warning tool. The results confirm the good predictive capability of the model, both in terms of spatial and temporal prediction of the instability phenomena. For this reason, we recommend an operating procedure for the real-time definition of shallow landslide triggering scenarios at the catchment scale, which is based on the use of SLIP calibrated through a specific multi-methodological approach. Copyright © 2016 Elsevier B.V. All rights reserved.
A useful demonstration of calculus in a physics high school laboratory
NASA Astrophysics Data System (ADS)
Alvarez, Gustavo; Schulte, Jurgen; Stockton, Geoffrey; Wheeler, David
2018-01-01
The real power of calculus is revealed when it is applied to actual physical problems. In this paper, we present a calculus inspired physics experiment suitable for high school and undergraduate programs. A model for the theory of the terminal velocity of a falling body subject to a resistive force is developed and its validity tested in an experiment of a falling magnet in a column of self-induced eddy currents. The presented method combines multiple physics concepts such as 1D kinematics, classical mechanics, electromagnetism and non-trivial mathematics. It offers the opportunity for lateral as well as project-based learning.
OECD-NEA Expert Group on Multi-Physics Experimental Data, Benchmarks and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valentine, Timothy; Rohatgi, Upendra S.
High-fidelity, multi-physics modeling and simulation (M&S) tools are being developed and utilized for a variety of applications in nuclear science and technology and show great promise in their abilities to reproduce observed phenomena for many applications. Even with the increasing fidelity and sophistication of coupled multi-physics M&S tools, the underpinning models and data still need to be validated against experiments that may require a more complex array of validation data because of the great breadth of the time, energy and spatial domains of the physical phenomena that are being simulated. The Expert Group on Multi-Physics Experimental Data, Benchmarks and Validationmore » (MPEBV) of the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) was formed to address the challenges with the validation of such tools. The work of the MPEBV expert group is shared among three task forces to fulfill its mandate and specific exercises are being developed to demonstrate validation principles for common industrial challenges. This paper describes the overall mission of the group, the specific objectives of the task forces, the linkages among the task forces, and the development of a validation exercise that focuses on a specific reactor challenge problem.« less
Modeling and Validation of Power-split and P2 Parallel Hybrid Electric Vehicles SAE 2013-01-1470)
The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types combined ...
NASA Astrophysics Data System (ADS)
Anaperta, M.; Helendra, H.; Zulva, R.
2018-04-01
This study aims to describe the validity of physics module with Character Oriented Values Using Process Approach Skills at Dynamic Electrical Material in high school physics / MA and SMK. The type of research is development research. The module development model uses the development model proposed by Plomp which consists of (1) preliminary research phase, (2) the prototyping phase, and (3) assessment phase. In this research is done is initial investigation phase and designing. Data collecting technique to know validation is observation and questionnaire. In the initial investigative phase, curriculum analysis, student analysis, and concept analysis were conducted. In the design phase and the realization of module design for SMA / MA and SMK subjects in dynamic electrical materials. After that, the formative evaluation which include self evaluation, prototyping (expert reviews, one-to-one, and small group. At this stage validity is performed. This research data is obtained through the module validation sheet, which then generates a valid module.
Monte Carlo calculations of positron emitter yields in proton radiotherapy.
Seravalli, E; Robert, C; Bauer, J; Stichelbaut, F; Kurz, C; Smeets, J; Van Ngoc Ty, C; Schaart, D R; Buvat, I; Parodi, K; Verhaegen, F
2012-03-21
Positron emission tomography (PET) is a promising tool for monitoring the three-dimensional dose distribution in charged particle radiotherapy. PET imaging during or shortly after proton treatment is based on the detection of annihilation photons following the ß(+)-decay of radionuclides resulting from nuclear reactions in the irradiated tissue. Therapy monitoring is achieved by comparing the measured spatial distribution of irradiation-induced ß(+)-activity with the predicted distribution based on the treatment plan. The accuracy of the calculated distribution depends on the correctness of the computational models, implemented in the employed Monte Carlo (MC) codes that describe the interactions of the charged particle beam with matter and the production of radionuclides and secondary particles. However, no well-established theoretical models exist for predicting the nuclear interactions and so phenomenological models are typically used based on parameters derived from experimental data. Unfortunately, the experimental data presently available are insufficient to validate such phenomenological hadronic interaction models. Hence, a comparison among the models used by the different MC packages is desirable. In this work, starting from a common geometry, we compare the performances of MCNPX, GATE and PHITS MC codes in predicting the amount and spatial distribution of proton-induced activity, at therapeutic energies, to the already experimentally validated PET modelling based on the FLUKA MC code. In particular, we show how the amount of ß(+)-emitters produced in tissue-like media depends on the physics model and cross-sectional data used to describe the proton nuclear interactions, thus calling for future experimental campaigns aiming at supporting improvements of MC modelling for clinical application of PET monitoring. © 2012 Institute of Physics and Engineering in Medicine
Takada, Kenta; Sato, Tatsuhiko; Kumada, Hiroaki; Koketsu, Junichi; Takei, Hideyuki; Sakurai, Hideyuki; Sakae, Takeji
2018-01-01
The microdosimetric kinetic model (MKM) is widely used for estimating relative biological effectiveness (RBE)-weighted doses for various radiotherapies because it can determine the surviving fraction of irradiated cells based on only the lineal energy distribution, and it is independent of the radiation type and ion species. However, the applicability of the method to proton therapy has not yet been investigated thoroughly. In this study, we validated the RBE-weighted dose calculated by the MKM in tandem with the Monte Carlo code PHITS for proton therapy by considering the complete simulation geometry of the clinical proton beam line. The physical dose, lineal energy distribution, and RBE-weighted dose for a 155 MeV mono-energetic and spread-out Bragg peak (SOBP) beam of 60 mm width were evaluated. In estimating the physical dose, the calculated depth dose distribution by irradiating the mono-energetic beam using PHITS was consistent with the data measured by a diode detector. A maximum difference of 3.1% in the depth distribution was observed for the SOBP beam. In the RBE-weighted dose validation, the calculated lineal energy distributions generally agreed well with the published measurement data. The calculated and measured RBE-weighted doses were in excellent agreement, except at the Bragg peak region of the mono-energetic beam, where the calculation overestimated the measured data by ~15%. This research has provided a computational microdosimetric approach based on a combination of PHITS and MKM for typical clinical proton beams. The developed RBE-estimator function has potential application in the treatment planning system for various radiotherapies. © The Author 2017. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.
Sato, Tatsuhiko; Kumada, Hiroaki; Koketsu, Junichi; Takei, Hideyuki; Sakurai, Hideyuki; Sakae, Takeji
2018-01-01
Abstract The microdosimetric kinetic model (MKM) is widely used for estimating relative biological effectiveness (RBE)-weighted doses for various radiotherapies because it can determine the surviving fraction of irradiated cells based on only the lineal energy distribution, and it is independent of the radiation type and ion species. However, the applicability of the method to proton therapy has not yet been investigated thoroughly. In this study, we validated the RBE-weighted dose calculated by the MKM in tandem with the Monte Carlo code PHITS for proton therapy by considering the complete simulation geometry of the clinical proton beam line. The physical dose, lineal energy distribution, and RBE-weighted dose for a 155 MeV mono-energetic and spread-out Bragg peak (SOBP) beam of 60 mm width were evaluated. In estimating the physical dose, the calculated depth dose distribution by irradiating the mono-energetic beam using PHITS was consistent with the data measured by a diode detector. A maximum difference of 3.1% in the depth distribution was observed for the SOBP beam. In the RBE-weighted dose validation, the calculated lineal energy distributions generally agreed well with the published measurement data. The calculated and measured RBE-weighted doses were in excellent agreement, except at the Bragg peak region of the mono-energetic beam, where the calculation overestimated the measured data by ~15%. This research has provided a computational microdosimetric approach based on a combination of PHITS and MKM for typical clinical proton beams. The developed RBE-estimator function has potential application in the treatment planning system for various radiotherapies. PMID:29087492
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atamturktur, Sez; Unal, Cetin; Hemez, Francois
The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed frameworkmore » is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this framework, the project team has focused on optimizing resource allocation for improving numerical models through further code development and experimentation. Related to further code development, we have developed a code prioritization index (CPI) for coupled numerical models. CPI is implemented to effectively improve the predictive capability of the coupled model by increasing the sophistication of constituent codes. In relation to designing new experiments, we investigated the information gained by the addition of each new experiment used for calibration and bias correction of a simulation model. Additionally, the variability of ‘information gain’ through the design domain has been investigated in order to identify the experiment settings where maximum information gain occurs and thus guide the experimenters in the selection of the experiment settings. This idea was extended to evaluate the information gain from each experiment can be improved by intelligently selecting the experiments, leading to the development of the Batch Sequential Design (BSD) technique. Additionally, we evaluated the importance of sufficiently exploring the domain of applicability in experiment-based validation of high-consequence modeling and simulation by developing a new metric to quantify coverage. This metric has also been incorporated into the design of new experiments. Finally, we have proposed a data-aware calibration approach for the calibration of numerical models. This new method considers the complexity of a numerical model (the number of parameters to be calibrated, parameter uncertainty, and form of the model) and seeks to identify the number of experiments necessary to calibrate the model based on the level of sophistication of the physics. The final component in the project team’s work to improve model calibration and validation methods is the incorporation of robustness to non-probabilistic uncertainty in the input parameters. This is an improvement to model validation and uncertainty quantification stemming beyond the originally proposed scope of the project. We have introduced a new metric for incorporating the concept of robustness into experiment-based validation of numerical models. This project has accounted for the graduation of two Ph.D. students (Kendra Van Buren and Josh Hegenderfer) and two M.S. students (Matthew Egeberg and Parker Shields). One of the doctoral students is now working in the nuclear engineering field and the other one is a post-doctoral fellow at the Los Alamos National Laboratory. Additionally, two more Ph.D. students (Garrison Stevens and Tunc Kulaksiz) who are working towards graduation have been supported by this project.« less
2009-12-01
the validity of approximating poroelastic media with acoustic or acoustic /elastic models , and to characterize how scattering physics will differ for...elastic buried object (yellow rectangle in the figure) in three types of environments: • (1) Model 1: acoustic layer on top of a poroelastic medium with a...porosity gradient and no viscous damping. • (2) Model 2: acoustic layer on top of a poroelastic medium with a porosity gradient and viscous damping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vance, J.N.; Holderness, J.H.; James, D.W.
1992-12-01
Waste stream scaling factors based on sampling programs are vulnerable to one or more of the following factors: sample representativeness, analytic accuracy, and measurement sensitivity. As an alternative to sample analyses or as a verification of the sampling results, this project proposes the use of the RADSOURCE code, which accounts for the release of fuel-source radionuclides. Once the release rates of these nuclides from fuel are known, the code develops scaling factors for waste streams based on easily measured Cobalt-60 (Co-60) and Cesium-137 (Cs-137). The project team developed mathematical models to account for the appearance rate of 10CFR61 radionuclides inmore » reactor coolant. They based these models on the chemistry and nuclear physics of the radionuclides involved. Next, they incorporated the models into a computer code that calculates plant waste stream scaling factors based on reactor coolant gamma- isotopic data. Finally, the team performed special sampling at 17 reactors to validate the models in the RADSOURCE code.« less
Developing evaluation scales for horticultural therapy.
Im, Eun-Ae; Park, Sin-Ae; Son, Ki-Cheol
2018-04-01
This study developed evaluation scales for measuring the effects of horticultural therapy in practical settings. Qualitative and quantitative research, including three preliminary studies and a main study, were conducted. In the first study, a total of 779 horticultural therapists answered an open-end questionnaire based on 58 items about elements of occupational therapy and seven factors about singularity of horticultural therapy. In the second study, 20 horticultural therapists participated in in-depth interviews. In the third study, a Delphi method was conducted with 24 horticultural therapists to build a model of assessment indexes and ensure the validity. In the final study, the reserve scales were tested by 121 horticultural therapists in their practical settings for 1045 clients, to verify their reliability and validity. Preliminary questions in the effects area of horticultural therapy were developed in the first study, and validity for the components in the second study. In the third study, an expert Delphi survey was conducted as part of content validity verification of the preliminary tool of horticultural therapy for physical, cognitive, psychological-emotional, and social areas. In the final study, the evaluation tool, which verified the construct, convergence, discriminant, and predictive validity and reliability test, was used to finalise the evaluation tool. The effects of horticultural therapy were classified as four different aspects, namely, physical, cognitive, psycho-emotional, and social, based on previous studies on the effects of horticultural therapy. 98 questions in the four aspects were selected as reserve scales. The reliability of each scale was calculated as 0.982 in physical, 0.980 in cognitive, 0.965 in psycho-emotional, and 0.972 in social aspects based on the Cronbach's test of intra-item internal consistency and half reliability of Spearman-Brown. This study was the first to demonstrate validity and reliability by simultaneously developing four measures of horticultural therapy effectiveness, namely, physical, cognitive, psychological-emotional, and social, both locally and externally. It is especially worthwhile in that it can be applied in common to people. Copyright © 2018 Elsevier Ltd. All rights reserved.
Lim, Hojun; Battaile, Corbett C.; Brown, Justin L.; ...
2016-06-14
In this work, we develop a tantalum strength model that incorporates e ects of temperature, strain rate and pressure. Dislocation kink-pair theory is used to incorporate temperature and strain rate e ects while the pressure dependent yield is obtained through the pressure dependent shear modulus. Material constants used in the model are parameterized from tantalum single crystal tests and polycrystalline ramp compression experiments. It is shown that the proposed strength model agrees well with the temperature and strain rate dependent yield obtained from polycrystalline tantalum experiments. Furthermore, the model accurately reproduces the pressure dependent yield stresses up to 250 GPa.more » The proposed strength model is then used to conduct simulations of a Taylor cylinder impact test and validated with experiments. This approach provides a physically-based multi-scale strength model that is able to predict the plastic deformation of polycrystalline tantalum through a wide range of temperature, strain and pressure regimes.« less
Finite Element Model Development and Validation for Aircraft Fuselage Structures
NASA Technical Reports Server (NTRS)
Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.
2000-01-01
The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results. The increased frequency range results in a corresponding increase in the number of modes, modal density and spatial resolution requirements. In this study, conventional modal tests using accelerometers are complemented with Scanning Laser Doppler Velocimetry and Electro-Optic Holography measurements to further resolve the spatial response characteristics. Whenever possible, component and subassembly modal tests are used to validate the finite element models at lower levels of assembly. Normal mode predictions for different finite element representations of components and assemblies are compared with experimental results to assess the most accurate techniques for modeling aircraft fuselage type structures.
NASA Astrophysics Data System (ADS)
Lorente, Pablo; Sotillo, Marcos G.; Gutknecht, Elodie; Dabrowski, Tomasz; Aouf, Lotfi; Toledano, Cristina; Amo-Baladron, Arancha; Aznar, Roland; De Pascual, Alvaro; Levier, Bruno; Bowyer, Peter; Rainaud, Romain; Alvarez-Fanjul, Enrique
2017-04-01
The IBI-MFC (Iberia-Biscay-Ireland Monitoring & Forecasting Centre) has been providing daily ocean model estimates and forecasts of diverse physical parameters for the IBI regional seas since 2011, first in the frame of MyOcean projects and later as part of the Copernicus Marine Environment Monitoring Service (CMEMS). By April 2017, coincident with the V3 CMEMS Service Release, the IBI-MFC will extend their near real time (NRT) forecast capabilities. Two new operational IBI forecast systems will be operationally run to generate high resolution biochemical (BIO) and wave (WAV) products on the IBI area. The IBI-NRT-BIO forecast system, based on a 1/36° NEMO-PISCES model application, is run once a week coupled with the IBI physical forecast solution and nested to the CMEMS GLOBAL-BIO solution. On the other hand, the IBI-NRT-WAV system, based on a MeteoFrance-WAM 10km resolution model application, runs twice a day using ECMWF wind forcing. Among other novelties related to the evolution of the IBI physical (PHY) solution, it is worthwhile mentioning the provision, as part of the IBI-NRT-PHY product daily updated, of three-dimensional hourly data on specific areas within the IBI domain. The delivery of these new hourly data along the whole water column has been achieved after the request from IBI users, in order to foster downscaling approaches by providing coherent open boundary conditions to any potential high-resolution coastal model nested to IBI regional solution. An extensive skill assessment of IBI-NRT forecast products has been conducted through the NARVAL (North Atlantic Regional VALidation) web tool, by means of the automatic computation of statistical metrics and quality indicators. By now, this tool has been focused on the validation of the IBI-NRT-PHY system. Nowadays, NARVAL is facing a significant upgrade to validate the aforementioned new biogeochemical and wave IBI products. To this aim, satellite derived observations of chlorophyll and significant wave height will be used, together with in-situ wave parameters measured by mooring buoys. Within this validation framework, special emphasis has been placed on the intercomparison of different forecast model solutions in overlapping areas in order to evaluate models' performances and prognostic capabilities. This common uncertainty estimates of IBI and other model solution is currently performed by NARVAL using both CMEMS forecast model sources (i.e. GLOBAL-MFC, MED-MFC and NWS-MFC) and non-CMEMS operational forecast solutions (mostly downstream application nested to the IBI solution). With respect to the IBI multi-year (MY) products, it is worth mentioning that the actual biogeochemical and physical reanalysis products will be re-run along year 2017, extending its time coverage backwards until 1992. Based on these IBI-MY products, a variety of climatic indicators related to essential oceanographic processes (i.e. western coastal upwelling or the Mediterranean Outflow Water) are currently being computed.
Hulteen, Ryan M; Lander, Natalie J; Morgan, Philip J; Barnett, Lisa M; Robertson, Samuel J; Lubans, David R
2015-10-01
It has been suggested that young people should develop competence in a variety of 'lifelong physical activities' to ensure that they can be active across the lifespan. The primary aim of this systematic review is to report the methodological properties, validity, reliability, and test duration of field-based measures that assess movement skill competency in lifelong physical activities. A secondary aim was to clearly define those characteristics unique to lifelong physical activities. A search of four electronic databases (Scopus, SPORTDiscus, ProQuest, and PubMed) was conducted between June 2014 and April 2015 with no date restrictions. Studies addressing the validity and/or reliability of lifelong physical activity tests were reviewed. Included articles were required to assess lifelong physical activities using process-oriented measures, as well as report either one type of validity or reliability. Assessment criteria for methodological quality were adapted from a checklist used in a previous review of sport skill outcome assessments. Movement skill assessments for eight different lifelong physical activities (badminton, cycling, dance, golf, racquetball, resistance training, swimming, and tennis) in 17 studies were identified for inclusion. Methodological quality, validity, reliability, and test duration (time to assess a single participant), for each article were assessed. Moderate to excellent reliability results were found in 16 of 17 studies, with 71% reporting inter-rater reliability and 41% reporting intra-rater reliability. Only four studies in this review reported test-retest reliability. Ten studies reported validity results; content validity was cited in 41% of these studies. Construct validity was reported in 24% of studies, while criterion validity was only reported in 12% of studies. Numerous assessments for lifelong physical activities may exist, yet only assessments for eight lifelong physical activities were included in this review. Generalizability of results may be more applicable if more heterogeneous samples are used in future research. Moderate to excellent levels of inter- and intra-rater reliability were reported in the majority of studies. However, future work should look to establish test-retest reliability. Validity was less commonly reported than reliability, and further types of validity other than content validity need to be established in future research. Specifically, predictive validity of 'lifelong physical activity' movement skill competency is needed to support the assertion that such activities provide the foundation for a lifetime of activity.
NASA Astrophysics Data System (ADS)
Lee, Bo Mi; Loh, Kenneth J.
2017-04-01
Carbon nanotubes can be randomly deposited in polymer thin film matrices to form nanocomposite strain sensors. However, a computational framework that enables the direct design of these nanocomposite thin films is still lacking. The objective of this study is to derive an experimentally validated and two-dimensional numerical model of carbon nanotube-based thin film strain sensors. This study consisted of two parts. First, multi-walled carbon nanotube (MWCNT)-Pluronic strain sensors were fabricated using vacuum filtration, and their physical, electrical, and electromechanical properties were evaluated. Second, scanning electron microscope images of the films were used for identifying topological features of the percolated MWCNT network, where the information obtained was then utilized for developing the numerical model. Validation of the numerical model was achieved by ensuring that the area ratios (of MWCNTs relative to the polymer matrix) were equivalent for both the experimental and modeled cases. Strain sensing behavior of the percolation-based model was simulated and then compared to experimental test results.
NASA Technical Reports Server (NTRS)
Thompson, David E.
2005-01-01
Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.
An integrated physiology model to study regional lung damage effects and the physiologic response
2014-01-01
Background This work expands upon a previously developed exercise dynamic physiology model (DPM) with the addition of an anatomic pulmonary system in order to quantify the impact of lung damage on oxygen transport and physical performance decrement. Methods A pulmonary model is derived with an anatomic structure based on morphometric measurements, accounting for heterogeneous ventilation and perfusion observed experimentally. The model is incorporated into an existing exercise physiology model; the combined system is validated using human exercise data. Pulmonary damage from blast, blunt trauma, and chemical injury is quantified in the model based on lung fluid infiltration (edema) which reduces oxygen delivery to the blood. The pulmonary damage component is derived and calibrated based on published animal experiments; scaling laws are used to predict the human response to lung injury in terms of physical performance decrement. Results The augmented dynamic physiology model (DPM) accurately predicted the human response to hypoxia, altitude, and exercise observed experimentally. The pulmonary damage parameters (shunt and diffusing capacity reduction) were fit to experimental animal data obtained in blast, blunt trauma, and chemical damage studies which link lung damage to lung weight change; the model is able to predict the reduced oxygen delivery in damage conditions. The model accurately estimates physical performance reduction with pulmonary damage. Conclusions We have developed a physiologically-based mathematical model to predict performance decrement endpoints in the presence of thoracic damage; simulations can be extended to estimate human performance and escape in extreme situations. PMID:25044032
Evidence base and future research directions in the management of low back pain.
Abbott, Allan
2016-03-18
Low back pain (LBP) is a prevalent and costly condition. Awareness of valid and reliable patient history taking, physical examination and clinical testing is important for diagnostic accuracy. Stratified care which targets treatment to patient subgroups based on key characteristics is reliant upon accurate diagnostics. Models of stratified care that can potentially improve treatment effects include prognostic risk profiling for persistent LBP, likely response to specific treatment based on clinical prediction models or suspected underlying causal mechanisms. The focus of this editorial is to highlight current research status and future directions for LBP diagnostics and stratified care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habte, A.; Sengupta, M.; Wilcox, S.
Models to compute Global Horizontal Irradiance (GHI) and Direct Normal Irradiance (DNI) have been in development over the last 3 decades. These models can be classified as empirical or physical, based on the approach. Empirical models relate ground based observations with satellite measurements and use these relations to compute surface radiation. Physical models consider the radiation received from the earth at the satellite and create retrievals to estimate surface radiation. While empirical methods have been traditionally used for computing surface radiation for the solar energy industry the advent of faster computing has made operational physical models viable. The Global Solarmore » Insolation Project (GSIP) is an operational physical model from NOAA that computes GHI using the visible and infrared channel measurements from the GOES satellites. GSIP uses a two-stage scheme that first retrieves cloud properties and uses those properties in a radiative transfer model to calculate surface radiation. NREL, University of Wisconsin and NOAA have recently collaborated to adapt GSIP to create a 4 km GHI and DNI product every 30 minutes. This paper presents an outline of the methodology and a comprehensive validation using high quality ground based solar data from the National Oceanic and Atmospheric Administration (NOAA) Surface Radiation (SURFRAD) (http://www.srrb.noaa.gov/surfrad/sitepage.html) and Integrated Surface Insolation Study (ISIS) http://www.srrb.noaa.gov/isis/isissites.html), the Solar Radiation Research Laboratory (SRRL) at National Renewable Energy Laboratory (NREL), and Sun Spot One (SS1) stations.« less
ERIC Educational Resources Information Center
Moore, Delilah S.; Ellis, Rebecca; Allen, Priscilla D.; Cherry, Katie E.; Monroe, Pamela A.; O'Neil, Carol E.; Wood, Robert H.
2008-01-01
The purpose of this study was to establish validity evidence of four physical activity (PA) questionnaires in culturally diverse older adults by comparing self-report PA with performance-based physical function. Participants were 54 older adults who completed the Continuous Scale Physical Functional Performance 10-item Test (CS-PFP10), Physical…
IMPLEMENTATION AND VALIDATION OF A FULLY IMPLICIT ACCUMULATOR MODEL IN RELAP-7
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Haihua; Zou, Ling; Zhang, Hongbin
2016-01-01
This paper presents the implementation and validation of an accumulator model in RELAP-7 under the framework of preconditioned Jacobian free Newton Krylov (JFNK) method, based on the similar model used in RELAP5. RELAP-7 is a new nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). RELAP-7 is a fully implicit system code. The JFNK and preconditioning methods used in RELAP-7 is briefly discussed. The slightly modified accumulator model is summarized for completeness. The implemented model was validated with LOFT L3-1 test and benchmarked with RELAP5 results. RELAP-7 and RELAP5 had almost identical results for themore » accumulator gas pressure and water level, although there were some minor difference in other parameters such as accumulator gas temperature and tank wall temperature. One advantage of the JFNK method is its easiness to maintain and modify models due to fully separation of numerical methods from physical models. It would be straightforward to extend the current RELAP-7 accumulator model to simulate the advanced accumulator design.« less
Modelling human skull growth: a validated computational model
Marghoub, Arsalan; Johnson, David; Khonsari, Roman H.; Fagan, Michael J.; Moazen, Mehran
2017-01-01
During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions (n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. PMID:28566514
Modelling human skull growth: a validated computational model.
Libby, Joseph; Marghoub, Arsalan; Johnson, David; Khonsari, Roman H; Fagan, Michael J; Moazen, Mehran
2017-05-01
During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions ( n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. © 2017 The Author(s).
Data-Driven Residential Load Modeling and Validation in GridLAB-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gotseff, Peter; Lundstrom, Blake
Accurately characterizing the impacts of high penetrations of distributed energy resources (DER) on the electric distribution system has driven modeling methods from traditional static snap shots, often representing a critical point in time (e.g., summer peak load), to quasi-static time series (QSTS) simulations capturing all the effects of variable DER, associated controls and hence, impacts on the distribution system over a given time period. Unfortunately, the high time resolution DER source and load data required for model inputs is often scarce or non-existent. This paper presents work performed within the GridLAB-D model environment to synthesize, calibrate, and validate 1-second residentialmore » load models based on measured transformer loads and physics-based models suitable for QSTS electric distribution system modeling. The modeling and validation approach taken was to create a typical GridLAB-D model home that, when replicated to represent multiple diverse houses on a single transformer, creates a statistically similar load to a measured load for a given weather input. The model homes are constructed to represent the range of actual homes on an instrumented transformer: square footage, thermal integrity, heating and cooling system definition as well as realistic occupancy schedules. House model calibration and validation was performed using the distribution transformer load data and corresponding weather. The modeled loads were found to be similar to the measured loads for four evaluation metrics: 1) daily average energy, 2) daily average and standard deviation of power, 3) power spectral density, and 4) load shape.« less
Modeling and Validation of Microwave Ablations with Internal Vaporization
Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L.
2014-01-01
Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this work, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10 and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intra-procedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard Index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard Index of 0.27, 0.49, 0.61, 0.67 and 0.69 at 1, 2, 3, 4, and 5 minutes. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481
Development of Cell Models as a Basis for Bioreactor Design for Genetically Modified Bacteria
1986-10-30
of future behavior based on specifying the current state vector . Generally a total population greater than 10,000 is sufficient to allow treatment of...specifying the current state vector (essentially values for all variables in the model). Deterministic models become increasingly valid as the number of...host I A) and therein PARASItIS converts the host’s biomaterial or activities into its own + A and B are in physical contact. SYMBIOSIS (or perhaps Oi
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; ...
2014-06-30
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in ordermore » to reduce the overall numerical uncertainty while leveraging available computational resources. Finally, the coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cohen, O.
The development of the Zeeman–Doppler Imaging (ZDI) technique has provided synoptic observations of surface magnetic fields of low-mass stars. This led the stellar astrophysics community to adopt modeling techniques that have been used in solar physics using solar magnetograms. However, many of these techniques have been neglected by the solar community due to their failure to reproduce solar observations. Nevertheless, some of these techniques are still used to simulate the coronae and winds of solar analogs. Here we present a comparative study between two MHD models for the solar corona and solar wind. The first type of model is amore » polytropic wind model, and the second is the physics-based AWSOM model. We show that while the AWSOM model consistently reproduces many solar observations, the polytropic model fails to reproduce many of them, and in the cases where it does, its solutions are unphysical. Our recommendation is that polytropic models, which are used to estimate mass-loss rates and other parameters of solar analogs, must first be calibrated with solar observations. Alternatively, these models can be calibrated with models that capture more detailed physics of the solar corona (such as the AWSOM model) and that can reproduce solar observations in a consistent manner. Without such a calibration, the results of the polytropic models cannot be validated, but they can be wrongly used by others.« less
Zammit, Andrea R; Hall, Charles B; Lipton, Richard B; Katz, Mindy J; Muniz-Terrera, Graciela
2018-05-01
The aim of this study was to identify natural subgroups of older adults based on cognitive performance, and to establish each subgroup's characteristics based on demographic factors, physical function, psychosocial well-being, and comorbidity. We applied latent class (LC) modeling to identify subgroups in baseline assessments of 1345 Einstein Aging Study (EAS) participants free of dementia. The EAS is a community-dwelling cohort study of 70+ year-old adults living in the Bronx, NY. We used 10 neurocognitive tests and 3 covariates (age, sex, education) to identify latent subgroups. We used goodness-of-fit statistics to identify the optimal class solution and assess model adequacy. We also validated our model using two-fold split-half cross-validation. The sample had a mean age of 78.0 (SD=5.4) and a mean of 13.6 years of education (SD=3.5). A 9-class solution based on cognitive performance at baseline was the best-fitting model. We characterized the 9 identified classes as (i) disadvantaged, (ii) poor language, (iii) poor episodic memory and fluency, (iv) poor processing speed and executive function, (v) low average, (vi) high average, (vii) average, (viii) poor executive and poor working memory, (ix) elite. The cross validation indicated stable class assignment with the exception of the average and high average classes. LC modeling in a community sample of older adults revealed 9 cognitive subgroups. Assignment of subgroups was reliable and associated with external validators. Future work will test the predictive validity of these groups for outcomes such as Alzheimer's disease, vascular dementia and death, as well as markers of biological pathways that contribute to cognitive decline. (JINS, 2018, 24, 511-523).
ERIC Educational Resources Information Center
Rivet, Ann E.; Kastens, Kim A.
2012-01-01
In recent years, science education has placed increasing importance on learners' mastery of scientific reasoning. This growing emphasis presents a challenge for both developers and users of assessments. We report on our effort around the conceptualization, development, and testing the validity of an assessment of students' ability to reason around…
NASA Astrophysics Data System (ADS)
Alexander, M. Joan; Stephan, Claudia
2015-04-01
In climate models, gravity waves remain too poorly resolved to be directly modelled. Instead, simplified parameterizations are used to include gravity wave effects on model winds. A few climate models link some of the parameterized waves to convective sources, providing a mechanism for feedback between changes in convection and gravity wave-driven changes in circulation in the tropics and above high-latitude storms. These convective wave parameterizations are based on limited case studies with cloud-resolving models, but they are poorly constrained by observational validation, and tuning parameters have large uncertainties. Our new work distills results from complex, full-physics cloud-resolving model studies to essential variables for gravity wave generation. We use the Weather Research Forecast (WRF) model to study relationships between precipitation, latent heating/cooling and other cloud properties to the spectrum of gravity wave momentum flux above midlatitude storm systems. Results show the gravity wave spectrum is surprisingly insensitive to the representation of microphysics in WRF. This is good news for use of these models for gravity wave parameterization development since microphysical properties are a key uncertainty. We further use the full-physics cloud-resolving model as a tool to directly link observed precipitation variability to gravity wave generation. We show that waves in an idealized model forced with radar-observed precipitation can quantitatively reproduce instantaneous satellite-observed features of the gravity wave field above storms, which is a powerful validation of our understanding of waves generated by convection. The idealized model directly links observations of surface precipitation to observed waves in the stratosphere, and the simplicity of the model permits deep/large-area domains for studies of wave-mean flow interactions. This unique validated model tool permits quantitative studies of gravity wave driving of regional circulation and provides a new method for future development of realistic convective gravity wave parameterizations.
Nigg, Claudio R; Motl, Robert W; Horwath, Caroline; Dishman, Rod K
2012-01-01
Objectives Physical activity (PA) research applying the Transtheoretical Model (TTM) to examine group differences and/or change over time requires preliminary evidence of factorial validity and invariance. The current study examined the factorial validity and longitudinal invariance of TTM constructs recently revised for PA. Method Participants from an ethnically diverse sample in Hawaii (N=700) completed questionnaires capturing each TTM construct. Results Factorial validity was confirmed for each construct using confirmatory factor analysis with full-information maximum likelihood. Longitudinal invariance was evidenced across a shorter (3-month) and longer (6-month) time period via nested model comparisons. Conclusions The questionnaires for each validated TTM construct are provided, and can now be generalized across similar subgroups and time points. Further validation of the provided measures is suggested in additional populations and across extended time points. PMID:22778669
Wada, Tomoki; Yasunaga, Hideo; Yamana, Hayato; Matsui, Hiroki; Fushimi, Kiyohide; Morimura, Naoto
2018-03-01
There was no established disability predictive measurement for patients with trauma that could be used in administrative claims databases. The aim of the present study was to develop and validate a diagnosis-based disability predictive index for severe physical disability at discharge using the International Classification of Diseases, 10th revision (ICD-10) coding. This retrospective observational study used the Diagnosis Procedure Combination database in Japan. Patients who were admitted to hospitals with trauma and discharged alive from 01 April 2010 to 31 March 2015 were included. Pediatric patients under 15 years old were excluded. Data for patients admitted to hospitals from 01 April 2010 to 31 March 2013 was used for development of a disability predictive index (derivation cohort), while data for patients admitted to hospitals from 01 April 2013 to 31 March 2015 was used for the internal validation (validation cohort). The outcome of interest was severe physical disability defined as the Barthel Index score of <60 at discharge. Trauma-related ICD-10 codes were categorized into 36 injury groups with reference to the categorization used in the Global Burden of Diseases study 2013. A multivariable logistic regression analysis was performed for the outcome using the injury groups and patient baseline characteristics including patient age, sex, and Charlson Comorbidity Index (CCI) score in the derivation cohort. A score corresponding to a regression coefficient was assigned to each injury group. The disability predictive index for each patient was defined as the sum of the scores. The predictive performance of the index was validated using the receiver operating characteristic curve analysis in the validation cohort. The derivation cohort included 1,475,158 patients, while the validation cohort included 939,659 patients. Of the 939,659 patients, 235,382 (25.0%) were discharged with severe physical disability. The c-statistics of the disability predictive index was 0.795 (95% confidence interval [CI] 0.794-0.795), while that of a model using the disability predictive index and patient baseline characteristics was 0.856 (95% CI 0.855-0.857). Severe physical disability at discharge may be well predicted with patient age, sex, CCI score, and the diagnosis-based disability predictive index in patients admitted to hospitals with trauma. Copyright © 2018 Elsevier Ltd. All rights reserved.
Validation of Heat Transfer Thermal Decomposition and Container Pressurization of Polyurethane Foam.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Sarah Nicole; Dodd, Amanda B.; Larsen, Marvin E.
Polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. In fire environments, gas pressure from thermal decomposition of polymers can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of PMDI-based polyurethane foam is presented to assess the validity of the computational model. Both experimental measurement uncertainty and model prediction uncertainty are examined and compared. Both the mean value method and Latin hypercube sampling approach are used to propagate the uncertainty through the model. In addition to comparing computational and experimental results, the importance of each input parameter on the simulation resultmore » is also investigated. These results show that further development in the physics model of the foam and appropriate associated material testing are necessary to improve model accuracy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schimpe, Michael; von Kuepach, M. E.; Naumann, M.
For reliable lifetime predictions of lithium-ion batteries, models for cell degradation are required. A comprehensive semi-empirical model based on a reduced set of internal cell parameters and physically justified degradation functions for the capacity loss is developed and presented for a commercial lithium iron phosphate/graphite cell. One calendar and several cycle aging effects are modeled separately. Emphasis is placed on the varying degradation at different temperatures. Degradation mechanisms for cycle aging at high and low temperatures as well as the increased cycling degradation at high state of charge are calculated separately. For parameterization, a lifetime test study is conducted includingmore » storage and cycle tests. Additionally, the model is validated through a dynamic current profile based on real-world application in a stationary energy storage system revealing the accuracy. Tests for validation are continued for up to 114 days after the longest parametrization tests. In conclusion, the model error for the cell capacity loss in the application-based tests is at the end of testing below 1% of the original cell capacity and the maximum relative model error is below 21%.« less
Schimpe, Michael; von Kuepach, M. E.; Naumann, M.; ...
2018-01-12
For reliable lifetime predictions of lithium-ion batteries, models for cell degradation are required. A comprehensive semi-empirical model based on a reduced set of internal cell parameters and physically justified degradation functions for the capacity loss is developed and presented for a commercial lithium iron phosphate/graphite cell. One calendar and several cycle aging effects are modeled separately. Emphasis is placed on the varying degradation at different temperatures. Degradation mechanisms for cycle aging at high and low temperatures as well as the increased cycling degradation at high state of charge are calculated separately. For parameterization, a lifetime test study is conducted includingmore » storage and cycle tests. Additionally, the model is validated through a dynamic current profile based on real-world application in a stationary energy storage system revealing the accuracy. Tests for validation are continued for up to 114 days after the longest parametrization tests. In conclusion, the model error for the cell capacity loss in the application-based tests is at the end of testing below 1% of the original cell capacity and the maximum relative model error is below 21%.« less
NASA Astrophysics Data System (ADS)
Li, Mingming; Li, Lin; Li, Qiang; Zou, Zongshu
2018-05-01
A filter-based Euler-Lagrange multiphase flow model is used to study the mixing behavior in a combined blowing steelmaking converter. The Euler-based volume of fluid approach is employed to simulate the top blowing, while the Lagrange-based discrete phase model that embeds the local volume change of rising bubbles for the bottom blowing. A filter-based turbulence method based on the local meshing resolution is proposed aiming to improve the modeling of turbulent eddy viscosities. The model validity is verified through comparison with physical experiments in terms of mixing curves and mixing times. The effects of the bottom gas flow rate on bath flow and mixing behavior are investigated and the inherent reasons for the mixing result are clarified in terms of the characteristics of bottom-blowing plumes, the interaction between plumes and top-blowing jets, and the change of bath flow structure.
Parasitic Parameters Extraction for InP DHBT Based on EM Method and Validation up to H-Band
NASA Astrophysics Data System (ADS)
Li, Oupeng; Zhang, Yong; Wang, Lei; Xu, Ruimin; Cheng, Wei; Wang, Yuan; Lu, Haiyan
2017-05-01
This paper presents a small-signal model for InGaAs/InP double heterojunction bipolar transistor (DHBT). Parasitic parameters of access via and electrode finger are extracted by 3-D electromagnetic (EM) simulation. By analyzing the equivalent circuit of seven special structures and using the EM simulation results, the parasitic parameters are extracted systematically. Compared with multi-port s-parameter EM model, the equivalent circuit model has clear physical intension and avoids the complex internal ports setting. The model is validated on a 0.5 × 7 μm2 InP DHBT up to 325 GHz. The model provides a good fitting result between measured and simulated multi-bias s-parameters in full band. At last, an H-band amplifier is designed and fabricated for further verification. The measured amplifier performance is highly agreed with the model prediction, which indicates the model has good accuracy in submillimeterwave band.
Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram
2014-01-10
Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.
Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram
2014-01-01
Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Revilock, Duane M.; Pereira, Michael J.; Lyle, Karen H.
2009-01-01
Following the tragedy of the Orbiter Columbia (STS-107) on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the space shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize reinforced carbon-carbon (RCC) along with ice and foam debris materials, which could shed on ascent and impact the orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS-DYNA (Livermore Software Technology Corp.) to predict damage by potential and actual impact events on the orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: Level 1--fundamental tests to obtain independent static and dynamic constitutive model properties of materials of interest, Level 2--subcomponent impact tests to provide highly controlled impact test data for the correlation and validation of the models, and Level 3--full-scale orbiter leading-edge impact tests to establish the final level of confidence for the analysis methodology. This report discusses the Level 2 test program conducted in the NASA Glenn Research Center (GRC) Ballistic Impact Laboratory with ice projectile impact tests on flat RCC panels, and presents the data observed. The Level 2 testing consisted of 54 impact tests in the NASA GRC Ballistic Impact Laboratory on 6- by 6-in. and 6- by 12-in. flat plates of RCC and evaluated three types of debris projectiles: Single-crystal, polycrystal, and "soft" ice. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile and validated the use of the ice and RCC models for use in LS-DYNA.
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Revilock, Duane M.; Pereira, Michael J.; Lyle, Karen H.
2009-01-01
Following the tragedy of the Orbiter Columbia (STS-107) on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the space shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize reinforced carbon-carbon (RCC) along with ice and foam debris materials, which could shed on ascent and impact the orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS-DYNA (Livermore Software Technology Corp.) to predict damage by potential and actual impact events on the orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: Level 1-fundamental tests to obtain independent static and dynamic constitutive model properties of materials of interest, Level 2-subcomponent impact tests to provide highly controlled impact test data for the correlation and validation of the models, and Level 3-full-scale orbiter leading-edge impact tests to establish the final level of confidence for the analysis methodology. This report discusses the Level 2 test program conducted in the NASA Glenn Research Center (GRC) Ballistic Impact Laboratory with external tank foam impact tests on flat RCC panels, and presents the data observed. The Level 2 testing consisted of 54 impact tests in the NASA GRC Ballistic Impact Laboratory on 6- by 6-in. and 6- by 12-in. flat plates of RCC and evaluated two types of debris projectiles: BX-265 and PDL-1034 external tank foam. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile and validated the use of the foam and RCC models for use in LS-DYNA.
Tavares, Letícia Ferreira; Castro, Inês Rugani Ribeiro de; Cardoso, Letícia Oliveira; Levy, Renata Bertazzi; Claro, Rafael Moreira; Oliveira, Andreia Ferreira de
2014-09-01
This study evaluated the relative validity of physical activity indicators from the questionnaire used in the Brazilian National School-Based Health Survey (PeNSE) in the city of Rio de Janeiro, Brazil, based on a sample of 174 students. The following indicators of weekly physical activity were evaluated: ACTIVE-300MIN (≥ 300 minutes/week); ACTIVE-150MIN (≥ 150 minutes), INACTIVE (no physical activity). Additionally, indicators of sedentary behavior were also assessed, as daily screen time (TV, videogames, and computer). The results from the questionnaire were compared with three 24-hour recalls. The results of ACTIVE-300MIN, ACTIVE-150MIN, and INACTIVE generated by PeNSE showed high accuracy. These indicators performed better than those of sedentary behavior in relation to frequency estimates as well as sensitivity, specificity, and correct classification rate. The indicators of physical activity from PeNSE showed satisfactory relative validity.
NASA Astrophysics Data System (ADS)
Parkin, G.; O'Donnell, G.; Ewen, J.; Bathurst, J. C.; O'Connell, P. E.; Lavabre, J.
1996-02-01
Validation methods commonly used to test catchment models are not capable of demonstrating a model's fitness for making predictions for catchments where the catchment response is not known (including hypothetical catchments, and future conditions of existing catchments which are subject to land-use or climate change). This paper describes the first use of a new method of validation (Ewen and Parkin, 1996. J. Hydrol., 175: 583-594) designed to address these types of application; the method involves making 'blind' predictions of selected hydrological responses which are considered important for a particular application. SHETRAN (a physically based, distributed catchment modelling system) is tested on a small Mediterranean catchment. The test involves quantification of the uncertainty in four predicted features of the catchment response (continuous hydrograph, peak discharge rates, monthly runoff, and total runoff), and comparison of observations with the predicted ranges for these features. The results of this test are considered encouraging.
Retrieval of Atmospheric Particulate Matter Using Satellite Data Over Central and Eastern China
NASA Astrophysics Data System (ADS)
Chen, G. L.; Guang, J.; Li, Y.; Che, Y. H.; Gong, S. Q.
2018-04-01
Fine particulate matter (PM2.5) is a particle cluster with diameters less than or equal to 2.5 μm. Over the past few decades, regional air pollution composed of PM2.5 has frequently occurred over Central and Eastern China. In order to estimate the concentration, distribution and other properties of PM2.5, the general retrieval models built by establishing the relationship between aerosol optical depth (AOD) and PM2.5 has been widely used in many studies, including experimental models via statistics analysis and physical models with certain physical mechanism. The statistical experimental models can't be extended to other areas or historical period due to its dependence on the ground-based observations and necessary auxiliary data, which limits its further application. In this paper, a physically based model is applied to estimate the concentration of PM2.5 over Central and Eastern China from 2007 to 2016. The ground-based PM2.5 measurements were used to be as reference data to validate our retrieval results. Then annual variation and distribution of PM2.5 concentration in the Central and Eastern China was analysed. Results shows that the annual average PM2.5 show a trend of gradually increasing and then decreasing during 2007-2016, with the highest value in 2011.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhijie; Lai, Canhai; Marcy, Peter William
2017-05-01
A challenging problem in designing pilot-scale carbon capture systems is to predict, with uncertainty, the adsorber performance and capture efficiency under various operating conditions where no direct experimental data exist. Motivated by this challenge, we previously proposed a hierarchical framework in which relevant parameters of physical models were sequentially calibrated from different laboratory-scale carbon capture unit (C2U) experiments. Specifically, three models of increasing complexity were identified based on the fundamental physical and chemical processes of the sorbent-based carbon capture technology. Results from the corresponding laboratory experiments were used to statistically calibrate the physical model parameters while quantifying some of theirmore » inherent uncertainty. The parameter distributions obtained from laboratory-scale C2U calibration runs are used in this study to facilitate prediction at a larger scale where no corresponding experimental results are available. In this paper, we first describe the multiphase reactive flow model for a sorbent-based 1-MW carbon capture system then analyze results from an ensemble of simulations with the upscaled model. The simulation results are used to quantify uncertainty regarding the design’s predicted efficiency in carbon capture. In particular, we determine the minimum gas flow rate necessary to achieve 90% capture efficiency with 95% confidence.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.
In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less
Lv, Chen; Liu, Yahui; Hu, Xiaosong; Guo, Hongyan; Cao, Dongpu; Wang, Fei-Yue
2017-08-22
As a typical cyber-physical system (CPS), electrified vehicle becomes a hot research topic due to its high efficiency and low emissions. In order to develop advanced electric powertrains, accurate estimations of the unmeasurable hybrid states, including discrete backlash nonlinearity and continuous half-shaft torque, are of great importance. In this paper, a novel estimation algorithm for simultaneously identifying the backlash position and half-shaft torque of an electric powertrain is proposed using a hybrid system approach. System models, including the electric powertrain and vehicle dynamics models, are established considering the drivetrain backlash and flexibility, and also calibrated and validated using vehicle road testing data. Based on the developed system models, the powertrain behavior is represented using hybrid automata according to the piecewise affine property of the backlash dynamics. A hybrid-state observer, which is comprised of a discrete-state observer and a continuous-state observer, is designed for the simultaneous estimation of the backlash position and half-shaft torque. In order to guarantee the stability and reachability, the convergence property of the proposed observer is investigated. The proposed observer are validated under highly dynamical transitions of vehicle states. The validation results demonstrates the feasibility and effectiveness of the proposed hybrid-state observer.
Sweet, Shane N.; Fortier, Michelle S.; Strachan, Shaelyn M.; Blanchard, Chris M.; Boulay, Pierre
2014-01-01
Self-determination theory and self-efficacy theory are prominent theories in the physical activity literature, and studies have begun integrating their concepts. Sweet, Fortier, Strachan and Blanchard (2012) have integrated these two theories in a cross-sectional study. Therefore, this study sought to test a longitudinal integrated model to predict physical activity at the end of a 4-month cardiac rehabilitation program based on theory, research and Sweet et al.’s cross-sectional model. Participants from two cardiac rehabilitation programs (N=109) answered validated self-report questionnaires at baseline, two and four months. Data were analyzed using Amos to assess the path analysis and model fit. Prior to integration, perceived competence and self-efficacy were combined, and labeled as confidence. After controlling for 2-month physical activity and cardiac rehabilitation site, no motivational variables significantly predicted residual change in 4-month physical activity. Although confidence at two months did not predict residual change in 4-month physical activity, it had a strong positive relationship with 2-month physical activity (β=0.30, P<0.001). The overall model retained good fit indices. In conclusion, results diverged from theoretical predictions of physical activity, but self-determination and self-efficacy theory were still partially supported. Because the model had good fit, this study demonstrated that theoretical integration is feasible. PMID:26973926
Desai, Rishi J; Solomon, Daniel H; Weinblatt, Michael E; Shadick, Nancy; Kim, Seoyoung C
2015-04-13
We conducted an external validation study to examine the correlation of a previously published claims-based index for rheumatoid arthritis severity (CIRAS) with disease activity score in 28 joints calculated by using C-reactive protein (DAS28-CRP) and the multi-dimensional health assessment questionnaire (MD-HAQ) physical function score. Patients enrolled in the Brigham and Women's Hospital Rheumatoid Arthritis Sequential Study (BRASS) and Medicare were identified and their data from these two sources were linked. For each patient, DAS28-CRP measurement and MD-HAQ physical function scores were extracted from BRASS, and CIRAS was calculated from Medicare claims for the period of 365 days prior to the DAS28-CRP measurement. Pearson correlation coefficient between CIRAS and DAS28-CRP as well as MD-HAQ physical function scores were calculated. Furthermore, we considered several additional pharmacy and medical claims-derived variables as predictors for DAS28-CRP in a multivariable linear regression model in order to assess improvement in the performance of the original CIRAS algorithm. In total, 315 patients with enrollment in both BRASS and Medicare were included in this study. The majority (81%) of the cohort was female, and the mean age was 70 years. The correlation between CIRAS and DAS28-CRP was low (Pearson correlation coefficient = 0.07, P = 0.24). The correlation between the calculated CIRAS and MD-HAQ physical function scores was also found to be low (Pearson correlation coefficient = 0.08, P = 0.17). The linear regression model containing additional claims-derived variables yielded model R(2) of 0.23, suggesting limited ability of this model to explain variation in DAS28-CRP. In a cohort of Medicare-enrolled patients with established RA, CIRAS showed low correlation with DAS28-CRP as well as MD-HAQ physical function scores. Claims-based algorithms for disease activity should be rigorously tested in distinct populations in order to establish their generalizability before widespread adoption.
Liu, Hong; Zhu, Jingping; Wang, Kai
2015-08-24
The geometrical attenuation model given by Blinn was widely used in the geometrical optics bidirectional reflectance distribution function (BRDF) models. Blinn's geometrical attenuation model based on symmetrical V-groove assumption and ray scalar theory causes obvious inaccuracies in BRDF curves and negatives the effects of polarization. Aiming at these questions, a modified polarized geometrical attenuation model based on random surface microfacet theory is presented by combining of masking and shadowing effects and polarized effect. The p-polarized, s-polarized and unpolarized geometrical attenuation functions are given in their separate expressions and are validated with experimental data of two samples. It shows that the modified polarized geometrical attenuation function reaches better physical rationality, improves the precision of BRDF model, and widens the applications for different polarization.
Physical activity measurement instruments for children with cerebral palsy: a systematic review.
Capio, Catherine M; Sit, Cindy H P; Abernethy, Bruce; Rotor, Esmerita R
2010-10-01
this paper is a systematic review of physical activity measurement instruments for field-based studies involving children with cerebral palsy (CP). database searches using PubMed Central, MEDLINE, CINAHL Plus, PsycINFO, EMBASE, Cochrane Library, and PEDro located 12 research papers, identifying seven instruments that met the inclusion criteria of (1) having been developed for children aged 0 to 18 years, (2) having been used to evaluate a physical activity dimension, and (3) having been used in a field-based study involving children with CP. The instruments reviewed were the Activities Scale for Kids - Performance version (ASKp), the Canada Fitness Survey, the Children's Assessment of Participation and Enjoyment/Preferences for Activities of Children (CAPE/PAC), the Compendium of Physical Activities, the Physical Activity Questionnaire - Adolescents (PAQ-A), StepWatch, and the Uptimer. Second-round searches yielded 11 more papers, providing reliability and validity evidence for the instruments. the instruments measure physical activity frequency, mode, domain, and duration. Although most instruments demonstrated adequate reliability and validity, only the ASKp and CAPE/PAC have established reliability and validity for children with physical disabilities; the Uptimer has established concurrent validity. No instrument measuring intensity in free-living has been validated or found reliable for children with CP. the findings suggest that further studies are needed to examine the methodological properties of physical activity measurement in children with CP. Combining subjective and objective instruments is recommended to achieve better understanding of physical activity participation.
Anderson, J.R.; Ackerman, J.J.H.; Garbow, J.R.
2015-01-01
Two semipermeable, hollow fiber phantoms for the validation of perfusion-sensitive magnetic resonance methods and signal models are described. Semipermeable hollow fibers harvested from a standard commercial hemodialysis cartridge serve to mimic tissue capillary function. Flow of aqueous media through the fiber lumen is achieved with a laboratory-grade peristaltic pump. Diffusion of water and solute species (e.g., Gd-based contrast agent) occurs across the fiber wall, allowing exchange between the lumen and the extralumenal space. Phantom design attributes include: i) small physical size, ii) easy and low-cost construction, iii) definable compartment volumes, and iv) experimental control over media content and flow rate. PMID:26167136
On validating remote sensing simulations using coincident real data
NASA Astrophysics Data System (ADS)
Wang, Mingming; Yao, Wei; Brown, Scott; Goodenough, Adam; van Aardt, Jan
2016-05-01
The remote sensing community often requires data simulation, either via spectral/spatial downsampling or through virtual, physics-based models, to assess systems and algorithms. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model is one such first-principles, physics-based model for simulating imagery for a range of modalities. Complex simulation of vegetation environments subsequently has become possible, as scene rendering technology and software advanced. This in turn has created questions related to the validity of such complex models, with potential multiple scattering, bidirectional distribution function (BRDF), etc. phenomena that could impact results in the case of complex vegetation scenes. We selected three sites, located in the Pacific Southwest domain (Fresno, CA) of the National Ecological Observatory Network (NEON). These sites represent oak savanna, hardwood forests, and conifer-manzanita-mixed forests. We constructed corresponding virtual scenes, using airborne LiDAR and imaging spectroscopy data from NEON, ground-based LiDAR data, and field-collected spectra to characterize the scenes. Imaging spectroscopy data for these virtual sites then were generated using the DIRSIG simulation environment. This simulated imagery was compared to real AVIRIS imagery (15m spatial resolution; 12 pixels/scene) and NEON Airborne Observation Platform (AOP) data (1m spatial resolution; 180 pixels/scene). These tests were performed using a distribution-comparison approach for select spectral statistics, e.g., established the spectra's shape, for each simulated versus real distribution pair. The initial comparison results of the spectral distributions indicated that the shapes of spectra between the virtual and real sites were closely matched.
Quasi-steady aerodynamic model of clap-and-fling flapping MAV and validation using free-flight data.
Armanini, S F; Caetano, J V; Croon, G C H E de; Visser, C C de; Mulder, M
2016-06-30
Flapping-wing aerodynamic models that are accurate, computationally efficient and physically meaningful, are challenging to obtain. Such models are essential to design flapping-wing micro air vehicles and to develop advanced controllers enhancing the autonomy of such vehicles. In this work, a phenomenological model is developed for the time-resolved aerodynamic forces on clap-and-fling ornithopters. The model is based on quasi-steady theory and accounts for inertial, circulatory, added mass and viscous forces. It extends existing quasi-steady approaches by: including a fling circulation factor to account for unsteady wing-wing interaction, considering real platform-specific wing kinematics and different flight regimes. The model parameters are estimated from wind tunnel measurements conducted on a real test platform. Comparison to wind tunnel data shows that the model predicts the lift forces on the test platform accurately, and accounts for wing-wing interaction effectively. Additionally, validation tests with real free-flight data show that lift forces can be predicted with considerable accuracy in different flight regimes. The complete parameter-varying model represents a wide range of flight conditions, is computationally simple, physically meaningful and requires few measurements. It is therefore potentially useful for both control design and preliminary conceptual studies for developing new platforms.
Flipo, Nicolas; Jeannée, Nicolas; Poulin, Michel; Even, Stéphanie; Ledoux, Emmanuel
2007-03-01
The objective of this work is to combine several approaches to better understand nitrate fate in the Grand Morin aquifers (2700 km(2)), part of the Seine basin. cawaqs results from the coupling of the hydrogeological model newsam with the hydrodynamic and biogeochemical model of river ProSe. cawaqs is coupled with the agronomic model Stics in order to simulate nitrate migration in basins. First, kriging provides a satisfactory representation of aquifer nitrate contamination from local observations, to set initial conditions for the physically based model. Then associated confidence intervals, derived from data using geostatistics, are used to validate cawaqs results. Results and evaluation obtained from the combination of these approaches are given (period 1977-1988). Then cawaqs is used to simulate nitrate fate for a 20-year period (1977-1996). The mean nitrate concentrations increase in aquifers is 0.09 mgN L(-1)yr(-1), resulting from an average infiltration flux of 3500 kgN.km(-2)yr(-1).
Evaluation of physical activity web sites for use of behavior change theories.
Doshi, Amol; Patrick, Kevin; Sallis, James F; Calfas, Karen
2003-01-01
Physical activity (PA) Web sites were assessed for their use of behavior change theories, including constructs of the health belief model, Transtheoretical Model, social cognitive theory, and the theory of reasoned action and planned behavior. An evaluation template for assessing PA Web sites was developed, and content validity and interrater reliability were demonstrated. Two independent raters evaluated 24 PA Web sites. Web sites varied widely in application of theory-based constructs, ranging from 5 to 48 on a 100-point scale. The most common intervention strategies were general information, social support, and realistic goal areas. Coverage of theory-based strategies was low, varying from 26% for social cognitive theory to 39% for health belief model. Overall, PA Web sites provided little assessment, feedback, or individually tailored assistance for users. They were unable to substantially tailor the on-line experience for users at different stages of change or different demographic characteristics.
Direct modeling for computational fluid dynamics
NASA Astrophysics Data System (ADS)
Xu, Kun
2015-06-01
All fluid dynamic equations are valid under their modeling scales, such as the particle mean free path and mean collision time scale of the Boltzmann equation and the hydrodynamic scale of the Navier-Stokes (NS) equations. The current computational fluid dynamics (CFD) focuses on the numerical solution of partial differential equations (PDEs), and its aim is to get the accurate solution of these governing equations. Under such a CFD practice, it is hard to develop a unified scheme that covers flow physics from kinetic to hydrodynamic scales continuously because there is no such governing equation which could make a smooth transition from the Boltzmann to the NS modeling. The study of fluid dynamics needs to go beyond the traditional numerical partial differential equations. The emerging engineering applications, such as air-vehicle design for near-space flight and flow and heat transfer in micro-devices, do require further expansion of the concept of gas dynamics to a larger domain of physical reality, rather than the traditional distinguishable governing equations. At the current stage, the non-equilibrium flow physics has not yet been well explored or clearly understood due to the lack of appropriate tools. Unfortunately, under the current numerical PDE approach, it is hard to develop such a meaningful tool due to the absence of valid PDEs. In order to construct multiscale and multiphysics simulation methods similar to the modeling process of constructing the Boltzmann or the NS governing equations, the development of a numerical algorithm should be based on the first principle of physical modeling. In this paper, instead of following the traditional numerical PDE path, we introduce direct modeling as a principle for CFD algorithm development. Since all computations are conducted in a discretized space with limited cell resolution, the flow physics to be modeled has to be done in the mesh size and time step scales. Here, the CFD is more or less a direct construction of discrete numerical evolution equations, where the mesh size and time step will play dynamic roles in the modeling process. With the variation of the ratio between mesh size and local particle mean free path, the scheme will capture flow physics from the kinetic particle transport and collision to the hydrodynamic wave propagation. Based on the direct modeling, a continuous dynamics of flow motion will be captured in the unified gas-kinetic scheme. This scheme can be faithfully used to study the unexplored non-equilibrium flow physics in the transition regime.
Development and Validation of a Job Exposure Matrix for Physical Risk Factors in Low Back Pain
Solovieva, Svetlana; Pehkonen, Irmeli; Kausto, Johanna; Miranda, Helena; Shiri, Rahman; Kauppinen, Timo; Heliövaara, Markku; Burdorf, Alex; Husgafvel-Pursiainen, Kirsti; Viikari-Juntura, Eira
2012-01-01
Objectives The aim was to construct and validate a gender-specific job exposure matrix (JEM) for physical exposures to be used in epidemiological studies of low back pain (LBP). Materials and Methods We utilized two large Finnish population surveys, one to construct the JEM and another to test matrix validity. The exposure axis of the matrix included exposures relevant to LBP (heavy physical work, heavy lifting, awkward trunk posture and whole body vibration) and exposures that increase the biomechanical load on the low back (arm elevation) or those that in combination with other known risk factors could be related to LBP (kneeling or squatting). Job titles with similar work tasks and exposures were grouped. Exposure information was based on face-to-face interviews. Validity of the matrix was explored by comparing the JEM (group-based) binary measures with individual-based measures. The predictive validity of the matrix against LBP was evaluated by comparing the associations of the group-based (JEM) exposures with those of individual-based exposures. Results The matrix includes 348 job titles, representing 81% of all Finnish job titles in the early 2000s. The specificity of the constructed matrix was good, especially in women. The validity measured with kappa-statistic ranged from good to poor, being fair for most exposures. In men, all group-based (JEM) exposures were statistically significantly associated with one-month prevalence of LBP. In women, four out of six group-based exposures showed an association with LBP. Conclusions The gender-specific JEM for physical exposures showed relatively high specificity without compromising sensitivity. The matrix can therefore be considered as a valid instrument for exposure assessment in large-scale epidemiological studies, when more precise but more labour-intensive methods are not feasible. Although the matrix was based on Finnish data we foresee that it could be applicable, with some modifications, in other countries with a similar level of technology. PMID:23152793
Modelling 1-minute directional observations of the global irradiance.
NASA Astrophysics Data System (ADS)
Thejll, Peter; Pagh Nielsen, Kristian; Andersen, Elsa; Furbo, Simon
2016-04-01
Direct and diffuse irradiances from the sky has been collected at 1-minute intervals for about a year from the experimental station at the Technical University of Denmark for the IEA project "Solar Resource Assessment and Forecasting". These data were gathered by pyrheliometers tracking the Sun, as well as with apertured pyranometers gathering 1/8th and 1/16th of the light from the sky in 45 degree azimuthal ranges pointed around the compass. The data are gathered in order to develop detailed models of the potentially available solar energy and its variations at high temporal resolution in order to gain a more detailed understanding of the solar resource. This is important for a better understanding of the sub-grid scale cloud variation that cannot be resolved with climate and weather models. It is also important for optimizing the operation of active solar energy systems such as photovoltaic plants and thermal solar collector arrays, and for passive solar energy and lighting to buildings. We present regression-based modelling of the observed data, and focus, here, on the statistical properties of the model fits. Using models based on the one hand on what is found in the literature and on physical expectations, and on the other hand on purely statistical models, we find solutions that can explain up to 90% of the variance in global radiation. The models leaning on physical insights include terms for the direct solar radiation, a term for the circum-solar radiation, a diffuse term and a term for the horizon brightening/darkening. The purely statistical model is found using data- and formula-validation approaches picking model expressions from a general catalogue of possible formulae. The method allows nesting of expressions, and the results found are dependent on and heavily constrained by the cross-validation carried out on statistically independent testing and training data-sets. Slightly better fits -- in terms of variance explained -- is found using the purely statistical fitting/searching approach. We describe the methods applied, results found, and discuss the different potentials of the physics- and statistics-only based model-searches.
Monte Carlo simulation of Ray-Scan 64 PET system and performance evaluation using GATE toolkit
NASA Astrophysics Data System (ADS)
Li, Suying; Zhang, Qiushi; Vuletic, Ivan; Xie, Zhaoheng; Yang, Kun; Ren, Qiushi
2017-02-01
In this study, we aimed to develop a GATE model for the simulation of Ray-Scan 64 PET scanner and model its performance characteristics. A detailed implementation of system geometry and physical process were included in the simulation model. Then we modeled the performance characteristics of Ray-Scan 64 PET system for the first time, based on National Electrical Manufacturers Association (NEMA) NU-2 2007 protocols and validated the model against experimental measurement, including spatial resolution, sensitivity, counting rates and noise equivalent count rate (NECR). Moreover, an accurate dead time module was investigated to simulate the counting rate performance. Overall results showed reasonable agreement between simulation and experimental data. The validation results showed the reliability and feasibility of the GATE model to evaluate major performance of Ray-Scan 64 PET system. It provided a useful tool for a wide range of research applications.
Enhancement of vehicle dynamics via an innovative magnetorheological fluid limited slip differential
NASA Astrophysics Data System (ADS)
Russo, Riccardo; Strano, Salvatore; Terzo, Mario
2016-03-01
A new automotive controllable differential is proposed and tested, firstly in software environment and, successively, following a hardware in the loop procedure based on the employment of the physical prototype. The device is based on the employment of magnetorheological fluid, whose magnetization allows to generate the locking torque and, consequently, the corrective yaw moment. A vehicle model has been derived and adopted for the design of a yaw moment controller based on the sliding mode approach. Some feedbacks requested by the controller have been estimated by means of an extended Kalman filter. The obtained results show the effectiveness of the device in terms of vehicle dynamics improvement. Indeed, the results reached by the vehicle in presence of the new differential confirm the improved performances for both steady and unsteady state manoeuvres. Moreover, the hardware in the loop testing allows to overcome the limits due to the modelling of the differential, fully validating the physical prototype.
Schraiber, Lilia Blima; Bettiol, Heloisa; Barbieri, Marco Antônio
2017-01-01
Few studies have used structural equation modeling to analyze the effects of variables on violence against women. The present study analyzed the effects of socioeconomic status and social support on violence against pregnant women who used prenatal services. This was a cross-sectional study based on data from the Brazilian Ribeirão Preto and São Luís birth cohort studies (BRISA). The sample of the municipality of São Luís (Maranhão/Brazil) consisted of 1,446 pregnant women interviewed in 2010 and 2011. In the proposed model, socioeconomic status was the most distal predictor, followed by social support that determined general violence, psychological violence or physical/sexual violence, which were analyzed as latent variables. Violence was measured by the World Health Organization Violence against Women (WHO VAW) instrument. The São Luis model was estimated using structural equation modeling and validated with 1,378 pregnant women from Ribeirão Preto (São Paulo/Brazil). The proposed model showed good fit for general, psychological and physical/sexual violence for the São Luís sample. Socioeconomic status had no effect on general or psychological violence (p>0.05), but pregnant women with lower socioeconomic status reported more episodes of physical/sexual violence (standardized coefficient, SC = -0.136; p = 0.021). This effect of socioeconomic status was indirect and mediated by low social support (SC = -0.075; p<0.001). Low social support was associated with more episodes of general, psychological and physical/sexual violence (p<0.001). General and psychological violence indistinctly affected pregnant women of different socioeconomic status. Physical/sexual violence was more common for pregnant women with lower socioeconomic status and lower social support. Better social support contributed to reduction of all types of violence. Results were nearly the same for the validation sample of Ribeirão Preto except that SES was not associated with physical/sexual violence. PMID:28107428
Ribeiro, Marizélia Rodrigues Costa; Silva, Antônio Augusto Moura da; Alves, Maria Teresa Seabra Soares de Britto E; Batista, Rosângela Fernandes Lucena; Ribeiro, Cecília Cláudia Costa; Schraiber, Lilia Blima; Bettiol, Heloisa; Barbieri, Marco Antônio
2017-01-01
Few studies have used structural equation modeling to analyze the effects of variables on violence against women. The present study analyzed the effects of socioeconomic status and social support on violence against pregnant women who used prenatal services. This was a cross-sectional study based on data from the Brazilian Ribeirão Preto and São Luís birth cohort studies (BRISA). The sample of the municipality of São Luís (Maranhão/Brazil) consisted of 1,446 pregnant women interviewed in 2010 and 2011. In the proposed model, socioeconomic status was the most distal predictor, followed by social support that determined general violence, psychological violence or physical/sexual violence, which were analyzed as latent variables. Violence was measured by the World Health Organization Violence against Women (WHO VAW) instrument. The São Luis model was estimated using structural equation modeling and validated with 1,378 pregnant women from Ribeirão Preto (São Paulo/Brazil). The proposed model showed good fit for general, psychological and physical/sexual violence for the São Luís sample. Socioeconomic status had no effect on general or psychological violence (p>0.05), but pregnant women with lower socioeconomic status reported more episodes of physical/sexual violence (standardized coefficient, SC = -0.136; p = 0.021). This effect of socioeconomic status was indirect and mediated by low social support (SC = -0.075; p<0.001). Low social support was associated with more episodes of general, psychological and physical/sexual violence (p<0.001). General and psychological violence indistinctly affected pregnant women of different socioeconomic status. Physical/sexual violence was more common for pregnant women with lower socioeconomic status and lower social support. Better social support contributed to reduction of all types of violence. Results were nearly the same for the validation sample of Ribeirão Preto except that SES was not associated with physical/sexual violence.
Projected climate change impacts on winter recreation in the ...
A physically-based water and energy balance model is used to simulate natural snow accumulation at 247 winter recreation locations across the continental United States. We combine this model with projections of snowmaking conditions to determine downhill skiing, cross-country skiing, and snowmobiling season lengths under baseline and future climates, using data from five climate models and two emissions scenarios. The present-day simulations from the snow model without snowmaking are validated with observations of snow-water-equivalent from snow monitoring sites. Projected season lengths are combined with baseline estimates of winter recreation activity to monetize impacts to the selected winter recreation activity categories for the years 2050 and 2090. Estimate the physical and economic impact of climate change on winter recreation in the contiguous U.S.
Spacecraft Internal Acoustic Environment Modeling
NASA Technical Reports Server (NTRS)
Chu, Shao-Sheng R.; Allen Christopher S.
2010-01-01
Acoustic modeling can be used to identify key noise sources, determine/analyze sub-allocated requirements, keep track of the accumulation of minor noise sources, and to predict vehicle noise levels at various stages in vehicle development, first with estimates of noise sources, later with experimental data. This paper describes the implementation of acoustic modeling for design purposes by incrementally increasing model fidelity and validating the accuracy of the model while predicting the noise of sources under various conditions. During FY 07, a simple-geometry Statistical Energy Analysis (SEA) model was developed and validated using a physical mockup and acoustic measurements. A process for modeling the effects of absorptive wall treatments and the resulting reverberation environment were developed. During FY 08, a model with more complex and representative geometry of the Orion Crew Module (CM) interior was built, and noise predictions based on input noise sources were made. A corresponding physical mockup was also built. Measurements were made inside this mockup, and comparisons were made with the model and showed excellent agreement. During FY 09, the fidelity of the mockup and corresponding model were increased incrementally by including a simple ventilation system. The airborne noise contribution of the fans was measured using a sound intensity technique, since the sound power levels were not known beforehand. This is opposed to earlier studies where Reference Sound Sources (RSS) with known sound power level were used. Comparisons of the modeling result with the measurements in the mockup showed excellent results. During FY 10, the fidelity of the mockup and the model were further increased by including an ECLSS (Environmental Control and Life Support System) wall, associated closeout panels, and the gap between ECLSS wall and mockup wall. The effect of sealing the gap and adding sound absorptive treatment to ECLSS wall were also modeled and validated.
2012-03-01
such as FASCODE is accomplished. The assessment is limited by the correctness of the models used; validating the models is beyond the scope of this...comparisons with other models and validation against data sets (Snell et al. 2000). 2.3.2 Previous Research Several LADAR simulations have been produced...performance models would better capture the atmosphere physics and climatological effects on these systems. Also, further validation needs to be performed
2012-01-01
Background The purpose of this study was to examine the internal consistency, test-retest reliability, construct validity and predictive validity of a new German self-report instrument to assess the influence of social support and the physical environment on physical activity in adolescents. Methods Based on theoretical consideration, the short scales on social support and physical environment were developed and cross-validated in two independent study samples of 9 to 17 year-old girls and boys. The longitudinal sample of Study I (n = 196) was recruited from a German comprehensive school, and subjects in this study completed the questionnaire twice with a between-test interval of seven days. Cronbach’s alphas were computed to determine the internal consistency of the factors. Test-retest reliability of the latent factors was assessed using intra-class coefficients. Factorial validity of the scales was assessed using principle components analysis. Construct validity was determined using a cross-validation technique by performing confirmatory factor analysis with the independent nationwide cross-sectional sample of Study II (n = 430). Correlations between factors and three measures of physical activity (objectively measured moderate-to-vigorous physical activity (MVPA), self-reported habitual MVPA and self-reported recent MVPA) were calculated to determine the predictive validity of the instrument. Results Construct validity of the social support scale (two factors: parental support and peer support) and the physical environment scale (four factors: convenience, public recreation facilities, safety and private sport providers) was shown. Both scales had moderate test-retest reliability. The factors of the social support scale also had good internal consistency and predictive validity. Internal consistency and predictive validity of the physical environment scale were low to acceptable. Conclusions The results of this study indicate moderate to good reliability and construct validity of the social support scale and physical environment scale. Predictive validity was only confirmed for the social support scale but not for the physical environment scale. Hence, it remains unclear if a person’s physical environment has a direct or an indirect effect on physical activity behavior or a moderation function. PMID:22928865
Reimers, Anne K; Jekauc, Darko; Mess, Filip; Mewes, Nadine; Woll, Alexander
2012-08-29
The purpose of this study was to examine the internal consistency, test-retest reliability, construct validity and predictive validity of a new German self-report instrument to assess the influence of social support and the physical environment on physical activity in adolescents. Based on theoretical consideration, the short scales on social support and physical environment were developed and cross-validated in two independent study samples of 9 to 17 year-old girls and boys. The longitudinal sample of Study I (n = 196) was recruited from a German comprehensive school, and subjects in this study completed the questionnaire twice with a between-test interval of seven days. Cronbach's alphas were computed to determine the internal consistency of the factors. Test-retest reliability of the latent factors was assessed using intra-class coefficients. Factorial validity of the scales was assessed using principle components analysis. Construct validity was determined using a cross-validation technique by performing confirmatory factor analysis with the independent nationwide cross-sectional sample of Study II (n = 430). Correlations between factors and three measures of physical activity (objectively measured moderate-to-vigorous physical activity (MVPA), self-reported habitual MVPA and self-reported recent MVPA) were calculated to determine the predictive validity of the instrument. Construct validity of the social support scale (two factors: parental support and peer support) and the physical environment scale (four factors: convenience, public recreation facilities, safety and private sport providers) was shown. Both scales had moderate test-retest reliability. The factors of the social support scale also had good internal consistency and predictive validity. Internal consistency and predictive validity of the physical environment scale were low to acceptable. The results of this study indicate moderate to good reliability and construct validity of the social support scale and physical environment scale. Predictive validity was only confirmed for the social support scale but not for the physical environment scale. Hence, it remains unclear if a person's physical environment has a direct or an indirect effect on physical activity behavior or a moderation function.
Numerical framework for the modeling of electrokinetic flows
NASA Astrophysics Data System (ADS)
Deshpande, Manish; Ghaddar, Chahid; Gilbert, John R.; St. John, Pamela M.; Woudenberg, Timothy M.; Connell, Charles R.; Molho, Joshua; Herr, Amy; Mungal, Godfrey; Kenny, Thomas W.
1998-09-01
This paper presents a numerical framework for design-based analyses of electrokinetic flow in interconnects. Electrokinetic effects, which can be broadly divided into electrophoresis and electroosmosis, are of importance in providing a transport mechanism in microfluidic devices for both pumping and separation. Models for the electrokinetic effects can be derived and coupled to the fluid dynamic equations through appropriate source terms. In the design of practical microdevices, however, accurate coupling of the electrokinetic effects requires the knowledge of several material and physical parameters, such as the diffusivity and the mobility of the solute in the solvent. Additionally wall-based effects such as chemical binding sites might exist that affect the flow patterns. In this paper, we address some of these issues by describing a synergistic numerical/experimental process to extract the parameters required. Experiments were conducted to provide the numerical simulations with a mechanism to extract these parameters based on quantitative comparisons with each other. These parameters were then applied in predicting further experiments to validate the process. As part of this research, we have created NetFlow, a tool for micro-fluid analyses. The tool can be validated and applied in existing technologies by first creating test structures to extract representations of the physical phenomena in the device, and then applying them in the design analyses to predict correct behavior.
NASA Astrophysics Data System (ADS)
Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.
2009-12-01
Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.
AIAA Aerospace America Magazine - Year in Review Article, 2010
NASA Technical Reports Server (NTRS)
Figueroa, Fernando
2010-01-01
NASA Stennis Space Center has implemented a pilot operational Integrated System Health Management (ISHM) capability. The implementation was done for the E-2 Rocket Engine Test Stand and a Chemical Steam Generator (CSG) test article; and validated during operational testing. The CSG test program is a risk mitigation activity to support building of the new A-3 Test Stand, which will be a highly complex facility for testing of engines in high altitude conditions. The foundation of the ISHM capability are knowledge-based integrated domain models for the test stand and CSG, with physical and model-based elements represented by objects the domain models enable modular and evolutionary ISHM functionality.
Validation of Afterbody Aeroheating Predictions for Planetary Probes: Status and Future Work
NASA Technical Reports Server (NTRS)
Wright, Michael J.; Brown, James L.; Sinha, Krishnendu; Candler, Graham V.; Milos, Frank S.; Prabhu, DInesh K.
2005-01-01
A review of the relevant flight conditions and physical models for planetary probe afterbody aeroheating calculations is given. Readily available sources of afterbody flight data and published attempts to computationally simulate those flights are summarized. A current status of the application of turbulence models to afterbody flows is presented. Finally, recommendations for additional analysis and testing that would reduce our uncertainties in our ability to accurately predict base heating levels are given.
A Goddard Multi-Scale Modeling System with Unified Physics
NASA Technical Reports Server (NTRS)
Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.;
2008-01-01
Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite simulator has been developed at GSFC, which is designed to fully utilize the multi-scale modeling system. A brief review of the multi-scale modeling system with unified physics/simulator and examples is presented in this article.
Polycrystalline CVD diamond device level modeling for particle detection applications
NASA Astrophysics Data System (ADS)
Morozzi, A.; Passeri, D.; Kanxheri, K.; Servoli, L.; Lagomarsino, S.; Sciortino, S.
2016-12-01
Diamond is a promising material whose excellent physical properties foster its use for radiation detection applications, in particular in those hostile operating environments where the silicon-based detectors behavior is limited due to the high radiation fluence. Within this framework, the application of Technology Computer Aided Design (TCAD) simulation tools is highly envisaged for the study, the optimization and the predictive analysis of sensing devices. Since the novelty of using diamond in electronics, this material is not included in the library of commercial, state-of-the-art TCAD software tools. In this work, we propose the development, the application and the validation of numerical models to simulate the electrical behavior of polycrystalline (pc)CVD diamond conceived for diamond sensors for particle detection. The model focuses on the characterization of a physically-based pcCVD diamond bandgap taking into account deep-level defects acting as recombination centers and/or trap states. While a definite picture of the polycrystalline diamond band-gap is still debated, the effect of the main parameters (e.g. trap densities, capture cross-sections, etc.) can be deeply investigated thanks to the simulated approach. The charge collection efficiency due to β -particle irradiation of diamond materials provided by different vendors and with different electrode configurations has been selected as figure of merit for the model validation. The good agreement between measurements and simulation findings, keeping the traps density as the only one fitting parameter, assesses the suitability of the TCAD modeling approach as a predictive tool for the design and the optimization of diamond-based radiation detectors.
Cerin, Ester; Suen, Yi Nam; Barnett, Anthony; Huang, Wendy Y J; Mellecker, Robin R
2017-12-01
Childhood physical activity (PA) is important for health across the lifespan. Time pre-schoolers spend outdoors, which has been associated with more PA, is likely influenced by parents' perception of neighbourhood informal social control relevant to pre-schoolers' PA, defined as the willingness of neighbours to intervene to ensure social order and a safe community environment for young children's active play. To advance measurement of this construct, we assessed factorial and construct validities of the PA-related neighbourhood informal social control scale for parents of pre-schoolers (PANISC-PP). In 2013-2014, Hong Kong primary caregivers (n=394) of 3-5 year-old children completed a socio-demographic questionnaire, the preliminary version of the PANISC-PP, and self-report measures of theoretical neighbourhood correlates of PA-related neighbourhood informal social control (perceived signs of physical and social disorder, community cohesion, perceived stranger danger, risk of unintentional injury and traffic safety). The fit of the data to an a priori measurement model of the PANISC-PP was examined using confirmatory factor analyses. As the a priori model showed inadequate fit to the data, the factor structure was re-specified based on theoretical considerations. The final measurement models of the PANISC-PP showed acceptable fit to the data and consisted of three correlated latent factors: "General informal supervision", "Civic engagement for the creation of a better neighbourhood environment" and "Educating and assisting neighbourhood children". The internal reliability of the subscales was good (Cronbach's α values 0.82-0.89). Generalised additive mixed models indicated that all subscales were positively associated with community cohesion and scores on the subscale "Educating and assisting neighbourhood children" were related in the expected direction to all indicators of traffic and personal safety, supporting construct validity of the PANISC-PP. This study suggests that the PANISC-PP is a reliable and valid instrument for assessing parents' perceived neighbourhood informal social control related to pre-schoolers' PA.
Maddison, Ralph; Jiang, Yannan; Dalleck, Lance; Löf, Marie
2013-01-01
Background Questionnaires are commonly used to assess physical activity in large population-based studies because of their low cost and convenience. Many self-report physical activity questionnaires have been shown to be valid and reliable measures, but they are subject to measurement errors and misreporting, often due to lengthy recall periods. Mobile phones offer a novel approach to measure self-reported physical activity on a daily basis and offer real-time data collection with the potential to enhance recall. Objective The aims of this study were to determine the convergent validity of a mobile phone physical activity (MobilePAL) questionnaire against accelerometry in people with cardiovascular disease (CVD), and to compare how the MobilePAL questionnaire performed compared with the commonly used self-recall International Physical Activity Questionnaire (IPAQ). Methods Thirty adults aged 49 to 85 years with CVD were recruited from a local exercise-based cardiac rehabilitation clinic in Auckland, New Zealand. All participants completed a demographics questionnaire and underwent a 6-minute walk test at the first visit. Subsequently, participants were temporarily provided a smartphone (with the MobilePAL questionnaire preloaded that asked 2 questions daily) and an accelerometer, which was to be worn for 7 days. After 1 week, a follow-up visit was completed during which the smartphone and accelerometer were returned, and participants completed the IPAQ. Results Average daily physical activity level measured using the MobilePAL questionnaire showed moderate correlation (r=.45; P=.01) with daily activity counts per minute (Acc_CPM) and estimated metabolic equivalents (MET) (r=.45; P=.01) measured using the accelerometer. Both MobilePAL (beta=.42; P=.008) and age (beta=–.48, P=.002) were significantly associated with Acc_CPM (adjusted R2=.40). When IPAQ-derived energy expenditure, measured in MET-minutes per week (IPAQ_met), was considered in the predicted model, both IPAQ_met (beta=.51; P=.001) and age (beta=–.36; P=.016) made unique contributions (adjusted R2=.47, F 2,27=13.58; P<.001).There was also a significant association between the MobilePAL and IPAQ measures (r=.49, beta=.51; P=.007). Conclusions A mobile phone–delivered questionnaire is a relatively reliable and valid measure of physical activity in a CVD cohort. Reliability and validity measures in the present study are comparable to existing self-report measures. Given their ubiquitous use, mobile phones may be an effective method for physical activity surveillance data collection. PMID:23524251
Reliability and Validity of an Internet-based Questionnaire Measuring Lifetime Physical Activity
De Vera, Mary A.; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek
2010-01-01
Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005–2006. Reliability was examined using a test-retest study. Validity was examined in a 2-part study consisting of 1) comparisons with previously validated instruments measuring similar constructs, the Lifetime Total Physical Activity Questionnaire (LT-PAQ) and the Chasan-Taber Physical Activity Questionnaire (CT-PAQ), and 2) a priori hypothesis tests of constructs measured by the L-PAQ. The L-PAQ demonstrated good reliability, with intraclass correlation coefficients ranging from 0.67 (household activity) to 0.89 (sports/recreation). Comparison between the L-PAQ and the LT-PAQ resulted in Spearman correlation coefficients ranging from 0.41 (total activity) to 0.71 (household activity); comparison between the L-PAQ and the CT-PAQ yielded coefficients of 0.58 (sports/recreation), 0.56 (household activity), and 0.50 (total activity). L-PAQ validity was further supported by observed relations between the L-PAQ and sociodemographic variables, consistent with a priori hypotheses. Overall, the L-PAQ is a useful instrument for assessing multiple domains of lifetime physical activity with acceptable reliability and validity. PMID:20876666
Reliability and validity of an internet-based questionnaire measuring lifetime physical activity.
De Vera, Mary A; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek
2010-11-15
Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005-2006. Reliability was examined using a test-retest study. Validity was examined in a 2-part study consisting of 1) comparisons with previously validated instruments measuring similar constructs, the Lifetime Total Physical Activity Questionnaire (LT-PAQ) and the Chasan-Taber Physical Activity Questionnaire (CT-PAQ), and 2) a priori hypothesis tests of constructs measured by the L-PAQ. The L-PAQ demonstrated good reliability, with intraclass correlation coefficients ranging from 0.67 (household activity) to 0.89 (sports/recreation). Comparison between the L-PAQ and the LT-PAQ resulted in Spearman correlation coefficients ranging from 0.41 (total activity) to 0.71 (household activity); comparison between the L-PAQ and the CT-PAQ yielded coefficients of 0.58 (sports/recreation), 0.56 (household activity), and 0.50 (total activity). L-PAQ validity was further supported by observed relations between the L-PAQ and sociodemographic variables, consistent with a priori hypotheses. Overall, the L-PAQ is a useful instrument for assessing multiple domains of lifetime physical activity with acceptable reliability and validity.
Pedersen, Scott J; Kitic, Cecilia M; Bird, Marie-Louise; Mainsbridge, Casey P; Cooley, P Dean
2016-08-19
With the advent of workplace health and wellbeing programs designed to address prolonged occupational sitting, tools to measure behaviour change within this environment should derive from empirical evidence. In this study we measured aspects of validity and reliability for the Occupational Sitting and Physical Activity Questionnaire that asks employees to recount the percentage of work time they spend in the seated, standing, and walking postures during a typical workday. Three separate cohort samples (N = 236) were drawn from a population of government desk-based employees across several departmental agencies. These volunteers were part of a larger state-wide intervention study. Workplace sitting and physical activity behaviour was measured both subjectively against the International Physical Activity Questionnaire, and objectively against ActivPal accelerometers before the intervention began. Criterion validity and concurrent validity for each of the three posture categories were assessed using Spearman's rank correlation coefficients, and a bias comparison with 95 % limits of agreement. Test-retest reliability of the survey was reported with intraclass correlation coefficients. Criterion validity for this survey was strong for sitting and standing estimates, but weak for walking. Participants significantly overestimated the amount of walking they did at work. Concurrent validity was moderate for sitting and standing, but low for walking. Test-retest reliability of this survey proved to be questionable for our sample. Based on our findings we must caution occupational health and safety professionals about the use of employee self-report data to estimate workplace physical activity. While the survey produced accurate measurements for time spent sitting at work it was more difficult for employees to estimate their workplace physical activity.
Bifactor Approach to Modeling Multidimensionality of Physical Self-Perception Profile
ERIC Educational Resources Information Center
Chung, ChihMing; Liao, Xiaolan; Song, Hairong; Lee, Taehun
2016-01-01
The multi-dimensionality of Physical Self-Perception Profile (PSPP) has been acknowledged by the use of correlated-factor model and second-order model. In this study, the authors critically endorse the bifactor model, as a substitute to address the multi-dimensionality of PSPP. To cross-validate the models, analyses are conducted first in…
Estimation of Transport and Kinetic Parameters of Vanadium Redox Batteries Using Static Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Seong Beom; Pratt, III, Harry D.; Anderson, Travis M.
Mathematical models of Redox Flow Batteries (RFBs) can be used to analyze cell performance, optimize battery operation, and control the energy storage system efficiently. Among many other models, physics-based electrochemical models are capable of predicting internal states of the battery, such as temperature, state-of-charge, and state-of-health. In the models, estimating parameters is an important step that can study, analyze, and validate the models using experimental data. A common practice is to determine these parameters either through conducting experiments or based on the information available in the literature. However, it is not easy to investigate all proper parameters for the modelsmore » through this way, and there are occasions when important information, such as diffusion coefficients and rate constants of ions, has not been studied. Also, the parameters needed for modeling charge-discharge are not always available. In this paper, an efficient way to estimate parameters of physics-based redox battery models will be proposed. Furthermore, this paper also demonstrates that the proposed approach can study and analyze aspects of capacity loss/fade, kinetics, and transport phenomena of the RFB system.« less
Estimation of Transport and Kinetic Parameters of Vanadium Redox Batteries Using Static Cells
Lee, Seong Beom; Pratt, III, Harry D.; Anderson, Travis M.; ...
2018-03-27
Mathematical models of Redox Flow Batteries (RFBs) can be used to analyze cell performance, optimize battery operation, and control the energy storage system efficiently. Among many other models, physics-based electrochemical models are capable of predicting internal states of the battery, such as temperature, state-of-charge, and state-of-health. In the models, estimating parameters is an important step that can study, analyze, and validate the models using experimental data. A common practice is to determine these parameters either through conducting experiments or based on the information available in the literature. However, it is not easy to investigate all proper parameters for the modelsmore » through this way, and there are occasions when important information, such as diffusion coefficients and rate constants of ions, has not been studied. Also, the parameters needed for modeling charge-discharge are not always available. In this paper, an efficient way to estimate parameters of physics-based redox battery models will be proposed. Furthermore, this paper also demonstrates that the proposed approach can study and analyze aspects of capacity loss/fade, kinetics, and transport phenomena of the RFB system.« less
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2016-05-01
Quality control is critical to manufacturing. Frequently, techniques are used to define object conformity bounds, based on historical quality data. This paper considers techniques for bespoke and small batch jobs that are not statistical model based. These techniques also serve jobs where 100% validation is needed due to the mission or safety critical nature of particular parts. One issue with this type of system is alignment discrepancies between the generated model and the physical part. This paper discusses and evaluates techniques for characterizing and correcting alignment issues between the projected and perceived data sets to prevent errors attributable to misalignment.
Evidence base and future research directions in the management of low back pain
Abbott, Allan
2016-01-01
Low back pain (LBP) is a prevalent and costly condition. Awareness of valid and reliable patient history taking, physical examination and clinical testing is important for diagnostic accuracy. Stratified care which targets treatment to patient subgroups based on key characteristics is reliant upon accurate diagnostics. Models of stratified care that can potentially improve treatment effects include prognostic risk profiling for persistent LBP, likely response to specific treatment based on clinical prediction models or suspected underlying causal mechanisms. The focus of this editorial is to highlight current research status and future directions for LBP diagnostics and stratified care. PMID:27004162
Choi, Mona; Ahn, Sangwoo; Jung, Dukyoo
2015-01-01
We evaluated the psychometric properties of the Korean version of the Self-Efficacy for Exercise Scale (SEE-K). The SEE-K consists of nine items and was translated into Korean using the forward-backward translation method. We administered it to 212 community-dwelling older adults along with measures of outcome expectation for exercise, quality of life, and physical activity. The validity was determined using confirmatory factor analysis and Rasch analysis with INFIT and OUTFIT statistics, which showed acceptable model fit. The concurrent validity was confirmed according to positive correlations between the SEE-K, outcome expectation for exercise, and quality of life. Furthermore, the high physical activity group had higher SEE-K scores. Finally, the reliability of the SEE-K was deemed acceptable based on Cronbach's alpha, coefficients of determination, and person and item separation indices with reliability. Thus, the SEE-K appears to have satisfactory validity and reliability among older adults in South Korea. Copyright © 2015 Elsevier Inc. All rights reserved.
Integration of data-driven and physically-based methods to assess shallow landslides susceptibility
NASA Astrophysics Data System (ADS)
Lajas, Sara; Oliveira, Sérgio C.; Zêzere, José Luis
2016-04-01
Approaches used to assess shallow landslides susceptibility at the basin scale are conceptually different depending on the use of statistic or deterministic methods. The data-driven methods are sustained in the assumption that the same causes are likely to produce the same effects and for that reason a present/past landslide inventory and a dataset of factors assumed as predisposing factors are crucial for the landslide susceptibility assessment. The physically-based methods are based on a system controlled by physical laws and soil mechanics, where the forces which tend to promote movement are compared with forces that tend to promote resistance to movement. In this case, the evaluation of susceptibility is supported by the calculation of the Factor of safety (FoS), and dependent of the availability of detailed data related with the slope geometry and hydrological and geotechnical properties of the soils and rocks. Within this framework, this work aims to test two hypothesis: (i) although conceptually distinct and based on contrasting procedures, statistic and deterministic methods generate similar shallow landslides susceptibility results regarding the predictive capacity and spatial agreement; and (ii) the integration of the shallow landslides susceptibility maps obtained with data-driven and physically-based methods, for the same study area, generate a more reliable susceptibility model for shallow landslides occurrence. To evaluate these two hypotheses, we select the Information Value data-driven method and the physically-based Infinite Slope model to evaluate shallow landslides in the study area of Monfalim and Louriceira basins (13.9 km2), which is located in the north of Lisbon region (Portugal). The landslide inventory is composed by 111 shallow landslides and was divide in two independent groups based on temporal criteria (age ≤ 1983 and age > 1983): (i) the modelling group (51 cases) was used to define the weights for each predisposing factor (lithology, land use, slope, aspect, curvature, topographic position index and the slope over area ratio) with the Information Value method and was used also to calibrate the strength parameters (cohesion and friction angle) of the different lithological units considered in the Infinity Slope model; and (ii) the validation group (60 cases) was used to independent validate and define the predictive capacity of the shallow landslides susceptibility maps produced with the Information Value method and the Infinite Slope method. The comparison of both landslide susceptibility maps was supported by: (i) the computation of the Receiver Operator Characteristic (ROC) curves; (ii) the calculation of the Area Under the Curve (AUC); and (iii) the evaluation of the spatial agreement between the landslide susceptibility classes. Finally, the susceptibility maps produced with the Information Value and the Infinite Slope methods are integrated into a single landslide susceptibility map based on a set of integration rules define by cross-validation of the susceptibility classes of both maps and analysis of the corresponding contingency table. This work was supported by the FCT - Portuguese Foundation for Science and Technology and is within the framework of the FORLAND Project. Sérgio Oliveira was funded by a postdoctoral grant (SFRH/BPD/85827/2012) from the Portuguese Foundation for Science and Technology (FCT).
Using the split Hopkinson pressure bar to validate material models.
Church, Philip; Cornish, Rory; Cullis, Ian; Gould, Peter; Lewtas, Ian
2014-08-28
This paper gives a discussion of the use of the split-Hopkinson bar with particular reference to the requirements of materials modelling at QinetiQ. This is to deploy validated material models for numerical simulations that are physically based and have as little characterization overhead as possible. In order to have confidence that the models have a wide range of applicability, this means, at most, characterizing the models at low rate and then validating them at high rate. The split Hopkinson pressure bar (SHPB) is ideal for this purpose. It is also a very useful tool for analysing material behaviour under non-shock wave loading. This means understanding the output of the test and developing techniques for reliable comparison of simulations with SHPB data. For materials other than metals comparison with an output stress v strain curve is not sufficient as the assumptions built into the classical analysis are generally violated. The method described in this paper compares the simulations with as much validation data as can be derived from deployed instrumentation including the raw strain gauge data on the input and output bars, which avoids any assumptions about stress equilibrium. One has to take into account Pochhammer-Chree oscillations and their effect on the specimen and recognize that this is itself also a valuable validation test of the material model. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Williams, Karen Ann
One section of college students (N = 25) enrolled in an algebra-based physics course was selected for a Piagetian-based learning cycle (LC) treatment while a second section (N = 25) studied in an Ausubelian-based meaningful verbal reception learning treatment (MVRL). This study examined the students' overall (concept + problem solving + mental model) meaningful understanding of force, density/Archimedes Principle, and heat. Also examined were students' meaningful understanding as measured by conceptual questions, problems, and mental models. In addition, students' learning orientations were examined. There were no significant posttest differences between the LC and MVRL groups for students' meaningful understanding or learning orientation. Piagetian and Ausubelian theories explain meaningful understanding for each treatment. Students from each treatment increased their meaningful understanding. However, neither group altered their learning orientation. The results of meaningful understanding as measured by conceptual questions, problem solving, and mental models were mixed. Differences were attributed to the weaknesses and strengths of each treatment. This research also examined four variables (treatment, reasoning ability, learning orientation, and prior knowledge) to find which best predicted students' overall meaningful understanding of physics concepts. None of these variables were significant predictors at the.05 level. However, when the same variables were used to predict students' specific understanding (i.e. concept, problem solving, or mental model understanding), the results were mixed. For forces and density/Archimedes Principle, prior knowledge and reasoning ability significantly predicted students' conceptual understanding. For heat, however, reasoning ability was the only significant predictor of concept understanding. Reasoning ability and treatment were significant predictors of students' problem solving for heat and forces. For density/Archimedes Principle, treatment was the only significant predictor of students' problem solving. None of the variables were significant predictors of mental model understanding. This research suggested that Piaget and Ausubel used different terminology to describe learning yet these theories are similar. Further research is needed to validate this premise and validate the blending of the two theories.
Ion flux through membrane channels--an enhanced algorithm for the Poisson-Nernst-Planck model.
Dyrka, Witold; Augousti, Andy T; Kotulska, Malgorzata
2008-09-01
A novel algorithmic scheme for numerical solution of the 3D Poisson-Nernst-Planck model is proposed. The algorithmic improvements are universal and independent of the detailed physical model. They include three major steps: an adjustable gradient-based step value, an adjustable relaxation coefficient, and an optimized segmentation of the modeled space. The enhanced algorithm significantly accelerates the speed of computation and reduces the computational demands. The theoretical model was tested on a regular artificial channel and validated on a real protein channel-alpha-hemolysin, proving its efficiency. (c) 2008 Wiley Periodicals, Inc.
A validation study of a stochastic model of human interaction
NASA Astrophysics Data System (ADS)
Burchfield, Mitchel Talmadge
The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree of the variance in each probability distribution. The correlation between predicted and observed probabilities ranged from a low of 0.955 to a high value of 0.998, indicating that humans behave in psychological space as Fermions behave in momentum space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, Daniel J.; Lee, Choonsik; Tien, Christopher
2013-01-15
Purpose: To validate the accuracy of a Monte Carlo source model of the Siemens SOMATOM Sensation 16 CT scanner using organ doses measured in physical anthropomorphic phantoms. Methods: The x-ray output of the Siemens SOMATOM Sensation 16 multidetector CT scanner was simulated within the Monte Carlo radiation transport code, MCNPX version 2.6. The resulting source model was able to perform various simulated axial and helical computed tomographic (CT) scans of varying scan parameters, including beam energy, filtration, pitch, and beam collimation. Two custom-built anthropomorphic phantoms were used to take dose measurements on the CT scanner: an adult male and amore » 9-month-old. The adult male is a physical replica of University of Florida reference adult male hybrid computational phantom, while the 9-month-old is a replica of University of Florida Series B 9-month-old voxel computational phantom. Each phantom underwent a series of axial and helical CT scans, during which organ doses were measured using fiber-optic coupled plastic scintillator dosimeters developed at University of Florida. The physical setup was reproduced and simulated in MCNPX using the CT source model and the computational phantoms upon which the anthropomorphic phantoms were constructed. Average organ doses were then calculated based upon these MCNPX results. Results: For all CT scans, good agreement was seen between measured and simulated organ doses. For the adult male, the percent differences were within 16% for axial scans, and within 18% for helical scans. For the 9-month-old, the percent differences were all within 15% for both the axial and helical scans. These results are comparable to previously published validation studies using GE scanners and commercially available anthropomorphic phantoms. Conclusions: Overall results of this study show that the Monte Carlo source model can be used to accurately and reliably calculate organ doses for patients undergoing a variety of axial or helical CT examinations on the Siemens SOMATOM Sensation 16 scanner.« less
Jensen, Roxanne E.; Potosky, Arnold L.; Reeve, Bryce B.; Hahn, Elizabeth; Cella, David; Fries, James; Smith, Ashley Wilder; Keegan, Theresa H.M.; Wu, Xiao-Cheng; Paddock, Lisa; Moinpour, Carol M.
2016-01-01
Purpose To evaluate the validity of the Patient-Reported Outcomes Measurement Information System (PROMIS) Physical Function measures in a diverse, population-based cancer sample. Methods Cancer patients 6–13 months post diagnosis (n=4,840) were recruited for the Measuring Your Health (MY-Health) study. Participants were diagnosed between 2010–2013 with non-Hodgkin lymphoma or cancers of the colorectum, lung, breast, uterus, cervix, or prostate. Four PROMIS Physical Function short forms (4a, 6b, 10a, and 16) were evaluated for validity and reliability across age and race-ethnicity groups. Covariates included gender, marital status, education level, cancer site and stage, comorbidities, and functional status. Results PROMIS Physical Function short forms showed high internal consistency (Cronbach’s α =0.92 – 0.96), convergent validity (Fatigue, Pain Interference, FACT Physical Well-Being all r≥0.68) and discriminant validity (unrelated domains all r≤0.3) across survey short forms, age, and race-ethnicity. Known group differences by demographic, clinical, and functional characteristics performed as hypothesized. Ceiling effects for higher-functioning individuals were identified on most forms. Conclusions This study provides strong evidence that PROMIS Physical Function measures are valid and reliable in multiple race-ethnicity and age groups. Researchers selecting specific PROMIS short forms should consider the degree of functional disability in their patient population to ensure that length and content are tailored to limit response burden. PMID:25935353
Zhou, Xiangmin; Zhang, Nan; Sha, Desong; Shen, Yunhe; Tamma, Kumar K; Sweet, Robert
2009-01-01
The inability to render realistic soft-tissue behavior in real time has remained a barrier to face and content aspects of validity for many virtual reality surgical training systems. Biophysically based models are not only suitable for training purposes but also for patient-specific clinical applications, physiological modeling and surgical planning. When considering the existing approaches for modeling soft tissue for virtual reality surgical simulation, the computer graphics-based approach lacks predictive capability; the mass-spring model (MSM) based approach lacks biophysically realistic soft-tissue dynamic behavior; and the finite element method (FEM) approaches fail to meet the real-time requirement. The present development stems from physics fundamental thermodynamic first law; for a space discrete dynamic system directly formulates the space discrete but time continuous governing equation with embedded material constitutive relation and results in a discrete mechanics framework which possesses a unique balance between the computational efforts and the physically realistic soft-tissue dynamic behavior. We describe the development of the discrete mechanics framework with focused attention towards a virtual laparoscopic nephrectomy application.
Assessing the impact of modeling limits on intelligent systems
NASA Technical Reports Server (NTRS)
Rouse, William B.; Hammer, John M.
1990-01-01
The knowledge bases underlying intelligent systems are validated. A general conceptual framework is provided for considering the roles in intelligent systems of models of physical, behavioral, and operational phenomena. A methodology is described for identifying limits in particular intelligent systems, and the use of the methodology is illustrated via an experimental evaluation of the pilot-vehicle interface within the Pilot's Associate. The requirements and functionality are outlined for a computer based knowledge engineering environment which would embody the approach advocated and illustrated in earlier discussions. Issues considered include the specific benefits of this functionality, the potential breadth of applicability, and technical feasibility.
Steenson, Sharalyn; Özcebe, Hilal; Arslan, Umut; Konşuk Ünlü, Hande; Araz, Özgür M; Yardim, Mahmut; Üner, Sarp; Bilir, Nazmi; Huang, Terry T-K
2018-01-01
Childhood obesity rates have been rising rapidly in developing countries. A better understanding of the risk factors and social context is necessary to inform public health interventions and policies. This paper describes the validation of several measurement scales for use in Turkey, which relate to child and parent perceptions of physical activity (PA) and enablers and barriers of physical activity in the home environment. The aim of this study was to assess the validity and reliability of several measurement scales in Turkey using a population sample across three socio-economic strata in the Turkish capital, Ankara. Surveys were conducted in Grade 4 children (mean age = 9.7 years for boys; 9.9 years for girls), and their parents, across 6 randomly selected schools, stratified by SES (n = 641 students, 483 parents). Construct validity of the scales was evaluated through exploratory and confirmatory factor analysis. Internal consistency of scales and test-retest reliability were assessed by Cronbach's alpha and intra-class correlation. The scales as a whole were found to have acceptable-to-good model fit statistics (PA Barriers: RMSEA = 0.076, SRMR = 0.0577, AGFI = 0.901; PA Outcome Expectancies: RMSEA = 0.054, SRMR = 0.0545, AGFI = 0.916, and PA Home Environment: RMSEA = 0.038, SRMR = 0.0233, AGFI = 0.976). The PA Barriers subscales showed good internal consistency and poor to fair test-retest reliability (personal α = 0.79, ICC = 0.29, environmental α = 0.73, ICC = 0.59). The PA Outcome Expectancies subscales showed good internal consistency and test-retest reliability (negative α = 0.77, ICC = 0.56; positive α = 0.74, ICC = 0.49). Only the PA Home Environment subscale on support for PA was validated in the final confirmatory model; it showed moderate internal consistency and test-retest reliability (α = 0.61, ICC = 0.48). This study is the first to validate measures of perceptions of physical activity and the physical activity home environment in Turkey. Our results support the originally hypothesized two-factor structures for Physical Activity Barriers and Physical Activity Outcome Expectancies. However, we found the one-factor rather than two-factor structure for Physical Activity Home Environment had the best model fit. This study provides general support for the use of these scales in Turkey in terms of validity, but test-retest reliability warrants further research.
NASA Astrophysics Data System (ADS)
Mangiarotti, Sylvain; Drapeau, Laurent
2013-04-01
The global modeling approach aims to obtain parsimonious models of observed dynamics from few or single time series (Letellier et al. 2009). Specific algorithms were developed and validated for this purpose (Mangiarotti et al. 2012a). This approach was applied to the dynamics of cereal crops in semi-arid region using the vegetation index derived from satellite data as a proxy of the dynamics. A low-dimensional autonomous model could be obtained. The corresponding attractor is characteristic of weakly dissipative chaos and exhibits a toroidal-like structure. At present, only few theoretical cases of such chaos are known, and none was obtained from real world observations. Under smooth conditions, a robust validation of three-dimensional chaotic models can be usually performed based on the topological approach (Gilmore 1998). Such approach becomes more difficult for weakly dissipative systems, and almost impossible under noisy observational conditions. For this reason, another validation approach is developed which consists in comparing the forecasting skill of the model to other forecasts for which no dynamical model is required. A data assimilation process is associated to the model to estimate the model's skill; several schemes are tested (simple re-initialization, Extended and Ensemble Kalman Filters and Back and Forth Nudging). Forecasts without model are performed based on the search of analogous states in the phase space (Mangiarotti et al. 2012b). The comparison reveals the quality of the model's forecasts at short to moderate horizons and contributes to validate the model. These results suggest that the dynamics of cereal crops can be reasonably approximated by low-dimensional chaotic models, and also bring out powerful arguments for chaos. Chaotic models have often been used as benchmark to test data assimilation schemes; the present work shows that such tests may not only have a theoretical interest, but also almost direct applicative potential. Moreover, other global models could be obtained for other regions. The model considered here is not a particular case which highlights the usefulness to investigate and to widen this field of modeling and research. References: Letellier, C., Aguirre, L.A., Freitas, U.S., 2009. Frequently asked questions about global modeling. Chaos, 19, doi:10.1063/1.3125705. Gilmore R., 1998. Topological analysis of chaotic dynamical systems. Review of Modern Physics, 70, 1455-1530. Mangiarotti, S., Coudret, R., Drapreau, L., Jarlan, L., 2012a. Polynomial Search and Global Modeling - two algorithms for modelling chaos. Physical Review E, 86(4), 046205. Mangiarotti, S., Mazzega, P., Mougin, E., Hiernaux, P., 2012b. Predictability of vegetation cycles over the semi-arid region of Gourma (Mali) from forecasts of AVHRR-NDVI signals. Remote Sensing of Environment, 123, 246-257.
Gupta, C K; Mishra, G; Mehta, S C; Prasad, J
1993-01-01
Lung volumes, capacities, diffusion and alveolar volumes with physical characteristics (age, height and weight) were recorded for 186 healthy school children (96 boys and 90 girls) of 10-17 years age group. The objective was to study the relative importance of physical characteristics as regressor variables in regression models to estimate lung functions. We observed that height is best correlated with all the lung functions. Inclusion of all physical characteristics in the models have little gain compared to the ones having just height as regressor variable. We also find that exponential models were not only statistically valid but fared better compared to the linear ones. We conclude that lung functions covary with height and other physical characteristics but do not depend upon them. The rate of increase in the functions depend upon initial lung functions. Further, we propose models and provide ready reckoners to give estimates of lung functions with 95 per cent confidence limits based on heights from 125 to 170 cm for the age group of 10 to 17 years.
A RE-AIM evaluation of theory-based physical activity interventions.
Antikainen, Iina; Ellis, Rebecca
2011-04-01
Although physical activity interventions have been shown to effectively modify behavior, little research has examined the potential of these interventions for adoption in real-world settings. The purpose of this literature review was to evaluate the external validity of 57 theory-based physical activity interventions using the RE-AIM framework. The physical activity interventions included were more likely to report on issues of internal, rather than external validity and on individual, rather than organizational components of the RE-AIM framework, making the translation of many interventions into practice difficult. Furthermore, most studies included motivated, healthy participants, thus reducing the generalizability of the interventions to real-world settings that provide services to more diverse populations. To determine if a given intervention is feasible and effective in translational research, more information should be reported about the factors that affect external validity.
Automated Analysis of Stateflow Models
NASA Technical Reports Server (NTRS)
Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier
2017-01-01
Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.
The physical therapy profile questionnaire (PTPQ): development, validation and pilot testing.
Dizon, Janine Margarita R; Grimmer-Somers, Karen; Kumar, Saravana
2011-09-19
Country by country similarities and differences in physical therapy practice exists. Therefore, before updates in practice can be provided, such as trainings in evidence-based practice, it is necessary to identify the profile and nature of practice in a given country or setting. Following a search of the international literature, no appropriate tool was identified to collect and establish data to create the profile of physical therapy practice in the Philippines. We therefore developed, validated and pilot tested a survey instrument which would comprehensively describe the practice of physical therapy in the Philippines We used a mixed methods design to answer our study aims. A focus group interview was conducted among a group of physical therapists to establish the content and contexts of items to be included in the survey instrument. Findings were amalgamated with the information from the literature on developing survey instruments/questionnaires. A survey instrument was drafted and named as the Physical Therapy Profile Questionnaire (PTPQ). The PTPQ was then validated and pilot tested to a different group of physical therapists.The final version consisted of five separate parts namely (A) General information and demographics, (B) Practice Profile, (C) Treatment Preferences, (D) Bases for clinical work and (E) Bases for educational/research work. At present the PTPQ is relevant to the Philippines and could be used by any country which has a similar nature of practice with the Philippines. The Physical Therapy Practice Questionnaire (PTPQ) was shown to have good face and content validity among the Filipino physical therapists and their context of practice. It has also been found to be useful, easy to administer tool and in a format appealing to respondents. The PTPQ is expected to assist comprehensive data collection to create a profile of physical therapy practice in the Philippines.
NASA Astrophysics Data System (ADS)
Formosa, F.; Fréchette, L. G.
2015-12-01
An electrical circuit equivalent (ECE) approach has been set up allowing elementary oscillatory microengine components to be modelled. They cover gas channel/chamber thermodynamics, viscosity and thermal effects, mechanical structure and electromechanical transducers. The proposed tool has been validated on a centimeter scale Free Piston membrane Stirling engine [1]. We propose here new developments taking into account scaling effects to establish models suitable for any microengines. They are based on simplifications derived from the comparison of the hydraulic radius with respect to the viscous and thermal penetration depths respectively).
A simplified model for tritium permeation transient predictions when trapping is active*1
NASA Astrophysics Data System (ADS)
Longhurst, G. R.
1994-09-01
This report describes a simplified one-dimensional tritium permeation and retention model. The model makes use of the same physical mechanisms as more sophisticated, time-transient codes such as implantation, recombination, diffusion, trapping and thermal gradient effects. It takes advantage of a number of simplifications and approximations to solve the steady-state problem and then provides interpolating functions to make estimates of intermediate states based on the steady-state solution. Comparison calculations with the verified and validated TMAP4 transient code show good agreement.
NASA Astrophysics Data System (ADS)
Hernandez, K. F.; Shah-Fairbank, S.
2016-12-01
The San Dimas Experimental Forest has been designated as a research area by the United States Forest Service for use as a hydrologic testing facility since 1933 to investigate watershed hydrology of the 27 square mile land. Incorporation of a computer model provides validity to the testing of the physical model. This study focuses on San Dimas Experimental Forest's Bell Canyon, one of the triad of watersheds contained within the Big Dalton watershed of the San Dimas Experimental Forest. A scaled physical model was constructed of Bell Canyon to highlight watershed characteristics and each's effect on runoff. The physical model offers a comprehensive visualization of a natural watershed and can vary the characteristics of rainfall intensity, slope, and roughness through interchangeable parts and adjustments to the system. The scaled physical model is validated and calibrated through a HEC-HMS model to assure similitude of the system. Preliminary results of the physical model suggest that a 50-year storm event can be represented by a peak discharge of 2.2 X 10-3 cfs. When comparing the results to HEC-HMS, this equates to a flow relationship of approximately 1:160,000, which can be used to model other return periods. The completion of the Bell Canyon physical model can be used for educational instruction in the classroom, outreach in the community, and further research using the model as an accurate representation of the watershed present in the San Dimas Experimental Forest.
Multicomponent ensemble models to forecast induced seismicity
NASA Astrophysics Data System (ADS)
Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.
2018-01-01
In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels of seismicity days before the occurrence of felt events.
NASA Astrophysics Data System (ADS)
Pigot, Corentin; Gilibert, Fabien; Reyboz, Marina; Bocquet, Marc; Zuliani, Paola; Portal, Jean-Michel
2018-04-01
Phase-change memory (PCM) compact modeling of the threshold switching based on a thermal runaway in Poole–Frenkel conduction is proposed. Although this approach is often used in physical models, this is the first time it is implemented in a compact model. The model accuracy is validated by a good correlation between simulations and experimental data collected on a PCM cell embedded in a 90 nm technology. A wide range of intermediate states is measured and accurately modeled with a single set of parameters, allowing multilevel programing. A good convergence is exhibited even in snapback simulation owing to this fully continuous approach. Moreover, threshold properties extraction indicates a thermally enhanced switching, which validates the basic hypothesis of the model. Finally, it is shown that this model is compliant with a new drift-resilient cell-state metric. Once enriched with a phase transition module, this compact model is ready to be implemented in circuit simulators.
NASA Astrophysics Data System (ADS)
Sandfeld, Stefan; Budrikis, Zoe; Zapperi, Stefano; Fernandez Castellanos, David
2015-02-01
Crystalline plasticity is strongly interlinked with dislocation mechanics and nowadays is relatively well understood. Concepts and physical models of plastic deformation in amorphous materials on the other hand—where the concept of linear lattice defects is not applicable—still are lagging behind. We introduce an eigenstrain-based finite element lattice model for simulations of shear band formation and strain avalanches. Our model allows us to study the influence of surfaces and finite size effects on the statistics of avalanches. We find that even with relatively complex loading conditions and open boundary conditions, critical exponents describing avalanche statistics are unchanged, which validates the use of simpler scalar lattice-based models to study these phenomena.
Current status of validating operational model forecasts at the DWD site Lindenberg
NASA Astrophysics Data System (ADS)
Beyrich, F.; Heret, C.; Vogel, G.
2009-09-01
Based on long experience in the measurement of atmospheric boundary layer parameters, the Meteorological Observatory Lindenberg / Richard - Aßmann-Observatory is well qualified to validate operational NWP results for this location. The validation activities cover a large range of time periods from single days or months up to several years and include much more quantities than generally used in areal verification techniques. They mainly focus on land surface and boundary layer processes which play an important role in the atmospheric forc-ing from the surface. Versatility and continuity of the database enable a comprehensive evaluation of the model behaviour under different meteorological conditions in order to esti-mate the accuracy of the physical parameterisations and to detect possible deficiencies in the predicted processes. The measurements from the boundary layer field site Falkenberg serve as reference data for various types of validation studies: 1. The operational boundary-layer measurements are used to identify and to document weather situations with large forecast errors which can then be analysed in more de-tail. Results from a case study will be presented where model deficiencies in the cor-rect simulation of the diurnal evolution of near-surface temperature under winter con-ditions over a closed snow cover where diagnosed. 2. Due to the synopsis of the boundary layer quantities based on monthly averaged di-urnal cycles systematic model deficiencies can be detected more clearly. Some dis-tinctive features found in the annual cycle (e.g. near-surface temperatures, turbulent heat fluxes and soil moisture) will be outlined. Further aspects are their different ap-pearance in the COSMO-EU and COSMO-DE models as well as the effects of start-ing time (00 or 12 UTC) on the prediction accuracy. 3. The evaluation of the model behaviour over several years provides additional insight into the impact of changes in the physical parameterisations, data assimilation or nu-merics on the meteorological quantities. The temporal development of the error char-acteristics of some near-surface weather parameters (temperature, dewpoint tem-perature, wind velocity) and of the energy fluxes at the surface will be discussed.
NASA Astrophysics Data System (ADS)
Zheng, Jiajia; Li, Yancheng; Li, Zhaochun; Wang, Jiong
2015-10-01
This paper presents multi-physics modeling of an MR absorber considering the magnetic hysteresis to capture the nonlinear relationship between the applied current and the generated force under impact loading. The magnetic field, temperature field, and fluid dynamics are represented by the Maxwell equations, conjugate heat transfer equations, and Navier-Stokes equations. These fields are coupled through the apparent viscosity and the magnetic force, both of which in turn depend on the magnetic flux density and the temperature. Based on a parametric study, an inverse Jiles-Atherton hysteresis model is used and implemented for the magnetic field simulation. The temperature rise of the MR fluid in the annular gap caused by core loss (i.e. eddy current loss and hysteresis loss) and fluid motion is computed to investigate the current-force behavior. A group of impulsive tests was performed for the manufactured MR absorber with step exciting currents. The numerical and experimental results showed good agreement, which validates the effectiveness of the proposed multi-physics FEA model.
Development of Thermal Radiation Experiments Kit Based on Data Logger for Physics Learning Media
NASA Astrophysics Data System (ADS)
Permana, H.; Iswanto, B. H.
2018-04-01
Thermal Radiation Experiments Kit (TREK) based on data logger for physics learning media was developed. TREK will be used as a learning medium on the subject of Temperature and Heat to explain the concept of emissivity of a material in grade XI so that it can add variations of experiments which are commonly done such as thermal expansion, transfer of thermal energy (conduction, convection, and radiation), and specific heat capacity. DHT11 sensor is used to measure temperature and microcontroller Arduino-uno used as data logger. The object tested are in the form of coated glass thin films and aluminum with different colors. TREK comes with a user manual and student worksheet (LKS) to make it easier for teachers and students to use. TREK was developed using the ADDIE Development Model (Analyze, Design, Development, Implementation, and Evaluation). And validated by experts, physics teachers, and students. Validation instrument is a questionnaire with a five-item Likert response scale with reviewed aspect coverage: appropriate content and concepts, design, and user friendly. The results showed that TREK was excellent (experts 88.13%, science teachers 95.68%, and students 85.77%).
NASA Astrophysics Data System (ADS)
Scozzari, Andrea; Doveri, Marco
2015-04-01
The knowledge of the physical/chemical processes implied with the exploitation of water bodies for human consumption is an essential tool for the optimisation of the monitoring infrastructure. Due to their increasing importance in the context of human consumption (at least in the EU), this work focuses on groundwater resources. In the framework of drinkable water networks, the physical and data-driven modelling of transport phenomena in groundwater can help optimising the sensor network and validating the acquired data. This work proposes the combined usage of physical and data-driven modelling as a support to the design and maximisation of results from a network of distributed sensors. In particular, the validation of physico-chemical measurements and the detection of eventual anomalies by a set of continuous measurements take benefit from the knowledge of the domain from which water is abstracted, and its expected characteristics. Change-detection techniques based on non-specific sensors (presented by quite a large literature during the last two decades) have to deal with the classical issues of maximising correct detections and minimising false alarms, the latter of the two being the most typical problem to be faced, in the view of designing truly applicable monitoring systems. In this context, the definition of "anomaly" in terms of distance from an expected value or feature characterising the quality of water implies the definition of a suitable metric and the knowledge of the physical and chemical peculiarities of the natural domain from which water is exploited, with its implications in terms of characteristics of the water resource.
Validation of a short measure of effort-reward imbalance in the workplace: evidence from China.
Li, Jian; Loerbroks, Adrian; Shang, Li; Wege, Natalia; Wahrendorf, Morten; Siegrist, Johannes
2012-01-01
Work stress is an emergent risk in occupational health in China, and its measurement is still a critical issue. The aim of this study was to examine the reliability and validity of a short version of the effort-reward imbalance (ERI) questionnaire in a sample of Chinese workers. A community-based survey was conducted in 1,916 subjects aged 30-65 years with paid employment (971 men and 945 women). Acceptable internal consistencies of the three scales, effort, reward and overcommitment, were obtained. Confirmatory factor analysis showed a good model fit of the data with the theoretical structure (goodness-of-fit index = 0.95). Evidence of criterion validity was demonstrated, as all three scales were independently associated with elevated odds ratios of both poor physical and mental health. Based on the findings of our study, this short version of the ERI questionnaire is considered to be a reliable and valid tool for measuring psychosocial work environment in Chinese working populations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horstemeyer, Mark R.; Chaudhuri, Santanu
2015-09-30
A multiscale modeling Internal State Variable (ISV) constitutive model was developed that captures the fundamental structure-property relationships. The macroscale ISV model used lower length scale simulations (Butler-Volmer and Electronics Structures results) in order to inform the ISVs at the macroscale. The chemomechanical ISV model was calibrated and validated from experiments with magnesium (Mg) alloys that were investigated under corrosive environments coupled with experimental electrochemical studies. Because the ISV chemomechanical model is physically based, it can be used for other material systems to predict corrosion behavior. As such, others can use the chemomechanical model for analyzing corrosion effects on their designs.
An experimental study of the Rayleigh-Taylor instability critical wave length
NASA Astrophysics Data System (ADS)
Kong, Xujing; Wang, Youchun; Zhang, Shufei; Xu, Hongkun
1992-06-01
A physical model has been constructed to represent the condensate film pattern on a horizontal downward-facing surface with fins, which is based on visual observation in experiment. The results of analysis using this model confirms the validity of the critical wave length formula obtained from Rayleigh-Taylor stability analysis. This formula may be used as a criterion to design horizontal downward-facing surfaces with fins that can best destabilize the condensate film, thus enhancing condensation heat transfer.
Nahar, Vinayak K; Sharma, Manoj; Catalano, Hannah Priest; Ickes, Melinda J; Johnson, Paul; Ford, M Allison
2016-01-01
Most college students do not adequately participate in enough physical activity (PA) to attain health benefits. A theory-based approach is critical in developing effective interventions to promote PA. The purpose of this study was to examine the utility of the newly proposed multi-theory model (MTM) of health behavior change in predicting initiation and sustenance of PA among college students. Using a cross-sectional design, a valid and reliable survey was administered in October 2015 electronically to students enrolled at a large Southern US University. The internal consistency Cronbach alphas of the subscales were acceptable (0.65-0.92). Only those who did not engage in more than 150 minutes of moderate to vigorous intensity aerobic PA during the past week were included in this study. Of the 495 respondents, 190 met the inclusion criteria of which 141 completed the survey. The majority of participants were females (72.3%) and Caucasians (70.9%). Findings of the confirmatory factor analysis (CFA) confirmed construct validity of subscales (initiation model: χ2 = 253.92 [df = 143], P < 0.001, CFI = 0.91, RMSEA = 0.07, SRMR = 0.07; sustenance model: χ2= 19.40 [df = 22], P < 0.001, CFI = 1.00, RMSEA = 0.00, SRMR = 0.03). Multivariate regression analysis showed that 26% of the variance in the PA initiation was explained by advantages outweighing disadvantages, behavioral confidence, work status, and changes in physical environment. Additionally, 29.7% of the variance in PA sustenance was explained by emotional transformation, practice for change, and changes in social environment. Based on this study's findings, MTM appears to be a robust theoretical framework for predicting PA behavior change. Future research directions and development of suitable intervention strategies are discussed.
Nahar, Vinayak K.; Sharma, Manoj; Catalano, Hannah Priest; Ickes, Melinda J.; Johnson, Paul; Ford, M. Allison
2016-01-01
Background: Most college students do not adequately participate in enough physical activity (PA) to attain health benefits. A theory-based approach is critical in developing effective interventions to promote PA. The purpose of this study was to examine the utility of the newly proposed multi-theory model (MTM) of health behavior change in predicting initiation and sustenance of PA among college students. Methods: Using a cross-sectional design, a valid and reliable survey was administered in October 2015 electronically to students enrolled at a large Southern US University. The internal consistency Cronbach alphas of the subscales were acceptable (0.65-0.92). Only those who did not engage in more than 150 minutes of moderate to vigorous intensity aerobic PA during the past week were included in this study. Results: Of the 495 respondents, 190 met the inclusion criteria of which 141 completed the survey. The majority of participants were females (72.3%) and Caucasians (70.9%). Findings of the confirmatory factor analysis (CFA) confirmed construct validity of subscales (initiation model: χ2 = 253.92 [df = 143], P < 0.001, CFI = 0.91, RMSEA = 0.07, SRMR = 0.07; sustenance model: χ2= 19.40 [df = 22], P < 0.001, CFI = 1.00, RMSEA = 0.00, SRMR = 0.03). Multivariate regression analysis showed that 26% of the variance in the PA initiation was explained by advantages outweighing disadvantages, behavioral confidence, work status, and changes in physical environment. Additionally, 29.7% of the variance in PA sustenance was explained by emotional transformation, practice for change, and changes in social environment. Conclusion: Based on this study’s findings, MTM appears to be a robust theoretical framework for predicting PA behavior change. Future research directions and development of suitable intervention strategies are discussed. PMID:27386419
Reconstructing spatial organizations of chromosomes through manifold learning
Deng, Wenxuan; Hu, Hailin; Ma, Rui; Zhang, Sai; Yang, Jinglin; Peng, Jian; Kaplan, Tommy; Zeng, Jianyang
2018-01-01
Abstract Decoding the spatial organizations of chromosomes has crucial implications for studying eukaryotic gene regulation. Recently, chromosomal conformation capture based technologies, such as Hi-C, have been widely used to uncover the interaction frequencies of genomic loci in a high-throughput and genome-wide manner and provide new insights into the folding of three-dimensional (3D) genome structure. In this paper, we develop a novel manifold learning based framework, called GEM (Genomic organization reconstructor based on conformational Energy and Manifold learning), to reconstruct the three-dimensional organizations of chromosomes by integrating Hi-C data with biophysical feasibility. Unlike previous methods, which explicitly assume specific relationships between Hi-C interaction frequencies and spatial distances, our model directly embeds the neighboring affinities from Hi-C space into 3D Euclidean space. Extensive validations demonstrated that GEM not only greatly outperformed other state-of-art modeling methods but also provided a physically and physiologically valid 3D representations of the organizations of chromosomes. Furthermore, we for the first time apply the modeled chromatin structures to recover long-range genomic interactions missing from original Hi-C data. PMID:29408992
Reconstructing spatial organizations of chromosomes through manifold learning.
Zhu, Guangxiang; Deng, Wenxuan; Hu, Hailin; Ma, Rui; Zhang, Sai; Yang, Jinglin; Peng, Jian; Kaplan, Tommy; Zeng, Jianyang
2018-05-04
Decoding the spatial organizations of chromosomes has crucial implications for studying eukaryotic gene regulation. Recently, chromosomal conformation capture based technologies, such as Hi-C, have been widely used to uncover the interaction frequencies of genomic loci in a high-throughput and genome-wide manner and provide new insights into the folding of three-dimensional (3D) genome structure. In this paper, we develop a novel manifold learning based framework, called GEM (Genomic organization reconstructor based on conformational Energy and Manifold learning), to reconstruct the three-dimensional organizations of chromosomes by integrating Hi-C data with biophysical feasibility. Unlike previous methods, which explicitly assume specific relationships between Hi-C interaction frequencies and spatial distances, our model directly embeds the neighboring affinities from Hi-C space into 3D Euclidean space. Extensive validations demonstrated that GEM not only greatly outperformed other state-of-art modeling methods but also provided a physically and physiologically valid 3D representations of the organizations of chromosomes. Furthermore, we for the first time apply the modeled chromatin structures to recover long-range genomic interactions missing from original Hi-C data.
A first generation BAC-based physical map of the rainbow trout genome
Palti, Yniv; Luo, Ming-Cheng; Hu, Yuqin; Genet, Carine; You, Frank M; Vallejo, Roger L; Thorgaard, Gary H; Wheeler, Paul A; Rexroad, Caird E
2009-01-01
Background Rainbow trout (Oncorhynchus mykiss) are the most-widely cultivated cold freshwater fish in the world and an important model species for many research areas. Coupling great interest in this species as a research model with the need for genetic improvement of aquaculture production efficiency traits justifies the continued development of genomics research resources. Many quantitative trait loci (QTL) have been identified for production and life-history traits in rainbow trout. A bacterial artificial chromosome (BAC) physical map is needed to facilitate fine mapping of QTL and the selection of positional candidate genes for incorporation in marker-assisted selection (MAS) for improving rainbow trout aquaculture production. This resource will also facilitate efforts to obtain and assemble a whole-genome reference sequence for this species. Results The physical map was constructed from DNA fingerprinting of 192,096 BAC clones using the 4-color high-information content fingerprinting (HICF) method. The clones were assembled into physical map contigs using the finger-printing contig (FPC) program. The map is composed of 4,173 contigs and 9,379 singletons. The total number of unique fingerprinting fragments (consensus bands) in contigs is 1,185,157, which corresponds to an estimated physical length of 2.0 Gb. The map assembly was validated by 1) comparison with probe hybridization results and agarose gel fingerprinting contigs; and 2) anchoring large contigs to the microsatellite-based genetic linkage map. Conclusion The production and validation of the first BAC physical map of the rainbow trout genome is described in this paper. We are currently integrating this map with the NCCCWA genetic map using more than 200 microsatellites isolated from BAC end sequences and by identifying BACs that harbor more than 300 previously mapped markers. The availability of an integrated physical and genetic map will enable detailed comparative genome analyses, fine mapping of QTL, positional cloning, selection of positional candidate genes for economically important traits and the incorporation of MAS into rainbow trout breeding programs. PMID:19814815
Comparison of Physical Activity Adult Questionnaire results with accelerometer data.
Garriguet, Didier; Tremblay, Sylvain; Colley, Rachel C
2015-07-01
Discrepancies between self-reported and objectively measured physical activity are well-known. For the purpose of validation, this study compares a new self-reported physical activity questionnaire with an existing one and with accelerometer data. Data collected at one site of the Canadian Health Measures Survey in 2013 were used for this validation study. The International Physical Activity Questionnaire (IPAQ) was administered to respondents during the household interview, and the new Physical Activity for Adults Questionnaire (PAAQ) was administered during a subsequent visit to a mobile examination centre (MEC). At the MEC, respondents were given an accelerometer to wear for seven days. The analysis pertains to 112 respondents aged 18 to 79 who wore the accelerometer for 10 or more hours on at least four days. Moderate-to-vigorous physical activity (MVPA) measured by accelerometer had higher correlation with data from the PAAQ (r = 0.44) than with data from the IPAQ (r = 0.20). The differences between accelerometer and PAAQ data were greater based on accelerometer-measured physical activity accumulated in 10-minute bouts (30-minute difference in MVPA) than on all minutes (9-minute difference). The percentages of respondents meeting the Canadian Physical Activity Guidelines were 90% based on self-reported IPAQ minutes, 70% based on all accelerometer MVPA minutes, 29% based on accelerometer MVPA minutes accumulated in 10-minute bouts, and 61% based on self-reported PAAQ minutes. The PAAQ demonstrated reasonable validity against the accelerometer criterion. Based on correlations and absolute differences between daily minutes of MVPA and the percentages of respondents meeting the Canadian Physical Activity Guidelines, PAAQ results were closer to accelerometer data than were the IPAQ results for the study sample and previous Statistics Canada self-reported questionnaire findings.
NASA Astrophysics Data System (ADS)
Kuznetsova, Maria
The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) was established at the dawn of the new millennium as a long-term flexible solution to the problem of transition of progress in space environment modeling to operational space weather forecasting. CCMC hosts an expanding collection of state-of-the-art space weather models developed by the international space science community. Over the years the CCMC acquired the unique experience in preparing complex models and model chains for operational environment and developing and maintaining custom displays and powerful web-based systems and tools ready to be used by researchers, space weather service providers and decision makers. In support of space weather needs of NASA users CCMC is developing highly-tailored applications and services that target specific orbits or locations in space and partnering with NASA mission specialists on linking CCMC space environment modeling with impacts on biological and technological systems in space. Confidence assessment of model predictions is an essential element of space environment modeling. CCMC facilitates interaction between model owners and users in defining physical parameters and metrics formats relevant to specific applications and leads community efforts to quantify models ability to simulate and predict space environment events. Interactive on-line model validation systems developed at CCMC make validation a seamless part of model development circle. The talk will showcase innovative solutions for space weather research, validation, anomaly analysis and forecasting and review on-going community-wide model validation initiatives enabled by CCMC applications.
Microstructure-based approach for predicting crack initiation and early growth in metals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, James V.; Emery, John M.; Brewer, Luke N.
2009-09-01
Fatigue cracking in metals has been and is an area of great importance to the science and technology of structural materials for quite some time. The earliest stages of fatigue crack nucleation and growth are dominated by the microstructure and yet few models are able to predict the fatigue behavior during these stages because of a lack of microstructural physics in the models. This program has developed several new simulation tools to increase the microstructural physics available for fatigue prediction. In addition, this program has extended and developed microscale experimental methods to allow the validation of new microstructural models formore » deformation in metals. We have applied these developments to fatigue experiments in metals where the microstructure has been intentionally varied.« less
2011-01-01
Background Guidance documents for the development and validation of patient-reported outcomes (PROs) advise the use of conceptual frameworks, which outline the structure of the concept that a PRO aims to measure. It is unknown whether currently available PROs are based on conceptual frameworks. This study, which was limited to a specific case, had the following aims: (i) to identify conceptual frameworks of physical activity in chronic respiratory patients or similar populations (chronic heart disease patients or the elderly) and (ii) to assess whether the development and validation of PROs to measure physical activity in these populations were based on a conceptual framework of physical activity. Methods Two systematic reviews were conducted through searches of the Medline, Embase, PsycINFO, and Cinahl databases prior to January 2010. Results In the first review, only 2 out of 581 references pertaining to physical activity in the defined populations provided a conceptual framework of physical activity in COPD patients. In the second review, out of 103 studies developing PROs to measure physical activity or related constructs, none were based on a conceptual framework of physical activity. Conclusions These findings raise concerns about how the large body of evidence from studies that use physical activity PRO instruments should be evaluated by health care providers, guideline developers, and regulatory agencies. PMID:21967887
Computer-based personality judgments are more accurate than those made by humans
Youyou, Wu; Kosinski, Michal; Stillwell, David
2015-01-01
Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507
Computer-based personality judgments are more accurate than those made by humans.
Youyou, Wu; Kosinski, Michal; Stillwell, David
2015-01-27
Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.
NASA Astrophysics Data System (ADS)
Zhang, Xinzhong; Haidvogel, Dale; Munroe, Daphne; Powell, Eric N.; Klinck, John; Mann, Roger; Castruccio, Frederic S.
2015-02-01
To study the primary larval transport pathways and inter-population connectivity patterns of the Atlantic surfclam, Spisula solidissima, a coupled modeling system combining a physical circulation model of the Middle Atlantic Bight (MAB), Georges Bank (GBK) and the Gulf of Maine (GoM), and an individual-based surfclam larval model was implemented, validated and applied. Model validation shows that the model can reproduce the observed physical circulation patterns and surface and bottom water temperature, and recreates the observed distributions of surfclam larvae during upwelling and downwelling events. The model results show a typical along-shore connectivity pattern from the northeast to the southwest among the surfclam populations distributed from Georges Bank west and south along the MAB shelf. Continuous surfclam larval input into regions off Delmarva (DMV) and New Jersey (NJ) suggests that insufficient larval supply is unlikely to be the factor causing the failure of the population to recover after the observed decline of the surfclam populations in DMV and NJ from 1997 to 2005. The GBK surfclam population is relatively more isolated than populations to the west and south in the MAB; model results suggest substantial inter-population connectivity from southern New England to the Delmarva region. Simulated surfclam larvae generally drift for over one hundred kilometers along the shelf, but the distance traveled is highly variable in space and over time. Surfclam larval growth and transport are strongly impacted by the physical environment. This suggests the need to further examine how the interaction between environment, behavior, and physiology affects inter-population connectivity. Larval vertical swimming and sinking behaviors have a significant net effect of increasing larval drifting distances when compared with a purely passive model, confirming the need to include larval behavior.
Dwivedi, Dipankar; Mohanty, Binayak P.; Lesikar, Bruce J.
2013-01-01
Microbes have been identified as a major contaminant of water resources. Escherichia coli (E. coli) is a commonly used indicator organism. It is well recognized that the fate of E. coli in surface water systems is governed by multiple physical, chemical, and biological factors. The aim of this work is to provide insight into the physical, chemical, and biological factors along with their interactions that are critical in the estimation of E. coli loads in surface streams. There are various models to predict E. coli loads in streams, but they tend to be system or site specific or overly complex without enhancing our understanding of these factors. Hence, based on available data, a Bayesian Neural Network (BNN) is presented for estimating E. coli loads based on physical, chemical, and biological factors in streams. The BNN has the dual advantage of overcoming the absence of quality data (with regards to consistency in data) and determination of mechanistic model parameters by employing a probabilistic framework. This study evaluates whether the BNN model can be an effective alternative tool to mechanistic models for E. coli loads estimation in streams. For this purpose, a comparison with a traditional model (LOADEST, USGS) is conducted. The models are compared for estimated E. coli loads based on available water quality data in Plum Creek, Texas. All the model efficiency measures suggest that overall E. coli loads estimations by the BNN model are better than the E. coli loads estimations by the LOADEST model on all the three occasions (three-fold cross validation). Thirteen factors were used for estimating E. coli loads with the exhaustive feature selection technique, which indicated that six of thirteen factors are important for estimating E. coli loads. Physical factors included temperature and dissolved oxygen; chemical factors include phosphate and ammonia; biological factors include suspended solids and chlorophyll. The results highlight that the LOADEST model estimates E. coli loads better in the smaller ranges, whereas the BNN model estimates E. coli loads better in the higher ranges. Hence, the BNN model can be used to design targeted monitoring programs and implement regulatory standards through TMDL programs. PMID:24511166
NASA Astrophysics Data System (ADS)
Arevalo, L.; Wu, D.; Jacobson, B.
2013-08-01
The main propose of this paper is to present a physical model of long air gap electrical discharges under positive switching impulses. The development and progression of discharges in long air gaps are attributable to two intertwined physical phenomena, namely, the leader channel and the streamer zone. Experimental studies have been used to develop empirical and physical models capable to represent the streamer zone and the leader channel. The empirical ones have led to improvements in the electrical design of high voltage apparatus and insulation distances, but they cannot take into account factors associated with fundamental physics and/or the behavior of materials. The physical models have been used to describe and understand the discharge phenomena of laboratory and lightning discharges. However, because of the complex simulations necessary to reproduce real cases, they are not in widespread use in the engineering of practical applications. Hence, the aim of the work presented here is to develop a model based on physics of the discharge capable to validate and complement the existing engineering models. The model presented here proposes a new geometrical approximation for the representation of the streamer and the calculation of the accumulated electrical charge. The model considers a variable streamer region that changes with the temporal and spatial variations of the electric field. The leader channel is modeled using the non local thermo-equilibrium equations. Furthermore, statistical delays before the inception of the first corona, and random distributions to represent the tortuous nature of the path taken by the leader channel were included based on the behavior observed in experimental tests, with the intention of ensuring the discharge behaved in a realistic manner. For comparison purposes, two different gap configurations were simulated. A reasonable agreement was found between the physical model and the experimental test results.
NASA Astrophysics Data System (ADS)
Kartalov, Emil P.; Scherer, Axel; Quake, Stephen R.; Taylor, Clive R.; Anderson, W. French
2007-03-01
A systematic experimental study and theoretical modeling of the device physics of polydimethylsiloxane "pushdown" microfluidic valves are presented. The phase space is charted by 1587 dimension combinations and encompasses 45-295μm lateral dimensions, 16-39μm membrane thickness, and 1-28psi closing pressure. Three linear models are developed and tested against the empirical data, and then combined into a fourth-power-polynomial superposition. The experimentally validated final model offers a useful quantitative prediction for a valve's properties as a function of its dimensions. Typical valves (80-150μm width) are shown to behave like thin springs.
Anderson, P. S. L.; Rayfield, E. J.
2012-01-01
Computational models such as finite-element analysis offer biologists a means of exploring the structural mechanics of biological systems that cannot be directly observed. Validated against experimental data, a model can be manipulated to perform virtual experiments, testing variables that are hard to control in physical experiments. The relationship between tooth form and the ability to break down prey is key to understanding the evolution of dentition. Recent experimental work has quantified how tooth shape promotes fracture in biological materials. We present a validated finite-element model derived from physical compression experiments. The model shows close agreement with strain patterns observed in photoelastic test materials and reaction forces measured during these experiments. We use the model to measure strain energy within the test material when different tooth shapes are used. Results show that notched blades deform materials for less strain energy cost than straight blades, giving insights into the energetic relationship between tooth form and prey materials. We identify a hypothetical ‘optimal’ blade angle that minimizes strain energy costs and test alternative prey materials via virtual experiments. Using experimental data and computational models offers an integrative approach to understand the mechanics of tooth morphology. PMID:22399789
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Levente
Interpreting sensor data require knowledge about sensor placement and the surrounding environment. For a single sensor measurement, it is easy to document the context by visual observation, however for millions of sensors reporting data back to a server, the contextual information needs to be automatically extracted from either data analysis or leveraging complimentary data sources. Data layers that overlap spatially or temporally with sensor locations, can be used to extract the context and to validate the measurement. To minimize the amount of data transmitted through the internet, while preserving signal information content, two methods are explored; computation at the edgemore » and compressed sensing. We validate the above methods on wind and chemical sensor data (1) eliminate redundant measurement from wind sensors and (2) extract peak value of a chemical sensor measuring a methane plume. We present a general cloud based framework to validate sensor data based on statistical and physical modeling and contextual data extracted from geospatial data.« less
Low Order Modeling Tools for Preliminary Pressure Gain Combustion Benefits Analyses
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.
2012-01-01
Pressure gain combustion (PGC) offers the promise of higher thermodynamic cycle efficiency and greater specific power in propulsion and power systems. This presentation describes a model, developed under a cooperative agreement between NASA and AFRL, for preliminarily assessing the performance enhancement and preliminary size requirements of PGC components either as stand-alone thrust producers or coupled with surrounding turbomachinery. The model is implemented in the Numerical Propulsion Simulation System (NPSS) environment allowing various configurations to be examined at numerous operating points. The validated model is simple, yet physics-based. It executes quickly in NPSS, yet produces realistic results.
2015-06-24
physically . While not distinct from IH models, they require inner boundary magnetic field and plasma property values, the latter not currently measured...initialization for the computational grid. Model integration continues until a physically consistent steady-state is attained. Because of the more... physical basis and greater likelihood of realistic solutions, only MHD-type coronal models were considered in the review. There are two major types of
Validation of Tendril TrueHome Using Software-to-Software Comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maguire, Jeffrey B; Horowitz, Scott G; Moore, Nathan
This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.
Evaluation of TOPLATS on three Mediterranean catchments
NASA Astrophysics Data System (ADS)
Loizu, Javier; Álvarez-Mozos, Jesús; Casalí, Javier; Goñi, Mikel
2016-08-01
Physically based hydrological models are complex tools that provide a complete description of the different processes occurring on a catchment. The TOPMODEL-based Land-Atmosphere Transfer Scheme (TOPLATS) simulates water and energy balances at different time steps, in both lumped and distributed modes. In order to gain insight on the behavior of TOPLATS and its applicability in different conditions a detailed evaluation needs to be carried out. This study aimed to develop a complete evaluation of TOPLATS including: (1) a detailed review of previous research works using this model; (2) a sensitivity analysis (SA) of the model with two contrasted methods (Morris and Sobol) of different complexity; (3) a 4-step calibration strategy based on a multi-start Powell optimization algorithm; and (4) an analysis of the influence of simulation time step (hourly vs. daily). The model was applied on three catchments of varying size (La Tejeria, Cidacos and Arga), located in Navarre (Northern Spain), and characterized by different levels of Mediterranean climate influence. Both Morris and Sobol methods showed very similar results that identified Brooks-Corey Pore Size distribution Index (B), Bubbling pressure (ψc) and Hydraulic conductivity decay (f) as the three overall most influential parameters in TOPLATS. After calibration and validation, adequate streamflow simulations were obtained in the two wettest catchments, but the driest (Cidacos) gave poor results in validation, due to the large climatic variability between calibration and validation periods. To overcome this issue, an alternative random and discontinuous method of cal/val period selection was implemented, improving model results.
Helium ions for radiotherapy? Physical and biological verifications of a novel treatment modality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krämer, Michael, E-mail: m.kraemer@gsi.de; Scifoni, Emanuele; Schuy, Christoph
Purpose: Modern facilities for actively scanned ion beam radiotherapy allow in principle the use of helium beams, which could present specific advantages, especially for pediatric tumors. In order to assess the potential use of these beams for radiotherapy, i.e., to create realistic treatment plans, the authors set up a dedicated {sup 4}He beam model, providing base data for their treatment planning system TRiP98, and they have reported that in this work together with its physical and biological validations. Methods: A semiempirical beam model for the physical depth dose deposition and the production of nuclear fragments was developed and introduced inmore » TRiP98. For the biological effect calculations the last version of the local effect model was used. The model predictions were experimentally verified at the HIT facility. The primary beam attenuation and the characteristics of secondary charged particles at various depth in water were investigated using {sup 4}He ion beams of 200 MeV/u. The nuclear charge of secondary fragments was identified using a ΔE/E telescope. 3D absorbed dose distributions were measured with pin point ionization chambers and the biological dosimetry experiments were realized irradiating a Chinese hamster ovary cells stack arranged in an extended target. Results: The few experimental data available on basic physical processes are reproduced by their beam model. The experimental verification of absorbed dose distributions in extended target volumes yields an overall agreement, with a slight underestimation of the lateral spread. Cell survival along a 4 cm extended target is reproduced with remarkable accuracy. Conclusions: The authors presented a simple simulation model for therapeutical {sup 4}He beams which they introduced in TRiP98, and which is validated experimentally by means of physical and biological dosimetries. Thus, it is now possible to perform detailed treatment planning studies with {sup 4}He beams, either exclusively or in combination with other ion modalities.« less
NASA Astrophysics Data System (ADS)
Huda, C.; Hudha, M. N.; Ain, N.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.
2018-01-01
Computer programming course is theoretical. Sufficient practice is necessary to facilitate conceptual understanding and encouraging creativity in designing computer programs/animation. The development of tutorial video in an Android-based blended learning is needed for students’ guide. Using Android-based instructional material, students can independently learn anywhere and anytime. The tutorial video can facilitate students’ understanding about concepts, materials, and procedures of programming/animation making in detail. This study employed a Research and Development method adapting Thiagarajan’s 4D model. The developed Android-based instructional material and tutorial video were validated by experts in instructional media and experts in physics education. The expert validation results showed that the Android-based material was comprehensive and very feasible. The tutorial video was deemed feasible as it received average score of 92.9%. It was also revealed that students’ conceptual understanding, skills, and creativity in designing computer program/animation improved significantly.
Predictors of validity and reliability of a physical activity record in adolescents
2013-01-01
Background Poor to moderate validity of self-reported physical activity instruments is commonly observed in young people in low- and middle-income countries. However, the reasons for such low validity have not been examined in detail. We tested the validity of a self-administered daily physical activity record in adolescents and assessed if personal characteristics or the convenience level of reporting physical activity modified the validity estimates. Methods The study comprised a total of 302 adolescents from an urban and rural area in Ecuador. Validity was evaluated by comparing the record with accelerometer recordings for seven consecutive days. Test-retest reliability was examined by comparing registrations from two records administered three weeks apart. Time spent on sedentary (SED), low (LPA), moderate (MPA) and vigorous (VPA) intensity physical activity was estimated. Bland Altman plots were used to evaluate measurement agreement. We assessed if age, sex, urban or rural setting, anthropometry and convenience of completing the record explained differences in validity estimates using a linear mixed model. Results Although the record provided higher estimates for SED and VPA and lower estimates for LPA and MPA compared to the accelerometer, it showed an overall fair measurement agreement for validity. There was modest reliability for assessing physical activity in each intensity level. Validity was associated with adolescents’ personal characteristics: sex (SED: P = 0.007; LPA: P = 0.001; VPA: P = 0.009) and setting (LPA: P = 0.000; MPA: P = 0.047). Reliability was associated with the convenience of completing the physical activity record for LPA (low convenience: P = 0.014; high convenience: P = 0.045). Conclusions The physical activity record provided acceptable estimates for reliability and validity on a group level. Sex and setting were associated with validity estimates, whereas convenience to fill out the record was associated with better reliability estimates for LPA. This tendency of improved reliability estimates for adolescents reporting higher convenience merits further consideration. PMID:24289296
Hung, Man; Baumhauer, Judith F; Latt, L Daniel; Saltzman, Charles L; SooHoo, Nelson F; Hunt, Kenneth J
2013-11-01
In 2012, the American Orthopaedic Foot & Ankle Society(®) established a national network for collecting and sharing data on treatment outcomes and improving patient care. One of the network's initiatives is to explore the use of computerized adaptive tests (CATs) for patient-level outcome reporting. We determined whether the CAT from the NIH Patient Reported Outcome Measurement Information System(®) (PROMIS(®)) Physical Function (PF) item bank provides efficient, reliable, valid, precise, and adequately covered point estimates of patients' physical function. After informed consent, 288 patients with a mean age of 51 years (range, 18-81 years) undergoing surgery for common foot and ankle problems completed a web-based questionnaire. Efficiency was determined by time for test administration. Reliability was assessed with person and item reliability estimates. Validity evaluation included content validity from expert review and construct validity measured against the PROMIS(®) Pain CAT and patient responses based on tradeoff perceptions. Precision was assessed by standard error of measurement (SEM) across patients' physical function levels. Instrument coverage was based on a person-item map. Average time of test administration was 47 seconds. Reliability was 0.96 for person and 0.99 for item. Construct validity against the Pain CAT had an r value of -0.657 (p < 0.001). Precision had an SEM of less than 3.3 (equivalent to a Cronbach's alpha of ≥ 0.90) across a broad range of function. Concerning coverage, the ceiling effect was 0.32% and there was no floor effect. The PROMIS(®) PF CAT appears to be an excellent method for measuring outcomes for patients with foot and ankle surgery. Further validation of the PROMIS(®) item banks may ultimately provide a valid and reliable tool for measuring patient-reported outcomes after injuries and treatment.
Automatic paper sliceform design from 3D solid models.
Le-Nguyen, Tuong-Vu; Low, Kok-Lim; Ruiz, Conrado; Le, Sang N
2013-11-01
A paper sliceform or lattice-style pop-up is a form of papercraft that uses two sets of parallel paper patches slotted together to make a foldable structure. The structure can be folded flat, as well as fully opened (popped-up) to make the two sets of patches orthogonal to each other. Automatic design of paper sliceforms is still not supported by existing computational models and remains a challenge. We propose novel geometric formulations of valid paper sliceform designs that consider the stability, flat-foldability and physical realizability of the designs. Based on a set of sufficient construction conditions, we also present an automatic algorithm for generating valid sliceform designs that closely depict the given 3D solid models. By approximating the input models using a set of generalized cylinders, our method significantly reduces the search space for stable and flat-foldable sliceforms. To ensure the physical realizability of the designs, the algorithm automatically generates slots or slits on the patches such that no two cycles embedded in two different patches are interlocking each other. This guarantees local pairwise assembility between patches, which is empirically shown to lead to global assembility. Our method has been demonstrated on a number of example models, and the output designs have been successfully made into real paper sliceforms.
Test, revision, and cross-validation of the Physical Activity Self-Definition Model.
Kendzierski, Deborah; Morganstein, Mara S
2009-08-01
Structural equation modeling was used to test an extended version of the Kendzierski, Furr, and Schiavoni (1998) Physical Activity Self-Definition Model. A revised model using data from 622 runners fit the data well. Cross-validation indices supported the revised model, and this model also provided a good fit to data from 397 cyclists. Partial invariance was found across activities. In both samples, perceived commitment and perceived ability had direct effects on self-definition, and perceived wanting, perceived trying, and enjoyment had indirect effects. The contribution of perceived ability to self-definition did not differ across activities. Implications concerning the original model, indirect effects, skill salience, and the role of context in self-definition are discussed.
NASA Astrophysics Data System (ADS)
Panda, D. K.; Lenka, T. R.
2017-06-01
An enhancement mode p-GaN gate AlGaN/GaN HEMT is proposed and a physics based virtual source charge model with Landauer approach for electron transport has been developed using Verilog-A and simulated using Cadence Spectre, in order to predict device characteristics such as threshold voltage, drain current and gate capacitance. The drain current model incorporates important physical effects such as velocity saturation, short channel effects like DIBL (drain induced barrier lowering), channel length modulation (CLM), and mobility degradation due to self-heating. The predicted I d-V ds, I d-V gs, and C-V characteristics show an excellent agreement with the experimental data for both drain current and capacitance which validate the model. The developed model was then utilized to design and simulate a single-pole single-throw (SPST) RF switch.
NASA Astrophysics Data System (ADS)
Song, X.; Frey, E. C.; Wang, W. T.; Du, Y.; Tsui, B. M. W.
2004-02-01
Simultaneous acquisition of /sup 99m/Tc stress and /sup 201/Tl rest myocardial perfusion SPECT has several potential advantages, but the image quality is degraded by crosstalk between the Tc and Tl data. We have previously developed a crosstalk model that includes estimates of the downscatter and Pb X-ray for use in crosstalk compensation. In this work, we validated the model by comparing the crosstalk from /sup 99m/Tc to the Tl window calculated using a combination of the SimSET-MCNP Monte Carlo simulation codes. We also evaluated the model-based crosstalk compensation method using both simulated data from the 3-D MCAT phantom and experimental data from a physical phantom with a myocardial defect. In these studies, the Tl distributions were reconstructed from crosstalk contaminated data without crosstalk compensation, with compensation using the model-based crosstalk estimate, and with compensation using the known true crosstalk, and were compared with the Tl distribution reconstructed from uncontaminated Tl data. Results show that the model gave good estimates of both the downscatter photons and Pb X-rays in the simultaneous dual-isotopes myocardial perfusion SPECT. The model-based compensation method provided image quality that was significantly improved as compared to no compensation and was very close to that from the separate acquisition.
NASA Astrophysics Data System (ADS)
Cui, Yi-an; Liu, Lanbo; Zhu, Xiaoxiong
2017-08-01
Monitoring the extent and evolution of contaminant plumes in local and regional groundwater systems from existing landfills is critical in contamination control and remediation. The self-potential survey is an efficient and economical nondestructive geophysical technique that can be used to investigate underground contaminant plumes. Based on the unscented transform, we have built a Kalman filtering cycle to conduct time-lapse data assimilation for monitoring the transport of solute based on the solute transport experiment using a bench-scale physical model. The data assimilation was formed by modeling the evolution based on the random walk model and observation correcting based on the self-potential forward. Thus, monitoring self-potential data can be inverted by the data assimilation technique. As a result, we can reconstruct the dynamic process of the contaminant plume instead of using traditional frame-to-frame static inversion, which may cause inversion artifacts. The data assimilation inversion algorithm was evaluated through noise-added synthetic time-lapse self-potential data. The result of the numerical experiment shows validity, accuracy and tolerance to the noise of the dynamic inversion. To validate the proposed algorithm, we conducted a scaled-down sandbox self-potential observation experiment to generate time-lapse data that closely mimics the real-world contaminant monitoring setup. The results of physical experiments support the idea that the data assimilation method is a potentially useful approach for characterizing the transport of contamination plumes using the unscented Kalman filter (UKF) data assimilation technique applied to field time-lapse self-potential data.
The Canadian Assessment of Physical Literacy: methods for children in grades 4 to 6 (8 to 12 years).
Longmuir, Patricia E; Boyer, Charles; Lloyd, Meghann; Yang, Yan; Boiarskaia, Elena; Zhu, Weimo; Tremblay, Mark S
2015-08-11
Physical literacy is described as the motivation, confidence, physical competence, knowledge and understanding to value and engage in a physically active lifestyle. As such, it is expected that those who have greater physical literacy would be more likely to obtain the health benefits offered by habitual physical activity. A theoretical model and assessment battery, the Canadian Assessment of Physical Literacy (CAPL), for the assessment of childhood physical literacy had been proposed in theory but validity data were lacking. The purpose of this study was to explore validity evidence for the CAPL among children in grades 4 to 6. CAPL validity was evaluated through three analyses that utilized cross-sectional data obtained through local schools in Eastern Ontario, Canada. A confirmatory factor analysis compared the data to the theoretical model. Patterns of association between self-reported age and gender and the CAPL total and domain scores were examined using regression models. Teacher ratings of participants' knowledge, attitude and physical activity competence were compared to assessment results. The CAPL was completed by 963 children (55 % female) in grades 4, 5 and 6. Children were 8 to 12 years of age (mean 10.1 years), with 85 % of children approached agreeing to participate. A confirmatory factor analysis using data from 489 children with complete raw scores supported a model with four domains: engagement in physical activity (active and sedentary), physical competence (fitness and motor skill), motivation and confidence, and knowledge and understanding. Raw domain scores followed expected patterns for age and gender, providing evidence for their validity. Interpretive categories, developed from age and gender adjusted normative data, were not associated with age indicating that the CAPL is suitable for use across this age range. Children's gender was associated with the physical competence, motivation and engagement in physical activity domain scores, indicating that further research is required regarding the gender adjustment of the raw CAPL scores. CAPL domain and total scores were statistically significantly associated with teacher ratings of the child's motivation, attitudes, fitness, skill and overall physical activity. CAPL offers a comprehensive assessment of engagement in physical activity, physical competence, motivation and confidence, and knowledge and understanding as components of childhood (grades 4 to 6, 8 to 12 years) physical literacy. Monitoring of these measures enhances our understanding of children's physical literacy, and assists with the identification of areas where additional supports are required.
Abasi, Mohammad Hadi; Eslami, Ahmad Ali; Rakhshani, Fatemeh; Shiri, Mansoor
2016-01-01
Attention to different aspects of self-efficacy leads to actual evaluation of self-efficacy about physical activity. This study was carried out in order to design and determine psychometric characteristics of a questionnaire for evaluation of self-efficacy about leisure time physical activity (SELPA) among Iranian adolescent boys, with an emphasis on regulatory self-efficacy. This descriptive-analytic study was conducted in 734 male adolescents aged 15-19 years in Isfahan. After item generation and item selection based on review of literature and other questionnaires, content validity index (CVI) and content validity ratio (CVR) were determined and items were modified employing the opinions of expert panel (N = 10). Comprehensibility of the questionnaire was determined by members of target group (N = 35). Exploratory factors analysis (EFA) was operated on sample 1 (N 1 = 325) and confirmatory factors analysis (CFA) on sample 2 (N 2 = 347). Reliability of SELPA was estimated via internal consistency method. According to EFA, barrier self-efficacy and scheduling self-efficacy are the two main aspects of SELPA with the total variance of 65%. The suggested model was confirmed by CFA and all fitness indices of the corrected model were good. Cronbach's alpha was totally estimated as 0.89 and for barrier and scheduling self-efficacy, it was 0.86 and 0.81, respectively. The results provide some evidence for acceptable validity and reliability of SELPA in Iranian adolescent boys. However, further investigations, especially for evaluation of predictive power of the questionnaire, are necessary.
Nonlinear scaling of the Unit Hydrograph Peaking Factor for dam safety
NASA Astrophysics Data System (ADS)
Pradhan, N. R.; Loney, D.
2017-12-01
Existing U.S. Army Corps of Engineers (USACE) policy suggests unit hydrograph peaking factor (UHPF), the ratio of an observed and modeled event unit hydrograph peak, range between 1.25 and 1.50 to ensure dam safety. It is pertinent to investigate the impact of extreme flood events on the validity of this range through physically based rainfall-runoff models not available during the planning and design of most USACE dams. The UHPF range was analyzed by deploying the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model in the Goose Creek, VA, watershed to develop a UHPF relationship with excess rainfall across various return-period events. An effective rainfall factor (ERF) is introduced to validate existing UHPF guidance as well as provide a nonlinear UHPF scaling relation when effective rainfall does not match that of the UH design event.
Hernando, Barbara; Ibañez, Maria Victoria; Deserio-Cuesta, Julio Alberto; Soria-Navarro, Raquel; Vilar-Sastre, Inca; Martinez-Cadenas, Conrado
2018-03-01
Prediction of human pigmentation traits, one of the most differentiable externally visible characteristics among individuals, from biological samples represents a useful tool in the field of forensic DNA phenotyping. In spite of freckling being a relatively common pigmentation characteristic in Europeans, little is known about the genetic basis of this largely genetically determined phenotype in southern European populations. In this work, we explored the predictive capacity of eight freckle and sunlight sensitivity-related genes in 458 individuals (266 non-freckled controls and 192 freckled cases) from Spain. Four loci were associated with freckling (MC1R, IRF4, ASIP and BNC2), and female sex was also found to be a predictive factor for having a freckling phenotype in our population. After identifying the most informative genetic variants responsible for human ephelides occurrence in our sample set, we developed a DNA-based freckle prediction model using a multivariate regression approach. Once developed, the capabilities of the prediction model were tested by a repeated 10-fold cross-validation approach. The proportion of correctly predicted individuals using the DNA-based freckle prediction model was 74.13%. The implementation of sex into the DNA-based freckle prediction model slightly improved the overall prediction accuracy by 2.19% (76.32%). Further evaluation of the newly-generated prediction model was performed by assessing the model's performance in a new cohort of 212 Spanish individuals, reaching a classification success rate of 74.61%. Validation of this prediction model may be carried out in larger populations, including samples from different European populations. Further research to validate and improve this newly-generated freckle prediction model will be needed before its forensic application. Together with DNA tests already validated for eye and hair colour prediction, this freckle prediction model may lead to a substantially more detailed physical description of unknown individuals from DNA found at the crime scene. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jain, Prateek; Yadav, Chandan; Agarwal, Amit; Chauhan, Yogesh Singh
2017-08-01
We present a surface potential based analytical model for double gate tunnel field effect transistor (DGTFET) for the current, terminal charges, and terminal capacitances. The model accounts for the effect of the mobile charge in the channel and captures the device physics in depletion as well as in the strong inversion regime. The narrowing of the tunnel barrier in the presence of mobile charges in the channel is incorporated via modeling of the inverse decay length, which is constant under channel depletion condition and bias dependent under inversion condition. To capture the ambipolar current behavior in the model, tunneling at the drain junction is also included. The proposed model is validated against TCAD simulation data and it shows close match with the simulation data.
Shen, Xing-Rong; Chai, Jing; Feng, Rui; Liu, Tong-Zhu; Tong, Gui-Xian; Cheng, Jing; Li, Kai-Chun; Xie, Shao-Yu; Shi, Yong; Wang, De-Bin
2014-01-01
The big gap between efficacy of population level prevention and expectations due to heterogeneity and complexity of cancer etiologic factors calls for selective yet personalized interventions based on effective risk assessment. This paper documents our research protocol aimed at refining and validating a two-stage and web- based cancer risk assessment tool, from a tentative one in use by an ongoing project, capable of identifying individuals at elevated risk for one or more types of the 80% leading cancers in rural China with adequate sensitivity and specificity and featuring low cost, easy application and cultural and technical sensitivity for farmers and village doctors. The protocol adopted a modified population-based case control design using 72, 000 non-patients as controls, 2, 200 cancer patients as cases, and another 600 patients as cases for external validation. Factors taken into account comprised 8 domains including diet and nutrition, risk behaviors, family history, precancerous diseases, related medical procedures, exposure to environment hazards, mood and feelings, physical activities and anthropologic and biologic factors. Modeling stresses explored various methodologies like empirical analysis, logistic regression, neuro-network analysis, decision theory and both internal and external validation using concordance statistics, predictive values, etc..
The Space Weather Modeling Framework (SWMF): Models and Validation
NASA Astrophysics Data System (ADS)
Gombosi, Tamas; Toth, Gabor; Sokolov, Igor; de Zeeuw, Darren; van der Holst, Bart; Ridley, Aaron; Manchester, Ward, IV
In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magneto-sphere) are based on MHD and are represented by our multiphysics code, BATS-R-US. SWMF is a powerful tool for coupling regional models describing the space environment from the solar photosphere to the bottom of the ionosphere. Presently, SWMF contains over a dozen components: the solar corona (SC), eruptive event generator (EE), inner heliosphere (IE), outer heliosphere (OH), solar energetic particles (SE), global magnetosphere (GM), inner magnetosphere (IM), radiation belts (RB), plasmasphere (PS), ionospheric electrodynamics (IE), polar wind (PW), upper atmosphere (UA) and lower atmosphere (LA). This talk will present an overview of SWMF, new results obtained with improved physics as well as some validation studies.
A brief overview of compartmental modeling for intake of plutonium via wounds
Poudel, Deepesh; Klumpp, John Allan; Waters, Tom L.; ...
2017-06-07
Here, the aim of this study is to present several approaches that have been used to model the behavior of radioactive materials (specifically Pu) in contaminated wounds. We also review some attempts by the health physics community to validate and revise the National Council on Radiation Protection and Measurements (NCRP) 156 biokinetic model for wounds, and present some general recommendations based on the review. Modeling of intake via the wound pathway is complicated because of a large array of wound characteristics (e.g. solubility and chemistry of the material, type and depth of the tissue injury, anatomical location of injury). Moreover,more » because a majority of the documented wound cases in humans are medically treated (excised or treated with chelation), the data to develop biokinetic models for unperturbed wound exposures are limited. Since the NCRP wound model was largely developed from animal data, it is important to continue to validate and improve the model using human data whenever plausible.« less
A compact physical model for the simulation of pNML-based architectures
NASA Astrophysics Data System (ADS)
Turvani, G.; Riente, F.; Plozner, E.; Schmitt-Landsiedel, D.; Breitkreutz-v. Gamm, S.
2017-05-01
Among emerging technologies, perpendicular Nanomagnetic Logic (pNML) seems to be very promising because of its capability of combining logic and memory onto the same device, scalability, 3D-integration and low power consumption. Recently, Full Adder (FA) structures clocked by a global magnetic field have been experimentally demonstrated and detailed characterizations of the switching process governing the domain wall (DW) nucleation probability Pnuc and time tnuc have been performed. However, the design of pNML architectures represent a crucial point in the study of this technology; this can have a remarkable impact on the reliability of pNML structures. Here, we present a compact model developed in VHDL which enables to simulate complex pNML architectures while keeping into account critical physical parameters. Therefore, such parameters have been extracted from the experiments, fitted by the corresponding physical equations and encapsulated into the proposed model. Within this, magnetic structures are decomposed into a few basic elements (nucleation centers, nanowires, inverters etc.) represented by the according physical description. To validate the model, we redesigned a FA and compared our simulation results to the experiment. With this compact model of pNML devices we have envisioned a new methodology which makes it possible to simulate and test the physical behavior of complex architectures with very low computational costs.
How to assess the impact of a physical parameterization in simulations of moist convection?
NASA Astrophysics Data System (ADS)
Grabowski, Wojciech
2017-04-01
A numerical model capable in simulating moist convection (e.g., cloud-resolving model or large-eddy simulation model) consists of a fluid flow solver combined with required representations (i.e., parameterizations) of physical processes. The later typically include cloud microphysics, radiative transfer, and unresolved turbulent transport. Traditional approaches to investigate impacts of such parameterizations on convective dynamics involve parallel simulations with different parameterization schemes or with different scheme parameters. Such methodologies are not reliable because of the natural variability of a cloud field that is affected by the feedback between the physics and dynamics. For instance, changing the cloud microphysics typically leads to a different realization of the cloud-scale flow, and separating dynamical and microphysical impacts is difficult. This presentation will present a novel modeling methodology, the piggybacking, that allows studying the impact of a physical parameterization on cloud dynamics with confidence. The focus will be on the impact of cloud microphysics parameterization. Specific examples of the piggybacking approach will include simulations concerning the hypothesized deep convection invigoration in polluted environments, the validity of the saturation adjustment in modeling condensation in moist convection, and separation of physical impacts from statistical uncertainty in simulations applying particle-based Lagrangian microphysics, the super-droplet method.
Toro, Brigitte; Nester, Christopher J; Farren, Pauline C
2007-03-01
To develop the construct, content, and criterion validity of the Salford Gait Tool (SF-GT) and to evaluate agreement between gait observations using the SF-GT and kinematic gait data. Tool development and comparative evaluation. University in the United Kingdom. For designing construct and content validity, convenience samples of 10 children with hemiplegic, diplegic, and quadriplegic cerebral palsy (CP) and 152 physical therapy students and 4 physical therapists were recruited. For developing criterion validity, kinematic gait data of 13 gait clusters containing 56 children with hemiplegic, diplegic, and quadriplegic CP and 11 neurologically intact children was used. For clinical evaluation, a convenience sample of 23 pediatric physical therapists participated. We developed a sagittal plane observational gait assessment tool through a series of design, test, and redesign iterations. The tool's grading system was calibrated using kinematic gait data of 13 gait clusters and was evaluated by comparing the agreement of gait observations using the SF-GT with kinematic gait data. Criterion standard kinematic gait data. There was 58% mean agreement based on grading categories and 80% mean agreement based on degree estimations evaluated with the least significant difference method. The new SF-GT has good concurrent criterion validity.
NASA Technical Reports Server (NTRS)
Balanis, Constantine A.; Polka, Lesley A.; Polycarpou, Anastasis C.
1994-01-01
Formulations for scattering from the coated plate and the coated dihedral corner reflector are included. A coated plate model based upon the Uniform Theory of Diffraction (UTD) for impedance wedges was presented in the last report. In order to resolve inaccuracies and discontinuities in the predicted patterns using the UTD-based model, an improved model that uses more accurate diffraction coefficients is presented. A Physical Optics (PO) model for the coated dihedral corner reflector is presented as an intermediary step in developing a high-frequency model for this structure. The PO model is based upon the reflection coefficients for a metal-backed lossy material. Preliminary PO results for the dihedral corner reflector suggest that, in addition to being much faster computationally, this model may be more accurate than existing moment method (MM) models. An improved Physical Optics (PO)/Equivalent Currents model for modeling the Radar Cross Section (RCS) of both square and triangular, perfectly conducting, trihedral corner reflectors is presented. The new model uses the PO approximation at each reflection for the first- and second-order reflection terms. For the third-order reflection terms, a Geometrical Optics (GO) approximation is used for the first reflection; and PO approximations are used for the remaining reflections. The previously reported model used GO for all reflections except the terminating reflection. Using PO for most of the reflections results in a computationally slower model because many integrations must be performed numerically, but the advantage is that the predicted RCS using the new model is much more accurate. Comparisons between the two PO models, Finite-Difference Time-Domain (FDTD) and experimental data are presented for validation of the new model.
Artifact-based reflective interviews for identifying pragmatic epistemological resources
NASA Astrophysics Data System (ADS)
Shubert, Christopher Walden
Physics Education Research studies the science of teaching and learning physics. The process of student learning is complex, and the factors that affect it are numerous. Describing students' understanding of physics knowledge and reasoning is the basis for much productive research; however, such research fails to account for certain types of student learning difficulties. In this dissertation, I explore one source of student difficulty: personal epistemology, students' ideas about knowledge and knowing. Epistemology traditionally answers three questions: What is knowledge? How is knowledge created? And, how do we know what we know? An individual's responses to these questions can affect learning in terms of how they approach tasks involving the construction and application of knowledge. The key issue addressed in this dissertation is the effect of methodological choices on the validity and reliability of claims concerning personal epistemology. My central concern is contextual validity, how what is said about one's epistemology is not identical to how one behaves epistemologically. In response to these issues, I present here a new methodology for research on student epistemology: video artifact-based reflective interview protocols. These protocols begin with video taping students in their natural classroom activities, and then asking the participants epistemological questions immediately after watching selected scenes from their activity, contextually anchoring them in their actual learning experience. The data from these interviews is viewed in the framework of Epistemological Resource Theory, a framework of small bits of knowledge whose coordination in a given context is used to describe personal epistemology. I claim that the privileged data from these interviews allows detailed epistemological resources to be identified, and that these resources can provide greater insight into how student epistemologies are applied in learning activities. This research, situated within an algebra-based physics for life scientists course reform project, focuses on student work in Modeling Informed Instruction (MII) laboratory activities, which are an adaptation of Modeling Instruction. The development of these activities is based on the epistemological foundations of Modeling Instruction, and these foundations are used to describe a potential assessment for the epistemological effectiveness of a curriculum.
Physical modelling of LNG rollover in a depressurized container filled with water
NASA Astrophysics Data System (ADS)
Maksim, Dadonau; Denissenko, Petr; Hubert, Antoine; Dembele, Siaka; Wen, Jennifer
2015-11-01
Stable density stratification of multi-component Liquefied Natural Gas causes it to form distinct layers, with upper layer having a higher fraction of the lighter components. Heat flux through the walls and base of the container results in buoyancy-driven convection accompanied by heat and mass transfer between the layers. The equilibration of densities of the top and bottom layers, normally caused by the preferential evaporation of Nitrogen, may induce an imbalance in the system and trigger a rapid mixing process, so-called rollover. Numerical simulation of the rollover is complicated and codes require validation. Physical modelling of the phenomenon has been performed in a water-filled depressurized vessel. Reducing gas pressure in the container to levels comparable to the hydrostatic pressure in the water column allows modelling of tens of meters industrial reservoirs using a 20 cm laboratory setup. Additionally, it allows to model superheating of the base fluid layer at temperatures close the room temperature. Flow visualizations and parametric studies are presented. Results are related to outcomes of numerical modelling.
Didarloo, Alireza; Shojaeizadeh, Davoud; Ardebili, Hassan Eftekhar; Niknami, Shamsaddin; Hajizadeh, Ebrahim; Alizadeh, Mohammad
2011-10-01
Findings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran. A sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior). The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions) using the SPSS package. The findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors. Our findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease.
NASA Astrophysics Data System (ADS)
Inam, Azhar; Adamowski, Jan; Prasher, Shiv; Halbe, Johannes; Malard, Julien; Albano, Raffaele
2017-08-01
Effective policies, leading to sustainable management solutions for land and water resources, require a full understanding of interactions between socio-economic and physical processes. However, the complex nature of these interactions, combined with limited stakeholder engagement, hinders the incorporation of socio-economic components into physical models. The present study addresses this challenge by integrating the physical Spatial Agro Hydro Salinity Model (SAHYSMOD) with a participatory group-built system dynamics model (GBSDM) that includes socio-economic factors. A stepwise process to quantify the GBSDM is presented, along with governing equations and model assumptions. Sub-modules of the GBSDM, describing agricultural, economic, water and farm management factors, are linked together with feedbacks and finally coupled with the physically based SAHYSMOD model through commonly used tools (i.e., MS Excel and a Python script). The overall integrated model (GBSDM-SAHYSMOD) can be used to help facilitate the role of stakeholders with limited expertise and resources in model and policy development and implementation. Following the development of the integrated model, a testing methodology was used to validate the structure and behavior of the integrated model. Model robustness under different operating conditions was also assessed. The model structure was able to produce anticipated real behaviours under the tested scenarios, from which it can be concluded that the formulated structures generate the right behaviour for the right reasons.
ERIC Educational Resources Information Center
Lodewyk, Ken R.; Mandigo, James L.
2017-01-01
Physical and Health Education Canada has developed and implemented a formative, criterion-referenced, and practitioner-based national (Canadian) online educational assessment and support resource called Passport for Life (PFL). It was developed to support the awareness and advancement of physical literacy among PE students and teachers. PFL…
USDA-ARS?s Scientific Manuscript database
The Food Intake and Physical Activity of School Children (CAAFE) comprises an online questionnaire to self-report diet and physical activity of Brazilian schoolchildren. The present study aimed to assess the validity (matches, omissions, and intrusions) and moderating factors of the CAAFE. Direct ob...
USDA-ARS?s Scientific Manuscript database
The Food Intake and Physical Activity of School Children (CAAFE) comprises an online questionnaire to self-report diet and physical activity of Brazilian schoolchildren. The present study aimed to assess the validity (matches, omissions and intrusions) and moderating factors of the CAAFE. Direct obs...
Questioning the Validity of Inquiry Assessment in a High Stakes Physical Sciences Examination
ERIC Educational Resources Information Center
Ramnarain, Umesh
2014-01-01
The South African science curriculum advocates an inquiry-based approach to practical work. Inquiry is a complex and multifaceted activity involving both cognitive and physical activity; thus, paper-and-pencil items do not provide the authentic context for this assessment. This study investigates the construct validity of inquiry-related questions…
Chen, Dongmei; Zhu, Shouping; Cao, Xu; Zhao, Fengjun; Liang, Jimin
2015-01-01
X-ray luminescence computed tomography (XLCT) has become a promising imaging technology for biological application based on phosphor nanoparticles. There are mainly three kinds of XLCT imaging systems: pencil beam XLCT, narrow beam XLCT and cone beam XLCT. Narrow beam XLCT can be regarded as a balance between the pencil beam mode and the cone-beam mode in terms of imaging efficiency and image quality. The collimated X-ray beams are assumed to be parallel ones in the traditional narrow beam XLCT. However, we observe that the cone beam X-rays are collimated into X-ray beams with fan-shaped broadening instead of parallel ones in our prototype narrow beam XLCT. Hence we incorporate the distribution of the X-ray beams in the physical model and collected the optical data from only two perpendicular directions to further speed up the scanning time. Meanwhile we propose a depth related adaptive regularized split Bregman (DARSB) method in reconstruction. The simulation experiments show that the proposed physical model and method can achieve better results in the location error, dice coefficient, mean square error and the intensity error than the traditional split Bregman method and validate the feasibility of method. The phantom experiment can obtain the location error less than 1.1 mm and validate that the incorporation of fan-shaped X-ray beams in our model can achieve better results than the parallel X-rays. PMID:26203388
de Castro, Alberto; Rosales, Patricia; Marcos, Susana
2007-03-01
To measure tilt and decentration of intraocular lenses (IOLs) with Scheimpflug and Purkinje imaging systems in physical model eyes with known amounts of tilt and decentration and patients. Instituto de Optica Daza de Valdés, Consejo Superior de Investigaciones Científicas, Madrid, Spain. Measurements of IOL tilt and decentration were obtained using a commercial Scheimpflug system (Pentacam, Oculus), custom algorithms, and a custom-built Purkinje imaging apparatus. Twenty-five Scheimpflug images of the anterior segment of the eye were obtained at different meridians. Custom algorithms were used to process the images (correction of geometrical distortion, edge detection, and curve fittings). Intraocular lens tilt and decentration were estimated by fitting sinusoidal functions to the projections of the pupillary axis and IOL axis in each image. The Purkinje imaging system captures pupil images showing reflections of light from the anterior corneal surface and anterior and posterior lens surfaces. Custom algorithms were used to detect the Purkinje image locations and estimate IOL tilt and decentration based on a linear system equation and computer eye models with individual biometry. Both methods were validated with a physical model eye in which IOL tilt and decentration can be set nominally. Twenty-one eyes of 12 patients with IOLs were measured with both systems. Measurements of the physical model eye showed an absolute discrepancy between nominal and measured values of 0.279 degree (Purkinje) and 0.243 degree (Scheimpflug) for tilt and 0.094 mm (Purkinje) and 0.228 mm (Scheimpflug) for decentration. In patients, the mean tilt was less than 2.6 degrees and the mean decentration less than 0.4 mm. Both techniques showed mirror symmetry between right eyes and left eyes for tilt around the vertical axis and for decentration in the horizontal axis. Both systems showed high reproducibility. Validation experiments on physical model eyes showed slightly higher accuracy with the Purkinje method than the Scheimpflug imaging method. Horizontal measurements of patients with both techniques were highly correlated. The IOLs tended to be tilted and decentered nasally in most patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gokaltun, Seckin; Munroe, Norman; Subramaniam, Shankar
2014-12-31
This study presents a new drag model, based on the cohesive inter-particle forces, implemented in the MFIX code. This new drag model combines an existing standard model in MFIX with a particle-based drag model based on a switching principle. Switches between the models in the computational domain occur where strong particle-to-particle cohesion potential is detected. Three versions of the new model were obtained by using one standard drag model in each version. Later, performance of each version was compared against available experimental data for a fluidized bed, published in the literature and used extensively by other researchers for validation purposes.more » In our analysis of the results, we first observed that standard models used in this research were incapable of producing closely matching results. Then, we showed for a simple case that a threshold is needed to be set on the solid volume fraction. This modification was applied to avoid non-physical results for the clustering predictions, when governing equation of the solid granular temperate was solved. Later, we used our hybrid technique and observed the capability of our approach in improving the numerical results significantly; however, improvement of the results depended on the threshold of the cohesive index, which was used in the switching procedure. Our results showed that small values of the threshold for the cohesive index could result in significant reduction of the computational error for all the versions of the proposed drag model. In addition, we redesigned an existing circulating fluidized bed (CFB) test facility in order to create validation cases for clustering regime of Geldart A type particles.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.
2017-01-01
Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.
NASA Astrophysics Data System (ADS)
Park, E.; Jeong, J.
2017-12-01
A precise estimation of groundwater fluctuation is studied by considering delayed recharge flux (DRF) and unsaturated zone drainage (UZD). Both DRF and UZD are due to gravitational flow impeded in the unsaturated zone, which may nonnegligibly affect groundwater level changes. In the validation, a previous model without the consideration of unsaturated flow is benchmarked where the actual groundwater level and precipitation data are divided into three periods based on the climatic condition. The estimation capability of the new model is superior to the benchmarked model as indicated by the significantly improved representation of groundwater level with physically interpretable model parameters.
Vanwolleghem, Griet; Van Dyck, Delfien; Ducheyne, Fabian; De Bourdeaudhuij, Ilse; Cardon, Greet
2014-06-10
Google Street View provides a valuable and efficient alternative to observe the physical environment compared to on-site fieldwork. However, studies on the use, reliability and validity of Google Street View in a cycling-to-school context are lacking. We aimed to study the intra-, inter-rater reliability and criterion validity of EGA-Cycling (Environmental Google Street View Based Audit - Cycling to school), a newly developed audit using Google Street View to assess the physical environment along cycling routes to school. Parents (n = 52) of 11-to-12-year old Flemish children, who mostly cycled to school, completed a questionnaire and identified their child's cycling route to school on a street map. Fifty cycling routes of 11-to-12-year olds were identified and physical environmental characteristics along the identified routes were rated with EGA-Cycling (5 subscales; 37 items), based on Google Street View. To assess reliability, two researchers performed the audit. Criterion validity of the audit was examined by comparing the ratings based on Google Street View with ratings through on-site assessments. Intra-rater reliability was high (kappa range 0.47-1.00). Large variations in the inter-rater reliability (kappa range -0.03-1.00) and criterion validity scores (kappa range -0.06-1.00) were reported, with acceptable inter-rater reliability values for 43% of all items and acceptable criterion validity for 54% of all items. EGA-Cycling can be used to assess physical environmental characteristics along cycling routes to school. However, to assess the micro-environment specifically related to cycling, on-site assessments have to be added.
NASA Astrophysics Data System (ADS)
Sinha, Neeraj; Zambon, Andrea; Ott, James; Demagistris, Michael
2015-06-01
Driven by the continuing rapid advances in high-performance computing, multi-dimensional high-fidelity modeling is an increasingly reliable predictive tool capable of providing valuable physical insight into complex post-detonation reacting flow fields. Utilizing a series of test cases featuring blast waves interacting with combustible dispersed clouds in a small-scale test setup under well-controlled conditions, the predictive capabilities of a state-of-the-art code are demonstrated and validated. Leveraging physics-based, first principle models and solving large system of equations on highly-resolved grids, the combined effects of finite-rate/multi-phase chemical processes (including thermal ignition), turbulent mixing and shock interactions are captured across the spectrum of relevant time-scales and length scales. Since many scales of motion are generated in a post-detonation environment, even if the initial ambient conditions are quiescent, turbulent mixing plays a major role in the fireball afterburning as well as in dispersion, mixing, ignition and burn-out of combustible clouds in its vicinity. Validating these capabilities at the small scale is critical to establish a reliable predictive tool applicable to more complex and large-scale geometries of practical interest.
Detection, discrimination, and real-time tracking of cracks in rotating disks
NASA Astrophysics Data System (ADS)
Haase, Wayne C.; Drumm, Michael J.
2002-06-01
The purpose of this effort was to develop a system* to detect, discriminate and track fatigue cracks in rotating disks. Aimed primarily at jet engines in flight applications, the system also has value for detecting cracks in a spin pit during low cycle fatigue testing, and for monitoring the health of steam turbines and land-based gas turbine engines for maintenance purposes. The results of this effort produced: a physics-based model that describes the change in the center of mass of a rotating disk using damping ratio, initial unbalance and crack size as parameters; the development of a data acquisition and analysis system that can detect and discriminate a crack using a single cycle of data; and initial validation of the model through testing in a spin pit. The development of the physics-based model also pointed to the most likely regimes for crack detection; identified specific powers of (omega) search for in specific regimes; dictated a particular type of data acquisition for crack discrimination; and demonstrated a need for a higher signal-to-noise ratio in the measurement of the basic vibration signal.
NASA Astrophysics Data System (ADS)
Buzulukova, Natalia; Fok, Mei-Ching; Glocer, Alex; Moore, Thomas E.
2013-04-01
We report studies of the storm time ring current and its influence on the radiation belts, plasmasphere and global magnetospheric dynamics. The near-Earth space environment is described by multiscale physics that reflects a variety of processes and conditions that occur in magnetospheric plasma. For a successful description of such a plasma, a complex solution is needed which allows multiple physics domains to be described using multiple physical models. A key population of the inner magnetosphere is ring current plasma. Ring current dynamics affects magnetic and electric fields in the entire magnetosphere, the distribution of cold ionospheric plasma (plasmasphere), and radiation belts particles. To study electrodynamics of the inner magnetosphere, we present a MHD model (BATSRUS code) coupled with ionospheric solver for electric field and with ring current-radiation belt model (CIMI code). The model will be used as a tool to reveal details of coupling between different regions of the Earth's magnetosphere. A model validation will be also presented based on comparison with data from THEMIS, POLAR, GOES, and TWINS missions. INVITED TALK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aly, A.; Avramova, Maria; Ivanov, Kostadin
To correctly describe and predict this hydrogen distribution there is a need for multi-physics coupling to provide accurate three-dimensional azimuthal, radial, and axial temperature distributions in the cladding. Coupled high-fidelity reactor-physics codes with a sub-channel code as well as with a computational fluid dynamics (CFD) tool have been used to calculate detailed temperature distributions. These high-fidelity coupled neutronics/thermal-hydraulics code systems are coupled further with the fuel-performance BISON code with a kernel (module) for hydrogen. Both hydrogen migration and precipitation/dissolution are included in the model. Results from this multi-physics analysis is validated utilizing calculations of hydrogen distribution using models informed bymore » data from hydrogen experiments and PIE data.« less
Buck, Christoph; Kneib, Thomas; Tkaczick, Tobias; Konstabel, Kenn; Pigeot, Iris
2015-12-22
Built environment studies provide broad evidence that urban characteristics influence physical activity (PA). However, findings are still difficult to compare, due to inconsistent measures assessing urban point characteristics and varying definitions of spatial scale. Both were found to influence the strength of the association between the built environment and PA. We simultaneously evaluated the effect of kernel approaches and network-distances to investigate the association between urban characteristics and physical activity depending on spatial scale and intensity measure. We assessed urban measures of point characteristics such as intersections, public transit stations, and public open spaces in ego-centered network-dependent neighborhoods based on geographical data of one German study region of the IDEFICS study. We calculated point intensities using the simple intensity and kernel approaches based on fixed bandwidths, cross-validated bandwidths including isotropic and anisotropic kernel functions and considering adaptive bandwidths that adjust for residential density. We distinguished six network-distances from 500 m up to 2 km to calculate each intensity measure. A log-gamma regression model was used to investigate the effect of each urban measure on moderate-to-vigorous physical activity (MVPA) of 400 2- to 9.9-year old children who participated in the IDEFICS study. Models were stratified by sex and age groups, i.e. pre-school children (2 to <6 years) and school children (6-9.9 years), and were adjusted for age, body mass index (BMI), education and safety concerns of parents, season and valid weartime of accelerometers. Association between intensity measures and MVPA strongly differed by network-distance, with stronger effects found for larger network-distances. Simple intensity revealed smaller effect estimates and smaller goodness-of-fit compared to kernel approaches. Smallest variation in effect estimates over network-distances was found for kernel intensity measures based on isotropic and anisotropic cross-validated bandwidth selection. We found a strong variation in the association between the built environment and PA of children based on the choice of intensity measure and network-distance. Kernel intensity measures provided stable results over various scales and improved the assessment compared to the simple intensity measure. Considering different spatial scales and kernel intensity methods might reduce methodological limitations in assessing opportunities for PA in the built environment.
NASA Technical Reports Server (NTRS)
Guenther, D. B.
1994-01-01
The nonadiabatic frequencies of a standard solar model and a solar model that includes helium diffusion are discussed. The nonadiabatic pulsation calculation includes physics that describes the losses and gains due to radiation. Radiative gains and losses are modeled in both the diffusion approximation, which is only valid in optically thick regions, and the Eddington approximation, which is valid in both optically thin and thick regions. The calculated pulsation frequencies for modes with l less than or equal to 1320 are compared to the observed spectrum of the Sun. Compared to a strictly adiabatic calculation, the nonadiabatic calculation of p-mode frequencies improves the agreement between model and observation. When helium diffusion is included in the model the frequencies of the modes that are sensitive to regions near the base of the convection zone are improved (i.e., brought into closer agreement with observation), but the agreement is made worse for other modes. Cyclic variations in the frequency spacings of the Sun as a function of frequency of n are presented as evidence for a discontinuity in the structure of the Sun, possibly located near the base of the convection zone.
System identification and the modeling of sailing yachts
NASA Astrophysics Data System (ADS)
Legursky, Katrina
This research represents an exploration of sailing yacht dynamics with full-scale sailing motion data, physics-based models, and system identification techniques. The goal is to provide a method of obtaining and validating suitable physics-based dynamics models for use in control system design on autonomous sailing platforms, which have the capacity to serve as mobile, long range, high endurance autonomous ocean sensing platforms. The primary contributions of this study to the state-of-the-art are the formulation of a five degree-of-freedom (DOF) linear multi-input multi-output (MIMO) state space model of sailing yacht dynamics, the process for identification of this model from full-scale data, a description of the maneuvers performed during on-water tests, and an analysis method to validate estimated models. The techniques and results described herein can be directly applied to and tested on existing autonomous sailing platforms. A full-scale experiment on a 23ft monohull sailing yacht is developed to collect motion data for physics-based model identification. Measurements include 3 axes of accelerations, velocities, angular rates, and attitude angles in addition to apparent wind speed and direction. The sailing yacht herein is treated as a dynamic system with two control inputs, the rudder angle, deltaR, and the mainsail angle, delta B, which are also measured. Over 20 hours of full scale sailing motion data is collected, representing three sail configurations corresponding to a range of wind speeds: the Full Main and Genoa (abbrev. Genoa) for lower wind speeds, the Full Main and Jib (abbrev. Jib) for mid-range wind speeds, and the Reefed Main and Jib (abbrev. Reef) for the highest wind speeds. The data also covers true wind angles from upwind through a beam reach. A physics-based non-linear model to describe sailing yacht motion is outlined, including descriptions of methods to model the aerodynamics and hydrodynamics of a sailing yacht in surge, sway, roll, and yaw. Existing aerodynamic models for sailing yachts are unsuitable for control system design as they do not include a physical description of the sails' dynamic effect on the system. A new aerodynamic model is developed and validated using the full-scale sailing data which includes sail deflection as a control input to the system. The Maximum Likelihood Estimation (MLE) algorithm is used with non-linear simulation data to successfully estimate a set of hydrodynamic derivatives for a sailing yacht. It is shown that all sailing yacht models will contain a second order mode (referred to herein as Mode 1A.S or 4B.S) which is dependent upon trimmed roll angle. For the test yacht it is concluded that for this mode when the trimmed roll angle is, roll rate and roll angle are the dominant motion variables, and for surge velocity and yaw rate dominate. This second order mode is dynamically stable for . It transitions from stability in the higher values of to instability in the region defined by. These conclusions align with other work which has also found roll angle to be a driving factor in the dynamic behavior of a tall-ship (Johnson, Miles, Lasher, & Womack, 2009). It is also shown that all linear models also contain a first order mode, (referred to herein as Mode 3A.F or 1B.F), which lies very close to the origin of the complex plane indicating a long time constant. Measured models have indicated this mode can be stable or unstable. The eigenvector analysis reveals that the mode is stable if the surge contribution is < 40% and the sway contribution is > 20%. The small set of maneuvers necessary for model identification, quick OSLS estimation method, and detailed modal analysis of estimated models outlined in this work are immediately applicable to existing autonomous mono-hull sailing yachts, and could readily be adapted for use with other wind-powered vessel configurations such as wing-sails, catamarans, and tri-marans. (Abstract shortened by UMI.)
NASA Astrophysics Data System (ADS)
Battistini, Alessandro; Rosi, Ascanio; Segoni, Samuele; Catani, Filippo; Casagli, Nicola
2017-04-01
Landslide inventories are basic data for large scale landslide modelling, e.g. they are needed to calibrate and validate rainfall thresholds, physically based models and early warning systems. The setting up of landslide inventories with traditional methods (e.g. remote sensing, field surveys and manual retrieval of data from technical reports and local newspapers) is time consuming. The objective of this work is to automatically set up a landslide inventory using a state-of-the art semantic engine based on data mining on online news (Battistini et al., 2013) and to evaluate if the automatically generated inventory can be used to validate a regional scale landslide warning system based on rainfall-thresholds. The semantic engine scanned internet news in real time in a 50 months test period. At the end of the process, an inventory of approximately 900 landslides was set up for the Tuscany region (23,000 km2, Italy). The inventory was compared with the outputs of the regional landslide early warning system based on rainfall thresholds, and a good correspondence was found: e.g. 84% of the events reported in the news is correctly identified by the model. In addition, the cases of not correspondence were forwarded to the rainfall threshold developers, which used these inputs to update some of the thresholds. On the basis of the results obtained, we conclude that automatic validation of landslide models using geolocalized landslide events feedback is possible. The source of data for validation can be obtained directly from the internet channel using an appropriate semantic engine. We also automated the validation procedure, which is based on a comparison between forecasts and reported events. We verified that our approach can be automatically used for a near real time validation of the warning system and for a semi-automatic update of the rainfall thresholds, which could lead to an improvement of the forecasting effectiveness of the warning system. In the near future, the proposed procedure could operate in continuous time and could allow for a periodic update of landslide hazard models and landslide early warning systems with minimum human intervention. References: Battistini, A., Segoni, S., Manzo, G., Catani, F., Casagli, N. (2013). Web data mining for automatic inventory of geohazards at national scale. Applied Geography, 43, 147-158.
Flood loss modelling with FLF-IT: a new flood loss function for Italian residential structures
NASA Astrophysics Data System (ADS)
Hasanzadeh Nafari, Roozbeh; Amadio, Mattia; Ngo, Tuan; Mysiak, Jaroslav
2017-07-01
The damage triggered by different flood events costs the Italian economy millions of euros each year. This cost is likely to increase in the future due to climate variability and economic development. In order to avoid or reduce such significant financial losses, risk management requires tools which can provide a reliable estimate of potential flood impacts across the country. Flood loss functions are an internationally accepted method for estimating physical flood damage in urban areas. In this study, we derived a new flood loss function for Italian residential structures (FLF-IT), on the basis of empirical damage data collected from a recent flood event in the region of Emilia-Romagna. The function was developed based on a new Australian approach (FLFA), which represents the confidence limits that exist around the parameterized functional depth-damage relationship. After model calibration, the performance of the model was validated for the prediction of loss ratios and absolute damage values. It was also contrasted with an uncalibrated relative model with frequent usage in Europe. In this regard, a three-fold cross-validation procedure was carried out over the empirical sample to measure the range of uncertainty from the actual damage data. The predictive capability has also been studied for some sub-classes of water depth. The validation procedure shows that the newly derived function performs well (no bias and only 10 % mean absolute error), especially when the water depth is high. Results of these validation tests illustrate the importance of model calibration. The advantages of the FLF-IT model over other Italian models include calibration with empirical data, consideration of the epistemic uncertainty of data, and the ability to change parameters based on building practices across Italy.
Prognostics of Power Electronics, Methods and Validation Experiments
NASA Technical Reports Server (NTRS)
Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai
2012-01-01
Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.
Standards for Environmental Measurement Using GIS: Toward a Protocol for Protocols.
Forsyth, Ann; Schmitz, Kathryn H; Oakes, Michael; Zimmerman, Jason; Koepp, Joel
2006-02-01
Interdisciplinary research regarding how the built environment influences physical activity has recently increased. Many research projects conducted jointly by public health and environmental design professionals are using geographic information systems (GIS) to objectively measure the built environment. Numerous methodological issues remain, however, and environmental measurements have not been well documented with accepted, common definitions of valid, reliable variables. This paper proposes how to create and document standardized definitions for measures of environmental variables using GIS with the ultimate goal of developing reliable, valid measures. Inherent problems with software and data that hamper environmental measurement can be offset by protocols combining clear conceptual bases with detailed measurement instructions. Examples demonstrate how protocols can more clearly translate concepts into specific measurement. This paper provides a model for developing protocols to allow high quality comparative research on relationships between the environment and physical activity and other outcomes of public health interest.
NASA Astrophysics Data System (ADS)
GABA, C. O. U.; Alamou, E.; Afouda, A.; Diekkrüger, B.
2016-12-01
Assessing water resources is still an important challenge especially in the context of climatic changes. Although numerous hydrological models exist, new approaches are still under investigation. In this context, we investigate a new modelling approach based on the Physics Principle of Least Action which was first applied to the Bétérou catchment in Benin and gave very good results. The study presents new hypotheses to go further in the model development with a view of widening its application. The improved version of the model MODHYPMA was applied to sixteen (16) subcatchments in Bénin, West Africa. Its performance was compared to two well-known lumped conceptual models, the GR4J and HBV models. The model was successfully calibrated and validated and showed a good performance in most catchments. The analysis revealed that the three models have similar performance and timing errors. But in contrary to other models, MODHYMA is subject to a less loss of performance from calibration to validation. In order to evaluate the usefulness of our model for the prediction of runoff in ungauged basins, model parameters were estimated from the physical catchments characteristics. We relied on statistical methods applied on calibrated model parameters to deduce relationships between parameters and physical catchments characteristics. These relationships were further tested and validated on gauged basins that were considered ungauged. This regionalization was also performed for GR4J model.We obtained NSE values greater than 0.7 for MODHYPMA while the NSE values for GR4J were inferior to 0.5. In the presented study, the effects of climate change on water resources in the Ouémé catchment at the outlet of Savè (about 23 500 km2) are quantified. The output of a regional climate model was used as input to the hydrological models.Computed within the GLOWA-IMPETUS project, the future climate projections (describing a rainfall reduction of up to 15%) are derived from the regional climate model REMO driven by the global ECHAM model.The results reveal a significant decrease in future water resources (of -66% to -53% for MODHYPMA and of -59% to -46% for GR4J) for the IPCC climate scenarios A1B and B1.
NASA Astrophysics Data System (ADS)
Beamer, J. P.; Hill, D. F.; Liston, G. E.; Arendt, A. A.; Hood, E. W.
2013-12-01
In Prince William Sound (PWS), Alaska, there is a pressing need for accurate estimates of the spatial and temporal variations in coastal freshwater discharge (FWD). FWD into PWS originates from streamflow due to rainfall, annual snowmelt, and changes in stored glacier mass and is important because it helps establish spatial and temporal patterns in ocean salinity and temperature, and is a time-varying boundary condition for oceanographic circulation models. Previous efforts to model FWD into PWS have been heavily empirical, with many physical processes absorbed into calibration coefficients that, in many cases, were calibrated to streams and rivers not hydrologically similar to those discharging into PWS. In this work we adapted and validated a suite of high-resolution (in space and time), physically-based, distributed weather, snowmelt, and runoff-routing models designed specifically for snow melt- and glacier melt-dominated watersheds like PWS in order to: 1) provide high-resolution, real-time simulations of snowpack and FWD, and 2) provide a record of historical variations of FWD. SnowModel, driven with gridded topography, land cover, and 32 years (1979-2011) of 3-hourly North American Regional Reanalysis (NARR) atmospheric forcing data, was used to simulate snowpack accumulation and melt across a PWS model domain. SnowModel outputs of daily snow water equivalent (SWE) depth and grid-cell runoff volumes were then coupled with HydroFlow, a runoff-routing model which routed snowmelt, glacier-melt, and rainfall to each watershed outlet (PWS coastline) in the simulation domain. The end product was a continuous 32-year simulation of daily FWD into PWS. In order to validate the models, SWE and snow depths from SnowModel were compared with observed SWE and snow depths from SnoTel and snow survey data, and discharge from HydroFlow was compared with observed streamflow measurements. As a second phase of this research effort, the coupled models will be set-up to run in real-time, where daily measurements from weather stations in the PWS will be used to drive simulations of snow cover and streamflow. In addition, we will deploy a strategic array of instrumentation aimed at validating the simulated weather estimates and the calculations of freshwater discharge. Upon successful implementation and validation of the modeling system, it will join established and ongoing computational and observational efforts that have a common goal of establishing a comprehensive understanding of the physical behavior of PWS.
The space shuttle payload planning working groups. Volume 8: Earth and ocean physics
NASA Technical Reports Server (NTRS)
1973-01-01
The findings and recommendations of the Earth and Ocean Physics working group of the space shuttle payload planning activity are presented. The requirements for the space shuttle mission are defined as: (1) precision measurement for earth and ocean physics experiments, (2) development and demonstration of new and improved sensors and analytical techniques, (3) acquisition of surface truth data for evaluation of new measurement techniques, (4) conduct of critical experiments to validate geophysical phenomena and instrumental results, and (5) development and validation of analytical/experimental models for global ocean dynamics and solid earth dynamics/earthquake prediction. Tables of data are presented to show the flight schedule estimated costs, and the mission model.
Hassett, Leanne; Moseley, Anne; Harmer, Alison; van der Ploeg, Hidde P
2015-01-01
To determine the reliability and validity of the Physical Activity Scale for Individuals with a Physical Disability (PASIPD) in adults with severe traumatic brain injury (TBI) and estimate the proportion of the sample participants who fail to meet the World Health Organization guidelines for physical activity. A single-center observational study recruited a convenience sample of 30 community-based ambulant adults with severe TBI. Participants completed the PASIPD on 2 occasions, 1 week apart, and wore an accelerometer (ActiGraph GT3X; ActiGraph LLC, Pensacola, Florida) for the 7 days between these 2 assessments. The PASIPD test-retest reliability was substantial (intraclass correlation coefficient = 0.85; 95% confidence interval, 0.70-0.92), and the correlation with the accelerometer ranged from too low to be meaningful (R = 0.09) to moderate (R = 0.57). From device-based measurement of physical activity, 56% of participants failed to meet the World Health Organization physical activity guidelines. The PASIPD is a reliable measure of the type of physical activity people with severe TBI participate in, but it is not a valid measure of the amount of moderate to vigorous physical activity in which they engage. Accelerometers should be used to quantify moderate to vigorous physical activity in people with TBI.
The SCALE Verified, Archived Library of Inputs and Data - VALID
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Rearden, Bradley T
The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less
Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework
NASA Astrophysics Data System (ADS)
Cañadas, M.; Arce, P.; Rato Mendes, P.
2011-01-01
Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was 247.1 kcps at 0.87 MBq mL-1). Agreement better than 3% was obtained in the scatter fraction comparison study. We also measured and simulated a mini-Derenzo phantom obtaining images with similar quality using iterative reconstruction methods. We concluded that the overall performance of the simulation showed good agreement with the measured results and validates the GAMOS package for PET applications. Furthermore, its ease of use and flexibility recommends it as an excellent tool to optimize design features or image reconstruction techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saurav, Kumar; Chandan, Vikas
District-heating-and-cooling (DHC) systems are a proven energy solution that has been deployed for many years in a growing number of urban areas worldwide. They comprise a variety of technologies that seek to develop synergies between the production and supply of heat, cooling, domestic hot water and electricity. Although the benefits of DHC systems are significant and have been widely acclaimed, yet the full potential of modern DHC systems remains largely untapped. There are several opportunities for development of energy efficient DHC systems, which will enable the effective exploitation of alternative renewable resources, waste heat recovery, etc., in order to increasemore » the overall efficiency and facilitate the transition towards the next generation of DHC systems. This motivated the need for modelling these complex systems. Large-scale modelling of DHC-networks is challenging, as it has several components such as buildings, pipes, valves, heating source, etc., interacting with each other. In this paper, we focus on building modelling. In particular, we present a gray-box methodology for thermal modelling of buildings. Gray-box modelling is a hybrid of data driven and physics based models where, coefficients of the equations from physics based models are learned using data. This approach allows us to capture the dynamics of the buildings more effectively as compared to pure data driven approach. Additionally, this approach results in a simpler models as compared to pure physics based models. We first develop the individual components of the building such as temperature evolution, flow controller, etc. These individual models are then integrated in to the complete gray-box model for the building. The model is validated using data collected from one of the buildings at Lule{\\aa}, a city on the coast of northern Sweden.« less
Simplified Predictive Models for CO2 Sequestration Performance Assessment
NASA Astrophysics Data System (ADS)
Mishra, Srikanta; RaviGanesh, Priya; Schuetter, Jared; Mooney, Douglas; He, Jincong; Durlofsky, Louis
2014-05-01
We present results from an ongoing research project that seeks to develop and validate a portfolio of simplified modeling approaches that will enable rapid feasibility and risk assessment for CO2 sequestration in deep saline formation. The overall research goal is to provide tools for predicting: (a) injection well and formation pressure buildup, and (b) lateral and vertical CO2 plume migration. Simplified modeling approaches that are being developed in this research fall under three categories: (1) Simplified physics-based modeling (SPM), where only the most relevant physical processes are modeled, (2) Statistical-learning based modeling (SLM), where the simulator is replaced with a "response surface", and (3) Reduced-order method based modeling (RMM), where mathematical approximations reduce the computational burden. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. In the first category (SPM), we use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and the nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. In the second category (SLM), we develop statistical "proxy models" using the simulation domain described previously with two different approaches: (a) classical Box-Behnken experimental design with a quadratic response surface fit, and (b) maximin Latin Hypercube sampling (LHS) based design with a Kriging metamodel fit using a quadratic trend and Gaussian correlation structure. For roughly the same number of simulations, the LHS-based meta-model yields a more robust predictive model, as verified by a k-fold cross-validation approach. In the third category (RMM), we use a reduced-order modeling procedure that combines proper orthogonal decomposition (POD) for reducing problem dimensionality with trajectory-piecewise linearization (TPWL) for extrapolating system response at new control points from a limited number of trial runs ("snapshots"). We observe significant savings in computational time with very good accuracy from the POD-TPWL reduced order model - which could be important in the context of history matching, uncertainty quantification and optimization problems. The paper will present results from our ongoing investigations, and also discuss future research directions and likely outcomes. This work was supported by U.S. Department of Energy National Energy Technology Laboratory award DE-FE0009051 and Ohio Department of Development grant D-13-02.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
NASA Astrophysics Data System (ADS)
Bonne, François; Alamir, Mazen; Bonnay, Patrick
2014-01-01
In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection, to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonne, François; Bonnay, Patrick; Alamir, Mazen
2014-01-29
In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection,more » to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.« less
Simulation Based Earthquake Forecasting with RSQSim
NASA Astrophysics Data System (ADS)
Gilchrist, J. J.; Jordan, T. H.; Dieterich, J. H.; Richards-Dinger, K. B.
2016-12-01
We are developing a physics-based forecasting model for earthquake ruptures in California. We employ the 3D boundary element code RSQSim to generate synthetic catalogs with millions of events that span up to a million years. The simulations incorporate rate-state fault constitutive properties in complex, fully interacting fault systems. The Unified California Earthquake Rupture Forecast Version 3 (UCERF3) model and data sets are used for calibration of the catalogs and specification of fault geometry. Fault slip rates match the UCERF3 geologic slip rates and catalogs are tuned such that earthquake recurrence matches the UCERF3 model. Utilizing the Blue Waters Supercomputer, we produce a suite of million-year catalogs to investigate the epistemic uncertainty in the physical parameters used in the simulations. In particular, values of the rate- and state-friction parameters a and b, the initial shear and normal stress, as well as the earthquake slip speed, are varied over several simulations. In addition to testing multiple models with homogeneous values of the physical parameters, the parameters a, b, and the normal stress are varied with depth as well as in heterogeneous patterns across the faults. Cross validation of UCERF3 and RSQSim is performed within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) to determine the affect of the uncertainties in physical parameters observed in the field and measured in the lab, on the uncertainties in probabilistic forecasting. We are particularly interested in the short-term hazards of multi-event sequences due to complex faulting and multi-fault ruptures.
Amin, N A; Quek, K F; Oxley, J A; Noah, R M; Nordin, R
2015-10-01
The Job Content Questionnaire (M-JCQ) is an established self-reported instrument used across the world to measure the work dimensions based on the Karasek's demand-control-support model. To evaluate the psychometrics properties of the Malay version of M-JCQ among nurses in Malaysia. This cross-sectional study was carried out on nurses working in 4 public hospitals in Klang Valley area, Malaysia. M-JCQ was used to assess the perceived psychosocial stressors and physical demands of nurses at their workplaces. Construct validity of the questionnaire was examined using exploratory factor analysis (EFA). Cronbach's α values were used to estimate the reliability (internal consistency) of the M-JCQ. EFA showed that 34 selected items were loaded in 4 factors. Except for psychological job demand (Cronbach's α 0.51), the remaining 3 α values for 3 subscales (job control, social support, and physical demand) were greater than 0.70, indicating acceptable internal consistency. However, an item was excluded due to poor item-total correlation (r<0.3). The final M-JCQ was consisted of 33 items. The M-JCQ is a reliable and valid instrument to measure psychosocial and physical stressors in the workplace of public hospital nurses in Malaysia.
Busija, L; Buchbinder, R; Osborne, R H
2016-08-01
This study reports the development of the OsteoArthritis Questionnaire (OA-Quest) - a new measure designed to comprehensively capture the potentially modifiable burden of osteoarthritis. Item development was guided by the a priori conceptual framework of the Personal Burden of Osteoarthritis (PBO) which captures 8 dimensions of osteoarthritis burden (Physical distress, Fatigue, Physical limitations, Psychosocial distress, Physical de-conditioning, Financial hardship, Sleep disturbances, Lost productivity). One hundred and twenty three candidate items were pretested in a clinical sample of 18 osteoarthritis patients. The measurement properties of the OA-Quest were assessed with exploratory factor analysis (EFA), Rasch modelling, and confirmatory factor analysis (CFA) in a community-based sample (n = 792). EFA replicated 7 of the 8 PBO domains. An exception was PBO Fatigue domain, with items merging into the Physical distress subscale in the OA-Quest. Following item analysis, a 42-item 7-subscale questionnaire was constructed, measuring Physical distress (seven items, Cronbach's α = 0.93), Physical limitations (11 items, α = 0.95), Psychosocial distress (seven items, α = 0.93), Physical de-conditioning (four items, α = 0.87), Financial hardship (four items, α = 0.93), Sleep disturbances (five items, α = 0.96), and Lost productivity (four items α = 0.90). A highly restricted 7-factor CFA model had excellent fit with the data (χ(2)(113) = 316.36, P < 0.001; chi-square/degrees of freedom = 2.8; comparative fit index [CFI] = 0.97; root mean square error of approximation [RMSEA] = 0.07), supporting construct validity of the new measure. The OA-Quest is a new measure of osteoarthritis burden that is founded on a comprehensive conceptual model. It has strong evidence of construct validity and provides reliable measurement across a broad range of osteoarthritis burden. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
The Social Physique Anxiety Scale: construct validity in adolescent females.
McAuley, E; Burman, G
1993-09-01
Hart, Leary, and Rejeski have developed the Social Physique Anxiety Scale (SPA), a measure of the anxiety experienced in response to having one's physique evaluated by other people. The present study cross-validated the psychometric properties of this measure in a sample (N = 236) of adolescent competitive female gymnasts. Employing structural equation modeling, the proposed unidimensional factor structure of the SPA was supported, although some questions regarding the robustness of the fit are raised. Construct validity was demonstrated by significant inverse relationships between aspects of physical efficacy (perceived physical ability and physical self-presentation confidence) and degree of social physique anxiety. These findings are discussed in terms of possible alternative factor structures and integration of social anxiety and other psychosocial constructs to better understand physical activity behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fields, Laura; Genser, Krzysztof; Hatcher, Robert
Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. Thismore » raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.« less
Particulate matter concentration mapping from MODIS satellite data: a Vietnamese case study
NASA Astrophysics Data System (ADS)
Nguyen, Thanh T. N.; Bui, Hung Q.; Pham, Ha V.; Luu, Hung V.; Man, Chuc D.; Pham, Hai N.; Le, Ha T.; Nguyen, Thuy T.
2015-09-01
Particulate Matter (PM) pollution is one of the most important air quality concerns in Vietnam. In this study, we integrate ground-based measurements, meteorological and satellite data to map temporal PM concentrations at a 10 × 10 km grid for the entire of Vietnam. We specifically used MODIS Aqua and Terra data and developed statistically-significant regression models to map and extend the ground-based PM concentrations. We validated our models over diverse geographic provinces i.e., North East, Red River Delta, North Central Coast and South Central Coast in Vietnam. Validation suggested good results for satellite-derived PM2.5 data compared to ground-based PM2.5 (n = 285, r2 = 0.411, RMSE = 20.299 μg m-3 and RE = 39.789%). Further, validation of satellite-derived PM2.5 on two independent datasets for North East and South Central Coast suggested similar results (n = 40, r2 = 0.455, RMSE = 21.512 μg m-3, RE = 45.236% and n = 45, r2 = 0.444, RMSE = 8.551 μg m-3, RE = 46.446% respectively). Also, our satellite-derived PM2.5 maps were able to replicate seasonal and spatial trends of ground-based measurements in four different regions. Our results highlight the potential use of MODIS datasets for PM estimation at a regional scale in Vietnam. However, model limitation in capturing maximal or minimal PM2.5 peaks needs further investigations on ground data, atmospheric conditions and physical aspects.
NASA Astrophysics Data System (ADS)
Ferguson-Hessler, Monica G. M.; de Jong, Ton
This study aims at giving a systematic description of the cognitive activities involved in teaching physics. Such a description of instruction in physics requires a basis in two models, that is, the cognitive activities involved in learning physics and the knowledge base that is the foundation of expertise in that subject. These models have been provided by earlier research. The model of instruction distinguishes three main categories of instruction process: presenting new information, integrating (i.e., bringing structure into) new knowledge, and connecting elements of new knowledge to prior knowledge. Each of the main categories has been divided into a number of specific instruction processes. Hereby any limited and specific cognitive teacher activity can be described along the two dimensions of process and type of knowledge. The model was validated by application to lectures and problem-solving classes of first year university courses. These were recorded and analyzed as to instruction process and type of knowledge. Results indicate that teachers are indeed involved in the various types of instruction processes defined. The importance of this study lies in the creation of a terminology that makes it possible to discuss instruction in an explicit and specific way.
Rohani, Hosein; Eslami, Ahmad Ali; Ghaderi, Arsalan; Jafari-Koshki, Tohid; Sadeghi, Erfan; Bidkhori, Mohammad; Raei, Mehdi
2016-01-01
Moderate increase in physical activity (PA) may be helpful in preventing or postponing the complications of type 2 diabetes mellitus (T2DM). The aim of this study was to assess the psychometric properties of a health action process approach (HAPA)-based PA inventory among T2DM patients. In 2015, this cross-sectional study was carried out on 203 participants recruited by convenience sampling in Isfahan, Iran. Content and face validity was confirmed by a panel of experts. The comments noted by 9 outpatients on the inventory were also investigated. Then,the items were administered to 203 T2DM patients. Construct validity was conducted using exploratory and structural equation modeling confirmatory factor analyses. Reliability was also assessed with Cronbach alpha and interclass correlation coefficient (ICC). Content validity was acceptable (CVR = 0.62, CVI = 0.89). Exploratory factor analysis extracted seven factors (risk- perception, action self-efficacy, outcome expectancies, maintenance self-efficacy, action and coping planning, behavioral intention, and recovery self-efficacy) explaining 82.23% of the variation. The HAPA had an acceptable fit to the observations (χ2 = 3.21, df = 3, P = 0.38; RMSEA = 0.06; AGFI = 0.90; PGFI = 0.12). The range of Cronbach alpha and ICC for the scales was about 0.63 to 0.97 and 0.862 to 0.988, respectively. The findings of the present study provided an initial support for the reliability and validity of the HAPA-based PA inventory among patients with T2DM.
Construct Validity of the Societal Outreach Scale (SOS).
Fike, David S; Denton, Jason; Walk, Matt; Kish, Jennifer; Gorman, Ira
2018-04-01
The American Physical Therapy Association (APTA) has been working toward a vision of increasing professional focus on societal-level health. However, performance of social responsibility and related behaviors by physical therapists remain relatively poorly integrated into practice. Promoting a focus on societal outreach is necessary for all health care professionals to impact the health of their communities. The objective was to document the validity of the 14-item Societal Outreach Scale (SOS) for use with practicing physical therapists. This study used a cross-sectional survey. The SOS was transmitted via email to all therapists who were licensed and practicing in 10 states in the United States that were purposefully selected to assure a broad representation. A sample of 2612 usable responses was received. Factor analysis was applied to assess construct validity of the instrument. Of alternate models, a 3-factor model best demonstrated goodness of fit with the sample data according to conventional indices (standardized root mean squared residual = .03, comparative fit index .96, root mean square error of approximation = .06). The 3 factors measured by the SOS were labeled Societal-Level Health Advocacy, Community Engagement/Social Integration, and Political Engagement. Internal consistency reliability was 0.7 for all factors. The 3-factor SOS demonstrated acceptable validity and reliability. Though the sample included a broad representation of physical therapists, this was a single cross-sectional study. Additional confirmatory factor analysis, reliability testing, and word refinement of the tool are warranted. Given the construct validity and reliability of the 3-factor SOS, it is recommended for use as a validated instrument to measure physical therapists' performance of social responsibility and related behaviors.
ERIC Educational Resources Information Center
Sulz, Lauren; Temple, Viviene; Gibbons, Sandra
2016-01-01
The aim of this research was to develop measures to provide valid and reliable representation of the motivational states and psychological needs proposed by the self-determination theory (Deci & Ryan, 1985, 2000) within a physical education context. Based on theoretical underpinnings of self-determination theory, two questionnaires were…
Creating Physical 3D Stereolithograph Models of Brain and Skull
Kelley, Daniel J.; Farhoud, Mohammed; Meyerand, M. Elizabeth; Nelson, David L.; Ramirez, Lincoln F.; Dempsey, Robert J.; Wolf, Alan J.; Alexander, Andrew L.; Davidson, Richard J.
2007-01-01
The human brain and skull are three dimensional (3D) anatomical structures with complex surfaces. However, medical images are often two dimensional (2D) and provide incomplete visualization of structural morphology. To overcome this loss in dimension, we developed and validated a freely available, semi-automated pathway to build 3D virtual reality (VR) and hand-held, stereolithograph models. To evaluate whether surface visualization in 3D was more informative than in 2D, undergraduate students (n = 50) used the Gillespie scale to rate 3D VR and physical models of both a living patient-volunteer's brain and the skull of Phineas Gage, a historically famous railroad worker whose misfortune with a projectile tamping iron provided the first evidence of a structure-function relationship in brain. Using our processing pathway, we successfully fabricated human brain and skull replicas and validated that the stereolithograph model preserved the scale of the VR model. Based on the Gillespie ratings, students indicated that the biological utility and quality of visual information at the surface of VR and stereolithograph models were greater than the 2D images from which they were derived. The method we developed is useful to create VR and stereolithograph 3D models from medical images and can be used to model hard or soft tissue in living or preserved specimens. Compared to 2D images, VR and stereolithograph models provide an extra dimension that enhances both the quality of visual information and utility of surface visualization in neuroscience and medicine. PMID:17971879
Christiansen, Daniel E.
2012-01-01
The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, conducted a study to examine techniques for estimation of daily streamflows using hydrological models and statistical methods. This report focuses on the use of a hydrologic model, the U.S. Geological Survey's Precipitation-Runoff Modeling System, to estimate daily streamflows at gaged and ungaged locations. The Precipitation-Runoff Modeling System is a modular, physically based, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on surface-water runoff and general basin hydrology. The Cedar River Basin was selected to construct a Precipitation-Runoff Modeling System model that simulates the period from January 1, 2000, to December 31, 2010. The calibration period was from January 1, 2000, to December 31, 2004, and the validation periods were from January 1, 2005, to December 31, 2010 and January 1, 2000 to December 31, 2010. A Geographic Information System tool was used to delineate the Cedar River Basin and subbasins for the Precipitation-Runoff Modeling System model and to derive parameters based on the physical geographical features. Calibration of the Precipitation-Runoff Modeling System model was completed using a U.S. Geological Survey calibration software tool. The main objective of the calibration was to match the daily streamflow simulated by the Precipitation-Runoff Modeling System model with streamflow measured at U.S. Geological Survey streamflow gages. The Cedar River Basin daily streamflow model performed with a Nash-Sutcliffe efficiency ranged from 0.82 to 0.33 during the calibration period, and a Nash-Sutcliffe efficiency ranged from 0.77 to -0.04 during the validation period. The Cedar River Basin model is meeting the criteria of greater than 0.50 Nash-Sutcliffe and is a good fit for streamflow conditions for the calibration period at all but one location, Austin, Minnesota. The Precipitation-Runoff Modeling System model accurately simulated streamflow at four of six uncalibrated sites within the basin. Overall, there was good agreement between simulated and measured seasonal and annual volumes throughout the basin for calibration and validation sites. The calibration period ranged from 0.2 to 20.8 percent difference, and the validation period ranged from 0.0 to 19.5 percent difference across all seasons and total annual runoff. The Precipitation-Runoff Modeling System model tended to underestimate lower streamflows compared to the observed streamflow values. This is an indication that the Precipitation-Runoff Modeling model needs more detailed groundwater and storage information to properly model the low-flow conditions in the Cedar River Basin.
NASA Astrophysics Data System (ADS)
Jieying, HE; Shengwei, ZHANG; Na, LI
2017-02-01
A passive sub-millimeter precipitation retrievals algorithm is provided based on Microwave Humidity and Temperature Sounder (MWHTS) onboard the Chinese Feng Yun 3C (FY-3C) satellite. Using the validated global reference physical model NCEP/WRF/VDISORT), NCEP data per 6 hours are downloaded to run the Weather Research and Forecast model WRF, and derive the typical precipitation data from the whole world. The precipitation retrieval algorithm can operate either on land or on seawater for global. To simply the calculation procedure and save the training time, principle component analysis (PCA) was adapted to filter out the redundancy caused by scanning angle and surface effects, as well as system noise. According to the comparison and validation combing with other precipitation sources, it is demonstrated that the retrievals are reliable for surface precipitation rate higher than 0.1 mm/h at 15km resolution.
Extension of HCDstruct for Transonic Aeroservoelastic Analysis of Unconventional Aircraft Concepts
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2017-01-01
A substantial effort has been made to implement an enhanced aerodynamic modeling capability in the Higher-fidelity Conceptual Design and structural optimization tool. This additional capability is needed for a rapid, physics-based method of modeling advanced aircraft concepts at risk of structural failure due to dynamic aeroelastic instabilities. To adequately predict these instabilities, in particular for transonic applications, a generalized aerodynamic matching algorithm was implemented to correct the doublet-lattice model available in Nastran using solution data from a priori computational fluid dynamics anal- ysis. This new capability is demonstrated for two tube-and-wing aircraft configurations, including a Boeing 737-200 for implementation validation and the NASA D8 as a first use case. Results validate the current implementation of the aerodynamic matching utility and demonstrate the importance of using such a method for aircraft configurations featuring fuselage-wing aerodynamic interaction.
Ablation and Thermal Response Property Model Validation for Phenolic Impregnated Carbon Ablator
NASA Technical Reports Server (NTRS)
Milos, F. S.; Chen, Y.-K.
2009-01-01
Phenolic Impregnated Carbon Ablator was the heatshield material for the Stardust probe and is also a candidate heatshield material for the Orion Crew Module. As part of the heatshield qualification for Orion, physical and thermal properties were measured for newly manufactured material, included emissivity, heat capacity, thermal conductivity, elemental composition, and thermal decomposition rates. Based on these properties, an ablation and thermal-response model was developed for temperatures up to 3500 K and pressures up to 100 kPa. The model includes orthotropic and pressure-dependent thermal conductivity. In this work, model validation is accomplished by comparison of predictions with data from many arcjet tests conducted over a range of stagnation heat flux and pressure from 107 Watts per square centimeter at 2.3 kPa to 1100 Watts per square centimeter at 84 kPa. Over the entire range of test conditions, model predictions compare well with measured recession, maximum surface temperatures, and in depth temperatures.
Development of a measure of student self-evaluation of physics exam performance
NASA Astrophysics Data System (ADS)
Hagedorn, Eric Anthony
The central purpose of this study was to provide preliminary evidence of the reliability and validity of the SEVSI - P (Self- evaluation scaled instrument - physics). This instrument, designed to measure student self-evaluation of physics exam performance, was developed in congruence with social cognitive theory. Self-evaluation in this study is defined to consist of two of the three subprocesses of self-regulation: self-observation and judgmental process. As such, the SEVSI - P consists of two subscales, one measuring the frequency and types of self-observations made during a physics exam and one measuring the frequency and types of judgmental comparisons made after an exam. Data from 621 completed surveys, voluntarily taken by first semester algebra/trigonometry based physics students at six Midwestern universities and one Southern university, were analyzed for reliability and factorial validity. Cronbach alphas of .71 and .83 for the self-observation and judgment subscales, respectively, indicate acceptable reliability for the instrument. Confirmatory factor analysis indicates the acceptability of the hypothesis that the data analyzed could have indeed been obtained from the proposed two factor model (self-observation and judgment). The results of this confirmatory factor analysis provide preliminary construct validity for this instrument. A number of theoretically related items were included on the SEVSI - P form to elicity information about the use of goals and pre-planned strategies, actions taken in response to previous poor performances, and emotional responses to performance. A correlational analysis of these items along with the self-observation and judgment subscale scores provided a limited degree of convergent validity for the two subscales. Analyses of variance were done to determine the presence of differences in scoring patterns based on gender or reported ethnic origin. These results indicate slightly higher judgment subscale scores for women and members of minority groups. The implications of these differences are suggested as warranting future research. Future uses of the SEVSI - P include classroom use to assist students self-evaluate their exam performances in order to increase their achievement. Future research using the SEVSI - P to determine the causal relationships between self-evaluation, actual achievement, and other social cognitive constructs such as self-efficacy are suggested.
NASA Astrophysics Data System (ADS)
Sun, Guodong; Mu, Mu
2017-05-01
An important source of uncertainty, which causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. Therefore, finding a subset among numerous physical parameters in numerical models in the atmospheric and oceanic sciences, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach in China. The results imply that nonlinear interactions among parameters play a key role in the identification of sensitive parameters in arid and semi-arid regions of China compared to those in northern, northeastern, and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.
A new approach to the extraction of single exponential diode model parameters
NASA Astrophysics Data System (ADS)
Ortiz-Conde, Adelmo; García-Sánchez, Francisco J.
2018-06-01
A new integration method is presented for the extraction of the parameters of a single exponential diode model with series resistance from the measured forward I-V characteristics. The extraction is performed using auxiliary functions based on the integration of the data which allow to isolate the effects of each of the model parameters. A differentiation method is also presented for data with low level of experimental noise. Measured and simulated data are used to verify the applicability of both proposed method. Physical insight about the validity of the model is also obtained by using the proposed graphical determinations of the parameters.
NASA Astrophysics Data System (ADS)
Lamorski, Krzysztof; Šimūnek, Jiří; Sławiński, Cezary; Lamorska, Joanna
2017-02-01
In this paper, we estimated using the machine learning methodology the main wetting branch of the soil water retention curve based on the knowledge of the main drying branch and other, optional, basic soil characteristics (particle size distribution, bulk density, organic matter content, or soil specific surface). The support vector machine algorithm was used for the models' development. The data needed by this algorithm for model training and validation consisted of 104 different undisturbed soil core samples collected from the topsoil layer (A horizon) of different soil profiles in Poland. The main wetting and drying branches of SWRC, as well as other basic soil physical characteristics, were determined for all soil samples. Models relying on different sets of input parameters were developed and validated. The analysis showed that taking into account other input parameters (i.e., particle size distribution, bulk density, organic matter content, or soil specific surface) than information about the drying branch of the SWRC has essentially no impact on the models' estimations. Developed models are validated and compared with well-known models that can be used for the same purpose, such as the Mualem (1977) (M77) and Kool and Parker (1987) (KP87) models. The developed models estimate the main wetting SWRC branch with estimation errors (RMSE = 0.018 m3/m3) that are significantly lower than those for the M77 (RMSE = 0.025 m3/m3) or KP87 (RMSE = 0. 047 m3/m3) models.
Meinck, Franziska; Cosma, Alina Paula; Mikton, Christopher; Baban, Adriana
2017-10-01
Child abuse is a major public health problem. In order to establish the prevalence of abuse exposure among children, measures need to be age-appropriate, sensitive, reliable and valid. This study aimed to investigate the psychometric properties of the Adverse Childhood Experiences Questionnaire Abuse Short Form (ACE-ASF). The ACE-ASF is an 8-item, retrospective self-report questionnaire measuring lifetime physical, emotional and sexual abuse. Data from a nationally representative sample of 15-year-old, school-going adolescents (n=1733, 55.5% female) from the Romanian Health Behavior in School-Based Children Study 2014 (HBSC) were analyzed. The factorial structure of the ACE-ASF was tested with Exploratory Factor Analysis (EFA) and confirmed using Confirmatory Factor Analysis (CFA). Measurement invariance was examined across sex, and internal reliability and concurrent criterion validity were established. Violence exposure was high: 39.7% physical, 32.2% emotional and 13.1% sexual abuse. EFA established a two-factor structure: physical/emotional abuse and sexual abuse. CFA confirmed this model fitted the data well [χ2(df)=60.526(19); RMSEA=0.036; CFI/TLI=0.990/0.986]. Metric invariance was supported across sexes. Internal consistency was good (0.83) for the sexual abuse scale and poor (0.57) for the physical/emotional abuse scale. Concurrent criterion validity confirmed hypothesized relationships between childhood abuse and health-related quality of life, life satisfaction, self-perceived health, bullying victimization and perpetration, externalizing and internalizing behaviors, and multiple health complaints. Results support the ACE-ASF as a valid measure of physical, emotional and sexual abuse in school-aged adolescents. However, the ACE-ASF combines spanking with other types of physical abuse when this should be assessed separately instead. Future research is needed to replicate findings in different youth populations and across age groups. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
A Physical Model to Determine Snowfall over Land by Microwave Radiometry
NASA Technical Reports Server (NTRS)
Skofronick-Jackson, G.; Kim, M.-J.; Weinman, J. A.; Chang, D.-E.
2003-01-01
Because microwave brightness temperatures emitted by snow covered surfaces are highly variable, snowfall above such surfaces is difficult to observe using window channels that occur at low frequencies (v less than 100 GHz). Furthermore, at frequencies v less than or equal to 37 GHz, sensitivity to liquid hydrometeors is dominant. These problems are mitigated at high frequencies (v greater than 100 GHz) where water vapor screens the surface emission and sensitivity to frozen hydrometeors is significant. However the scattering effect of snowfall in the atmosphere at those higher frequencies is also impacted by water vapor in the upper atmosphere. This work describes the methodology and results of physically-based retrievals of snow falling over land surfaces. The theory of scattering by randomly oriented dry snow particles at high microwave frequencies appears to be better described by regarding snow as a concatenation of equivalent ice spheres rather than as a sphere with the effective dielectric constant of an air-ice mixture. An equivalent sphere snow scattering model was validated against high frequency attenuation measurements. Satellite-based high frequency observations from an Advanced Microwave Sounding Unit (AMSU-B) instrument during the March 5-6, 2001 New England blizzard were used to retrieve snowfall over land. Vertical distributions of snow, temperature and relative humidity profiles were derived from the Pennsylvania State University-National Center for Atmospheric Research (PSU-NCAR) fifth-generation Mesoscale Model (MM5). Those data were applied and modified in a radiative transfer model that derived brightness temperatures consistent with the AMSU-B observations. The retrieved snowfall distribution was validated with radar reflectivity measurements obtained from the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) ground-based radar network.
A Risk-Based Approach for Aerothermal/TPS Analysis and Testing
NASA Technical Reports Server (NTRS)
Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak
2007-01-01
The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.
Maarsingh, O R; Heymans, M W; Verhaak, P F; Penninx, B W J H; Comijs, H C
2018-08-01
Given the poor prognosis of late-life depression, it is crucial to identify those at risk. Our objective was to construct and validate a prediction rule for an unfavourable course of late-life depression. For development and internal validation of the model, we used The Netherlands Study of Depression in Older Persons (NESDO) data. We included participants with a major depressive disorder (MDD) at baseline (n = 270; 60-90 years), assessed with the Composite International Diagnostic Interview (CIDI). For external validation of the model, we used The Netherlands Study of Depression and Anxiety (NESDA) data (n = 197; 50-66 years). The outcome was MDD after 2 years of follow-up, assessed with the CIDI. Candidate predictors concerned sociodemographics, psychopathology, physical symptoms, medication, psychological determinants, and healthcare setting. Model performance was assessed by calculating calibration and discrimination. 111 subjects (41.1%) had MDD after 2 years of follow-up. Independent predictors of MDD after 2 years were (older) age, (early) onset of depression, severity of depression, anxiety symptoms, comorbid anxiety disorder, fatigue, and loneliness. The final model showed good calibration and reasonable discrimination (AUC of 0.75; 0.70 after external validation). The strongest individual predictor was severity of depression (AUC of 0.69; 0.68 after external validation). The model was developed and validated in The Netherlands, which could affect the cross-country generalizability. Based on rather simple clinical indicators, it is possible to predict the 2-year course of MDD. The prediction rule can be used for monitoring MDD patients and identifying those at risk of an unfavourable outcome. Copyright © 2018 Elsevier B.V. All rights reserved.
Bourlieu, C; Guillard, V; Vallès-Pamiès, B; Guilbert, S; Gontard, N
2009-05-01
Control of moisture transfer inside composite food products or between food and its environment remains today a major challenge in food preservation. A wide rage of film-forming compounds is now available and facilitates tailoring moisture barriers with optimized functional properties. Despite these huge potentials, a realistic assessment of the film or coating efficacy is still critical. Due to nonlinear water sorption isotherms, water-dependent diffusivities, and variations of physical state, modelling transport phenomena through edible barriers is complex. Water vapor permeability can hardly be considered as an inherent property of films and only gives a relative indication of the barrier efficacy. The formal or mechanistic models reported in literature that describe the influence of testing conditions on the barrier properties of edible films are reviewed and discussed. Most of these models have been validated on a narrow range of conditions. Conversely, few original predictive models based on Fick's Second Law have been developed to assess shelf-life extension of food products including barriers. These models, assuming complex and realistic hypothesis, have been validated in various model foods. The development of nondestructive methods of moisture content measurement should speed up model validation and allow a better comprehension of moisture transfer through edible films.
NASA Astrophysics Data System (ADS)
Daneshjou, Kamran; Alibakhshi, Reza
2018-01-01
In the current manuscript, the process of spacecraft docking, as one of the main risky operations in an on-orbit servicing mission, is modeled based on unconstrained multibody dynamics. The spring-damper buffering device is utilized here in the docking probe-cone system for micro-satellites. Owing to the impact occurs inevitably during docking process and the motion characteristics of multibody systems are remarkably affected by this phenomenon, a continuous contact force model needs to be considered. Spring-damper buffering device, keeping the spacecraft stable in an orbit when impact occurs, connects a base (cylinder) inserted in the chaser satellite and the end of docking probe. Furthermore, by considering a revolute joint equipped with torsional shock absorber, between base and chaser satellite, the docking probe can experience both translational and rotational motions simultaneously. Although spacecraft docking process accompanied by the buffering mechanisms may be modeled by constrained multibody dynamics, this paper deals with a simple and efficient formulation to eliminate the surplus generalized coordinates and solve the impact docking problem based on unconstrained Lagrangian mechanics. By an example problem, first, model verification is accomplished by comparing the computed results with those recently reported in the literature. Second, according to a new alternative validation approach, which is based on constrained multibody problem, the accuracy of presented model can be also evaluated. This proposed verification approach can be applied to indirectly solve the constrained multibody problems by minimum required effort. The time history of impact force, the influence of system flexibility and physical interaction between shock absorber and penetration depth caused by impact are the issues followed in this paper. Third, the MATLAB/SIMULINK multibody dynamic analysis software will be applied to build impact docking model to validate computed results and then, investigate the trajectories of both satellites to take place the successful capture process.
Fonseca, T C Ferreira; Bogaerts, R; Lebacq, A L; Mihailescu, C L; Vanhavere, F
2014-04-01
A realistic computational 3D human body library, called MaMP and FeMP (Male and Female Mesh Phantoms), based on polygonal mesh surface geometry, has been created to be used for numerical calibration of the whole body counter (WBC) system of the nuclear power plant (NPP) in Doel, Belgium. The main objective was to create flexible computational models varying in gender, body height, and mass for studying the morphology-induced variation of the detector counting efficiency (CE) and reducing the measurement uncertainties. First, the counting room and an HPGe detector were modeled using MCNPX (Monte Carlo radiation transport code). The validation of the model was carried out for different sample-detector geometries with point sources and a physical phantom. Second, CE values were calculated for a total of 36 different mesh phantoms in a seated position using the validated Monte Carlo model. This paper reports on the validation process of the in vivo whole body system and the CE calculated for different body heights and weights. The results reveal that the CE is strongly dependent on the individual body shape, size, and gender and may vary by a factor of 1.5 to 3 depending on the morphology aspects of the individual to be measured.
NASA Astrophysics Data System (ADS)
Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun
2017-11-01
In this study, a data-driven method for predicting CO2 leaks and associated concentrations from geological CO2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems.
TU-D-201-05: Validation of Treatment Planning Dose Calculations: Experience Working with MPPG 5.a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xue, J; Park, J; Kim, L
2016-06-15
Purpose: Newly published medical physics practice guideline (MPPG 5.a.) has set the minimum requirements for commissioning and QA of treatment planning dose calculations. We present our experience in the validation of a commercial treatment planning system based on MPPG 5.a. Methods: In addition to tests traditionally performed to commission a model-based dose calculation algorithm, extensive tests were carried out at short and extended SSDs, various depths, oblique gantry angles and off-axis conditions to verify the robustness and limitations of a dose calculation algorithm. A comparison between measured and calculated dose was performed based on validation tests and evaluation criteria recommendedmore » by MPPG 5.a. An ion chamber was used for the measurement of dose at points of interest, and diodes were used for photon IMRT/VMAT validations. Dose profiles were measured with a three-dimensional scanning system and calculated in the TPS using a virtual water phantom. Results: Calculated and measured absolute dose profiles were compared at each specified SSD and depth for open fields. The disagreement is easily identifiable with the difference curve. Subtle discrepancy has revealed the limitation of the measurement, e.g., a spike at the high dose region and an asymmetrical penumbra observed on the tests with an oblique MLC beam. The excellent results we had (> 98% pass rate on 3%/3mm gamma index) on the end-to-end tests for both IMRT and VMAT are attributed to the quality beam data and the good understanding of the modeling. The limitation of the model and the uncertainty of measurement were considered when comparing the results. Conclusion: The extensive tests recommended by the MPPG encourage us to understand the accuracy and limitations of a dose algorithm as well as the uncertainty of measurement. Our experience has shown how the suggested tests can be performed effectively to validate dose calculation models.« less
NASA Technical Reports Server (NTRS)
Moussavi, Mahsa S.; Abdalati, Waleed; Pope, Allen; Scambos, Ted; Tedesco, Marco; MacFerrin, Michael; Grigsby, Shane
2016-01-01
Supraglacial meltwater lakes on the western Greenland Ice Sheet (GrIS) are critical components of its surface hydrology and surface mass balance, and they also affect its ice dynamics. Estimates of lake volume, however, are limited by the availability of in situ measurements of water depth,which in turn also limits the assessment of remotely sensed lake depths. Given the logistical difficulty of collecting physical bathymetric measurements, methods relying upon in situ data are generally restricted to small areas and thus their application to largescale studies is difficult to validate. Here, we produce and validate spaceborne estimates of supraglacial lake volumes across a relatively large area (1250 km(exp 2) of west Greenland's ablation region using data acquired by the WorldView-2 (WV-2) sensor, making use of both its stereo-imaging capability and its meter-scale resolution. We employ spectrally-derived depth retrieval models, which are either based on absolute reflectance (single-channel model) or a ratio of spectral reflectances in two bands (dual-channel model). These models are calibrated by usingWV-2multispectral imagery acquired early in the melt season and depth measurements from a high resolutionWV-2 DEM over the same lake basins when devoid of water. The calibrated models are then validated with different lakes in the area, for which we determined depths. Lake depth estimates based on measurements recorded in WV-2's blue (450-510 nm), green (510-580 nm), and red (630-690 nm) bands and dual-channel modes (blue/green, blue/red, and green/red band combinations) had near-zero bias, an average root-mean-squared deviation of 0.4 m (relative to post-drainage DEMs), and an average volumetric error of b1%. The approach outlined in this study - image-based calibration of depth-retrieval models - significantly improves spaceborne supraglacial bathymetry retrievals, which are completely independent from in situ measurements.
Didarloo, Alireza; Ardebili, Hassan Eftekhar; Niknami, Shamsaddin; Hajizadeh, Ebrahim; Alizadeh, Mohammad
2011-01-01
Background Findings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran. Methods A sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior). The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions) using the SPSS package. Results The findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors. Conclusion Our findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease. PMID:22111043
NASA Astrophysics Data System (ADS)
McNally, A.; Yatheendradas, S.; Jayanthi, H.; Funk, C. C.; Peters-Lidard, C. D.
2011-12-01
The declaration of famine in Somalia on July 21, 2011 highlights the need for regional hydroclimate analysis at a scale that is relevant for agropastoral drought monitoring. A particularly critical and robust component of such a drought monitoring system is a land surface model (LSM). We are currently enhancing the Famine Early Warning Systems Network (FEWS NET) monitoring activities by configuring a custom instance of NASA's Land Information System (LIS) called the FEWS NET Land Data Assimilation System (FLDAS). Using the LIS Noah LSM, in-situ measurements, and remotely sensed data, we focus on the following question: How can Noah be best parameterized to accurately simulate hydroclimate variables associated with crop performance? Parameter value testing and validation is done by comparing modeled soil moisture against fortuitously available in-situ soil moisture observations in the West Africa. Direct testing and application of the FLDAS over African agropastoral locations is subject to some issues: [1] In many regions that are vulnerable to food insecurity ground based measurements of precipitation, evapotranspiration and soil moisture are sparse or non-existent, [2] standard landcover classes (e.g., the University of Maryland 5 km dataset), do not include representations of specific agricultural crops with relevant parameter values, and phenologies representing their growth stages from the planting date and [3] physically based land surface models and remote sensing rain data might still need to be calibrated or bias-corrected for the regions of interest. This research aims to address these issues by focusing on sites in the West African countries of Mali, Niger, and Benin where in-situ rainfall and soil moisture measurements are available from the African Monsoon Multidisciplinary Analysis (AMMA). Preliminary results from model experiments over Southern Malawi, validated with Normalized Difference Vegetation Index (NDVI) and maize yield data, show that the ability to detect a drought signal in modeled soil moisture and actual evapotranspiration was sensitive to parameters like minimum stomatal resistance, green vegetation fraction, and minimum threshold for transpiration stress. In addition to improving our understanding and representation of the land surface physics in agropastoral drought, this study moves us closer to confidently validating LSM estimates with remotely sensed data (e.g. MODIS NDVI), essential in regions that lack ground based measurements. Ultimately, these improved information products serve to better inform decision makers about seasonal food production and anticipate the need for relief, as well as guide climate change adaptation strategies, potentially saving millions of lives.
Development and validation of real-time simulation of X-ray imaging with respiratory motion.
Vidal, Franck P; Villard, Pierre-Frédéric
2016-04-01
We present a framework that combines evolutionary optimisation, soft tissue modelling and ray tracing on GPU to simultaneously compute the respiratory motion and X-ray imaging in real-time. Our aim is to provide validated building blocks with high fidelity to closely match both the human physiology and the physics of X-rays. A CPU-based set of algorithms is presented to model organ behaviours during respiration. Soft tissue deformation is computed with an extension of the Chain Mail method. Rigid elements move according to kinematic laws. A GPU-based surface rendering method is proposed to compute the X-ray image using the Beer-Lambert law. It is provided as an open-source library. A quantitative validation study is provided to objectively assess the accuracy of both components: (i) the respiration against anatomical data, and (ii) the X-ray against the Beer-Lambert law and the results of Monte Carlo simulations. Our implementation can be used in various applications, such as interactive medical virtual environment to train percutaneous transhepatic cholangiography in interventional radiology, 2D/3D registration, computation of digitally reconstructed radiograph, simulation of 4D sinograms to test tomography reconstruction tools. Copyright © 2015 Elsevier Ltd. All rights reserved.
Chen, Xueli; Gao, Xinbo; Qu, Xiaochao; Chen, Duofang; Ma, Xiaopeng; Liang, Jimin; Tian, Jie
2010-10-10
The camera lens diaphragm is an important component in a noncontact optical imaging system and has a crucial influence on the images registered on the CCD camera. However, this influence has not been taken into account in the existing free-space photon transport models. To model the photon transport process more accurately, a generalized free-space photon transport model is proposed. It combines Lambertian source theory with analysis of the influence of the camera lens diaphragm to simulate photon transport process in free space. In addition, the radiance theorem is also adopted to establish the energy relationship between the virtual detector and the CCD camera. The accuracy and feasibility of the proposed model is validated with a Monte-Carlo-based free-space photon transport model and physical phantom experiment. A comparison study with our previous hybrid radiosity-radiance theorem based model demonstrates the improvement performance and potential of the proposed model for simulating photon transport process in free space.
Economic analysis of model validation for a challenge problem
Paez, Paul J.; Paez, Thomas L.; Hasselman, Timothy K.
2016-02-19
It is now commonplace for engineers to build mathematical models of the systems they are designing, building, or testing. And, it is nearly universally accepted that phenomenological models of physical systems must be validated prior to use for prediction in consequential scenarios. Yet, there are certain situations in which testing only or no testing and no modeling may be economically viable alternatives to modeling and its associated testing. This paper develops an economic framework within which benefit–cost can be evaluated for modeling and model validation relative to other options. The development is presented in terms of a challenge problem. Asmore » a result, we provide a numerical example that quantifies when modeling, calibration, and validation yield higher benefit–cost than a testing only or no modeling and no testing option.« less
Gimeno-Santos, Elena; Raste, Yogini; Demeyer, Heleen; Louvaris, Zafeiris; de Jong, Corina; Rabinovich, Roberto A.; Hopkinson, Nicholas S.; Polkey, Michael I.; Vogiatzis, Ioannis; Tabberer, Maggie; Dobbels, Fabienne; Ivanoff, Nathalie; de Boer, Willem I.; van der Molen, Thys; Kulich, Karoly; Serra, Ignasi; Basagaña, Xavier; Troosters, Thierry; Puhan, Milo A.; Karlsson, Niklas
2015-01-01
No current patient-centred instrument captures all dimensions of physical activity in chronic obstructive pulmonary disease (COPD). Our objective was item reduction and initial validation of two instruments to measure physical activity in COPD. Physical activity was assessed in a 6-week, randomised, two-way cross-over, multicentre study using PROactive draft questionnaires (daily and clinical visit versions) and two activity monitors. Item reduction followed an iterative process including classical and Rasch model analyses, and input from patients and clinical experts. 236 COPD patients from five European centres were included. Results indicated the concept of physical activity in COPD had two domains, labelled “amount” and “difficulty”. After item reduction, the daily PROactive instrument comprised nine items and the clinical visit contained 14. Both demonstrated good model fit (person separation index >0.7). Confirmatory factor analysis supported the bidimensional structure. Both instruments had good internal consistency (Cronbach's α>0.8), test–retest reliability (intraclass correlation coefficient ≥0.9) and exhibited moderate-to-high correlations (r>0.6) with related constructs and very low correlations (r<0.3) with unrelated constructs, providing evidence for construct validity. Daily and clinical visit “PROactive physical activity in COPD” instruments are hybrid tools combining a short patient-reported outcome questionnaire and two activity monitor variables which provide simple, valid and reliable measures of physical activity in COPD patients. PMID:26022965
Gimeno-Santos, Elena; Raste, Yogini; Demeyer, Heleen; Louvaris, Zafeiris; de Jong, Corina; Rabinovich, Roberto A; Hopkinson, Nicholas S; Polkey, Michael I; Vogiatzis, Ioannis; Tabberer, Maggie; Dobbels, Fabienne; Ivanoff, Nathalie; de Boer, Willem I; van der Molen, Thys; Kulich, Karoly; Serra, Ignasi; Basagaña, Xavier; Troosters, Thierry; Puhan, Milo A; Karlsson, Niklas; Garcia-Aymerich, Judith
2015-10-01
No current patient-centred instrument captures all dimensions of physical activity in chronic obstructive pulmonary disease (COPD). Our objective was item reduction and initial validation of two instruments to measure physical activity in COPD.Physical activity was assessed in a 6-week, randomised, two-way cross-over, multicentre study using PROactive draft questionnaires (daily and clinical visit versions) and two activity monitors. Item reduction followed an iterative process including classical and Rasch model analyses, and input from patients and clinical experts.236 COPD patients from five European centres were included. Results indicated the concept of physical activity in COPD had two domains, labelled "amount" and "difficulty". After item reduction, the daily PROactive instrument comprised nine items and the clinical visit contained 14. Both demonstrated good model fit (person separation index >0.7). Confirmatory factor analysis supported the bidimensional structure. Both instruments had good internal consistency (Cronbach's α>0.8), test-retest reliability (intraclass correlation coefficient ≥0.9) and exhibited moderate-to-high correlations (r>0.6) with related constructs and very low correlations (r<0.3) with unrelated constructs, providing evidence for construct validity.Daily and clinical visit "PROactive physical activity in COPD" instruments are hybrid tools combining a short patient-reported outcome questionnaire and two activity monitor variables which provide simple, valid and reliable measures of physical activity in COPD patients. Copyright ©ERS 2015.
A calibration protocol for population-specific accelerometer cut-points in children.
Mackintosh, Kelly A; Fairclough, Stuart J; Stratton, Gareth; Ridgers, Nicola D
2012-01-01
To test a field-based protocol using intermittent activities representative of children's physical activity behaviours, to generate behaviourally valid, population-specific accelerometer cut-points for sedentary behaviour, moderate, and vigorous physical activity. Twenty-eight children (46% boys) aged 10-11 years wore a hip-mounted uniaxial GT1M ActiGraph and engaged in 6 activities representative of children's play. A validated direct observation protocol was used as the criterion measure of physical activity. Receiver Operating Characteristics (ROC) curve analyses were conducted with four semi-structured activities to determine the accelerometer cut-points. To examine classification differences, cut-points were cross-validated with free-play and DVD viewing activities. Cut-points of ≤ 372, >2160 and >4806 counts • min(-1) representing sedentary, moderate and vigorous intensity thresholds, respectively, provided the optimal balance between the related needs for sensitivity (accurately detecting activity) and specificity (limiting misclassification of the activity). Cross-validation data demonstrated that these values yielded the best overall kappa scores (0.97; 0.71; 0.62), and a high classification agreement (98.6%; 89.0%; 87.2%), respectively. Specificity values of 96-97% showed that the developed cut-points accurately detected physical activity, and sensitivity values (89-99%) indicated that minutes of activity were seldom incorrectly classified as inactivity. The development of an inexpensive and replicable field-based protocol to generate behaviourally valid and population-specific accelerometer cut-points may improve the classification of physical activity levels in children, which could enhance subsequent intervention and observational studies.
NASA Astrophysics Data System (ADS)
Kunz, Robert; Haworth, Daniel; Dogan, Gulkiz; Kriete, Andres
2006-11-01
Three-dimensional, unsteady simulations of multiphase flow, gas exchange, and particle/aerosol deposition in the human lung are reported. Surface data for human tracheo-bronchial trees are derived from CT scans, and are used to generate three- dimensional CFD meshes for the first several generations of branching. One-dimensional meshes for the remaining generations down to the respiratory units are generated using branching algorithms based on those that have been proposed in the literature, and a zero-dimensional respiratory unit (pulmonary acinus) model is attached at the end of each terminal bronchiole. The process is automated to facilitate rapid model generation. The model is exercised through multiple breathing cycles to compute the spatial and temporal variations in flow, gas exchange, and particle/aerosol deposition. The depth of the 3D/1D transition (at branching generation n) is a key parameter, and can be varied. High-fidelity models (large n) are run on massively parallel distributed-memory clusters, and are used to generate physical insight and to calibrate/validate the 1D and 0D models. Suitably validated lower-order models (small n) can be run on single-processor PC’s with run times that allow model-based clinical intervention for individual patients.
NASA Technical Reports Server (NTRS)
Salas, Manuel D.
2007-01-01
The research program of the aerodynamics, aerothermodynamics and plasmadynamics discipline of NASA's Hypersonic Project is reviewed. Details are provided for each of its three components: 1) development of physics-based models of non-equilibrium chemistry, surface catalytic effects, turbulence, transition and radiation; 2) development of advanced simulation tools to enable increased spatial and time accuracy, increased geometrical complexity, grid adaptation, increased physical-processes complexity, uncertainty quantification and error control; and 3) establishment of experimental databases from ground and flight experiments to develop better understanding of high-speed flows and to provide data to validate and guide the development of simulation tools.
Does Hooke's law work in helical nanosprings?
Ben, Sudong; Zhao, Junhua; Rabczuk, Timon
2015-08-28
Hooke's law is a principle of physics that states that the force needed to extend a spring by some distance is proportional to that distance. The law is always valid for an initial portion of the elastic range for nearly all helical macrosprings. Here we report the sharp nonlinear force-displacement relation of tightly wound helical carbon nanotubes at even small displacement via a molecular mechanics model. We demonstrate that the van der Waals (vdW) interaction between the intertube walls dominates the nonlinear relation based on our analytical expressions. This study provides physical insights into the origin of huge nonlinearity of the helical nanosprings.
An atomistic-based chemophysical environment for evaluating asphalt oxidation and antioxidants.
Pan, Tongyan; Sun, Lu; Yu, Qifeng
2012-12-01
Asphalt binders in service conditions are subject to oxidative aging that involves the reactions between oxygen molecules and the component species of bulk asphalt. As a result, significant alterations can occur to the desired physical and/or mechanical properties of asphalt. A common practice to alleviate asphalt aging has been to employ different chemical additives or modifiers as antioxidants. The current state of knowledge in asphalt oxidation and antioxidant evaluation is centered on determining the degradation of asphalt physical properties, mainly the viscosity and ductility. Such practices, although meeting direct engineering needs, do not contribute to the fundamental understanding of the aging and anti-oxidation mechanisms, and thereby developing anti-aging strategies. From this standpoint, this study was initiated to study the chemical and physical bases of asphalt oxidation, as well as the anti-oxidation mechanisms of bio-based antioxidants using the coniferyl-alcohol lignin as an example. A quantum chemistry (QC) based chemophysical environment is developed, in which the various chemical reactions between asphalt component species and oxygen, as well as the incurred physical changes are studied. X-ray photoelectron spectroscopy (XPS) was used to validate the modified and unmodified asphalt models.
NASA Technical Reports Server (NTRS)
Pholsiri, Chalongrath; English, James; Seberino, Charles; Lim, Yi-Je
2010-01-01
The Excavator Design Validation tool verifies excavator designs by automatically generating control systems and modeling their performance in an accurate simulation of their expected environment. Part of this software design includes interfacing with human operations that can be included in simulation-based studies and validation. This is essential for assessing productivity, versatility, and reliability. This software combines automatic control system generation from CAD (computer-aided design) models, rapid validation of complex mechanism designs, and detailed models of the environment including soil, dust, temperature, remote supervision, and communication latency to create a system of high value. Unique algorithms have been created for controlling and simulating complex robotic mechanisms automatically from just a CAD description. These algorithms are implemented as a commercial cross-platform C++ software toolkit that is configurable using the Extensible Markup Language (XML). The algorithms work with virtually any mobile robotic mechanisms using module descriptions that adhere to the XML standard. In addition, high-fidelity, real-time physics-based simulation algorithms have also been developed that include models of internal forces and the forces produced when a mechanism interacts with the outside world. This capability is combined with an innovative organization for simulation algorithms, new regolith simulation methods, and a unique control and study architecture to make powerful tools with the potential to transform the way NASA verifies and compares excavator designs. Energid's Actin software has been leveraged for this design validation. The architecture includes parametric and Monte Carlo studies tailored for validation of excavator designs and their control by remote human operators. It also includes the ability to interface with third-party software and human-input devices. Two types of simulation models have been adapted: high-fidelity discrete element models and fast analytical models. By using the first to establish parameters for the second, a system has been created that can be executed in real time, or faster than real time, on a desktop PC. This allows Monte Carlo simulations to be performed on a computer platform available to all researchers, and it allows human interaction to be included in a real-time simulation process. Metrics on excavator performance are established that work with the simulation architecture. Both static and dynamic metrics are included.
Audigier, Chloé; Mansi, Tommaso; Delingette, Hervé; Rapaka, Saikiran; Passerini, Tiziano; Mihalef, Viorel; Jolly, Marie-Pierre; Pop, Raoul; Diana, Michele; Soler, Luc; Kamen, Ali; Comaniciu, Dorin; Ayache, Nicholas
2017-09-01
We aim at developing a framework for the validation of a subject-specific multi-physics model of liver tumor radiofrequency ablation (RFA). The RFA computation becomes subject specific after several levels of personalization: geometrical and biophysical (hemodynamics, heat transfer and an extended cellular necrosis model). We present a comprehensive experimental setup combining multimodal, pre- and postoperative anatomical and functional images, as well as the interventional monitoring of intra-operative signals: the temperature and delivered power. To exploit this dataset, an efficient processing pipeline is introduced, which copes with image noise, variable resolution and anisotropy. The validation study includes twelve ablations from five healthy pig livers: a mean point-to-mesh error between predicted and actual ablation extent of 5.3 ± 3.6 mm is achieved. This enables an end-to-end preclinical validation framework that considers the available dataset.
System equivalent model mixing
NASA Astrophysics Data System (ADS)
Klaassen, Steven W. B.; van der Seijs, Maarten V.; de Klerk, Dennis
2018-05-01
This paper introduces SEMM: a method based on Frequency Based Substructuring (FBS) techniques that enables the construction of hybrid dynamic models. With System Equivalent Model Mixing (SEMM) frequency based models, either of numerical or experimental nature, can be mixed to form a hybrid model. This model follows the dynamic behaviour of a predefined weighted master model. A large variety of applications can be thought of, such as the DoF-space expansion of relatively small experimental models using numerical models, or the blending of different models in the frequency spectrum. SEMM is outlined, both mathematically and conceptually, based on a notation commonly used in FBS. A critical physical interpretation of the theory is provided next, along with a comparison to similar techniques; namely DoF expansion techniques. SEMM's concept is further illustrated by means of a numerical example. It will become apparent that the basic method of SEMM has some shortcomings which warrant a few extensions to the method. One of the main applications is tested in a practical case, performed on a validated benchmark structure; it will emphasize the practicality of the method.
Validity and reliability of a video questionnaire to assess physical function in older adults.
Balachandran, Anoop; N Verduin, Chelsea; Potiaumpai, Melanie; Ni, Meng; Signorile, Joseph F
2016-08-01
Self-report questionnaires are widely used to assess physical function in older adults. However, they often lack a clear frame of reference and hence interpreting and rating task difficulty levels can be problematic for the responder. Consequently, the usefulness of traditional self-report questionnaires for assessing higher-level functioning is limited. Video-based questionnaires can overcome some of these limitations by offering a clear and objective visual reference for the performance level against which the subject is to compare his or her perceived capacity. Hence the purpose of the study was to develop and validate a novel, video-based questionnaire to assess physical function in older adults independently living in the community. A total of 61 community-living adults, 60years or older, were recruited. To examine validity, 35 of the subjects completed the video questionnaire, two types of physical performance tests: a test of instrumental activity of daily living (IADL) included in the Short Physical Functional Performance battery (PFP-10), and a composite of 3 performance tests (30s chair stand, single-leg balance and usual gait speed). To ascertain reliability, two-week test-retest reliability was assessed in the remaining 26 subjects who did not participate in validity testing. The video questionnaire showed a moderate correlation with the IADLs (Spearman rho=0.64, p<0.001; 95% CI (0.4, 0.8)), and a lower correlation with the composite score of physical performance tests (Spearman rho=0.49, p<0.01; 95% CI (0.18, 0.7)). The test-retest assessment yielded an intra-class correlation (ICC) of 0.87 (p<0.001; 95% CI (0.70, 0.94)) and a Cronbach's alpha of 0.89 demonstrating good reliability and internal consistency. Our results show that the video questionnaire developed to evaluate physical function in community-living older adults is a valid and reliable assessment tool; however, further validation is needed for definitive conclusions. Copyright © 2016 Elsevier Inc. All rights reserved.
Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...
2017-10-01
This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems through the comparison of simulated responses of select system designs to physical test data. Validation activities such as these lead to improvement of offshore wind modeling tools, which will enable the development of more innovative and cost-effective offshore wind designs. For Phase II of the project, numerical models of the DeepCwind floating semisubmersible wind system weremore » validated using measurement data from a 1/50th-scale validation campaign performed at the Maritime Research Institute Netherlands offshore wave basin. Validation of the models was performed by comparing the calculated ultimate and fatigue loads for eight different wave-only and combined wind/wave test cases against the measured data, after calibration was performed using free-decay, wind-only, and wave-only tests. The results show a decent estimation of both the ultimate and fatigue loads for the simulated results, but with a fairly consistent underestimation in the tower and upwind mooring line loads that can be attributed to an underestimation of wave-excitation forces outside the linear wave-excitation region, and the presence of broadband frequency excitation in the experimental measurements from wind. Participant results showed varied agreement with the experimental measurements based on the modeling approach used. Modeling attributes that enabled better agreement included: the use of a dynamic mooring model; wave stretching, or some other hydrodynamic modeling approach that excites frequencies outside the linear wave region; nonlinear wave kinematics models; and unsteady aerodynamics models. Also, it was observed that a Morison-only hydrodynamic modeling approach could create excessive pitch excitation and resulting tower loads in some frequency bands.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.
This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems through the comparison of simulated responses of select system designs to physical test data. Validation activities such as these lead to improvement of offshore wind modeling tools, which will enable the development of more innovative and cost-effective offshore wind designs. For Phase II of the project, numerical models of the DeepCwind floating semisubmersible wind system weremore » validated using measurement data from a 1/50th-scale validation campaign performed at the Maritime Research Institute Netherlands offshore wave basin. Validation of the models was performed by comparing the calculated ultimate and fatigue loads for eight different wave-only and combined wind/wave test cases against the measured data, after calibration was performed using free-decay, wind-only, and wave-only tests. The results show a decent estimation of both the ultimate and fatigue loads for the simulated results, but with a fairly consistent underestimation in the tower and upwind mooring line loads that can be attributed to an underestimation of wave-excitation forces outside the linear wave-excitation region, and the presence of broadband frequency excitation in the experimental measurements from wind. Participant results showed varied agreement with the experimental measurements based on the modeling approach used. Modeling attributes that enabled better agreement included: the use of a dynamic mooring model; wave stretching, or some other hydrodynamic modeling approach that excites frequencies outside the linear wave region; nonlinear wave kinematics models; and unsteady aerodynamics models. Also, it was observed that a Morison-only hydrodynamic modeling approach could create excessive pitch excitation and resulting tower loads in some frequency bands.« less
Lin, Yu-Hsiu; McLain, Alexander C; Probst, Janice C; Bennett, Kevin J; Qureshi, Zaina P; Eberth, Jan M
2017-01-01
The purpose of this study was to develop county-level estimates of poor health-related quality of life (HRQOL) among aged 65 years and older U.S. adults and to identify spatial clusters of poor HRQOL using a multilevel, poststratification approach. Multilevel, random-intercept models were fit to HRQOL data (two domains: physical health and mental health) from the 2011-2012 Behavioral Risk Factor Surveillance System. Using a poststratification, small area estimation approach, we generated county-level probabilities of having poor HRQOL for each domain in U.S. adults aged 65 and older, and validated our model-based estimates against state and county direct estimates. County-level estimates of poor HRQOL in the United States ranged from 18.07% to 44.81% for physical health and 14.77% to 37.86% for mental health. Correlations between model-based and direct estimates were higher for physical than mental HRQOL. Counties located in the Arkansas, Kentucky, and Mississippi exhibited the worst physical HRQOL scores, but this pattern did not hold for mental HRQOL, which had the highest probability of mentally unhealthy days in Illinois, Indiana, and Vermont. Substantial geographic variation in physical and mental HRQOL scores exists among older U.S. adults. State and local policy makers should consider these local conditions in targeting interventions and policies to counties with high levels of poor HRQOL scores. Copyright © 2016 Elsevier Inc. All rights reserved.
Surrogate screening models for the low physical activity criterion of frailty.
Eckel, Sandrah P; Bandeen-Roche, Karen; Chaves, Paulo H M; Fried, Linda P; Louis, Thomas A
2011-06-01
Low physical activity, one of five criteria in a validated clinical phenotype of frailty, is assessed by a standardized, semiquantitative questionnaire on up to 20 leisure time activities. Because of the time demanded to collect the interview data, it has been challenging to translate to studies other than the Cardiovascular Health Study (CHS), for which it was developed. Considering subsets of activities, we identified and evaluated streamlined surrogate assessment methods and compared them to one implemented in the Women's Health and Aging Study (WHAS). Using data on men and women ages 65 and older from the CHS, we applied logistic regression models to rank activities by "relative influence" in predicting low physical activity.We considered subsets of the most influential activities as inputs to potential surrogate models (logistic regressions). We evaluated predictive accuracy and predictive validity using the area under receiver operating characteristic curves and assessed criterion validity using proportional hazards models relating frailty status (defined using the surrogate) to mortality. Walking for exercise and moderately strenuous household chores were highly influential for both genders. Women required fewer activities than men for accurate classification. The WHAS model (8 CHS activities) was an effective surrogate, but a surrogate using 6 activities (walking, chores, gardening, general exercise, mowing and golfing) was also highly predictive. We recommend a 6 activity questionnaire to assess physical activity for men and women. If efficiency is essential and the study involves only women, fewer activities can be included.
Physics based performance model of a UV missile seeker
NASA Astrophysics Data System (ADS)
James, I.
2017-10-01
Electro-optically (EO) guided surface to air missiles (SAM) have developed to use Ultraviolet (UV) wavebands supplementary to the more common Infrared (IR) wavebands. Missiles such as the US Stinger have been around for some time, these have been joined recently by Chinese FN-16 and Russian SA-29 (Verba) and there is a much higher potential proliferation risk. The purpose of this paper is to introduce a first-principles, physics based, model of a typical seeker arrangement. The model is constructed from various calculations that aim to characterise the physical effects that will affect the performance of the system. Data has been gathered from a number of sources to provide realism to the variables within the model. It will be demonstrated that many of the variables have the power to dramatically alter the performance of the system as a whole. Further, data will be shown to illustrate the expected performance of a typical UV detector within a SAM in detection range against a variety of target sizes. The trend for the detection range against aircraft size and skin reflectivity will be shown to be non-linear, this should have been expected owing to the exponential decay of a signal through atmosphere. Future work will validate the performance of the model against real world performance data for cameras (when this is available) to ensure that it is operates within acceptable errors.
Hayes, Risa P; Nelson, David R; Meldahl, Michael L; Curtis, Bradley H
2011-07-01
The aims of this study were (1) to demonstrate the reliability and validity of the Impact of Weight on Activities of Daily Living Questionnaire (IWADL), a measure of ability to perform daily physical activities, in individuals with type 2 diabetes who are moderately obese and (2) to characterize those individuals with low self-reported ability. Data from a web-based survey of individuals with type 2 diabetes and body mass index (BMI) of 30-40 k/mg(2) were used to calculate Cronbach's α and demonstrate both IWADL factorial and construct validity. These data were entered into a multivariable multinomial logistic regression model with survey variables (demographics, health status, weight- and diabetes-related) as the independent variables and three IWADL scoring groups (Low, Medium, and High) as the dependent variables. Study participants were 349 individuals with type 2 diabetes (mean age = 59 years, 44% male, 91% white, mean BMI = 35 k/mg(2)). Factor analysis indicated a one-factor solution for a seven-item IWADL with Cronbach's α = 0.94. Significant (P < 0.05) relationships were identified between IWADL and variables previously shown to be related to level of physical activity. Ten variables remained independently (P < 0.05) related to IWADL scores, including age, gender, health status, current exercise, using exercise programs or hypnosis as a step to lose weight, and self-reported weight history. The IWADL is a reliable and valid measure of the ability of individuals with type 2 diabetes and moderate obesity to perform daily physical activities. This ability may be an important patient-reported end point to include in clinical trials of antihyperglycemic medications that produce weight loss.
NASA Astrophysics Data System (ADS)
Kalaroni, Sofia; Tsiaras, Kostas; Economou-Amilli, Athena; Petihakis, George; Politikos, Dimitrios; Triantafyllou, George
2013-04-01
Within the framework of the European project OPEC (Operational Ecology), a data assimilation system was implemented to describe chlorophyll-a concentrations of the North Aegean, as well the impact on the European anchovy (Engraulis encrasicolus) biomass distribution provided by a bioenergetics model, related to the density of three low trophic level functional groups of zooplankton (heterotrophic flagellates, microzooplankton and mesozooplankton). The three-dimensional hydrodynamic-biogeochemical model comprises two on-line coupled sub-models: the Princeton Ocean Model (POM) and the European Regional Seas Ecosystem Model (ERSEM). The assimilation scheme is based on the Singular Evolutive Extended Kalman (SEEK) filter and its variant that uses a fixed correction base (SFEK). For the initialization, SEEK filter uses a reduced order error covariance matrix provided by the dominant Empirical Orthogonal Functions (EOF) of model. The assimilation experiments were performed for year 2003 using SeaWiFS chlorophyll-a data during which the physical model uses the atmospheric forcing obtained from the regional climate model HIRHAM5. The assimilation system is validated by assessing the relevance of the system in fitting the data, the impact of the assimilation on non-observed biochemical parameters and the overall quality of the forecasts.
Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model
2010-03-01
EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End
EDGE COMPUTING AND CONTEXTUAL INFORMATION FOR THE INTERNET OF THINGS SENSORS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Levente
Interpreting sensor data require knowledge about sensor placement and the surrounding environment. For a single sensor measurement, it is easy to document the context by visual observation, however for millions of sensors reporting data back to a server, the contextual information needs to be automatically extracted from either data analysis or leveraging complimentary data sources. Data layers that overlap spatially or temporally with sensor locations, can be used to extract the context and to validate the measurement. To minimize the amount of data transmitted through the internet, while preserving signal information content, two methods are explored; computation at the edgemore » and compressed sensing. We validate the above methods on wind and chemical sensor data (1) eliminate redundant measurement from wind sensors and (2) extract peak value of a chemical sensor measuring a methane plume. We present a general cloud based framework to validate sensor data based on statistical and physical modeling and contextual data extracted from geospatial data.« less
Towards improved capability and confidence in coupled atmospheric and wildland fire modeling
NASA Astrophysics Data System (ADS)
Sauer, Jeremy A.
This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.
Quantitative validation of carbon-fiber laminate low velocity impact simulations
English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.
2015-09-26
Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less
Rolland, Yves; Dupuy, Charlotte; Abellan Van Kan, Gabor; Cesari, Matteo; Vellas, Bruno; Faruch, Marie; Dray, Cedric; de Souto Barreto, Philipe
2017-10-01
Screening for sarcopenia in daily practice can be challenging. Our objective was to explore whether the SARC-F questionnaire is a valid screening tool for sarcopenia (defined by the Foundation for the National Institutes of Health [FNIH] criteria). Moreover, we evaluated the physical performance of older women according to the SARC-F questionnaire. Cross-sectional study. Data from the Toulouse and Lyon EPIDémiologie de l'OStéoporose study (EPIDOS) on 3025 women living in the community (mean age: 80.5 ± 3.9 years), without a previous history of hip fracture, were assessed. The SARC-F self-report questionnaire score ranges from 0 to 10: a score ≥4 defines sarcopenia. The FNIH criteria uses handgrip strength (GS) and appendicular lean mass (ALM; assessed by DXA) divided by body mass index (BMI) to define sarcopenia. Outcome measures were the following performance-based tests: knee-extension strength, 6-m gait speed, and a repeated chair-stand test. The associations of sarcopenia with performance-based tests was examined using bootstrap multiple linear-regression models; adjusted R 2 determined the percentage variation for each outcome explained by the model. Prevalence of sarcopenia was 16.7% (n = 504) according to the SARC-F questionnaire and 1.8% (n = 49) using the FNIH criteria. Sensibility and specificity of the SARC-F to diagnose sarcopenia (defined by FNIH criteria) were 34% and 85%, respectively. Sarcopenic women defined by SARC-F had significantly lower physical performance than nonsarcopenic women. The SARC-F improved the ability to predict poor physical performance. The validity of the SARC-F questionnaire to screen for sarcopenia, when compared with the FNIH criteria, was limited. However, sarcopenia defined by the SARC-F questionnaire substantially improved the predictive value of clinical characteristics of patients to predict poor physical performance. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Development and application of diurnal thermal modeling for camouflage, concealment, and deception
NASA Astrophysics Data System (ADS)
Rodgers, Mark L. B.
2000-07-01
The art of camouflage is to make a military asset appear to be part of the natural environment: its background. In order to predict the likely performance of countermeasures in attaining this goal it is necessary to model the signatures of targets, backgrounds and the effect of countermeasures. A library of diurnal thermal models has been constructed covering a range of backgrounds from vegetated and non- vegetated surfaces to snow cover. These models, originally developed for Western Europe, have been validated successfully for theatres of operation from the arctic to the desert. This paper will show the basis for and development of physically based models for the diurnal thermal behavior both of these backgrounds and for major passive countermeasures: camouflage nets and continuous textile materials. The countermeasures set up significant challenges for the thermal modeler with their low but non-zero thermal inertial and the extent to which they influence local aerodynamic behavior. These challenges have been met and the necessary extensive validation has shown the ability of the models to predict successfully the behavior of in-service countermeasures.
Verschueren, Sabine M. P.; Degens, Hans; Morse, Christopher I.; Onambélé, Gladys L.
2017-01-01
Accurate monitoring of sedentary behaviour and physical activity is key to investigate their exact role in healthy ageing. To date, accelerometers using cut-off point models are most preferred for this, however, machine learning seems a highly promising future alternative. Hence, the current study compared between cut-off point and machine learning algorithms, for optimal quantification of sedentary behaviour and physical activity intensities in the elderly. Thus, in a heterogeneous sample of forty participants (aged ≥60 years, 50% female) energy expenditure during laboratory-based activities (ranging from sedentary behaviour through to moderate-to-vigorous physical activity) was estimated by indirect calorimetry, whilst wearing triaxial thigh-mounted accelerometers. Three cut-off point algorithms and a Random Forest machine learning model were developed and cross-validated using the collected data. Detailed analyses were performed to check algorithm robustness, and examine and benchmark both overall and participant-specific balanced accuracies. This revealed that the four models can at least be used to confidently monitor sedentary behaviour and moderate-to-vigorous physical activity. Nevertheless, the machine learning algorithm outperformed the cut-off point models by being robust for all individual’s physiological and non-physiological characteristics and showing more performance of an acceptable level over the whole range of physical activity intensities. Therefore, we propose that Random Forest machine learning may be optimal for objective assessment of sedentary behaviour and physical activity in older adults using thigh-mounted triaxial accelerometry. PMID:29155839
Wullems, Jorgen A; Verschueren, Sabine M P; Degens, Hans; Morse, Christopher I; Onambélé, Gladys L
2017-01-01
Accurate monitoring of sedentary behaviour and physical activity is key to investigate their exact role in healthy ageing. To date, accelerometers using cut-off point models are most preferred for this, however, machine learning seems a highly promising future alternative. Hence, the current study compared between cut-off point and machine learning algorithms, for optimal quantification of sedentary behaviour and physical activity intensities in the elderly. Thus, in a heterogeneous sample of forty participants (aged ≥60 years, 50% female) energy expenditure during laboratory-based activities (ranging from sedentary behaviour through to moderate-to-vigorous physical activity) was estimated by indirect calorimetry, whilst wearing triaxial thigh-mounted accelerometers. Three cut-off point algorithms and a Random Forest machine learning model were developed and cross-validated using the collected data. Detailed analyses were performed to check algorithm robustness, and examine and benchmark both overall and participant-specific balanced accuracies. This revealed that the four models can at least be used to confidently monitor sedentary behaviour and moderate-to-vigorous physical activity. Nevertheless, the machine learning algorithm outperformed the cut-off point models by being robust for all individual's physiological and non-physiological characteristics and showing more performance of an acceptable level over the whole range of physical activity intensities. Therefore, we propose that Random Forest machine learning may be optimal for objective assessment of sedentary behaviour and physical activity in older adults using thigh-mounted triaxial accelerometry.
Uncertainty quantification and validation of combined hydrological and macroeconomic analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, Jacquelynne; Parks, Mancel Jordan; Jennings, Barbara Joan
2010-09-01
Changes in climate can lead to instabilities in physical and economic systems, particularly in regions with marginal resources. Global climate models indicate increasing global mean temperatures over the decades to come and uncertainty in the local to national impacts means perceived risks will drive planning decisions. Agent-based models provide one of the few ways to evaluate the potential changes in behavior in coupled social-physical systems and to quantify and compare risks. The current generation of climate impact analyses provides estimates of the economic cost of climate change for a limited set of climate scenarios that account for a small subsetmore » of the dynamics and uncertainties. To better understand the risk to national security, the next generation of risk assessment models must represent global stresses, population vulnerability to those stresses, and the uncertainty in population responses and outcomes that could have a significant impact on U.S. national security.« less
Test and Analysis of Foam Impacting a 6x6 Inch RCC Flat Panel
NASA Technical Reports Server (NTRS)
Lessard, Wendy B.
2006-01-01
This report presents the testing and analyses of a foam projectile impacting onto thirteen 6x6 inch flat panels at a 90 degrees incidence angle. The panels tested in this investigation were fabricated of Reinforced-Carbon-Carbon material and were used to aid in the validation of an existing material model, MAT58. The computational analyses were performed using LS-DYNA, which is a physics-based, nonlinear, transient, finite element code used for analyzing material responses subjected to high impact forces and other dynamic conditions. The test results were used to validate LS-DYNA predictions and to determine the threshold of damage generated by the MAT58 cumulative damage material model. The threshold of damage parameter represents any external or internal visible RCC damage detectable by nondestructive evaluation techniques.
Investigation of radiative interaction in laminar flows using Monte Carlo simulation
NASA Technical Reports Server (NTRS)
Liu, Jiwen; Tiwari, S. N.
1993-01-01
The Monte Carlo method (MCM) is employed to study the radiative interactions in fully developed laminar flow between two parallel plates. Taking advantage of the characteristics of easy mathematical treatment of the MCM, a general numerical procedure is developed for nongray radiative interaction. The nongray model is based on the statistical narrow band model with an exponential-tailed inverse intensity distribution. To validate the Monte Carlo simulation for nongray radiation problems, the results of radiative dissipation from the MCM are compared with two available solutions for a given temperature profile between two plates. After this validation, the MCM is employed to solve the present physical problem and results for the bulk temperature are compared with available solutions. In general, good agreement is noted and reasons for some discrepancies in certain ranges of parameters are explained.
NASA Technical Reports Server (NTRS)
Johnson, Kevin D.; Entekhabi, Dara; Eagleson, Peter S.
1991-01-01
Landsurface hydrological parameterizations are implemented in the NASA Goddard Institute for Space Studies (GISS) General Circulation Model (GCM). These parameterizations are: (1) runoff and evapotranspiration functions that include the effects of subgrid scale spatial variability and use physically based equations of hydrologic flux at the soil surface, and (2) a realistic soil moisture diffusion scheme for the movement of water in the soil column. A one dimensional climate model with a complete hydrologic cycle is used to screen the basic sensitivities of the hydrological parameterizations before implementation into the full three dimensional GCM. Results of the final simulation with the GISS GCM and the new landsurface hydrology indicate that the runoff rate, especially in the tropics is significantly improved. As a result, the remaining components of the heat and moisture balance show comparable improvements when compared to observations. The validation of model results is carried from the large global (ocean and landsurface) scale, to the zonal, continental, and finally the finer river basin scales.
Serel Arslan, S; Demir, N; Karaduman, A A
2017-02-01
This study aimed to develop a scale called Tongue Thrust Rating Scale (TTRS), which categorised tongue thrust in children in terms of its severity during swallowing, and to investigate its validity and reliability. The study describes the developmental phase of the TTRS and presented its content and criterion-based validity and interobserver and intra-observer reliability. For content validation, seven experts assessed the steps in the scale over two Delphi rounds. Two physical therapists evaluated videos of 50 children with cerebral palsy (mean age, 57·9 ± 16·8 months), using the TTRS to test criterion-based validity, interobserver and intra-observer reliability. The Karaduman Chewing Performance Scale (KCPS) and Drooling Severity and Frequency Scale (DSFS) were used for criterion-based validity. All the TTRS steps were deemed necessary. The content validity index was 0·857. A very strong positive correlation was found between two examinations by one physical therapist, which indicated intra-observer reliability (r = 0·938, P < 0·001). A very strong positive correlation was also found between the TTRS scores of two physical therapists, indicating interobserver reliability (r = 0·892, P < 0·001). There was also a strong positive correlation between the TTRS and KCPS (r = 0·724, P < 0·001) and a very strong positive correlation between the TTRS scores and DSFS (r = 0·822 and r = 0·755; P < 0·001). These results demonstrated the criterion-based validity of the TTRS. The TTRS is a valid, reliable and clinically easy-to-use functional instrument to document the severity of tongue thrust in children. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Holburn, E. R.; Bledsoe, B. P.; Poff, N. L.; Cuhaciyan, C. O.
2005-05-01
Using over 300 R/EMAP sites in OR and WA, we examine the relative explanatory power of watershed, valley, and reach scale descriptors in modeling variation in benthic macroinvertebrate indices. Innovative metrics describing flow regime, geomorphic processes, and hydrologic-distance weighted watershed and valley characteristics are used in multiple regression and regression tree modeling to predict EPT richness, % EPT, EPT/C, and % Plecoptera. A nested design using seven ecoregions is employed to evaluate the influence of geographic scale and environmental heterogeneity on the explanatory power of individual and combined scales. Regression tree models are constructed to explain variability while identifying threshold responses and interactions. Cross-validated models demonstrate differences in the explanatory power associated with single-scale and multi-scale models as environmental heterogeneity is varied. Models explaining the greatest variability in biological indices result from multi-scale combinations of physical descriptors. Results also indicate that substantial variation in benthic macroinvertebrate response can be explained with process-based watershed and valley scale metrics derived exclusively from common geospatial data. This study outlines a general framework for identifying key processes driving macroinvertebrate assemblages across a range of scales and establishing the geographic extent at which various levels of physical description best explain biological variability. Such information can guide process-based stratification to avoid spurious comparison of dissimilar stream types in bioassessments and ensure that key environmental gradients are adequately represented in sampling designs.
NASA Astrophysics Data System (ADS)
Stanford, Adam Christopher
Canopy reflectance models (CRMs) can accurately estimate vegetation canopy biophysical-structural information such as Leaf Area Index (LAI) inexpensively using satellite imagery. The strict physical basis which geometric-optical CRMs employ to mathematically link canopy bidirectional reflectance and structure allows for the tangible replication of a CRM's geometric abstraction of a canopy in the laboratory, enabling robust CRM validation studies. To this end, the ULGS-2 goniometer was used to obtain multiangle, hyperspectral (Spectrodirectional) measurements of a specially-designed tangible physical model forest, developed based upon the Geometric-Optical Mutual Shadowing (GOMS) CRM, at three different canopy cover densities. GOMS forward-modelled reflectance values had high levels of agreement with ULGS-2 measurements, with obtained reflectance RMSE values ranging from 0.03% to 0.1%. Canopy structure modelled via GOMS Multiple-Forward-Mode (MFM) inversion had varying levels of success. The methods developed in this thesis can potentially be extended to more complex CRMs through the implementation of 3D printing.
A contact angle hysteresis model based on the fractal structure of contact line.
Wu, Shuai; Ma, Ming
2017-11-01
Contact angle is one of the most popular concept used in fields such as wetting, transport and microfludics. In practice, different contact angles such as equilibrium, receding and advancing contact angles are observed due to hysteresis. The connection among these contact angles is important in revealing the chemical and physical properties of surfaces related to wetting. Inspired by the fractal structure of contact line, we propose a single parameter model depicting the connection of the three angles. This parameter is decided by the fractal structure of the contact line. The results of this model agree with experimental observations. In certain cases, it can be reduced to other existing models. It also provides a new point of view in understanding the physical nature of the contact angle hysteresis. Interestingly, some counter-intuitive phenomena, such as the binary receding angles, are indicated in this model, which are waited to be validated by experiments. Copyright © 2017 Elsevier Inc. All rights reserved.
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
Model based Computerized Ionospheric Tomography in space and time
NASA Astrophysics Data System (ADS)
Tuna, Hakan; Arikan, Orhan; Arikan, Feza
2018-04-01
Reconstruction of the ionospheric electron density distribution in space and time not only provide basis for better understanding the physical nature of the ionosphere, but also provide improvements in various applications including HF communication. Recently developed IONOLAB-CIT technique provides physically admissible 3D model of the ionosphere by using both Slant Total Electron Content (STEC) measurements obtained from a GPS satellite - receiver network and IRI-Plas model. IONOLAB-CIT technique optimizes IRI-Plas model parameters in the region of interest such that the synthetic STEC computations obtained from the IRI-Plas model are in accordance with the actual STEC measurements. In this work, the IONOLAB-CIT technique is extended to provide reconstructions both in space and time. This extension exploits the temporal continuity of the ionosphere to provide more reliable reconstructions with a reduced computational load. The proposed 4D-IONOLAB-CIT technique is validated on real measurement data obtained from TNPGN-Active GPS receiver network in Turkey.
NASA Astrophysics Data System (ADS)
Larmat, C. S.; Delorey, A.; Rougier, E.; Knight, E. E.; Steedman, D. W.; Bradley, C. R.
2017-12-01
This presentation reports numerical modeling efforts to improve knowledge of the processes that affect seismic wave generation and propagation from underground explosions, with a focus on Rg waves. The numerical model is based on the coupling of hydrodynamic simulation codes (Abaqus, CASH and HOSS), with a 3D full waveform propagation code, SPECFEM3D. Validation datasets are provided by the Source Physics Experiment (SPE) which is a series of highly instrumented chemical explosions at the Nevada National Security Site with yields from 100kg to 5000kg. A first series of explosions in a granite emplacement has just been completed and a second series in alluvium emplacement is planned for 2018. The long-term goal of this research is to review and improve current existing seismic sources models (e.g. Mueller & Murphy, 1971; Denny & Johnson, 1991) by providing first principles calculations provided by the coupled codes capability. The hydrodynamic codes, Abaqus, CASH and HOSS, model the shocked, hydrodynamic region via equations of state for the explosive, borehole stemming and jointed/weathered granite. A new material model for unconsolidated alluvium materials has been developed and validated with past nuclear explosions, including the 10 kT 1965 Merlin event (Perret, 1971) ; Perret and Bass, 1975). We use the efficient Spectral Element Method code, SPECFEM3D (e.g. Komatitsch, 1998; 2002), and Geologic Framework Models to model the evolution of wavefield as it propagates across 3D complex structures. The coupling interface is a series of grid points of the SEM mesh situated at the edge of the hydrodynamic code domain. We will present validation tests and waveforms modeled for several SPE tests which provide evidence that the damage processes happening in the vicinity of the explosions create secondary seismic sources. These sources interfere with the original explosion moment and reduces the apparent seismic moment at the origin of Rg waves up to 20%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.
This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less
Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...
2016-10-13
This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less
NASA Astrophysics Data System (ADS)
Künne, A.; Fink, M.; Kipka, H.; Krause, P.; Flügel, W.-A.
2012-06-01
In this paper, a method is presented to estimate excess nitrogen on large scales considering single field processes. The approach was implemented by using the physically based model J2000-S to simulate the nitrogen balance as well as the hydrological dynamics within meso-scale test catchments. The model input data, the parameterization, the results and a detailed system understanding were used to generate the regression tree models with GUIDE (Loh, 2002). For each landscape type in the federal state of Thuringia a regression tree was calibrated and validated using the model data and results of excess nitrogen from the test catchments. Hydrological parameters such as precipitation and evapotranspiration were also used to predict excess nitrogen by the regression tree model. Hence they had to be calculated and regionalized as well for the state of Thuringia. Here the model J2000g was used to simulate the water balance on the macro scale. With the regression trees the excess nitrogen was regionalized for each landscape type of Thuringia. The approach allows calculating the potential nitrogen input into the streams of the drainage area. The results show that the applied methodology was able to transfer the detailed model results of the meso-scale catchments to the entire state of Thuringia by low computing time without losing the detailed knowledge from the nitrogen transport modeling. This was validated with modeling results from Fink (2004) in a catchment lying in the regionalization area. The regionalized and modeled excess nitrogen correspond with 94%. The study was conducted within the framework of a project in collaboration with the Thuringian Environmental Ministry, whose overall aim was to assess the effect of agro-environmental measures regarding load reduction in the water bodies of Thuringia to fulfill the requirements of the European Water Framework Directive (Bäse et al., 2007; Fink, 2006; Fink et al., 2007).
Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...
2014-01-01
This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less
NASA Astrophysics Data System (ADS)
Lepore, C.; Arnone, E.; Noto, L. V.; Sivandran, G.; Bras, R. L.
2013-09-01
This paper presents the development of a rainfall-triggered landslide module within an existing physically based spatially distributed ecohydrologic model. The model, tRIBS-VEGGIE (Triangulated Irregular Networks-based Real-time Integrated Basin Simulator and Vegetation Generator for Interactive Evolution), is capable of a sophisticated description of many hydrological processes; in particular, the soil moisture dynamics are resolved at a temporal and spatial resolution required to examine the triggering mechanisms of rainfall-induced landslides. The validity of the tRIBS-VEGGIE model to a tropical environment is shown with an evaluation of its performance against direct observations made within the study area of Luquillo Forest. The newly developed landslide module builds upon the previous version of the tRIBS landslide component. This new module utilizes a numerical solution to the Richards' equation (present in tRIBS-VEGGIE but not in tRIBS), which better represents the time evolution of soil moisture transport through the soil column. Moreover, the new landslide module utilizes an extended formulation of the factor of safety (FS) to correctly quantify the role of matric suction in slope stability and to account for unsaturated conditions in the evaluation of FS. The new modeling framework couples the capabilities of the detailed hydrologic model to describe soil moisture dynamics with the infinite slope model, creating a powerful tool for the assessment of rainfall-triggered landslide risk.
2012-01-01
Background Given the documented physical activity disparities that exist among low-income minority communities and the increased focused on socio-ecological approaches to address physical inactivity, efforts aimed at understanding the built environment to support physical activity are needed. This community-based participatory research (CBPR) project investigates walking trails perceptions in a high minority southern community and objectively examines walking trails. The primary aim is to explore if perceived and objective audit variables predict meeting recommendations for walking and physical activity, MET/minutes/week of physical activity, and frequency of trail use. Methods A proportional sampling plan was used to survey community residents in this cross-sectional study. Previously validated instruments were pilot tested and appropriately adapted and included the short version of the validated International Physical Activity Questionnaire, trail use, and perceptions of walking trails. Walking trails were assessed using the valid and reliable Path Environmental Audit Tool which assesses four content areas including: design features, amenities, maintenance, and pedestrian safety from traffic. Analyses included Chi-square, one-way ANOVA's, multiple linear regression, and multiple logistic models. Results Numerous (n = 21) high quality walking trails were available. Across trails, there were very few indicators of incivilities and safety features rated relatively high. Among the 372 respondents, trail use significantly predicted meeting recommendations for walking and physical activity, and MET/minutes/week. While controlling for other variables, significant predictors of trail use included proximity to trails, as well as perceptions of walking trail safety, trail amenities, and neighborhood pedestrian safety. Furthermore, while controlling for education, gender, and income; for every one time per week increase in using walking trails, the odds for meeting walking recommendations increased 1.27 times, and the odds for meeting PA recommendation increased 3.54 times. Perceived and objective audit variables did not predict meeting physical activity recommendations. Conclusions To improve physical activity levels, intervention efforts are needed to maximize the use of existing trails, as well as improve residents' perceptions related to incivilities, safety, conditions of trail, and amenities of the walking trails. This study provides important insights for informing development of the CBPR walking intervention and informing local recreational and environmental policies in this southern community. PMID:22289653
NASA Astrophysics Data System (ADS)
Reder, Alfredo; Rianna, Guido; Pagano, Luca
2018-02-01
In the field of rainfall-induced landslides on sloping covers, models for early warning predictions require an adequate trade-off between two aspects: prediction accuracy and timeliness. When a cover's initial hydrological state is a determining factor in triggering landslides, taking evaporative losses into account (or not) could significantly affect both aspects. This study evaluates the performance of three physically based predictive models, converting precipitation and evaporative fluxes into hydrological variables useful in assessing slope safety conditions. Two of the models incorporate evaporation, with one representing evaporation as both a boundary and internal phenomenon, and the other only a boundary phenomenon. The third model totally disregards evaporation. Model performances are assessed by analysing a well-documented case study involving a 2 m thick sloping volcanic cover. The large amount of monitoring data collected for the soil involved in the case study, reconstituted in a suitably equipped lysimeter, makes it possible to propose procedures for calibrating and validating the parameters of the models. All predictions indicate a hydrological singularity at the landslide time (alarm). A comparison of the models' predictions also indicates that the greater the complexity and completeness of the model, the lower the number of predicted hydrological singularities when no landslides occur (false alarms).
Reliability and Validity of the Evidence-Based Practice Confidence (EPIC) Scale
ERIC Educational Resources Information Center
Salbach, Nancy M.; Jaglal, Susan B.; Williams, Jack I.
2013-01-01
Introduction: The reliability, minimal detectable change (MDC), and construct validity of the evidence-based practice confidence (EPIC) scale were evaluated among physical therapists (PTs) in clinical practice. Methods: A longitudinal mail survey was conducted. Internal consistency and test-retest reliability were estimated using Cronbach's alpha…
Spray combustion model improvement study, 1
NASA Technical Reports Server (NTRS)
Chen, C. P.; Kim, Y. M.; Shang, H. M.
1993-01-01
This study involves the development of numerical and physical modeling in spray combustion. These modeling efforts are mainly motivated to improve the physical submodels of turbulence, combustion, atomization, dense spray effects, and group vaporization. The present mathematical formulation can be easily implemented in any time-marching multiple pressure correction methodologies such as MAST code. A sequence of validation cases includes the nonevaporating, evaporating and_burnin dense_sprays.
ERIC Educational Resources Information Center
Keegan, John P.; Chan, Fong; Ditchman, Nicole; Chiu, Chung-Yi
2012-01-01
The main objective of this study was to validate Pender's Health Promotion Model (HPM) as a motivational model for exercise/physical activity self-management for people with spinal cord injuries (SCIs). Quantitative descriptive research design using hierarchical regression analysis (HRA) was used. A total of 126 individuals with SCI were recruited…
NASA Astrophysics Data System (ADS)
Pascuet, M. I.; Castin, N.; Becquart, C. S.; Malerba, L.
2011-05-01
An atomistic kinetic Monte Carlo (AKMC) method has been applied to study the stability and mobility of copper-vacancy clusters in Fe. This information, which cannot be obtained directly from experimental measurements, is needed to parameterise models describing the nanostructure evolution under irradiation of Fe alloys (e.g. model alloys for reactor pressure vessel steels). The physical reliability of the AKMC method has been improved by employing artificial intelligence techniques for the regression of the activation energies required by the model as input. These energies are calculated allowing for the effects of local chemistry and relaxation, using an interatomic potential fitted to reproduce them as accurately as possible and the nudged-elastic-band method. The model validation was based on comparison with available ab initio calculations for verification of the used cohesive model, as well as with other models and theories.
A framework for the design and development of physical employment tests and standards.
Payne, W; Harvey, J
2010-07-01
Because operational tasks in the uniformed services (military, police, fire and emergency services) are physically demanding and incur the risk of injury, employment policy in these services is usually competency based and predicated on objective physical employment standards (PESs) based on physical employment tests (PETs). In this paper, a comprehensive framework for the design of PETs and PESs is presented. Three broad approaches to physical employment testing are described and compared: generic predictive testing; task-related predictive testing; task simulation testing. Techniques for the selection of a set of tests with good coverage of job requirements, including job task analysis, physical demands analysis and correlation analysis, are discussed. Regarding individual PETs, theoretical considerations including measurability, discriminating power, reliability and validity, and practical considerations, including development of protocols, resource requirements, administrative issues and safety, are considered. With regard to the setting of PESs, criterion referencing and norm referencing are discussed. STATEMENT OF RELEVANCE: This paper presents an integrated and coherent framework for the development of PESs and hence provides a much needed theoretically based but practically oriented guide for organisations seeking to establish valid and defensible PESs.
Prototype of NASA's Global Precipitation Measurement Mission Ground Validation System
NASA Technical Reports Server (NTRS)
Schwaller, M. R.; Morris, K. R.; Petersen, W. A.
2007-01-01
NASA is developing a Ground Validation System (GVS) as one of its contributions to the Global Precipitation Mission (GPM). The GPM GVS provides an independent means for evaluation, diagnosis, and ultimately improvement of GPM spaceborne measurements and precipitation products. NASA's GPM GVS consists of three elements: field campaigns/physical validation, direct network validation, and modeling and simulation. The GVS prototype of direct network validation compares Tropical Rainfall Measuring Mission (TRMM) satellite-borne radar data to similar measurements from the U.S. national network of operational weather radars. A prototype field campaign has also been conducted; modeling and simulation prototypes are under consideration.
2D/3D fetal cardiac dataset segmentation using a deformable model.
Dindoyal, Irving; Lambrou, Tryphon; Deng, Jing; Todd-Pokropek, Andrew
2011-07-01
To segment the fetal heart in order to facilitate the 3D assessment of the cardiac function and structure. Ultrasound acquisition typically results in drop-out artifacts of the chamber walls. The authors outline a level set deformable model to automatically delineate the small fetal cardiac chambers. The level set is penalized from growing into an adjacent cardiac compartment using a novel collision detection term. The region based model allows simultaneous segmentation of all four cardiac chambers from a user defined seed point placed in each chamber. The segmented boundaries are automatically penalized from intersecting at walls with signal dropout. Root mean square errors of the perpendicular distances between the algorithm's delineation and manual tracings are within 2 mm which is less than 10% of the length of a typical fetal heart. The ejection fractions were determined from the 3D datasets. We validate the algorithm using a physical phantom and obtain volumes that are comparable to those from physically determined means. The algorithm segments volumes with an error of within 13% as determined using a physical phantom. Our original work in fetal cardiac segmentation compares automatic and manual tracings to a physical phantom and also measures inter observer variation.
Durán-Agüero, Samuel; Valdes-Badilla, Pablo; Godoy Cumillaf, Andrés; Herrera-Valenzuela, Tomás
2015-05-01
Chile is a country that reaches the highest levels of overweight and obesity worldwide (66.7% of the Chilean population), with a group of college students tending to swell these numbers considered nutritionally vulnerable group. To associate the consumption of fruits with nutritional status of Chilean university students in physical education. The study population consisted of all students of the School of Pedagogy in Physical Education from the Autonomous University of Chile, Temuco based (n = 420). The sample included 239 students (56.9%), men (76.5%) with a mean age of 21.5 ± 2.1 years. Each student nutritional status was determined and applied a validated survey eating habits. An association between fruit consumption (≥2 servings / day) in the model crude OR = 0.528 (from 0.288 to 0.965), Model 1 adjusted OR = 0.496 (0.268 to 0.916) and Model 2 adjusted OR = 0.495 is observed (0.265 to 0.924) CONCLUSION: Consumption ≥ 2 servings a day of fruits is a protective factor for a good BMI Chilean university students in physical education. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.