de Boer, Pieter T; Frederix, Geert W J; Feenstra, Talitha L; Vemer, Pepijn
2016-09-01
Transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes. In this study we reviewed recently published studies on seasonal influenza and early breast cancer in order to gain insight into the reporting of model validation efforts in the overall health economic literature. A literature search was performed in Pubmed and Embase to retrieve health economic modelling studies published between 2008 and 2014. Reporting on model validation was evaluated by checking for the word validation, and by using AdViSHE (Assessment of the Validation Status of Health Economic decision models), a tool containing a structured list of relevant items for validation. Additionally, we contacted corresponding authors to ask whether more validation efforts were performed other than those reported in the manuscripts. A total of 53 studies on seasonal influenza and 41 studies on early breast cancer were included in our review. The word validation was used in 16 studies (30 %) on seasonal influenza and 23 studies (56 %) on early breast cancer; however, in a minority of studies, this referred to a model validation technique. Fifty-seven percent of seasonal influenza studies and 71 % of early breast cancer studies reported one or more validation techniques. Cross-validation of study outcomes was found most often. A limited number of studies reported on model validation efforts, although good examples were identified. Author comments indicated that more validation techniques were performed than those reported in the manuscripts. Although validation is deemed important by many researchers, this is not reflected in the reporting habits of health economic modelling studies. Systematic reporting of validation efforts would be desirable to further enhance decision makers' confidence in health economic models and their outcomes.
[Psychometric properties of the French version of the Effort-Reward Imbalance model].
Niedhammer, I; Siegrist, J; Landre, M F; Goldberg, M; Leclerc, A
2000-10-01
Two main models are currently used to evaluate psychosocial factors at work: the Job Strain model developed by Karasek and the Effort-Reward Imbalance model. A French version of the first model has been validated for the dimensions of psychological demands and decision latitude. As regards the second one evaluating three dimensions (extrinsic effort, reward, and intrinsic effort), there are several versions in different languages, but until recently there was no validated French version. The objective of this study was to explore the psychometric properties of the French version of the Effort-Reward Imbalance model in terms of internal consistency, factorial validity, and discriminant validity. The present study was based on the GAZEL cohort and included the 10 174 subjects who were working at the French national electric and gas company (EDF-GDF) and answered the questionnaire in 1998. A French version of Effort-Reward Imbalance was included in this questionnaire. This version was obtained by a standard forward/backward translation procedure. Internal consistency was satisfactory for the three scales of extrinsic effort, reward, and intrinsic effort: Cronbach's Alpha coefficients higher than 0.7 were observed. A one-factor solution was retained for the factor analysis of the scale of extrinsic effort. A three-factor solution was retained for the factor analysis of reward, and these dimensions were interpreted as the factor analysis of intrinsic effort did not support the expected four-dimension structure. The analysis of discriminant validity displayed significant associations between measures of Effort-Reward Imbalance and the variables of sex, age, education level, and occupational grade. This study is the first one supporting satisfactory psychometric properties of the French version of the Effort-Reward Imbalance model. However, the factorial validity of intrinsic effort could be questioned. Furthermore, as most previous studies were based on male samples working in specific occupations, the present one is also one of the first to show strong associations between measures of this model and social class variables in a population of men and women employed in various occupations.
Overview of Heat Addition and Efficiency Predictions for an Advanced Stirling Convertor
NASA Technical Reports Server (NTRS)
Wilson, Scott D.; Reid, Terry; Schifer, Nicholas; Briggs, Maxwell
2011-01-01
Past methods of predicting net heat input needed to be validated. Validation effort pursued with several paths including improving model inputs, using test hardware to provide validation data, and validating high fidelity models. Validation test hardware provided direct measurement of net heat input for comparison to predicted values. Predicted value of net heat input was 1.7 percent less than measured value and initial calculations of measurement uncertainty were 2.1 percent (under review). Lessons learned during validation effort were incorporated into convertor modeling approach which improved predictions of convertor efficiency.
Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts
NASA Technical Reports Server (NTRS)
Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky
2012-01-01
We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.
Further Studies into Synthetic Image Generation using CameoSim
2011-08-01
preparation of the validation effort a study of BRDF models has been completed, which includes the physical plausibility of models , how measured data...the visible to shortwave infrared. In preparation of the validation effort a study of BRDF models has been completed, which includes the physical...Example..................................................................................................................... 17 4. MODELLING BRDFS
Highlights of Transient Plume Impingement Model Validation and Applications
NASA Technical Reports Server (NTRS)
Woronowicz, Michael
2011-01-01
This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.
Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility
NASA Technical Reports Server (NTRS)
Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd
1999-01-01
We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.
Validation of Model Forecasts of the Ambient Solar Wind
NASA Technical Reports Server (NTRS)
Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.
2009-01-01
Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.
NASA Astrophysics Data System (ADS)
Glocer, A.; Rastätter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; Weigel, R. S.; McCollough, J.; Wing, S.
2016-07-01
We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPC's effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.
NASA Technical Reports Server (NTRS)
Glocer, A.; Rastaetter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.;
2016-01-01
We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPCs effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.
AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users.
Vemer, P; Corro Ramos, I; van Voorn, G A K; Al, M J; Feenstra, T L
2016-04-01
A trade-off exists between building confidence in health-economic (HE) decision models and the use of scarce resources. We aimed to create a practical tool providing model users with a structured view into the validation status of HE decision models, to address this trade-off. A Delphi panel was organized, and was completed by a workshop during an international conference. The proposed tool was constructed iteratively based on comments from, and the discussion amongst, panellists. During the Delphi process, comments were solicited on the importance and feasibility of possible validation techniques for modellers, their relevance for decision makers, and the overall structure and formulation in the tool. The panel consisted of 47 experts in HE modelling and HE decision making from various professional and international backgrounds. In addition, 50 discussants actively engaged in the discussion at the conference workshop and returned 19 questionnaires with additional comments. The final version consists of 13 items covering all relevant aspects of HE decision models: the conceptual model, the input data, the implemented software program, and the model outcomes. Assessment of the Validation Status of Health-Economic decision models (AdViSHE) is a validation-assessment tool in which model developers report in a systematic way both on validation efforts performed and on their outcomes. Subsequently, model users can establish whether confidence in the model is justified or whether additional validation efforts should be undertaken. In this way, AdViSHE enhances transparency of the validation status of HE models and supports efficient model validation.
Model-based verification and validation of the SMAP uplink processes
NASA Astrophysics Data System (ADS)
Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.
Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.
Pretest information for a test to validate plume simulation procedures (FA-17)
NASA Technical Reports Server (NTRS)
Hair, L. M.
1978-01-01
The results of an effort to plan a final verification wind tunnel test to validate the recommended correlation parameters and application techniques were presented. The test planning effort was complete except for test site finalization and the associated coordination. Two suitable test sites were identified. Desired test conditions were shown. Subsequent sections of this report present the selected model and test site, instrumentation of this model, planned test operations, and some concluding remarks.
NASA Technical Reports Server (NTRS)
Chaput, Armand; Johns, Zachary; Hodges, Todd; Selfridge, Justin; Bevirt, Joeben; Ahuja, Vivek
2015-01-01
Advanced Concepts Modeling software validation, analysis, and design. This was a National Institute of Aerospace contract with a lot of pieces. Efforts ranged from software development and validation for structures and aerodynamics, through flight control development, and aeropropulsive analysis, to UAV piloting services.
Bell, Cheryl; Johnston, Derek; Allan, Julia; Pollard, Beth; Johnston, Marie
2017-05-01
The Demand-Control (DC) and Effort-Reward Imbalance (ERI) models predict health in a work context. Self-report measures of the four key constructs (demand, control, effort, and reward) have been developed and it is important that these measures have good content validity uncontaminated by content from other constructs. We assessed relevance (whether items reflect the constructs) and representativeness (whether all aspects of the construct are assessed, and all items contribute to that assessment) across the instruments and items. Two studies examined fourteen demand/control items from the Job Content Questionnaire and seventeen effort/reward items from the Effort-Reward Imbalance measure using discriminant content validation and a third study developed new methods to assess instrument representativeness. Both methods use judges' ratings and construct definitions to get transparent quantitative estimates of construct validity. Study 1 used dictionary definitions while studies 2 and 3 used published phrases to define constructs. Overall, 3/5 demand items, 4/9 control items, 1/6 effort items, and 7/11 reward items were uniquely classified to the appropriate theoretical construct and were therefore 'pure' items with discriminant content validity (DCV). All pure items measured a defining phrase. However, both the DC and ERI assessment instruments failed to assess all defining aspects. Finding good discriminant content validity for demand and reward measures means these measures are usable and our quantitative results can guide item selection. By contrast, effort and control measures had limitations (in relevance and representativeness) presenting a challenge to the implementation of the theories. Statement of contribution What is already known on this subject? While the reliability and construct validity of Demand-Control and Effort-Reward-Imbalance (DC and ERI) work stress measures are routinely reported, there has not been adequate investigation of their content validity. This paper investigates their content validity in terms of both relevance and representativeness and provides a model for the investigation of content validity of measures in health psychology more generally. What does this study add? A new application of an existing method, discriminant content validity, and a new method of assessing instrument representativeness. 'Pure' DC and ERI items are identified, as are constructs that are not fully represented by their assessment instruments. The findings are important for studies attempting to distinguish between the main DC and ERI work stress constructs. The quantitative results can be used to guide item selection for future studies. © 2017 The British Psychological Society.
Efficient strategies for leave-one-out cross validation for genomic best linear unbiased prediction.
Cheng, Hao; Garrick, Dorian J; Fernando, Rohan L
2017-01-01
A random multiple-regression model that simultaneously fit all allele substitution effects for additive markers or haplotypes as uncorrelated random effects was proposed for Best Linear Unbiased Prediction, using whole-genome data. Leave-one-out cross validation can be used to quantify the predictive ability of a statistical model. Naive application of Leave-one-out cross validation is computationally intensive because the training and validation analyses need to be repeated n times, once for each observation. Efficient Leave-one-out cross validation strategies are presented here, requiring little more effort than a single analysis. Efficient Leave-one-out cross validation strategies is 786 times faster than the naive application for a simulated dataset with 1,000 observations and 10,000 markers and 99 times faster with 1,000 observations and 100 markers. These efficiencies relative to the naive approach using the same model will increase with increases in the number of observations. Efficient Leave-one-out cross validation strategies are presented here, requiring little more effort than a single analysis.
The Determinants of Student Effort at Learning ERP: A Cultural Perspective
ERIC Educational Resources Information Center
Alshare, Khaled A.; El-Masri, Mazen; Lane, Peggy L.
2015-01-01
This paper develops a research model based on the Unified Theory of Acceptance and Use of Technology model (UTAUT) and Hofstede's cultural dimensions to explore factors that influence student effort at learning Enterprise Resource Planning (ERP) systems. A Structural Equation Model (SEM) using LISREL was utilized to validate the proposed research…
Li, Jian; Herr, Raphael M; Allen, Joanne; Stephens, Christine; Alpass, Fiona
2017-11-25
The objective of this study was to validate a short version of the Effort-Reward-Imbalance (ERI) questionnaire in the context of New Zealand among older full-time and part-time employees. Data were collected from 1694 adults aged 48-83 years (mean 60 years, 53% female) who reported being in full- or part-time paid employment in the 2010 wave of the New Zealand Health, Work and Retirement study. Scale reliability was evaluated by item-total correlations and Cronbach's alpha. Factorial validity was assessed using multi-group confirmatory factor analyses assessing nested models of configural, metric, scalar and strict invariance across full- and part-time employment groups. Logistic regressions estimated associations of effort-reward ratio and over-commitment with poor physical/mental health, and depressive symptoms. Internal consistency of ERI scales was high across employment groups: effort 0.78-0.76; reward 0.81-0.77, and over-commitment 0.83-0.80. The three-factor model displayed acceptable fit in the overall sample (X 2 /df = 10.31; CFI = 0.95; TLI = 0.94; RMSEA = 0.075), and decrements in model fit indices provided evidence for strict invariance of the three-factor ERI model across full-time and part-time employment groups. High effort-reward ratio scores were consistently associated with poor mental health and depressive symptoms for both employment groups. High over-commitment was associated with poor mental health and depressive symptoms in both groups and also with poor physical health in the full-time employment group. The short ERI questionnaire appears to be a valid instrument to assess adverse psychosocial work characteristics in old full-time and part-time employees in New Zealand.
Li, Jian; Herr, Raphael M.; Allen, Joanne; Stephens, Christine; Alpass, Fiona
2017-01-01
Objectives: The objective of this study was to validate a short version of the Effort-Reward-Imbalance (ERI) questionnaire in the context of New Zealand among older full-time and part-time employees. Methods: Data were collected from 1694 adults aged 48-83 years (mean 60 years, 53% female) who reported being in full- or part-time paid employment in the 2010 wave of the New Zealand Health, Work and Retirement study. Scale reliability was evaluated by item-total correlations and Cronbach's alpha. Factorial validity was assessed using multi-group confirmatory factor analyses assessing nested models of configural, metric, scalar and strict invariance across full- and part-time employment groups. Logistic regressions estimated associations of effort-reward ratio and over-commitment with poor physical/mental health, and depressive symptoms. Results: Internal consistency of ERI scales was high across employment groups: effort 0.78-0.76; reward 0.81-0.77, and over-commitment 0.83-0.80. The three-factor model displayed acceptable fit in the overall sample (X2/df = 10.31; CFI = 0.95; TLI = 0.94; RMSEA = 0.075), and decrements in model fit indices provided evidence for strict invariance of the three-factor ERI model across full-time and part-time employment groups. High effort-reward ratio scores were consistently associated with poor mental health and depressive symptoms for both employment groups. High over-commitment was associated with poor mental health and depressive symptoms in both groups and also with poor physical health in the full-time employment group. Conclusions: The short ERI questionnaire appears to be a valid instrument to assess adverse psychosocial work characteristics in old full-time and part-time employees in New Zealand. PMID:28835574
MODELS FOR SUBMARINE OUTFALL - VALIDATION AND PREDICTION UNCERTAINTIES
This address reports on some efforts to verify and validate dilution models, including those found in Visual Plumes. This is done in the context of problem experience: a range of problems, including different pollutants such as bacteria; scales, including near-field and far-field...
Goals and Status of the NASA Juncture Flow Experiment
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.; Morrison, Joseph H.
2016-01-01
The NASA Juncture Flow experiment is a new effort whose focus is attaining validation data in the juncture region of a wing-body configuration. The experiment is designed specifically for the purpose of CFD validation. Current turbulence models routinely employed by Reynolds-averaged Navier-Stokes CFD are inconsistent in their prediction of corner flow separation in aircraft juncture regions, so experimental data in the near-wall region of such a configuration will be useful both for assessment as well as for turbulence model improvement. This paper summarizes the Juncture Flow effort to date, including preliminary risk-reduction experiments already conducted and planned future experiments. The requirements and challenges associated with conducting a quality validation test are discussed.
Embedded measures of performance validity using verbal fluency tests in a clinical sample.
Sugarman, Michael A; Axelrod, Bradley N
2015-01-01
The objective of this study was to determine to what extent verbal fluency measures can be used as performance validity indicators during neuropsychological evaluation. Participants were clinically referred for neuropsychological evaluation in an urban-based Veteran's Affairs hospital. Participants were placed into 2 groups based on their objectively evaluated effort on performance validity tests (PVTs). Individuals who exhibited credible performance (n = 431) failed 0 PVTs, and those with poor effort (n = 192) failed 2 or more PVTs. All participants completed the Controlled Oral Word Association Test (COWAT) and Animals verbal fluency measures. We evaluated how well verbal fluency scores could discriminate between the 2 groups. Raw scores and T scores for Animals discriminated between the credible performance and poor-effort groups with 90% specificity and greater than 40% sensitivity. COWAT scores had lower sensitivity for detecting poor effort. A combination of FAS and Animals scores into logistic regression models yielded acceptable group classification, with 90% specificity and greater than 44% sensitivity. Verbal fluency measures can yield adequate detection of poor effort during neuropsychological evaluation. We provide suggested cut points and logistic regression models for predicting the probability of poor effort in our clinical setting and offer suggested cutoff scores to optimize sensitivity and specificity.
NASA Technical Reports Server (NTRS)
Tijidjian, Raffi P.
2010-01-01
The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.
Injector Design Tool Improvements: User's manual for FDNS V.4.5
NASA Technical Reports Server (NTRS)
Chen, Yen-Sen; Shang, Huan-Min; Wei, Hong; Liu, Jiwen
1998-01-01
The major emphasis of the current effort is in the development and validation of an efficient parallel machine computational model, based on the FDNS code, to analyze the fluid dynamics of a wide variety of liquid jet configurations for general liquid rocket engine injection system applications. This model includes physical models for droplet atomization, breakup/coalescence, evaporation, turbulence mixing and gas-phase combustion. Benchmark validation cases for liquid rocket engine chamber combustion conditions will be performed for model validation purpose. Test cases may include shear coaxial, swirl coaxial and impinging injection systems with combinations LOXIH2 or LOXISP-1 propellant injector elements used in rocket engine designs. As a final goal of this project, a well tested parallel CFD performance methodology together with a user's operation description in a final technical report will be reported at the end of the proposed research effort.
Validation of PV-RPM Code in the System Advisor Model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Geoffrey Taylor; Lavrova, Olga; Freeman, Janine
2017-04-01
This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whethermore » the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.« less
Integrated Resilient Aircraft Control Project Full Scale Flight Validation
NASA Technical Reports Server (NTRS)
Bosworth, John T.
2009-01-01
Objective: Provide validation of adaptive control law concepts through full scale flight evaluation. Technical Approach: a) Engage failure mode - destabilizing or frozen surface. b) Perform formation flight and air-to-air tracking tasks. Evaluate adaptive algorithm: a) Stability metrics. b) Model following metrics. Full scale flight testing provides an ability to validate different adaptive flight control approaches. Full scale flight testing adds credence to NASA's research efforts. A sustained research effort is required to remove the road blocks and provide adaptive control as a viable design solution for increased aircraft resilience.
Model Validation Against The Modelers’ Data Archive
2014-08-01
completion of the planned Jack Rabbit 2 field trials. The relevant task for the effort addressed here is Task 4 of the current Interagency Agreement, as...readily simulates the Prairie Grass sulfur dioxide plumes. Also, Jack Rabbit II field trials are set to be completed during FY16. Once these data are...available, they will also be used to validate the combined models. This validation may prove to be more useful, as the Jack Rabbit II will release
NASA Technical Reports Server (NTRS)
Ferzali, Wassim; Zacharakis, Vassilis; Upadhyay, Triveni; Weed, Dennis; Burke, Gregory
1995-01-01
The ICAO Aeronautical Mobile Communications Panel (AMCP) completed the drafting of the Aeronautical Mobile Satellite Service (AMSS) Standards and Recommended Practices (SARP's) and the associated Guidance Material and submitted these documents to ICAO Air Navigation Commission (ANC) for ratification in May 1994. This effort, encompassed an extensive, multi-national SARP's validation. As part of this activity, the US Federal Aviation Administration (FAA) sponsored an effort to validate the SARP's via computer simulation. This paper provides a description of this effort. Specifically, it describes: (1) the approach selected for the creation of a high-fidelity AMSS computer model; (2) the test traffic generation scenarios; and (3) the resultant AMSS performance assessment. More recently, the AMSS computer model was also used to provide AMSS performance statistics in support of the RTCA standardization activities. This paper describes this effort as well.
Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition
NASA Technical Reports Server (NTRS)
Ewing, Anthony; Adams, Charles
2004-01-01
Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.
NASA Technical Reports Server (NTRS)
Westra, Doug G.; West, Jeffrey S.; Richardson, Brian R.
2015-01-01
Historically, the analysis and design of liquid rocket engines (LREs) has relied on full-scale testing and one-dimensional empirical tools. The testing is extremely expensive and the one-dimensional tools are not designed to capture the highly complex, and multi-dimensional features that are inherent to LREs. Recent advances in computational fluid dynamics (CFD) tools have made it possible to predict liquid rocket engine performance, stability, to assess the effect of complex flow features, and to evaluate injector-driven thermal environments, to mitigate the cost of testing. Extensive efforts to verify and validate these CFD tools have been conducted, to provide confidence for using them during the design cycle. Previous validation efforts have documented comparisons of predicted heat flux thermal environments with test data for a single element gaseous oxygen (GO2) and gaseous hydrogen (GH2) injector. The most notable validation effort was a comprehensive validation effort conducted by Tucker et al. [1], in which a number of different groups modeled a GO2/GH2 single element configuration by Pal et al [2]. The tools used for this validation comparison employed a range of algorithms, from both steady and unsteady Reynolds Averaged Navier-Stokes (U/RANS) calculations, large-eddy simulations (LES), detached eddy simulations (DES), and various combinations. A more recent effort by Thakur et al. [3] focused on using a state-of-the-art CFD simulation tool, Loci/STREAM, on a two-dimensional grid. Loci/STREAM was chosen because it has a unique, very efficient flamelet parameterization of combustion reactions that are too computationally expensive to simulate with conventional finite-rate chemistry calculations. The current effort focuses on further advancement of validation efforts, again using the Loci/STREAM tool with the flamelet parameterization, but this time with a three-dimensional grid. Comparisons to the Pal et al. heat flux data will be made for both RANS and Hybrid RANSLES/ Detached Eddy simulations (DES). Computation costs will be reported, along with comparison of accuracy and cost to much less expensive two-dimensional RANS simulations of the same geometry.
A Year-Long Comparison of GPS TEC and Global Ionosphere-Thermosphere Models
NASA Astrophysics Data System (ADS)
Perlongo, N. J.; Ridley, A. J.; Cnossen, I.; Wu, C.
2018-02-01
The prevalence of GPS total electron content (TEC) observations has provided an opportunity for extensive global ionosphere-thermosphere model validation efforts. This study presents a year-long data-model comparison using the Global Ionosphere-Thermosphere Model (GITM) and the Thermosphere-Ionosphere-Electrodynamics General Circulation Model (TIE-GCM). For the entire year of 2010, each model was run and compared to GPS TEC observations. The results were binned according to season, latitude, local time, and magnetic local time. GITM was found to overestimate the TEC everywhere, except on the midlatitude nightside, due to high O/N2 ratios. TIE-GCM produced much less TEC and had lower O/N2 ratios and neutral wind speeds. Seasonal and regional biases in the models are discussed along with ideas for model improvements and further validation efforts.
NASA Technical Reports Server (NTRS)
Bond, Barbara J.; Peterson, David L.
1999-01-01
This project was a collaborative effort by researchers at ARC, OSU and the University of Arizona. The goal was to use a dataset obtained from a previous study to "empirically validate a new canopy radiative-transfer model (SART) which incorporates a recently-developed leaf-level model (LEAFMOD)". The document includes a short research summary.
A Structural Equation Modelling of the Academic Self-Concept Scale
ERIC Educational Resources Information Center
Matovu, Musa
2014-01-01
The study aimed at validating the academic self-concept scale by Liu and Wang (2005) in measuring academic self-concept among university students. Structural equation modelling was used to validate the scale which was composed of two subscales; academic confidence and academic effort. The study was conducted on university students; males and…
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Supersonic Combustion Research at NASA
NASA Technical Reports Server (NTRS)
Drummond, J. P.; Danehy, Paul M.; Gaffney, Richard L., Jr.; Tedder, Sarah A.; Cutler, Andrew D.; Bivolaru, Daniel
2007-01-01
This paper discusses the progress of work to model high-speed supersonic reacting flow. The purpose of the work is to improve the state of the art of CFD capabilities for predicting the flow in high-speed propulsion systems, particularly combustor flowpaths. The program has several components including the development of advanced algorithms and models for simulating engine flowpaths as well as a fundamental experimental and diagnostic development effort to support the formulation and validation of the mathematical models. The paper will provide details of current work on experiments that will provide data for the modeling efforts along with the associated nonintrusive diagnostics used to collect the data from the experimental flowfield. Simulation of a recent experiment to partially validate the accuracy of a combustion code is also described.
Modeling Combustion in Supersonic Flows
NASA Technical Reports Server (NTRS)
Drummond, J. Philip; Danehy, Paul M.; Bivolaru, Daniel; Gaffney, Richard L.; Tedder, Sarah A.; Cutler, Andrew D.
2007-01-01
This paper discusses the progress of work to model high-speed supersonic reacting flow. The purpose of the work is to improve the state of the art of CFD capabilities for predicting the flow in high-speed propulsion systems, particularly combustor flow-paths. The program has several components including the development of advanced algorithms and models for simulating engine flowpaths as well as a fundamental experimental and diagnostic development effort to support the formulation and validation of the mathematical models. The paper will provide details of current work on experiments that will provide data for the modeling efforts along with with the associated nonintrusive diagnostics used to collect the data from the experimental flowfield. Simulation of a recent experiment to partially validate the accuracy of a combustion code is also described.
Helicopter simulation validation using flight data
NASA Technical Reports Server (NTRS)
Key, D. L.; Hansen, R. S.; Cleveland, W. B.; Abbott, W. Y.
1982-01-01
A joint NASA/Army effort to perform a systematic ground-based piloted simulation validation assessment is described. The best available mathematical model for the subject helicopter (UH-60A Black Hawk) was programmed for real-time operation. Flight data were obtained to validate the math model, and to develop models for the pilot control strategy while performing mission-type tasks. The validated math model is to be combined with motion and visual systems to perform ground based simulation. Comparisons of the control strategy obtained in flight with that obtained on the simulator are to be used as the basis for assessing the fidelity of the results obtained in the simulator.
William H. Cooke; Andrew J. Hartsell
2000-01-01
Wall-to-wall Landsat TM classification efforts in Georgia require field validation. Validation uslng FIA data was testing by developing a new crown modeling procedure. A methodology is under development at the Southern Research Station to model crown diameter using Forest Health monitoring data. These models are used to simulate the proportion of tree crowns that...
Model Calibration Efforts for the International Space Station's Solar Array Mast
NASA Technical Reports Server (NTRS)
Elliott, Kenny B.; Horta, Lucas G.; Templeton, Justin D.; Knight, Norman F., Jr.
2012-01-01
The International Space Station (ISS) relies on sixteen solar-voltaic blankets to provide electrical power to the station. Each pair of blankets is supported by a deployable boom called the Folding Articulated Square Truss Mast (FAST Mast). At certain ISS attitudes, the solar arrays can be positioned in such a way that shadowing of either one or three longerons causes an unexpected asymmetric thermal loading that if unchecked can exceed the operational stability limits of the mast. Work in this paper documents part of an independent NASA Engineering and Safety Center effort to assess the existing operational limits. Because of the complexity of the system, the problem is being worked using a building-block progression from components (longerons), to units (single or multiple bays), to assembly (full mast). The paper presents results from efforts to calibrate the longeron components. The work includes experimental testing of two types of longerons (straight and tapered), development of Finite Element (FE) models, development of parameter uncertainty models, and the establishment of a calibration and validation process to demonstrate adequacy of the models. Models in the context of this paper refer to both FE model and probabilistic parameter models. Results from model calibration of the straight longerons show that the model is capable of predicting the mean load, axial strain, and bending strain. For validation, parameter values obtained from calibration of straight longerons are used to validate experimental results for the tapered longerons.
The Construct of the Learning Organization: Dimensions, Measurement, and Validation
ERIC Educational Resources Information Center
Yang, Baiyin; Watkins, Karen E.; Marsick, Victoria J.
2004-01-01
This research describes efforts to develop and validate a multidimensional measure of the learning organization. An instrument was developed based on a critical review of both the conceptualization and practice of this construct. Supporting validity evidence for the instrument was obtained from several sources, including best model-data fit among…
Li, Jian; Loerbroks, Adrian; Jarczok, Marc N; Schöllgen, Ina; Bosch, Jos A; Mauss, Daniel; Siegrist, Johannes; Fischer, Joachim E
2012-09-01
We test the psychometric properties of a short version of the Effort-Reward Imbalance (ERI) questionnaire in addition to testing an interaction term of this model's main components on health functioning. A self-administered survey was conducted in a sample of 2,738 industrial workers (77% men with mean age 41.6 years) from a large manufacturing company in Southern Germany. The internal consistency reliability, structural validity, and criterion validity were analyzed. Satisfactory internal consistencies of the three scales: "Effort", "reward", and "overcommitment", were obtained (Cronbach's alpha coefficients 0.77, 0.82, and 0.83, respectively). Confirmatory factor analysis showed a good model fit of the data with the theoretical structure (AGFI = 0.94, RMSEA = 0.060). Evidence of criterion validity was demonstrated. Importantly, a significant synergistic interaction effect of ERI and overcommitment on poor mental health functioning was observed (odds ratio 6.74 (95% CI 5.32-8.52); synergy index 1.78 (95% CI 1.25-2.55)). This short version of the ERI questionnaire is a reliable and valid tool for epidemiological research on occupational health. Copyright © 2012 Wiley Periodicals, Inc.
Empirical validation of an agent-based model of wood markets in Switzerland
Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver
2018-01-01
We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300
ERIC Educational Resources Information Center
Awang-Hashim, Rosa; O'Neil, Harold F., Jr.; Hocevar, Dennis
2002-01-01
The relations between motivational constructs, effort, self-efficacy and worry, and statistics achievement were investigated in a sample of 360 undergraduates in Malaysia. Both trait (cross-situational) and state (task-specific) measures of each construct were used to test a mediational trait (r) state (r) performance (TSP) model. As hypothesized,…
Collection of Calibration and Validation Data for An Airport Landside Dynamic Simulation Model
DOT National Transportation Integrated Search
1980-04-01
The report summarizes the airport data collection procedures employed to obtain the necessary calibration and validation information. The preparation for the data collection effort is explained. A description is presented of the initial work tasks, w...
Towards improved capability and confidence in coupled atmospheric and wildland fire modeling
NASA Astrophysics Data System (ADS)
Sauer, Jeremy A.
This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.
Modeling the Effects of Argument Length and Validity on Inductive and Deductive Reasoning
ERIC Educational Resources Information Center
Rotello, Caren M.; Heit, Evan
2009-01-01
In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were…
Overview of Heat Addition and Efficiency Predictions for an Advanced Stirling Convertor
NASA Technical Reports Server (NTRS)
Wilson, Scott D.; Reid, Terry V.; Schifer, Nicholas A.; Briggs, Maxwell H.
2012-01-01
The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two high-efficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. Microporous bulk insulation is used in the ground support test hardware to minimize the loss of thermal energy from the electric heat source to the environment. The insulation package is characterized before operation to predict how much heat will be absorbed by the convertor and how much will be lost to the environment during operation. In an effort to validate these predictions, numerous tasks have been performed, which provided a more accurate value for net heat input into the ASCs. This test and modeling effort included: (a) making thermophysical property measurements of test setup materials to provide inputs to the numerical models, (b) acquiring additional test data that was collected during convertor tests to provide numerical models with temperature profiles of the test setup via thermocouple and infrared measurements, (c) using multidimensional numerical models (computational fluid dynamics code) to predict net heat input of an operating convertor, and (d) using validation test hardware to provide direct comparison of numerical results and validate the multidimensional numerical models used to predict convertor net heat input. This effort produced high fidelity ASC net heat input predictions, which were successfully validated using specially designed test hardware enabling measurement of heat transferred through a simulated Stirling cycle. The overall effort and results are discussed.
Some guidance on preparing validation plans for the DART Full System Models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy
2009-03-01
Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generallymore » applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo
Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.
Toward Paradoxical Inconsistency in Electrostatics of Metallic Conductors
Naturally, when dealing with fundamental problems, the V and V effort should include careful exploration and, if necessary, revision of the fundamentals...Current developments show a clear trend toward more serious efforts in validation and verification (V and V) of physical and engineering models...underlying the physics. With this understanding in mind, we review some fundamentals of the models of crystalline electric conductors and find a
Validation of a short measure of effort-reward imbalance in the workplace: evidence from China.
Li, Jian; Loerbroks, Adrian; Shang, Li; Wege, Natalia; Wahrendorf, Morten; Siegrist, Johannes
2012-01-01
Work stress is an emergent risk in occupational health in China, and its measurement is still a critical issue. The aim of this study was to examine the reliability and validity of a short version of the effort-reward imbalance (ERI) questionnaire in a sample of Chinese workers. A community-based survey was conducted in 1,916 subjects aged 30-65 years with paid employment (971 men and 945 women). Acceptable internal consistencies of the three scales, effort, reward and overcommitment, were obtained. Confirmatory factor analysis showed a good model fit of the data with the theoretical structure (goodness-of-fit index = 0.95). Evidence of criterion validity was demonstrated, as all three scales were independently associated with elevated odds ratios of both poor physical and mental health. Based on the findings of our study, this short version of the ERI questionnaire is considered to be a reliable and valid tool for measuring psychosocial work environment in Chinese working populations.
Executable Architecture Research at Old Dominion University
NASA Technical Reports Server (NTRS)
Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.
2011-01-01
Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.
ERIC Educational Resources Information Center
Gwaltney, Kevin Dale
2012-01-01
This effort: 1) establishes an autonomy definition uniquely tailored for teaching, 2) validates a nationally generalizable teacher autonomy construct, 3) demonstrates that the model describes and explains the autonomy levels of particular teacher groups, and 4) verifies the construct can represent teacher autonomy in other empirical models. The…
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Mulhearn, Tyler J; Watts, Logan L; Todd, E Michelle; Medeiros, Kelsey E; Connelly, Shane; Mumford, Michael D
2017-01-01
Although recent evidence suggests ethics education can be effective, the nature of specific training programs, and their effectiveness, varies considerably. Building on a recent path modeling effort, the present study developed and validated a predictive modeling tool for responsible conduct of research education. The predictive modeling tool allows users to enter ratings in relation to a given ethics training program and receive instantaneous evaluative information for course refinement. Validation work suggests the tool's predicted outcomes correlate strongly (r = 0.46) with objective course outcomes. Implications for training program development and refinement are discussed.
DOT National Transportation Integrated Search
2015-02-01
The Maryland State Highway Administration (SHA) has initiated major planning efforts to improve transportation : efficiency, safety, and sustainability on critical highway corridors through its Comprehensive Highway Corridor : (CHC) program. This pro...
Population status and habitat associations of the King Rail in the midwestern United States
Bolenbaugh, Jason R.; Cooper, Tom; Brady, Ryan S.; Willard, Karen L.; Krementz, David G.
2012-01-01
The migratory population of the King Rail (Rallus elegans) has declined dramatically during the past 50 years, emphasizing the need to document the distribution and status of this species to help guide conservation efforts. In an effort to guide King Rail breeding habitat protection and restoration, a landscape suitability index (LSI) model was developed for the Upper Mississippi River and Great Lakes Region Joint Venture (JV). To validate this model, 264 sites were surveyed across the JV region in 2008 and 2009 using the National Marshbird Monitoring protocol. Two other similarly collected data sets from Wisconsin (250 sites) and Ohio (259 sites) as well as data from the Cornell Laboratory of Ornithology's eBird database were added to our data set. Sampling effort was not uniform across the study area. King Rails were detected at 29 sites with the greatest concentration in southeastern Wisconsin and northeastern Illinois. Too few detections were made to validate the LSI model. King Rail detection sites tended to have microtopographic heterogeneity, more emergent herbaceous wetland vegetation and less woody vegetation. The migrant population of the King Rail is rare and warrants additional conservation efforts to achieve stated conservation population targets.
NASA Technical Reports Server (NTRS)
Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John
2011-01-01
A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.
Genetic Programming as Alternative for Predicting Development Effort of Individual Software Projects
Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R.; Meda-Campaña, M. E.
2012-01-01
Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment. PMID:23226305
Detailed Validation of the Bidirectional Effect in Various Case 1 and Case 2 Waters
2012-03-26
of the viewing direction, i.e., they assumed a completely diffuse BRDF . Previous efforts to model / understand the actual BRDF [4-10] have produced...places. Second, the MAG2002 BRDF tables were developed from a radiative transfer (RT) model that used scattering particle phase functions that...situ measurements from just 3 locations to validate their model ; here we used a much larger data set across a wide variety of inherent optical
Chao, Wang; Shuang, Li; Tao, Li; Shanfa, Yu; Junming, Dai
2017-01-01
This study aimed to detect the mediation effect of over-commitment between occupational stress, insomnia, and well-being; and the moderating role of gender, age and job position are also to be analyzed. One thousand six hundred eighteen valid samples were recruited from electronic manufacturing service industry in Hunan Province, China. All the data were collected by selfrated questionnaires after written consent. This paper introduced Effort-Reward- Insomnia-Well-being model, and it was fitted and validated through the structural equation model analysis. The results of single factor correlation analysis indicated that the coefficients between most of the items and dimensions presented statistical significance. The final fitting model had satisfactory global goodness of fit (CMIN/DF=3.99, AGFI=0.926, NNFI=0.950, IFI=0.956, RMSEA=0.043). Both of the measurement model and structural model had acceptable path loadings. Effort associated with insomnia indirectly and related to well-being directly and indirectly; reward could have either directly associated with insomnia and well-being, or indirectly related to them through over-commitment. Covariates as gender, age and position made differences on the association between occupational stress and health outcomes. Over-commitment had the ability to mediate the relationships between effort, reward, and health outcomes, and mediation effect varied from different working conditions and outcomes under different covariates.
NASA Astrophysics Data System (ADS)
Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.
2016-05-01
Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.
Improving the Validity of Activity of Daily Living Dependency Risk Assessment
Clark, Daniel O.; Stump, Timothy E.; Tu, Wanzhu; Miller, Douglas K.
2015-01-01
Objectives Efforts to prevent activity of daily living (ADL) dependency may be improved through models that assess older adults’ dependency risk. We evaluated whether cognition and gait speed measures improve the predictive validity of interview-based models. Method Participants were 8,095 self-respondents in the 2006 Health and Retirement Survey who were aged 65 years or over and independent in five ADLs. Incident ADL dependency was determined from the 2008 interview. Models were developed using random 2/3rd cohorts and validated in the remaining 1/3rd. Results Compared to a c-statistic of 0.79 in the best interview model, the model including cognitive measures had c-statistics of 0.82 and 0.80 while the best fitting gait speed model had c-statistics of 0.83 and 0.79 in the development and validation cohorts, respectively. Conclusion Two relatively brief models, one that requires an in-person assessment and one that does not, had excellent validity for predicting incident ADL dependency but did not significantly improve the predictive validity of the best fitting interview-based models. PMID:24652867
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhenyu; Du, Pengwei; Kosterev, Dmitry
2013-05-01
Disturbance data recorded by phasor measurement units (PMU) offers opportunities to improve the integrity of dynamic models. However, manually tuning parameters through play-back events demands significant efforts and engineering experiences. In this paper, a calibration method using the extended Kalman filter (EKF) technique is proposed. The formulation of EKF with parameter calibration is discussed. Case studies are presented to demonstrate its validity. The proposed calibration method is cost-effective, complementary to traditional equipment testing for improving dynamic model quality.
Hydrological processes and model representation: impact of soft data on calibration
J.G. Arnold; M.A. Youssef; H. Yen; M.J. White; A.Y. Sheshukov; A.M. Sadeghi; D.N. Moriasi; J.L. Steiner; Devendra Amatya; R.W. Skaggs; E.B. Haney; J. Jeong; M. Arabi; P.H. Gowda
2015-01-01
Hydrologic and water quality models are increasingly used to determine the environmental impacts of climate variability and land management. Due to differing model objectives and differences in monitored data, there are currently no universally accepted procedures for model calibration and validation in the literature. In an effort to develop accepted model calibration...
Model-Based Verification and Validation of the SMAP Uplink Processes
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun
2013-01-01
This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.
ASTP ranging system mathematical model
NASA Technical Reports Server (NTRS)
Ellis, M. R.; Robinson, L. H.
1973-01-01
A mathematical model is presented of the VHF ranging system to analyze the performance of the Apollo-Soyuz test project (ASTP). The system was adapted for use in the ASTP. The ranging system mathematical model is presented in block diagram form, and a brief description of the overall model is also included. A procedure for implementing the math model is presented along with a discussion of the validation of the math model and the overall summary and conclusions of the study effort. Detailed appendices of the five study tasks are presented: early late gate model development, unlock probability development, system error model development, probability of acquisition and model development, and math model validation testing.
ERIC Educational Resources Information Center
Nunnery, John A.; Ross, Steven M.; Bol, Linda
2008-01-01
This study reports the results of a validation study of the Comprehensive School Restructuring Teacher Questionnaire (CSRTQ) and the School Observation Measure (SOM), which are intended for use in evaluating comprehensive school reform efforts. The CSRTQ, which putatively measures five factors related to school restructuring (internal focus,…
Validating Inertial Confinement Fusion (ICF) predictive capability using perturbed capsules
NASA Astrophysics Data System (ADS)
Schmitt, Mark; Magelssen, Glenn; Tregillis, Ian; Hsu, Scott; Bradley, Paul; Dodd, Evan; Cobble, James; Flippo, Kirk; Offerman, Dustin; Obrey, Kimberly; Wang, Yi-Ming; Watt, Robert; Wilke, Mark; Wysocki, Frederick; Batha, Steven
2009-11-01
Achieving ignition on NIF is a monumental step on the path toward utilizing fusion as a controlled energy source. Obtaining robust ignition requires accurate ICF models to predict the degradation of ignition caused by heterogeneities in capsule construction and irradiation. LANL has embarked on a project to induce controlled defects in capsules to validate our ability to predict their effects on fusion burn. These efforts include the validation of feature-driven hydrodynamics and mix in a convergent geometry. This capability is needed to determine the performance of capsules imploded under less-than-optimum conditions on future IFE facilities. LANL's recently initiated Defect Implosion Experiments (DIME) conducted at Rochester's Omega facility are providing input for these efforts. Recent simulation and experimental results will be shown.
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.
2009-01-01
In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.
Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensormore » level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.« less
Occupant Protection during Orion Crew Exploration Vehicle Landings
NASA Technical Reports Server (NTRS)
Gernhardt, Michael L.; Jones, J. A.; Granderson, B. K.; Somers, J. T.
2009-01-01
The constellation program is evaluating current vehicle design capabilities for nominal water landings and contingency land landings of the Orion Crew Exploration vehicle. The Orion Landing Strategy tiger team was formed to lead the technical effort for which associated activities include the current vehicle design, susceptibility to roll control and tip over, reviewing methods for assessing occupant injury during ascent / aborts /landings, developing an alternate seat/attenuation design solution which improves occupant protection and operability, and testing the seat/attenuation system designs to ensure valid results. The EVA physiology, systems and Performance (EPSP) project is leading the effort under the authority of the Tiger Team Steering committee to develop, verify, validate and accredit biodynamics models using a variety of crash and injury databases including NASCAR, Indy Car and military aircraft. The validated biodynamics models will be used by the Constellation program to evaluate a variety of vehicle, seat and restraint designs in the context of multiple nominal and off-nominal landing scenarios. The models will be used in conjunction with Acceptable Injury Risk definitions to provide new occupant protection requirements for the Constellation Program.
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Brand, Jeremy H.; Pereira, J. Michael; Revilock, Duane M.
2007-01-01
Following the tragedy of the Space Shuttle Columbia on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the Space Shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize Reinforced Carbon-Carbon (RCC) and various debris materials which could potentially shed on ascent and impact the Orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS DYNA to predict damage by potential and actual impact events on the Orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: fundamental tests to obtain independent static and dynamic material model properties of materials of interest, sub-component impact tests to provide highly controlled impact test data for the correlation and validation of the models, and full-scale impact tests to establish the final level of confidence for the analysis methodology. This paper discusses the second level subcomponent test program in detail and its application to the LS DYNA model validation process. The level two testing consisted of over one hundred impact tests in the NASA Glenn Research Center Ballistic Impact Lab on 6 by 6 in. and 6 by 12 in. flat plates of RCC and evaluated three types of debris projectiles: BX 265 External Tank foam, ice, and PDL 1034 External Tank foam. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile. The information obtained from this testing validated the LS DYNA damage prediction models and provided a certain level of confidence to begin performing analysis for full-size RCC test articles for returning NASA to flight with STS 114 and beyond.
NASA Technical Reports Server (NTRS)
Sebok, Angelia; Wickens, Christopher; Sargent, Robert
2015-01-01
One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.
Yahya, Noorazrul; Ebert, Martin A; Bulsara, Max; Kennedy, Angel; Joseph, David J; Denham, James W
2016-08-01
Most predictive models are not sufficiently validated for prospective use. We performed independent external validation of published predictive models for urinary dysfunctions following radiotherapy of the prostate. Multivariable models developed to predict atomised and generalised urinary symptoms, both acute and late, were considered for validation using a dataset representing 754 participants from the TROG 03.04-RADAR trial. Endpoints and features were harmonised to match the predictive models. The overall performance, calibration and discrimination were assessed. 14 models from four publications were validated. The discrimination of the predictive models in an independent external validation cohort, measured using the area under the receiver operating characteristic (ROC) curve, ranged from 0.473 to 0.695, generally lower than in internal validation. 4 models had ROC >0.6. Shrinkage was required for all predictive models' coefficients ranging from -0.309 (prediction probability was inverse to observed proportion) to 0.823. Predictive models which include baseline symptoms as a feature produced the highest discrimination. Two models produced a predicted probability of 0 and 1 for all patients. Predictive models vary in performance and transferability illustrating the need for improvements in model development and reporting. Several models showed reasonable potential but efforts should be increased to improve performance. Baseline symptoms should always be considered as potential features for predictive models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Thompson, David E.
2005-01-01
Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.
NASA Technical Reports Server (NTRS)
Werner, C. R.; Humphreys, B. T.; Mulugeta, L.
2014-01-01
The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.
NASA Technical Reports Server (NTRS)
Balakrishna, S.; Goglia, G. L.
1979-01-01
The details of the efforts to synthesize a control-compatible multivariable model of a liquid nitrogen cooled, gaseous nitrogen operated, closed circuit, cryogenic pressure tunnel are presented. The synthesized model was transformed into a real-time cryogenic tunnel simulator, and this model is validated by comparing the model responses to the actual tunnel responses of the 0.3 m transonic cryogenic tunnel, using the quasi-steady-state and the transient responses of the model and the tunnel. The global nature of the simple, explicit, lumped multivariable model of a closed circuit cryogenic tunnel is demonstrated.
Modeling the effects of argument length and validity on inductive and deductive reasoning.
Rotello, Caren M; Heit, Evan
2009-09-01
In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were observed: Induction judgments were more affected by argument length, and deduction judgments were more affected by validity. In Experiment 2, fluency was manipulated by displaying the materials in a low-contrast font, leading to increased sensitivity to logical validity. Several variants of 1-process and 2-process models of reasoning were assessed against the results. A 1-process model that assumed the same scale of argument strength underlies induction and deduction was not successful. A 2-process model that assumed separate, continuous informational dimensions of apparent deductive validity and associative strength gave the more successful account. (c) 2009 APA, all rights reserved.
Cypko, Mario A; Stoehr, Matthaeus; Kozniewski, Marcin; Druzdzel, Marek J; Dietz, Andreas; Berliner, Leonard; Lemke, Heinz U
2017-11-01
Oncological treatment is being increasingly complex, and therefore, decision making in multidisciplinary teams is becoming the key activity in the clinical pathways. The increased complexity is related to the number and variability of possible treatment decisions that may be relevant to a patient. In this paper, we describe validation of a multidisciplinary cancer treatment decision in the clinical domain of head and neck oncology. Probabilistic graphical models and corresponding inference algorithms, in the form of Bayesian networks, can support complex decision-making processes by providing a mathematically reproducible and transparent advice. The quality of BN-based advice depends on the quality of the model. Therefore, it is vital to validate the model before it is applied in practice. For an example BN subnetwork of laryngeal cancer with 303 variables, we evaluated 66 patient records. To validate the model on this dataset, a validation workflow was applied in combination with quantitative and qualitative analyses. In the subsequent analyses, we observed four sources of imprecise predictions: incorrect data, incomplete patient data, outvoting relevant observations, and incorrect model. Finally, the four problems were solved by modifying the data and the model. The presented validation effort is related to the model complexity. For simpler models, the validation workflow is the same, although it may require fewer validation methods. The validation success is related to the model's well-founded knowledge base. The remaining laryngeal cancer model may disclose additional sources of imprecise predictions.
Rationality Validation of a Layered Decision Model for Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Huaqiang; Alves-Foss, James; Zhang, Du
2007-08-31
We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDMmore » rationality through simulation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
English, Shawn Allen; Nelson, Stacy Michelle; Briggs, Timothy
Presented is a model verification and validation effort using low - velocity impact (LVI) of carbon fiber reinforced polymer laminate experiments. A flat cylindrical indenter impacts the laminate with enough energy to produce delamination, matrix cracks and fiber breaks. Included in the experimental efforts are ultrasonic scans of the damage for qualitative validation of the models. However, the primary quantitative metrics of validation are the force time history measured through the instrumented indenter and initial and final velocities. The simulations, whi ch are run on Sandia's Sierra finite element codes , consist of all physics and material parameters of importancemore » as determined by a sensitivity analysis conducted on the LVI simulation. A novel orthotropic damage and failure constitutive model that is cap able of predicting progressive composite damage and failure is described in detail and material properties are measured, estimated from micromechanics or optimized through calibration. A thorough verification and calibration to the accompanying experiment s are presented. Specia l emphasis is given to the four - point bend experiment. For all simulations of interest, the mesh and material behavior is verified through extensive convergence studies. An ensemble of simulations incorporating model parameter unc ertainties is used to predict a response distribution which is then compared to experimental output. The result is a quantifiable confidence in material characterization and model physics when simulating this phenomenon in structures of interest.« less
Irma 5.2 multi-sensor signature prediction model
NASA Astrophysics Data System (ADS)
Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles
2007-04-01
The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/MN) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after an extensive verification and validation of an upgraded and reengineered active channel. Since 2005, the reengineering effort has focused on the Irma passive channel. Field measurements for the validation effort include the unpolarized data collection. Irma 5.2 is scheduled for release in the summer of 2007. This paper will report the validation test results of the Irma passive models and discuss the new features in Irma 5.2.
NASA Technical Reports Server (NTRS)
Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.
2013-01-01
NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA evaluate pilot modeling efforts and select the appropriate tools for future modeling efforts to predict pilot performance in NextGen operations.
Rosenberger, Amanda E.; Dunham, Jason B.
2005-01-01
Estimation of fish abundance in streams using the removal model or the Lincoln - Peterson mark - recapture model is a common practice in fisheries. These models produce misleading results if their assumptions are violated. We evaluated the assumptions of these two models via electrofishing of rainbow trout Oncorhynchus mykiss in central Idaho streams. For one-, two-, three-, and four-pass sampling effort in closed sites, we evaluated the influences of fish size and habitat characteristics on sampling efficiency and the accuracy of removal abundance estimates. We also examined the use of models to generate unbiased estimates of fish abundance through adjustment of total catch or biased removal estimates. Our results suggested that the assumptions of the mark - recapture model were satisfied and that abundance estimates based on this approach were unbiased. In contrast, the removal model assumptions were not met. Decreasing sampling efficiencies over removal passes resulted in underestimated population sizes and overestimates of sampling efficiency. This bias decreased, but was not eliminated, with increased sampling effort. Biased removal estimates based on different levels of effort were highly correlated with each other but were less correlated with unbiased mark - recapture estimates. Stream size decreased sampling efficiency, and stream size and instream wood increased the negative bias of removal estimates. We found that reliable estimates of population abundance could be obtained from models of sampling efficiency for different levels of effort. Validation of abundance estimates requires extra attention to routine sampling considerations but can help fisheries biologists avoid pitfalls associated with biased data and facilitate standardized comparisons among studies that employ different sampling methods.
NASA Hybrid Reflectometer Project
NASA Technical Reports Server (NTRS)
Lynch, Dana; Mancini, Ron (Technical Monitor)
2002-01-01
Time-domain and frequency-domain reflectometry have been used for about forty years to locate opens and shorts in cables. Interpretation of reflectometry data is as much art as science. Is there information in the data that is being missed? Can the reflectometers be improved to allow us to detect and locate defects in cables that are not outright shorts or opens? The Hybrid Reflectometer Project was begun this year at NASA Ames Research Center, initially to model wire physics, simulating time-domain reflectometry (TDR) signals in those models and validating the models against actual TDR data taken on testbed cables. Theoretical models of reflectometry in wires will give us an understanding of the merits and limits of these techniques and will guide the application of a proposed hybrid reflectometer with the aim of enhancing reflectometer sensitivity to the point that wire defects can be detected. We will point out efforts by some other researchers to apply wire physics models to the problem of defect detection in wires and we will describe our own initial efforts to create wire physics models and report on testbed validation of the TDR simulations.
Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites
NASA Technical Reports Server (NTRS)
Turner, Travis L.
2001-01-01
This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.
Vlachopoulos, Symeon P; Gigoudi, Maria A
2008-07-01
This article reports on the development and initial validation of the Amotivation Toward Exercise Scale (ATES), which reflects a taxonomy of older adults' reasons to refrain from exercise. Drawing on work by Pelletier, Dion, Tuson, and Green-Demers (1999) and Legault, Green-Demers, and Pelletier (2006), these dimensions were the outcome beliefs, capacity beliefs, effort beliefs, and value amotivation beliefs toward exercise. The results supported a 4-factor correlated model that fit the data better than either a unidimensional model or a 4-factor uncorrelated model or a hierarchical model with strong internal reliability for all the subscales. Evidence also emerged for the discriminant validity of the subscale scores. Furthermore, the predictive validity of the subscale scores was supported, and satisfactory measurement invariance was demonstrated across the calibration and validation samples, supporting the generalizability of the scale's measurement properties.
Validating the soil vulnerability index for a claypan watershed
USDA-ARS?s Scientific Manuscript database
Assessment studies of conservation efforts have shown that best management practices were not always implemented in the most vulnerable areas where they are most needed. While complex computer simulation models can be used to identify these areas, resources needed for using such models are beyond re...
High-resolution modeling assessment of tidal stream resource in Western Passage of Maine, USA
NASA Astrophysics Data System (ADS)
Yang, Zhaoqing; Wang, Taiping; Feng, Xi; Xue, Huijie; Kilcher, Levi
2017-04-01
Although significant efforts have been taken to assess the maximum potential of tidal stream energy at system-wide scale, accurate assessment of tidal stream energy resource at project design scale requires detailed hydrodynamic simulations using high-resolution three-dimensional (3-D) numerical models. Extended model validation against high quality measured data is essential to minimize the uncertainties of the resource assessment. Western Passage in the State of Maine in U.S. has been identified as one of the top ranking sites for tidal stream energy development in U.S. coastal waters, based on a number of criteria including tidal power density, market value and transmission distance. This study presents an on-going modeling effort for simulating the tidal hydrodynamics in Western Passage using the 3-D unstructured-grid Finite Volume Community Ocean Model (FVCOM). The model domain covers a large region including the entire the Bay of Fundy with grid resolution varies from 20 m in the Western Passage to approximately 1000 m along the open boundary near the mouth of Bay of Fundy. Preliminary model validation was conducted using existing NOAA measurements within the model domain. Spatial distributions of tidal power density were calculated and extractable tidal energy was estimated using a tidal turbine module embedded in FVCOM under different tidal farm scenarios. Additional field measurements to characterize resource and support model validation were discussed. This study provides an example of high resolution resource assessment based on the guidance recommended by the International Electrotechnical Commission Technical Specification.
van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G
2010-01-01
Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.
Multi-component testing using HZ-PAN and AgZ-PAN Sorbents for OSPREY Model validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garn, Troy G.; Greenhalgh, Mitchell; Lyon, Kevin L.
2015-04-01
In efforts to further develop the capability of the Off-gas SeParation and RecoverY (OSPREY) model, multi-component tests were completed using both HZ-PAN and AgZ-PAN sorbents. The primary purpose of this effort was to obtain multi-component xenon and krypton capacities for comparison to future OSPREY predicted multi-component capacities using previously acquired Langmuir equilibrium parameters determined from single component isotherms. Experimental capacities were determined for each sorbent using two feed gas compositions of 1000 ppmv xenon and 150 ppmv krypton in either a helium or air balance. Test temperatures were consistently held at 220 K and the gas flowrate was 50 sccm.more » Capacities were calculated from breakthrough curves using TableCurve® 2D software by Jandel Scientific. The HZ-PAN sorbent was tested in the custom designed cryostat while the AgZ-PAN was tested in a newly installed cooling apparatus. Previous modeling validation efforts indicated the OSPREY model can be used to effectively predict single component xenon and krypton capacities for both engineered form sorbents. Results indicated good agreement with the experimental and predicted capacity values for both krypton and xenon on the sorbents. Overall, the model predicted slightly elevated capacities for both gases which can be partially attributed to the estimation of the parameters and the uncertainty associated with the experimental measurements. Currently, OSPREY is configured such that one species adsorbs and one does not (i.e. krypton in helium). Modification of OSPREY code is currently being performed to incorporate multiple adsorbing species and non-ideal interactions of gas phase species with the sorbent and adsorbed phases. Once these modifications are complete, the sorbent capacities determined in the present work will be used to validate OSPREY multicomponent adsorption predictions.« less
Surrogates for numerical simulations; optimization of eddy-promoter heat exchangers
NASA Technical Reports Server (NTRS)
Patera, Anthony T.; Patera, Anthony
1993-01-01
Although the advent of fast and inexpensive parallel computers has rendered numerous previously intractable calculations feasible, many numerical simulations remain too resource-intensive to be directly inserted in engineering optimization efforts. An attractive alternative to direct insertion considers models for computational systems: the expensive simulation is evoked only to construct and validate a simplified, input-output model; this simplified input-output model then serves as a simulation surrogate in subsequent engineering optimization studies. A simple 'Bayesian-validated' statistical framework for the construction, validation, and purposive application of static computer simulation surrogates is presented. As an example, dissipation-transport optimization of laminar-flow eddy-promoter heat exchangers are considered: parallel spectral element Navier-Stokes calculations serve to construct and validate surrogates for the flowrate and Nusselt number; these surrogates then represent the originating Navier-Stokes equations in the ensuing design process.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
[Authentic leadership. Concept and validation of the ALQ in Spain].
Moriano, Juan A; Molero, Fernando; Lévy Mangin, Jean-Pierre
2011-04-01
This study presents the validation of the Authentic Leadership Questionnaire (ALQ) in a sample of more than 600 Spanish employees. This questionnaire measures four distinct but related substantive components of authentic leadership. These components are: self-awareness, relational transparency, balanced processing, and internalized moral perspective. Structural equation modeling confirmed that the Spanish version of ALQ has high reliability and predictive validity for important leadership outputs such as perceived effectiveness of leadership, followers' extra effort and satisfaction with the leader.
Development of full regeneration establishment models for the forest vegetation simulator
John D. Shaw
2015-01-01
For most simulation modeling efforts, the goal of model developers is to produce simulations that are the best representations of realism as possible. Achieving this goal commonly requires a considerable amount of data to set the initial parameters, followed by validation and model improvement â both of which require even more data. The Forest Vegetation Simulator (FVS...
Verification and Validation of Monte Carlo N-Particle 6 for Computing Gamma Protection Factors
2015-03-26
methods for evaluating RPFs, which it used for the subsequent 30 years. These approaches included computational modeling, radioisotopes , and a high...1.2.1. Past Methods of Experimental Evaluation ........................................................ 2 1.2.2. Modeling Efforts...Other Considerations ......................................................................................... 14 2.4. Monte Carlo Methods
Link, W.A.; Sauer, J.R.; Helbig, Andreas J.; Flade, Martin
1999-01-01
Count survey data are commonly used for estimating temporal and spatial patterns of population change. Since count surveys are not censuses, counts can be influenced by 'nuisance factors' related to the probability of detecting animals but unrelated to the actual population size. The effects of systematic changes in these factors can be confounded with patterns of population change. Thus, valid analysis of count survey data requires the identification of nuisance factors and flexible models for their effects. We illustrate using data from the Christmas Bird Count (CBC), a midwinter survey of bird populations in North America. CBC survey effort has substantially increased in recent years, suggesting that unadjusted counts may overstate population growth (or understate declines). We describe a flexible family of models for the effect of effort, that includes models in which increasing effort leads to diminishing returns in terms of the number of birds counted.
HALOE Algorithm Improvements for Upper Tropospheric Sounding
NASA Technical Reports Server (NTRS)
McHugh, Martin J.; Gordley, Larry L.; Russell, James M., III; Hervig, Mark E.
1999-01-01
This report details the ongoing efforts by GATS, Inc., in conjunction with Hampton University and University of Wyoming, in NASA's Mission to Planet Earth UARS Science Investigator Program entitled "HALOE Algorithm Improvements for Upper Tropospheric Soundings." The goal of this effort is to develop and implement major inversion and processing improvements that will extend HALOE measurements further into the troposphere. In particular, O3, H2O, and CH4 retrievals may be extended into the middle troposphere, and NO, HCl and possibly HF into the upper troposphere. Key areas of research being carried out to accomplish this include: pointing/tracking analysis; cloud identification and modeling; simultaneous multichannel retrieval capability; forward model improvements; high vertical-resolution gas filter channel retrievals; a refined temperature retrieval; robust error analyses; long-term trend reliability studies; and data validation. The current (first-year) effort concentrates on the pointer/tracker correction algorithms, cloud filtering and validation, and multi-channel retrieval development. However, these areas are all highly coupled, so progress in one area benefits from and sometimes depends on work in others.
HALOE Algorithm Improvements for Upper Tropospheric Sounding
NASA Technical Reports Server (NTRS)
Thompson, Robert Earl; McHugh, Martin J.; Gordley, Larry L.; Hervig, Mark E.; Russell, James M., III; Douglass, Anne (Technical Monitor)
2001-01-01
This report details the ongoing efforts by GATS, Inc., in conjunction with Hampton University and University of Wyoming, in NASA's Mission to Planet Earth Upper Atmospheric Research Satellite (UARS) Science Investigator Program entitled 'HALOE Algorithm Improvements for Upper Tropospheric Sounding.' The goal of this effort is to develop and implement major inversion and processing improvements that will extend Halogen Occultation Experiment (HALOE) measurements further into the troposphere. In particular, O3, H2O, and CH4 retrievals may be extended into the middle troposphere, and NO, HCl and possibly HF into the upper troposphere. Key areas of research being carried out to accomplish this include: pointing/tracking analysis; cloud identification and modeling; simultaneous multichannel retrieval capability; forward model improvements; high vertical-resolution gas filter channel retrievals; a refined temperature retrieval; robust error analyses; long-term trend reliability studies; and data validation. The current (first year) effort concentrates on the pointer/tracker correction algorithms, cloud filtering and validation, and multichannel retrieval development. However, these areas are all highly coupled, so progress in one area benefits from and sometimes depends on work in others.
Cryogenic Orbital Test Bed 3 (CRYOTE3) Overview and Status
NASA Technical Reports Server (NTRS)
Stephens, Jonathan; Martin, Jim; Smith, James; Sisco, Jim; Marsell, Brandon; Roth, Jacob; Schallhorn, Paul; Wanzie, Nathaniel; Piryk, David; Bauer, Jeffrey;
2015-01-01
CRYOTE3 is a grassroots CFM test effort with contributing government and industry partners focused on developing and testing hardware to produce needed data for model validation and implementation into flight systems.
Infrared Algorithm Development for Ocean Observations with EOS/MODIS
NASA Technical Reports Server (NTRS)
Brown, Otis B.
1997-01-01
Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.
Development and validation of a blade-element mathematical model for the AH-64A Apache helicopter
NASA Technical Reports Server (NTRS)
Mansur, M. Hossein
1995-01-01
A high-fidelity blade-element mathematical model for the AH-64A Apache Advanced Attack Helicopter has been developed by the Aeroflightdynamics Directorate of the U.S. Army's Aviation and Troop Command (ATCOM) at Ames Research Center. The model is based on the McDonnell Douglas Helicopter Systems' (MDHS) Fly Real Time (FLYRT) model of the AH-64A (acquired under contract) which was modified in-house and augmented with a blade-element-type main-rotor module. This report describes, in detail, the development of the rotor module, and presents some results of an extensive validation effort.
NASA Astrophysics Data System (ADS)
Basith, Abdul; Prakoso, Yudhono; Kongko, Widjo
2017-07-01
A tsunami model using high resolution geometric data is indispensable in efforts to tsunami mitigation, especially in tsunami prone areas. It is one of the factors that affect the accuracy results of numerical modeling of tsunami. Sadeng Port is a new infrastructure in the Southern Coast of Java which could potentially hit by massive tsunami from seismic gap. This paper discusses validation and error estimation of tsunami model created using high resolution geometric data in Sadeng Port. Tsunami model validation uses the height wave of Tsunami Pangandaran 2006 recorded by Tide Gauge of Sadeng. Tsunami model will be used to accommodate the tsunami numerical modeling involves the parameters of earthquake-tsunami which is derived from the seismic gap. The validation results using t-test (student) shows that the height of the tsunami modeling results and observation in Tide Gauge of Sadeng are considered statistically equal at 95% confidence level and the value of the RMSE and NRMSE are 0.428 m and 22.12%, while the differences of tsunami wave travel time is 12 minutes.
Developing and Testing a Model to Predict Outcomes of Organizational Change
Gustafson, David H; Sainfort, François; Eichler, Mary; Adams, Laura; Bisognano, Maureen; Steudel, Harold
2003-01-01
Objective To test the effectiveness of a Bayesian model employing subjective probability estimates for predicting success and failure of health care improvement projects. Data Sources Experts' subjective assessment data for model development and independent retrospective data on 221 healthcare improvement projects in the United States, Canada, and the Netherlands collected between 1996 and 2000 for validation. Methods A panel of theoretical and practical experts and literature in organizational change were used to identify factors predicting the outcome of improvement efforts. A Bayesian model was developed to estimate probability of successful change using subjective estimates of likelihood ratios and prior odds elicited from the panel of experts. A subsequent retrospective empirical analysis of change efforts in 198 health care organizations was performed to validate the model. Logistic regression and ROC analysis were used to evaluate the model's performance using three alternative definitions of success. Data Collection For the model development, experts' subjective assessments were elicited using an integrative group process. For the validation study, a staff person intimately involved in each improvement project responded to a written survey asking questions about model factors and project outcomes. Results Logistic regression chi-square statistics and areas under the ROC curve demonstrated a high level of model performance in predicting success. Chi-square statistics were significant at the 0.001 level and areas under the ROC curve were greater than 0.84. Conclusions A subjective Bayesian model was effective in predicting the outcome of actual improvement projects. Additional prospective evaluations as well as testing the impact of this model as an intervention are warranted. PMID:12785571
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Revilock, Duane M.; Pereira, Michael J.; Lyle, Karen H.
2009-01-01
Following the tragedy of the Orbiter Columbia (STS-107) on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the space shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize reinforced carbon-carbon (RCC) along with ice and foam debris materials, which could shed on ascent and impact the orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS-DYNA (Livermore Software Technology Corp.) to predict damage by potential and actual impact events on the orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: Level 1--fundamental tests to obtain independent static and dynamic constitutive model properties of materials of interest, Level 2--subcomponent impact tests to provide highly controlled impact test data for the correlation and validation of the models, and Level 3--full-scale orbiter leading-edge impact tests to establish the final level of confidence for the analysis methodology. This report discusses the Level 2 test program conducted in the NASA Glenn Research Center (GRC) Ballistic Impact Laboratory with ice projectile impact tests on flat RCC panels, and presents the data observed. The Level 2 testing consisted of 54 impact tests in the NASA GRC Ballistic Impact Laboratory on 6- by 6-in. and 6- by 12-in. flat plates of RCC and evaluated three types of debris projectiles: Single-crystal, polycrystal, and "soft" ice. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile and validated the use of the ice and RCC models for use in LS-DYNA.
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Revilock, Duane M.; Pereira, Michael J.; Lyle, Karen H.
2009-01-01
Following the tragedy of the Orbiter Columbia (STS-107) on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the space shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize reinforced carbon-carbon (RCC) along with ice and foam debris materials, which could shed on ascent and impact the orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS-DYNA (Livermore Software Technology Corp.) to predict damage by potential and actual impact events on the orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: Level 1-fundamental tests to obtain independent static and dynamic constitutive model properties of materials of interest, Level 2-subcomponent impact tests to provide highly controlled impact test data for the correlation and validation of the models, and Level 3-full-scale orbiter leading-edge impact tests to establish the final level of confidence for the analysis methodology. This report discusses the Level 2 test program conducted in the NASA Glenn Research Center (GRC) Ballistic Impact Laboratory with external tank foam impact tests on flat RCC panels, and presents the data observed. The Level 2 testing consisted of 54 impact tests in the NASA GRC Ballistic Impact Laboratory on 6- by 6-in. and 6- by 12-in. flat plates of RCC and evaluated two types of debris projectiles: BX-265 and PDL-1034 external tank foam. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile and validated the use of the foam and RCC models for use in LS-DYNA.
Validated Predictions of Metabolic Energy Consumption for Submaximal Effort Movement
Tsianos, George A.; MacFadden, Lisa N.
2016-01-01
Physical performance emerges from complex interactions among many physiological systems that are largely driven by the metabolic energy demanded. Quantifying metabolic demand is an essential step for revealing the many mechanisms of physical performance decrement, but accurate predictive models do not exist. The goal of this study was to investigate if a recently developed model of muscle energetics and force could be extended to reproduce the kinematics, kinetics, and metabolic demand of submaximal effort movement. Upright dynamic knee extension against various levels of ergometer load was simulated. Task energetics were estimated by combining the model of muscle contraction with validated models of lower limb musculotendon paths and segment dynamics. A genetic algorithm was used to compute the muscle excitations that reproduced the movement with the lowest energetic cost, which was determined to be an appropriate criterion for this task. Model predictions of oxygen uptake rate (VO2) were well within experimental variability for the range over which the model parameters were confidently known. The model's accurate estimates of metabolic demand make it useful for assessing the likelihood and severity of physical performance decrement for a given task as well as investigating underlying physiologic mechanisms. PMID:27248429
NASA Technical Reports Server (NTRS)
Defelice, David M.; Aydelott, John C.
1987-01-01
The resupply of the cryogenic propellants is an enabling technology for spacebased orbit transfer vehicles. As part of the NASA Lewis ongoing efforts in microgravity fluid management, thermodynamic analysis and subscale modeling techniques were developed to support an on-orbit test bed for cryogenic fluid management technologies. Analytical results have shown that subscale experimental modeling of liquid resupply can be used to validate analytical models when the appropriate target temperature is selected to relate the model to its prototype system. Further analyses were used to develop a thermodynamic model of the tank chilldown process which is required prior to the no-vent fill operation. These efforts were incorporated into two FORTRAN programs which were used to present preliminary analyticl results.
Models, validation, and applied geochemistry: Issues in science, communication, and philosophy
Nordstrom, D. Kirk
2012-01-01
Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.
Automation life-cycle cost model
NASA Technical Reports Server (NTRS)
Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne
1992-01-01
The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.
Qualitative Validation of the IMM Model for ISS and STS Programs
NASA Technical Reports Server (NTRS)
Kerstman, E.; Walton, M.; Reyes, D.; Boley, L.; Saile, L.; Young, M.; Arellano, J.; Garcia, Y.; Myers, J. G.
2016-01-01
To validate and further improve the Integrated Medical Model (IMM), medical event data were obtained from 32 ISS and 122 STS person-missions. Using the crew characteristics from these observed missions, IMM v4.0 was used to forecast medical events and medical resource utilization. The IMM medical condition incidence values were compared to the actual observed medical event incidence values, and the IMM forecasted medical resource utilization was compared to actual observed medical resource utilization. Qualitative comparisons of these parameters were conducted for both the ISS and STS programs. The results of these analyses will provide validation of IMM v4.0 and reveal areas of the model requiring adjustments to improve the overall accuracy of IMM outputs. This validation effort should result in enhanced credibility of the IMM and improved confidence in the use of IMM as a decision support tool for human space flight.
A Framework for Text Mining in Scientometric Study: A Case Study in Biomedicine Publications
NASA Astrophysics Data System (ADS)
Silalahi, V. M. M.; Hardiyati, R.; Nadhiroh, I. M.; Handayani, T.; Rahmaida, R.; Amelia, M.
2018-04-01
The data of Indonesians research publications in the domain of biomedicine has been collected to be text mined for the purpose of a scientometric study. The goal is to build a predictive model that provides a classification of research publications on the potency for downstreaming. The model is based on the drug development processes adapted from the literatures. An effort is described to build the conceptual model and the development of a corpus on the research publications in the domain of Indonesian biomedicine. Then an investigation is conducted relating to the problems associated with building a corpus and validating the model. Based on our experience, a framework is proposed to manage the scientometric study based on text mining. Our method shows the effectiveness of conducting a scientometric study based on text mining in order to get a valid classification model. This valid model is mainly supported by the iterative and close interactions with the domain experts starting from identifying the issues, building a conceptual model, to the labelling, validation and results interpretation.
Ares I-X Flight Test Validation of Control Design Tools in the Frequency-Domain
NASA Technical Reports Server (NTRS)
Johnson, Matthew; Hannan, Mike; Brandon, Jay; Derry, Stephen
2011-01-01
A major motivation of the Ares I-X flight test program was to Design for Data, in order to maximize the usefulness of the data recorded in support of Ares I modeling and validation of design and analysis tools. The Design for Data effort was intended to enable good post-flight characterizations of the flight control system, the vehicle structural dynamics, and also the aerodynamic characteristics of the vehicle. To extract the necessary data from the system during flight, a set of small predetermined Programmed Test Inputs (PTIs) was injected directly into the TVC signal. These PTIs were designed to excite the necessary vehicle dynamics while exhibiting a minimal impact on loads. The method is similar to common approaches in aircraft flight test programs, but with unique launch vehicle challenges due to rapidly changing states, short duration of flight, a tight flight envelope, and an inability to repeat any test. This paper documents the validation effort of the stability analysis tools to the flight data which was performed by comparing the post-flight calculated frequency response of the vehicle to the frequency response calculated by the stability analysis tools used to design and analyze the preflight models during the control design effort. The comparison between flight day frequency response and stability tool analysis for flight of the simulated vehicle shows good agreement and provides a high level of confidence in the stability analysis tools for use in any future program. This is true for both a nominal model as well as for dispersed analysis, which shows that the flight day frequency response is enveloped by the vehicle s preflight uncertainty models.
Efforts in Preparation for Jack Validation.
1997-12-01
clothing, equipment attached to the body, age, or physical health. The skeleton’s size, structure, and proportions are affected by age, exercise ...things such as genetics, exercise , and dietary habit (Bailey, Malina, & Rasmussen, 1978). VIRTUAL HUMAN MODELS A virtual human models only a subset of...artistically modeled) surfaces. - Somatotype modeling is not considered. To understand what this implies, consider scaling the body using an average
Modeling nanomaterial environmental fate in aquatic systems.
Dale, Amy L; Casman, Elizabeth A; Lowry, Gregory V; Lead, Jamie R; Viparelli, Enrica; Baalousha, Mohammed
2015-03-03
Mathematical models improve our fundamental understanding of the environmental behavior, fate, and transport of engineered nanomaterials (NMs, chemical substances or materials roughly 1-100 nm in size) and facilitate risk assessment and management activities. Although today's large-scale environmental fate models for NMs are a considerable improvement over early efforts, a gap still remains between the experimental research performed to date on the environmental fate of NMs and its incorporation into models. This article provides an introduction to the current state of the science in modeling the fate and behavior of NMs in aquatic environments. We address the strengths and weaknesses of existing fate models, identify the challenges facing researchers in developing and validating these models, and offer a perspective on how these challenges can be addressed through the combined efforts of modelers and experimentalists.
Dealing with Diversity in Computational Cancer Modeling
Johnson, David; McKeever, Steve; Stamatakos, Georgios; Dionysiou, Dimitra; Graf, Norbert; Sakkalis, Vangelis; Marias, Konstantinos; Wang, Zhihui; Deisboeck, Thomas S.
2013-01-01
This paper discusses the need for interconnecting computational cancer models from different sources and scales within clinically relevant scenarios to increase the accuracy of the models and speed up their clinical adaptation, validation, and eventual translation. We briefly review current interoperability efforts drawing upon our experiences with the development of in silico models for predictive oncology within a number of European Commission Virtual Physiological Human initiative projects on cancer. A clinically relevant scenario, addressing brain tumor modeling that illustrates the need for coupling models from different sources and levels of complexity, is described. General approaches to enabling interoperability using XML-based markup languages for biological modeling are reviewed, concluding with a discussion on efforts towards developing cancer-specific XML markup to couple multiple component models for predictive in silico oncology. PMID:23700360
WEC3: Wave Energy Converter Code Comparison Project: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien
This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less
Model Transformation for a System of Systems Dependability Safety Case
NASA Technical Reports Server (NTRS)
Murphy, Judy; Driskell, Steve
2011-01-01
The presentation reviews the dependability and safety effort of NASA's Independent Verification and Validation Facility. Topics include: safety engineering process, applications to non-space environment, Phase I overview, process creation, sample SRM artifact, Phase I end result, Phase II model transformation, fault management, and applying Phase II to individual projects.
As part of our efforts to develop a public platform to provide access to predictive models we have attempted to disentangle the influence of the quality versus quantity of data available to develop and validate QSAR models. Using a thorough manual review of the data underlying t...
Zurlo, Maria Clelia; Pes, Daniela; Siegrist, Johannes
2010-08-01
This study explores the explicative potential of effort-reward imbalance Model to unveil the dimensions involved in teacher stress process and analyses the psychometric characteristics of the Italian version of the ERI Questionnaire (Siegrist, J Occup Health Psychol 1:27-43, 1996) with respect to a homogeneous occupational group: Italian school teachers. The Italian version of the ERI Questionnaire was submitted to 673 teachers randomly drawn from a cross-section of school types. Internal consistency, reliability, discriminative validity, and factorial structure were evaluated. Predictive validity was explored with respect to a measure of perceived strain, the Crown-Crisp Experiential Index. Discriminative validity was explored with respect to age, gender, education, type of school, the presence/absence of physical pains in the last 12 months before the survey, and teachers' intention to leave the profession. Item-total correlations are for all items included between 0.30 and 0.80 (p < 0.01). Mean inter-item correlation is 0.26. Cronbach's alpha for the whole questionnaire reaches the value of 0.89. The factor analysis identified four reliable factors that accounted for 44.8 per cent of the total variance and which confirmed the basic structure emerged from previous studies yet highlighting two instead of three different components for reward. Higher efforts (T = -3.82, p < 0.001) and both lower material (T = 3.23, p < 0.001) and immaterial rewards (T = 3.17, p < 0.005) characterised the group of teachers, which reported to suffer for physical pains. Higher efforts (T = -5.26, p < 0.001), higher overcommitment (T = -3.15, p < 0.005), and both lower material (T = 4.63, p < 0.001) and immaterial rewards (T = 4.00, p < 0.001) were observed in the group of teachers inclined to give up the job. Multiple regression analyses have highlighted that higher efforts, higher overcommitment, and lower rewards are significantly predictive of higher levels of free-floating and somatic anxiety as well as depression and global psychological strain. This preliminary analysis of the reliability and validity of the Italian version of the ERI Questionnaire reveals that it constitutes a useful and reliable measure to analyse work-related stress with respect to the school setting. The validity of the ERI model to describe the dimensions involved in teacher's stress and to highlight those associated to leaving intentions and to several physical and psychological strain outcomes in Italian school teachers has been confirmed.
The Study of High-Speed Surface Dynamics Using a Pulsed Proton Beam
NASA Astrophysics Data System (ADS)
Buttler, William; Stone, Benjamin; Oro, David; Dimonte, Guy; Preston, Dean; Cherne, Frank; Germann, Timothy; Terrones, Guillermo; Tupa, Dale
2011-06-01
Los Alamos National Laboratory is presently engaged in development and implementation of ejecta source term and transport models for integration into LANL hydrodynamic computer codes. Experimental support for the effort spans a broad array of activities, including ejecta source term measurements from machine roughened Sn surfaces shocked by HE or flyer plates. Because the underlying postulate for ejecta formation is that ejecta are characterized by Richtmyer-Meshkov instability (RMI) phenomena, a key element of the theory and modeling effort centers on validation and verification RMI experiments at the LANSCE Proton Radiography Facility (pRad) to compare with modeled ejecta measurements. Here we present experimental results used to define and validate a physics based ejecta model together with remarkable, unexpected results of Sn instability growth in vacuum and gasses, and Sn and Cu RM growth that reveals the sensitivity of the RM instability to the yield strength of the material, Cu. The motivation of this last subject, RM growth linked to material strength, is to probe the shock pressure regions over which ejecta begins to form. Presenter
NASA Technical Reports Server (NTRS)
Pulkkinen, A.; Rastaetter, L.; Kuznetsova, M.; Singer, H.; Balch, C.; Weimer, D.; Toth, G.; Ridley, A.; Gombosi, T.; Wiltberger, M.;
2013-01-01
In this paper we continue the community-wide rigorous modern space weather model validation efforts carried out within GEM, CEDAR and SHINE programs. In this particular effort, in coordination among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), modelers, and science community, we focus on studying the models' capability to reproduce observed ground magnetic field fluctuations, which are closely related to geomagnetically induced current phenomenon. One of the primary motivations of the work is to support NOAA SWPC in their selection of the next numerical model that will be transitioned into operations. Six geomagnetic events and 12 geomagnetic observatories were selected for validation.While modeled and observed magnetic field time series are available for all 12 stations, the primary metrics analysis is based on six stations that were selected to represent the high-latitude and mid-latitude locations. Events-based analysis and the corresponding contingency tables were built for each event and each station. The elements in the contingency table were then used to calculate Probability of Detection (POD), Probability of False Detection (POFD) and Heidke Skill Score (HSS) for rigorous quantification of the models' performance. In this paper the summary results of the metrics analyses are reported in terms of POD, POFD and HSS. More detailed analyses can be carried out using the event by event contingency tables provided as an online appendix. An online interface built at CCMC and described in the supporting information is also available for more detailed time series analyses.
Godefroy, V; Trinchera, L; Romo, L; Rigal, N
2016-04-01
Appetitive traits and general temperament traits have both been correlated with adiposity and obesity in children. However, very few studies have tested structural models to identify the links between temperament, appetitive traits and adiposity in children. A validated structural model would help suggesting mechanisms to explain the impact of temperament on body mass index (BMI). In this study, we used Rothbart's heuristic definition of temperament as a starting point to define four appetitive traits, including two appetite reactivity dimensions (Appetite Arousal and Appetite Persistence) and two dimensions of self-regulation in eating (Self-regulation In Eating Without Hunger and Self-regulation in Eating Speed). We conducted a cross-sectional study in young adolescents to validate a structural model including these four appetitive traits, Effortful Control (a general temperament trait) and adiposity. A questionnaire assessing the four appetitive trait dimensions and Effortful Control was completed by adolescents from 10 to 14 years old (n=475), and their BMI-for-age was calculated (n=441). In total, 74% of the study participants were normal weight, 26% were overweight and 8% were obese. We then used structural equation modelling to test the structural model. We identified a well-fitting structural model (Comparative Fit Index=0.91; Root Mean Square Error of Approximation=0.04) that supports the hypothesis that Effortful Control impacts both dimensions of self-regulation in eating, which in turn are linked with both appetite reactivity dimensions. Moreover, Appetite Persistence is the only appetitive trait that was significantly related to adiposity (B=0.12; P<0.05). Our model shows that Effortful Control is related to adiposity through the mediation of an individual's 'eating temperament' (appetite reactivity and self-regulation in eating). Results suggest that young adolescents who exhibit high appetite reactivity but a low level of self-regulation in eating are at higher risk for excess adiposity.
DOT National Transportation Integrated Search
2017-10-23
In support of the Federal Aviation Administrations Office of Environment and Energy, the Volpe Center Environmental Measurement and Modeling Division (Volpe) has completed validation of the digital recording and 1/3 octave band analysis components...
Validation of the RAGE Hydrocode for Impacts into Volatile-Rich Targets
NASA Astrophysics Data System (ADS)
Plesko, C. S.; Asphaug, E.; Coker, R. F.; Wohletz, K. H.; Korycansky, D. G.; Gisler, G. R.
2007-12-01
In preparation for a detailed study of large-scale impacts into the Martian surface, we have validated the RAGE hydrocode (Gittings et al., in press, CSD) against a suite of experiments and statistical models. We present comparisons of hydrocode models to centimeter-scale gas gun impacts (Nakazawa et al. 2002), an underground nuclear test (Perret, 1971), and crater scaling laws (Holsapple 1993, O'Keefe and Ahrens 1993). We have also conducted model convergence and uncertainty analyses which will be presented. Results to date are encouraging for our current model goals, and indicate areas where the hydrocode may be extended in the future. This validation work is focused on questions related to the specific problem of large impacts into volatile-rich targets. The overall goal of this effort is to be able to realistically model large-scale Noachian, and possibly post- Noachian, impacts on Mars not so much to model the crater morphology as to understand the evolution of target volatiles in the post-impact regime, to explore how large craters might set the stage for post-impact hydro- geologic evolution both locally (in the crater subsurface) and globally, due to the redistribution of volatiles from the surface and subsurface into the atmosphere. This work is performed under the auspices of IGPP and the DOE at LANL under contracts W-7405-ENG-36 and DE-AC52-06NA25396. Effort by DK and EA is sponsored by NASA's Mars Fundamental Research Program.
NASA Astrophysics Data System (ADS)
Zheng, Y.; Ganushkina, N. Y.; Guild, T. B.; Jiggens, P.; Jun, I.; Mazur, J. E.; Meier, M. M.; Minow, J. I.; Pitchford, D. A.; O'Brien, T. P., III; Shprits, Y.; Tobiska, W. K.; Xapsos, M.; Rastaetter, L.; Jordanova, V. K.; Kellerman, A. C.; Fok, M. C. H.
2017-12-01
The Community Coordinated Modeling Center (CCMC) has been leading the community-wide model validation projects for many years. Such effort has been broadened and extended via the newly-launched International Forum for Space Weather Modeling Capabilities Assessment (https://ccmc.gsfc.nasa.gov/assessment/), Its objective is to track space weather models' progress and performance over time, which is critically needed in space weather operations. The Radiation and Plasma Effects Working Team is working on one of the many focused evaluation topics and deals with five different subtopics: Surface Charging from 10s eV to 40 keV electrons, Internal Charging due to energetic electrons from hundreds keV to several MeVs. Single Event Effects from solar energetic particles (SEPs) and galactic cosmic rays (GCRs) (several MeV to TeVs), Total Dose due to accumulation of doses from electrons (>100 KeV) and protons (> 1 MeV) in a broad energy range, and Radiation Effects from SEPs and GCRs at aviation altitudes. A unique aspect of the Radiation and Plasma Effects focus area is that it bridges the space environments, engineering and user community. This presentation will summarize the working team's progress in metrics discussion/definition and the CCMC web interface/tools to facilitate the validation efforts. As an example, tools in the areas of surface charging/internal charging will be demoed.
NASA Astrophysics Data System (ADS)
Joyce, C. J.; Tobiska, W. K.; Copeland, K.; Smart, D. F.; Shea, M. A.; Nowicki, S.; Atwell, W.; Benton, E. R.; Wilkins, R.; Hands, A.; Gronoff, G.; Meier, M. M.; Schwadron, N.
2017-12-01
Despite its potential for causing a wide range of harmful effects, including health hazards to airline passengers and damage to aircraft and satellite electronics, atmospheric radiation remains a relatively poorly defined risk, lacking sufficient measurements and modelling to fully evaluate the dangers posed. While our reliance on airline travel has increased dramatically over time, there remains an absence of international guidance and standards to protect aircraft passengers from potential health impacts due to radiation exposure. This subject has been gaining traction within the scientific community in recent years, with an expanding number of models with increasing capabilities being made available to evaluate atmospheric radiation hazards. We provide a general description of these modelling efforts, including the physics and methods used by the models, as well as their data inputs and outputs. We also discuss the current capacity for model validation via measurements and discuss the needs for the next generation of models, both in terms of their capabilities and the measurements required to validate them. This review of the status of atmospheric radiation modelling is part of a larger series of studies made as part of the SAFESKY program, with other efforts focusing on the underlying physics and implications, measurements and regulations/standards of atmospheric radiation.
2009-12-01
correctly Risk before validation step: 41-60% - Is this too high/ low ? Why? Risk 8: Operational or data latency impacts based on relationship between...too high, too low , or correct. We also asked them to comment on why they felt this way. Finally, we left additional space on the survey for any...cost of each validation effort was too high, too low , or acceptable. They then gave us rationale for their beliefs. The second cost associated with
NASA Technical Reports Server (NTRS)
Mcdougal, David S. (Editor)
1990-01-01
FIRE (First ISCCP Regional Experiment) is a U.S. cloud-radiation research program formed in 1984 to increase the basic understanding of cirrus and marine stratocumulus cloud systems, to develop realistic parameterizations for these systems, and to validate and improve ISCCP cloud product retrievals. Presentations of results culminating the first 5 years of FIRE research activities were highlighted. The 1986 Cirrus Intensive Field Observations (IFO), the 1987 Marine Stratocumulus IFO, the Extended Time Observations (ETO), and modeling activities are described. Collaborative efforts involving the comparison of multiple data sets, incorporation of data measurements into modeling activities, validation of ISCCP cloud parameters, and development of parameterization schemes for General Circulation Models (GCMs) are described.
Modeling interfacial fracture in Sierra.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Arthur A.; Ohashi, Yuki; Lu, Wei-Yang
2013-09-01
This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conductedmore » with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.« less
NASA Astrophysics Data System (ADS)
Beamer, J. P.; Hill, D. F.; Liston, G. E.; Arendt, A. A.; Hood, E. W.
2013-12-01
In Prince William Sound (PWS), Alaska, there is a pressing need for accurate estimates of the spatial and temporal variations in coastal freshwater discharge (FWD). FWD into PWS originates from streamflow due to rainfall, annual snowmelt, and changes in stored glacier mass and is important because it helps establish spatial and temporal patterns in ocean salinity and temperature, and is a time-varying boundary condition for oceanographic circulation models. Previous efforts to model FWD into PWS have been heavily empirical, with many physical processes absorbed into calibration coefficients that, in many cases, were calibrated to streams and rivers not hydrologically similar to those discharging into PWS. In this work we adapted and validated a suite of high-resolution (in space and time), physically-based, distributed weather, snowmelt, and runoff-routing models designed specifically for snow melt- and glacier melt-dominated watersheds like PWS in order to: 1) provide high-resolution, real-time simulations of snowpack and FWD, and 2) provide a record of historical variations of FWD. SnowModel, driven with gridded topography, land cover, and 32 years (1979-2011) of 3-hourly North American Regional Reanalysis (NARR) atmospheric forcing data, was used to simulate snowpack accumulation and melt across a PWS model domain. SnowModel outputs of daily snow water equivalent (SWE) depth and grid-cell runoff volumes were then coupled with HydroFlow, a runoff-routing model which routed snowmelt, glacier-melt, and rainfall to each watershed outlet (PWS coastline) in the simulation domain. The end product was a continuous 32-year simulation of daily FWD into PWS. In order to validate the models, SWE and snow depths from SnowModel were compared with observed SWE and snow depths from SnoTel and snow survey data, and discharge from HydroFlow was compared with observed streamflow measurements. As a second phase of this research effort, the coupled models will be set-up to run in real-time, where daily measurements from weather stations in the PWS will be used to drive simulations of snow cover and streamflow. In addition, we will deploy a strategic array of instrumentation aimed at validating the simulated weather estimates and the calculations of freshwater discharge. Upon successful implementation and validation of the modeling system, it will join established and ongoing computational and observational efforts that have a common goal of establishing a comprehensive understanding of the physical behavior of PWS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Thomas J.
2014-03-01
This report documents the efforts to perform dynamic model validation on the Eastern Interconnection (EI) by modeling governor deadband. An on-peak EI dynamic model is modified to represent governor deadband characteristics. Simulation results are compared with synchrophasor measurements collected by the Frequency Monitoring Network (FNET/GridEye). The comparison shows that by modeling governor deadband the simulated frequency response can closely align with the actual system response.
DOT National Transportation Integrated Search
2016-06-27
This research effort examined the corridor impacts of various signal timing and geometric strategies to improve the : operational challenges observed at DDIs. A microsimulation analysis was conducted using a calibrated and validated DDI : modeled aft...
Overcommitment as a predictor of effort-reward imbalance: evidence from an 8-year follow-up study.
Feldt, Taru; Hyvönen, Katriina; Mäkikangas, Anne; Rantanen, Johanna; Huhtala, Mari; Kinnunen, Ulla
2016-07-01
The effort-reward imbalance (ERI) model includes the personal characteristic of overcommitment (OC) and the job-related characteristics of effort, reward, and ERI, all of which are assumed to play a role in an employee's health and well-being at work. The aim of the present longitudinal study was to shed more light on the dynamics of the ERI model by investigating the basic hypotheses related to the role of OC in the model, ie, to establish whether an employee's OC could be a risk factor for an increased experience of high effort, low reward, and high ERI at work. The study was based on 5-wave, 8-year follow-up data collected among Finnish professionals in 2006 (T1, N=747), 2008 (T2, N=422), 2010 (T3, N=368), 2012 (T4, N=325), and 2014 (T5, N=273). The participants were mostly male (85% at T1) and the majority of them worked in technical fields. OC, effort, reward, and ERI were measured at each time point with the 23-item ERI scale. Three cross-lagged structural equation models (SEM) were estimated and compared by using full information maximum likelihood method: (i) OC predicted later experiences of effort, reward, and ERI (normal causation model), (ii) effort, reward, and ERI predicted later OC (reversed causation model), and (iii) associations in normal causal and reversed causal models were simultaneously valid (reciprocal causation model). The results supported the normal causation model: strong OC predicted later experiences of high effort, low reward and high ERI. High OC is a risk factor for an increased experience of job strain factors; that is, high effort, low reward, and high ERI. Thus, OC is a risk factor not only for an employee's well-being and health but also for an increasing risk for perceiving adverse job strain factors in the working environment.
Spray combustion model improvement study, 1
NASA Technical Reports Server (NTRS)
Chen, C. P.; Kim, Y. M.; Shang, H. M.
1993-01-01
This study involves the development of numerical and physical modeling in spray combustion. These modeling efforts are mainly motivated to improve the physical submodels of turbulence, combustion, atomization, dense spray effects, and group vaporization. The present mathematical formulation can be easily implemented in any time-marching multiple pressure correction methodologies such as MAST code. A sequence of validation cases includes the nonevaporating, evaporating and_burnin dense_sprays.
Validation of a Scalable Solar Sailcraft
NASA Technical Reports Server (NTRS)
Murphy, D. M.
2006-01-01
The NASA In-Space Propulsion (ISP) program sponsored intensive solar sail technology and systems design, development, and hardware demonstration activities over the past 3 years. Efforts to validate a scalable solar sail system by functional demonstration in relevant environments, together with test-analysis correlation activities on a scalable solar sail system have recently been successfully completed. A review of the program, with descriptions of the design, results of testing, and analytical model validations of component and assembly functional, strength, stiffness, shape, and dynamic behavior are discussed. The scaled performance of the validated system is projected to demonstrate the applicability to flight demonstration and important NASA road-map missions.
Landsat TM Classifications For SAFIS Using FIA Field Plots
William H. Cooke; Andrew J. Hartsell
2001-01-01
Wall-to-wall Landsat Thematic Mapper (TM) classification efforts in Georgia require field validation. We developed a new crown modeling procedure based on Forest Health Monitoring (FHM) data to test Forest Inventory and Analysis (FIA) data. These models simulate the proportion of tree crowns that reflect light on a FIA subplot basis. We averaged subplot crown...
Measuring infrastructure: A key step in program evaluation and planning.
Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd
2016-06-01
State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Jacobs, James A.
In an effort to develop a course in materials and processes of industry at Norfolk State College using Barton Herrscher's model of systematic instruction, a group of 12 NASA-Langley Research Center's (NASA-LRC) research engineers and technicians were recruited. The group acted as consultants in validating the content of the course and aided in…
NASA Technical Reports Server (NTRS)
West, Jeff; Westra, Doug; Lin, Jeff; Tucker, Kevin
2006-01-01
A robust rocket engine combustor design and development process must include tools which can accurately predict the multi-dimensional thermal environments imposed on solid surfaces by the hot combustion products. Currently, empirical methods used in the design process are typically one dimensional and do not adequately account for the heat flux rise rate in the near-injector region of the chamber. Computational Fluid Dynamics holds promise to meet the design tool requirement, but requires accuracy quantification, or validation, before it can be confidently applied in the design process. This effort presents the beginning of such a validation process for the Loci-CHEM CFD code. The model problem examined here is a gaseous oxygen (GO2)/gaseous hydrogen (GH2) shear coaxial single element injector operating at a chamber pressure of 5.42 MPa. The GO2/GH2 propellant combination in this geometry represents one the simplest rocket model problems and is thus foundational to subsequent validation efforts for more complex injectors. Multiple steady state solutions have been produced with Loci-CHEM employing different hybrid grids and two-equation turbulence models. Iterative convergence for each solution is demonstrated via mass conservation, flow variable monitoring at discrete flow field locations as a function of solution iteration and overall residual performance. A baseline hybrid was used and then locally refined to demonstrate grid convergence. Solutions were obtained with three variations of the k-omega turbulence model.
NASA Technical Reports Server (NTRS)
West, Jeff; Westra, Doug; Lin, Jeff; Tucker, Kevin
2006-01-01
A robust rocket engine combustor design and development process must include tools which can accurately predict the multi-dimensional thermal environments imposed on solid surfaces by the hot combustion products. Currently, empirical methods used in the design process are typically one dimensional and do not adequately account for the heat flux rise rate in the near-injector region of the chamber. Computational Fluid Dynamics holds promise to meet the design tool requirement, but requires accuracy quantification, or validation, before it can be confidently applied in the design process. This effort presents the beginning of such a validation process for the Loci- CHEM CPD code. The model problem examined here is a gaseous oxygen (GO2)/gaseous hydrogen (GH2) shear coaxial single element injector operating at a chamber pressure of 5.42 MPa. The GO2/GH2 propellant combination in this geometry represents one the simplest rocket model problems and is thus foundational to subsequent validation efforts for more complex injectors. Multiple steady state solutions have been produced with Loci-CHEM employing different hybrid grids and two-equation turbulence models. Iterative convergence for each solution is demonstrated via mass conservation, flow variable monitoring at discrete flow field locations as a function of solution iteration and overall residual performance. A baseline hybrid grid was used and then locally refined to demonstrate grid convergence. Solutions were also obtained with three variations of the k-omega turbulence model.
Development and Validation of Personality Disorder Spectra Scales for the MMPI-2-RF.
Sellbom, Martin; Waugh, Mark H; Hopwood, Christopher J
2018-01-01
The purpose of this study was to develop and validate a set of MMPI-2-RF (Ben-Porath & Tellegen, 2008/2011) personality disorder (PD) spectra scales. These scales could serve the purpose of assisting with DSM-5 PD diagnosis and help link categorical and dimensional conceptions of personality pathology within the MMPI-2-RF. We developed and provided initial validity results for scales corresponding to the 10 PD constructs listed in the DSM-5 using data from student, community, clinical, and correctional samples. Initial validation efforts indicated good support for criterion validity with an external PD measure as well as with dimensional personality traits included in the DSM-5 alternative model for PDs. Construct validity results using psychosocial history and therapists' ratings in a large clinical sample were generally supportive as well. Overall, these brief scales provide clinicians using MMPI-2-RF data with estimates of DSM-5 PD constructs that can support cross-model connections between categorical and dimensional assessment approaches.
NASA Technical Reports Server (NTRS)
Sarathy, Sriprakash
2005-01-01
Solar Sailcraft, the stuff of dreams of the H.G. Wells generation, is now a rapidly maturing reality. The promise of unlimited propulsive power by harnessing stellar radiation is close to realization. Currently, efforts are underway to build, prototype and test two configurations. These sails are designed to meet a 20m sail requirement, under guidance of the In-Space Propulsion (ISP) technology program office at MSFC. While these sails will not fly , they are the first steps in improving our understanding of the processes and phenomena at work. As part of the New Millennium Program (NMP) the ST9 technology validation mission hopes to launch and fly a solar sail by 2010 or sooner. Though the Solar Sail community has been studying and validating various concepts over two decades, it was not until recent breakthroughs in structural and material technology, has made possible to build sails that could be launched. With real sails that can be tested (albeit under earth conditions), the real task of engineering a viable spacecraft has finally commenced. Since it is not possible to accurately or practically recreate the actual operating conditions of the sailcraft (zero-G, vacuum and extremely low temperatures), much of the work has focused on developing accurate models that can be used to predict behavior in space, and for sails that are 6-10 times the size of currently existing sails. Since these models can be validated only with real test data under "earth" conditions, the process of modeling and the identification of uncertainty due to model assumptions and scope need to be closely considered. Sailcraft models that exist currently, are primarily focused on detailed physical representations at the component level, these are intended to support prototyping efforts. System level models that cut across different sail configurations and control concepts while maintaining a consistent approach are non-existent. Much effort has been focused on the areas of thrust performance, solar radiation prediction, and sail membrane behavior vis-a-vis their reflective geometry, such as wrinkling/folding/furling as it pertains to thrust prediction. A parallel effort has been conducted on developing usable models for developing attitude control systems (ACS), for different sail configurations in different regimes. There has been very little by way of a system wide exploration of the impact of the various control schemes, thrust prediction models for different sail configurations being considered.
Supersonics Project - Airport Noise Tech Challenge
NASA Technical Reports Server (NTRS)
Bridges, James
2010-01-01
The Airport Noise Tech Challenge research effort under the Supersonics Project is reviewed. While the goal of "Improved supersonic jet noise models validated on innovative nozzle concepts" remains the same, the success of the research effort has caused the thrust of the research to be modified going forward in time. The main activities from FY06-10 focused on development and validation of jet noise prediction codes. This required innovative diagnostic techniques to be developed and deployed, extensive jet noise and flow databases to be created, and computational tools to be developed and validated. Furthermore, in FY09-10 systems studies commissioned by the Supersonics Project showed that viable supersonic aircraft were within reach using variable cycle engine architectures if exhaust nozzle technology could provide 3-5dB of suppression. The Project then began to focus on integrating the technologies being developed in its Tech Challenge areas to bring about successful system designs. Consequently, the Airport Noise Tech Challenge area has shifted efforts from developing jet noise prediction codes to using them to develop low-noise nozzle concepts for integration into supersonic aircraft. The new plan of research is briefly presented by technology and timelines.
Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors
NASA Technical Reports Server (NTRS)
Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele
2010-01-01
This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.
Youngjohn, James R; Wershba, Rebecca; Stevenson, Matthew; Sturgeon, John; Thomas, Michael L
2011-04-01
The MMPI-2 Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008) is replacing the MMPI-2 as the most widely used personality test in neuropsychological assessment, but additional validation studies are needed. Our study examines MMPI-2-RF Validity scales and the newly created Somatic/Cognitive scales in a recently reported sample of 82 traumatic brain injury (TBI) litigants who either passed or failed effort tests (Thomas & Youngjohn, 2009). The restructured Validity scales FBS-r (restructured symptom validity), F-r (restructured infrequent responses), and the newly created Fs (infrequent somatic responses) were not significant predictors of TBI severity. FBS-r was significantly related to passing or failing effort tests, and Fs and F-r showed non-significant trends in the same direction. Elevations on the Somatic/Cognitive scales profile (MLS-malaise, GIC-gastrointestinal complaints, HPC-head pain complaints, NUC-neurological complaints, and COG-cognitive complaints) were significant predictors of effort test failure. Additionally, HPC had the anticipated paradoxical inverse relationship with head injury severity. The Somatic/Cognitive scales as a group were better predictors of effort test failure than the RF Validity scales, which was an unexpected finding. MLS arose as the single best predictor of effort test failure of all RF Validity and Somatic/Cognitive scales. Item overlap analysis revealed that all MLS items are included in the original MMPI-2 Hy scale, making MLS essentially a subscale of Hy. This study validates the MMPI-2-RF as an effective tool for use in neuropsychological assessment of TBI litigants.
Xiao, Li; Cai, Qin; Li, Zhilin; Zhao, Hongkai; Luo, Ray
2014-11-25
A multi-scale framework is proposed for more realistic molecular dynamics simulations in continuum solvent models by coupling a molecular mechanics treatment of solute with a fluid mechanics treatment of solvent. This article reports our initial efforts to formulate the physical concepts necessary for coupling the two mechanics and develop a 3D numerical algorithm to simulate the solvent fluid via the Navier-Stokes equation. The numerical algorithm was validated with multiple test cases. The validation shows that the algorithm is effective and stable, with observed accuracy consistent with our design.
Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?
NASA Technical Reports Server (NTRS)
Lum, Karen; Hihn, Jairus; Menzies, Tim
2006-01-01
While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.
Choudhry, Shahid A.; Li, Jing; Davis, Darcy; Erdmann, Cole; Sikka, Rishi; Sutariya, Bharat
2013-01-01
Introduction: Preventing the occurrence of hospital readmissions is needed to improve quality of care and foster population health across the care continuum. Hospitals are being held accountable for improving transitions of care to avert unnecessary readmissions. Advocate Health Care in Chicago and Cerner (ACC) collaborated to develop all-cause, 30-day hospital readmission risk prediction models to identify patients that need interventional resources. Ideally, prediction models should encompass several qualities: they should have high predictive ability; use reliable and clinically relevant data; use vigorous performance metrics to assess the models; be validated in populations where they are applied; and be scalable in heterogeneous populations. However, a systematic review of prediction models for hospital readmission risk determined that most performed poorly (average C-statistic of 0.66) and efforts to improve their performance are needed for widespread usage. Methods: The ACC team incorporated electronic health record data, utilized a mixed-method approach to evaluate risk factors, and externally validated their prediction models for generalizability. Inclusion and exclusion criteria were applied on the patient cohort and then split for derivation and internal validation. Stepwise logistic regression was performed to develop two predictive models: one for admission and one for discharge. The prediction models were assessed for discrimination ability, calibration, overall performance, and then externally validated. Results: The ACC Admission and Discharge Models demonstrated modest discrimination ability during derivation, internal and external validation post-recalibration (C-statistic of 0.76 and 0.78, respectively), and reasonable model fit during external validation for utility in heterogeneous populations. Conclusions: The ACC Admission and Discharge Models embody the design qualities of ideal prediction models. The ACC plans to continue its partnership to further improve and develop valuable clinical models. PMID:24224068
Integrated simulations for fusion research in the 2030's time frame (white paper outline)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, Alex; LoDestro, Lynda L.; Parker, Jeffrey B.
This white paper presents the rationale for developing a community-wide capability for whole-device modeling, and advocates for an effort with the expectation of persistence: a long-term programmatic commitment, and support for community efforts. Statement of 2030 goal (two suggestions): (a) Robust integrated simulation tools to aid real-time experimental discharges and reactor designs by employing a hierarchy in fidelity of physics models. (b) To produce by the early 2030s a capability for validated, predictive simulation via integration of a suite of physics models from moderate through high fidelity, to understand and plan full plasma discharges, aid in data interpretation, carry outmore » discovery science, and optimize future machine designs. We can achieve this goal via a focused effort to extend current scientific capabilities and rigorously integrate simulations of disparate physics into a comprehensive set of workflows.« less
Optical observables in stars with non-stationary atmospheres. [fireballs and cepheid models
NASA Technical Reports Server (NTRS)
Hillendahl, R. W.
1980-01-01
Experience gained by use of Cepheid modeling codes to predict the dimensional and photometric behavior of nuclear fireballs is used as a means of validating various computational techniques used in the Cepheid codes. Predicted results from Cepheid models are compared with observations of the continuum and lines in an effort to demonstrate that the atmospheric phenomena in Cepheids are quite complex but that they can be quantitatively modeled.
HIV reservoirs: the new frontier.
Iglesias-Ussel, Maria D; Romerio, Fabio
2011-01-01
Current antiretroviral therapies suppress viremia to very low levels, but are ineffective in eliminating reservoirs of persistent HIV infection. Efforts toward the development of therapies aimed at HIV reservoirs are complicated by the evidence that HIV establishes persistent productive and nonproductive infection in a number of cell types and through a variety of mechanisms. Moreover, immunologically privileged sites such as the brain also act as HIV sanctuaries. To facilitate the advancement of our knowledge in this new area of research, in vitro models of HIV persistence in different cellular reservoirs have been developed, particularly in CD4+ T-cells that represent the largest pool of persistently infected cells in the body. Whereas each model presents clear advantages, they all share one common limitation: they are systems attempting to recapitulate extremely complex virus-cell interactions occurring in vivo, which we know very little about. Potentially conflicting results arising from different models may be difficult to interpret without validation with clinical samples. Addressing these issues, among others, merits careful consideration for the identification of valid targets and the design of effective strategies for therapy, which may increase the success of efforts toward HIV eradication.
NASA National Combustion Code Simulations
NASA Technical Reports Server (NTRS)
Iannetti, Anthony; Davoudzadeh, Farhad
2001-01-01
A systematic effort is in progress to further validate the National Combustion Code (NCC) that has been developed at NASA Glenn Research Center (GRC) for comprehensive modeling and simulation of aerospace combustion systems. The validation efforts include numerical simulation of the gas-phase combustor experiments conducted at the Center for Turbulence Research (CTR), Stanford University, followed by comparison and evaluation of the computed results with the experimental data. Presently, at GRC, a numerical model of the experimental gaseous combustor is built to simulate the experimental model. The constructed numerical geometry includes the flow development sections for air annulus and fuel pipe, 24 channel air and fuel swirlers, hub, combustor, and tail pipe. Furthermore, a three-dimensional multi-block, multi-grid grid (1.6 million grid points, 3-levels of multi-grid) is generated. Computational simulation of the gaseous combustor flow field operating on methane fuel has started. The computational domain includes the whole flow regime starting from the fuel pipe and the air annulus, through the 12 air and 12 fuel channels, in the combustion region and through the tail pipe.
Progress in Validation of Wind-US for Ramjet/Scramjet Combustion
NASA Technical Reports Server (NTRS)
Engblom, William A.; Frate, Franco C.; Nelson, Chris C.
2005-01-01
Validation of the Wind-US flow solver against two sets of experimental data involving high-speed combustion is attempted. First, the well-known Burrows- Kurkov supersonic hydrogen-air combustion test case is simulated, and the sensitively of ignition location and combustion performance to key parameters is explored. Second, a numerical model is developed for simulation of an X-43B candidate, full-scale, JP-7-fueled, internal flowpath operating in ramjet mode. Numerical results using an ethylene-air chemical kinetics model are directly compared against previously existing pressure-distribution data along the entire flowpath, obtained in direct-connect testing conducted at NASA Langley Research Center. Comparison to derived quantities such as burn efficiency and thermal throat location are also made. Reasonable to excellent agreement with experimental data is demonstrated for key parameters in both simulation efforts. Additional Wind-US feature needed to improve simulation efforts are described herein, including maintaining stagnation conditions at inflow boundaries for multi-species flow. An open issue regarding the sensitivity of isolator unstart to key model parameters is briefly discussed.
Overview of NASA Multi-dimensional Stirling Convertor Code Development and Validation Effort
NASA Technical Reports Server (NTRS)
Tew, Roy C.; Cairelli, James E.; Ibrahim, Mounir B.; Simon, Terrence W.; Gedeon, David
2002-01-01
A NASA grant has been awarded to Cleveland State University (CSU) to develop a multi-dimensional (multi-D) Stirling computer code with the goals of improving loss predictions and identifying component areas for improvements. The University of Minnesota (UMN) and Gedeon Associates are teamed with CSU. Development of test rigs at UMN and CSU and validation of the code against test data are part of the effort. The one-dimensional (1-D) Stirling codes used for design and performance prediction do not rigorously model regions of the working space where abrupt changes in flow area occur (such as manifolds and other transitions between components). Certain hardware experiences have demonstrated large performance gains by varying manifolds and heat exchanger designs to improve flow distributions in the heat exchangers. 1-D codes were not able to predict these performance gains. An accurate multi-D code should improve understanding of the effects of area changes along the main flow axis, sensitivity of performance to slight changes in internal geometry, and, in general, the understanding of various internal thermodynamic losses. The commercial CFD-ACE code has been chosen for development of the multi-D code. This 2-D/3-D code has highly developed pre- and post-processors, and moving boundary capability. Preliminary attempts at validation of CFD-ACE models of MIT gas spring and "two space" test rigs were encouraging. Also, CSU's simulations of the UMN oscillating-flow fig compare well with flow visualization results from UMN. A complementary Department of Energy (DOE) Regenerator Research effort is aiding in development of regenerator matrix models that will be used in the multi-D Stirling code. This paper reports on the progress and challenges of this
Synergies Between Grace and Regional Atmospheric Modeling Efforts
NASA Astrophysics Data System (ADS)
Kusche, J.; Springer, A.; Ohlwein, C.; Hartung, K.; Longuevergne, L.; Kollet, S. J.; Keune, J.; Dobslaw, H.; Forootan, E.; Eicker, A.
2014-12-01
In the meteorological community, efforts converge towards implementation of high-resolution (< 12km) data-assimilating regional climate modelling/monitoring systems based on numerical weather prediction (NWP) cores. This is driven by requirements of improving process understanding, better representation of land surface interactions, atmospheric convection, orographic effects, and better forecasting on shorter timescales. This is relevant for the GRACE community since (1) these models may provide improved atmospheric mass separation / de-aliasing and smaller topography-induced errors, compared to global (ECMWF-Op, ERA-Interim) data, (2) they inherit high temporal resolution from NWP models, (3) parallel efforts towards improving the land surface component and coupling groundwater models; this may provide realistic hydrological mass estimates with sub-diurnal resolution, (4) parallel efforts towards re-analyses, with the aim of providing consistent time series. (5) On the other hand, GRACE can help validating models and aids in the identification of processes needing improvement. A coupled atmosphere - land surface - groundwater modelling system is currently being implemented for the European CORDEX region at 12.5 km resolution, based on the TerrSysMP platform (COSMO-EU NWP, CLM land surface and ParFlow groundwater models). We report results from Springer et al. (J. Hydromet., accept.) on validating the water cycle in COSMO-EU using GRACE and precipitation, evapotranspiration and runoff data; confirming that the model does favorably at representing observations. We show that after GRACE-derived bias correction, basin-average hydrological conditions prior to 2002 can be reconstructed better than before. Next, comparing GRACE with CLM forced by EURO-CORDEX simulations allows identifying processes needing improvement in the model. Finally, we compare COSMO-EU atmospheric pressure, a proxy for mass corrections in satellite gravimetry, with ERA-Interim over Europe at timescales shorter/longer than 1 month, and spatial scales below/above ERA resolution. We find differences between regional and global model more pronounced at high frequencies, with magnitude at sub-grid scale and larger scale corresponding to 1-3 hPa (1-3 cm EWH); relevant for the assessment of post-GRACE concepts.
The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation
2015-02-20
being integrated within MAT, including Granger causality. Granger causality tests whether a data series helps when predicting future values of another...relations by econometric models and cross-spectral methods. Econometrica: Journal of the Econometric Society, 424-438. Granger, C. W. (1980). Testing ... testing dataset. This effort is described in Section 3.2. 3.1. Improvements in Granger Causality User Interface Various metrics of causality are
Multivariate Modelling of the Career Intent of Air Force Personnel.
1980-09-01
index (HOPP) was used as a measure of current job satisfaction . As with the Vroom and Fishbein/Graen models, two separate validations were accom...34 Organizational Behavior and Human Performance , 23: 251-267, 1979. Lewis, Logan M. "Expectancy Theory as a Predictive Model of Career Intent, Job Satisfaction ...W. Albright. "Expectancy Theory Predictions of the Satisfaction , Effort, Performance , and Retention of Naval Aviation Officers," Organizational
A Model for Pharmacological Research-Treatment of Cocaine Dependence
Montoya, Ivan D.; Hess, Judith M.; Preston, Kenzie L.; Gorelick, David A.
2008-01-01
Major problems for research on pharmacological treatments for cocaine dependence are lack of comparability of results from different treatment research programs and poor validity and/or reliability of results. Double-blind, placebo-controlled, random assignment, experimental designs, using standard intake and assessment procedures help to reduce these problems. Cessation or reduction of drug use and/or craving, retention in treatment, and medical and psychosocial improvement are some of the outcome variables collected in treatment research programs. A model to be followed across different outpatient clinical trials for pharmacological treatment of cocaine dependence is presented here. This model represents an effort to standardize data collection to make results more valid and comparable. PMID:8749725
Fixed gain and adaptive techniques for rotorcraft vibration control
NASA Technical Reports Server (NTRS)
Roy, R. H.; Saberi, H. A.; Walker, R. A.
1985-01-01
The results of an analysis effort performed to demonstrate the feasibility of employing approximate dynamical models and frequency shaped cost functional control law desgin techniques for helicopter vibration suppression are presented. Both fixed gain and adaptive control designs based on linear second order dynamical models were implemented in a detailed Rotor Systems Research Aircraft (RSRA) simulation to validate these active vibration suppression control laws. Approximate models of fuselage flexibility were included in the RSRA simulation in order to more accurately characterize the structural dynamics. The results for both the fixed gain and adaptive approaches are promising and provide a foundation for pursuing further validation in more extensive simulation studies and in wind tunnel and/or flight tests.
Gearbox Reliability Collaborative Investigation of Gearbox Motion and High-Speed-Shaft Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keller, Jon; Guo, Yi; Sethuraman, Latha
2016-03-18
This paper extends a model-to-test validation effort to examine the effect of different constant rotor torque and moment conditions and intentional generator misalignment on the gearbox motion and high-speed-shaft loads. Fully validating gearbox motion and high-speed-shaft loads across a range of test conditions is a critical precursor to examining the bearing loads, as the gearbox motion and high-speed-shaft loads are the drivers of these bearing loads.
Validation of Computational Models in Biomechanics
Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.
2010-01-01
The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648
Validation of Blockage Interference Corrections in the National Transonic Facility
NASA Technical Reports Server (NTRS)
Walker, Eric L.
2007-01-01
A validation test has recently been constructed for wall interference methods as applied to the National Transonic Facility (NTF). The goal of this study was to begin to address the uncertainty of wall-induced-blockage interference corrections, which will make it possible to address the overall quality of data generated by the facility. The validation test itself is not specific to any particular modeling. For this present effort, the Transonic Wall Interference Correction System (TWICS) as implemented at the NTF is the mathematical model being tested. TWICS uses linear, potential boundary conditions that must first be calibrated. These boundary conditions include three different classical, linear. homogeneous forms that have been historically used to approximate the physical behavior of longitudinally slotted test section walls. Results of the application of the calibrated wall boundary conditions are discussed in the context of the validation test.
Propeller aircraft interior noise model utilization study and validation
NASA Technical Reports Server (NTRS)
Pope, L. D.
1984-01-01
Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.
SPR Hydrostatic Column Model Verification and Validation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bettin, Giorgia; Lord, David; Rudeen, David Keith
2015-10-01
A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extendedmore » nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.« less
Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
2015-02-01
The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less
The single-zone numerical model of homogeneous charge compression ignition engine performance
NASA Astrophysics Data System (ADS)
Fedyanov, E. A.; Itkis, E. M.; Kuzmin, V. N.; Shumskiy, S. N.
2017-02-01
The single-zone model of methane-air mixture combustion in the Homogeneous Charge Compression Ignition engine was developed. First modeling efforts resulted in the selection of the detailed kinetic reaction mechanism, most appropriate for the conditions of the HCCI process. Then, the model was completed so as to simulate the performance of the four-stroke engine and was coupled by physically reasonable adjusting functions. Validation of calculations against experimental data showed acceptable agreement.
Why Autism Must Be Taken Apart
ERIC Educational Resources Information Center
Waterhouse, Lynn; Gillberg, Christopher
2014-01-01
Although accumulated evidence has demonstrated that autism is found with many varied brain dysfunctions, researchers have tried to find a single brain dysfunction that would provide neurobiological validity for autism. However, unitary models of autism brain dysfunction have not adequately addressed conflicting evidence, and efforts to find a…
Validation of Model Simulations of Anvil Cirrus Properties During TWP-ICE: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zipser, Edward J.
2013-05-20
This 3-year grant, with two extensions, resulted in a successful 5-year effort, led by Ph.D. student Adam Varble, to compare cloud resolving model (CRM) simulations with the excellent database obtained during the TWP-ICE field campaign. The objective, largely achieved, is to undertake these comparisons comprehensively and quantitatively, informing the community in ways that goes beyond pointing out errors in the models, but points out ways to improve both cloud dynamics and microphysics parameterizations in future modeling efforts. Under DOE support, Adam Varble, with considerable assistance from Dr. Ann Fridlind and others, entrained scientists who ran some 10 different CRMs andmore » 4 different limited area models (LAMs) using a variety of microphysics parameterizations, to ensure that the conclusions of the study will have considerable generality.« less
Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M
2011-12-01
This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy. © 2011 Society for Risk Analysis.
High Fidelity System Simulation of Multiple Components in Support of the UEET Program
NASA Technical Reports Server (NTRS)
Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton
2006-01-01
The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.
Overview of heat transfer and fluid flow problem areas encountered in Stirling engine modeling
NASA Technical Reports Server (NTRS)
Tew, Roy C., Jr.
1988-01-01
NASA Lewis Research Center has been managing Stirling engine development programs for over a decade. In addition to contractual programs, this work has included in-house engine testing and development of engine computer models. Attempts to validate Stirling engine computer models with test data have demonstrated that engine thermodynamic losses need better characterization. Various Stirling engine thermodynamic losses and efforts that are underway to characterize these losses are discussed.
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
WFIRST: Coronagraph Systems Engineering and Performance Budgets
NASA Astrophysics Data System (ADS)
Poberezhskiy, Ilya; cady, eric; Frerking, Margaret A.; Kern, Brian; Nemati, Bijan; Noecker, Martin; Seo, Byoung-Joon; Zhao, Feng; Zhou, Hanying
2018-01-01
The WFIRST coronagraph instrument (CGI) will be the first in-space coronagraph using active wavefront control to directly image and characterize mature exoplanets and zodiacal disks in reflected starlight. For CGI systems engineering, including requirements development, CGI performance is predicted using a hierarchy of performance budgets to estimate various noise components — spatial and temporal flux variations — that obscure exoplanet signals in direct imaging and spectroscopy configurations. These performance budgets are validated through a robust integrated modeling and testbed model validation efforts.We present the performance budgeting framework used by WFIRST for the flow-down of coronagraph science requirements, mission constraints, and observatory interfaces to measurable instrument engineering parameters.
Xiao, Li; Cai, Qin; Li, Zhilin; Zhao, Hongkai; Luo, Ray
2014-01-01
A multi-scale framework is proposed for more realistic molecular dynamics simulations in continuum solvent models by coupling a molecular mechanics treatment of solute with a fluid mechanics treatment of solvent. This article reports our initial efforts to formulate the physical concepts necessary for coupling the two mechanics and develop a 3D numerical algorithm to simulate the solvent fluid via the Navier-Stokes equation. The numerical algorithm was validated with multiple test cases. The validation shows that the algorithm is effective and stable, with observed accuracy consistent with our design. PMID:25404761
Validation and Improvement of Reliability Methods for Air Force Building Systems
focusing primarily on HVAC systems . This research used contingency analysis to assess the performance of each model for HVAC systems at six Air Force...probabilistic model produced inflated reliability calculations for HVAC systems . In light of these findings, this research employed a stochastic method, a...Nonhomogeneous Poisson Process (NHPP), in an attempt to produce accurate HVAC system reliability calculations. This effort ultimately concluded that
The impact of effort-reward imbalance on quality of life among Japanese working men.
Watanabe, Mayumi; Tanaka, Katsutoshi; Aratake, Yutaka; Kato, Noritada; Sakata, Yumi
2008-07-01
Health-related quality of life (HRQL) is an important measure of health outcome in working and healthy populations. Here, we investigated the impact of effort-reward imbalance (ERI), a representative work-stress model, on HRQL of Japanese working men. The study targeted 1,096 employees from a manufacturing plant in Japan. To assess HRQL and ERI, participants were surveyed using the Japanese version of the Short-Form 8 Health Survey (SF-8) and effort-reward imbalance model. Of the 1,096 employees, 1,057 provided valid responses to the questionnaire. For physical summary scores, the adjusted effort-reward imbalance odds ratios of middle vs. bottom and top vs. bottom tertiles were 0.24 (95% confidence interval, 0.08-0.70) and 0.09 (95% confidence interval, 0.03-0.28), respectively. For mental summary scores, ratios were 0.21 (95% confidence interval, 0.07-0.63) and 0.07 (95% confidence interval, 0.02-0.25), respectively. These findings demonstrate that effort-reward imbalance is independently associated with HRQL among Japanese employees.
A survey of Applied Psychological Services' models of the human operator
NASA Technical Reports Server (NTRS)
Siegel, A. I.; Wolf, J. J.
1979-01-01
A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.
Enhancing Transfer Effectiveness: A Model for the 1990s.
ERIC Educational Resources Information Center
Berman, Paul; And Others
In an effort to identify effective transfer practices appropriate to different community college circumstances, and to establish a quantitative database that would enable valid comparisons of transfer between their 28 member institutions, the National Effective Transfer Consortium (NETC) sponsored a survey of more than 30,000 students attending…
ERIC Educational Resources Information Center
Chai, Ching Shing; Ng, Eugenia M. W.; Li, Wenhao; Hong, Huang-Yao; Koh, Joyce H. L.
2013-01-01
The Technological Pedagogical Content Knowledge (TPCK) framework has been adopted by many educational technologists and teacher educators for the research and development of knowledge about the pedagogical uses of Information and Communication Technologies (ICT) in classrooms. While the framework is potentially very important, efforts to survey…
Improving Quality in Education: Dynamic Approaches to School Improvement
ERIC Educational Resources Information Center
Creemers, Bert P. M.; Kyriakides, Leonidas
2011-01-01
This book explores an approach to school improvement that merges the traditions of educational effectiveness research and school improvement efforts. It displays how the dynamic model, which is theoretical and empirically validated, can be used in both traditions. Each chapter integrates evidence from international and national studies, showing…
ERIC Educational Resources Information Center
Gluschkoff, Kia; Elovainio, Marko; Keltikangas-Järvinen, Liisa; Hintsanen, Mirka; Mullola, Sari; Hintsa, Taina
2016-01-01
Introduction: We examined the associations and proportionate contributions of three well-validated models of stressful psychosocial work environment (job strain, effort-reward imbalance, and organizational injustice) in explaining depressive symptoms among primary school teachers. In addition, we tested the mediating role of different types of…
Ethnic Differences in Decisional Balance and Stages of Mammography Adoption
ERIC Educational Resources Information Center
Otero-Sabogal, Regina; Stewart, Susan; Shema, Sarah J.; Pasick, Rena J.
2007-01-01
Behavioral theories developed through research with mainstream, English-speaking populations have been applied to ethnically diverse and underserved communities in the effort to eliminate disparities in early breast cancer detection. This study tests the validity of the transtheoretical model (TTM) decisional balance measure and the application of…
Determination of indoor exposure levels commonly involves assumptions of fully mixed ventilation conditions. In the effort to determine contaminant levels with efficiency, the nodal approach is common in modeling of the indoor environment. To quantify the transport phenomenon or ...
Track/train dynamics test procedure transfer function test
NASA Technical Reports Server (NTRS)
Vigil, R. A.
1975-01-01
A transfer function vibration test was made on an 80 ton open hopper freight car in an effort to obtain validation data on the car's nonlinear elastic model. Test configuration, handling, test facilities, test operations, and data acquisition/reduction activities necessary to meet the conditions of test requirements are given.
An examination of data quality on QSAR Modeling in regards ...
The development of QSAR models is critically dependent on the quality of available data. As part of our efforts to develop public platforms to provide access to predictive models, we have attempted to discriminate the influence of the quality versus quantity of data available to develop and validate QSAR models. We have focused our efforts on the widely used EPISuite software that was initially developed over two decades ago and, specifically, on the PHYSPROP dataset used to train the EPISuite prediction models. This presentation will review our approaches to examining key datasets, the delivery of curated data and the development of machine-learning models for thirteen separate property endpoints of interest to environmental science. We will also review how these data will be made freely accessible to the community via a new “chemistry dashboard”. This abstract does not reflect U.S. EPA policy. presentation at UNC-CH.
Software development predictors, error analysis, reliability models and software metric analysis
NASA Technical Reports Server (NTRS)
Basili, Victor
1983-01-01
The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
Accomplishments are described for the second year effort of a 3-year program to develop methodology for component specific modeling of aircraft engine hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models; (2) geometry model generators; (3) remeshing; (4) specialty 3-D inelastic stuctural analysis; (5) computationally efficient solvers, (6) adaptive solution strategies; (7) engine performance parameters/component response variables decomposition and synthesis; (8) integrated software architecture and development, and (9) validation cases for software developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, J. H.; Ng, E. Y. K.; Robertson, Amy
As part of a collaboration of the National Renewable Energy Laboratory (NREL) and SWAY AS, NREL installed scientific wind, wave, and motion measurement equipment on the spar-type 1/6.5th-scale prototype SWAY floating offshore wind system. The equipment enhanced SWAY's data collection and allowed SWAY to verify the concept and NREL to validate a FAST model of the SWAY design in an open-water condition. Nanyang Technological University (NTU), in collaboration with NREL, assisted with the validation. This final report gives an overview of the SWAY prototype and NREL and NTU's efforts to validate a model of the system. The report provides amore » summary of the different software tools used in the study, the modeling strategies, and the development of a FAST model of the SWAY prototype wind turbine, including justification of the modeling assumptions. Because of uncertainty in system parameters and modeling assumptions due to the complexity of the design, several system properties were tuned to better represent the system and improve the accuracy of the simulations. Calibration was performed using data from a static equilibrium test and free-decay tests.« less
Lee, Chien-Ching; Lin, Shih-Pin; Yang, Shu-Ling; Tsou, Mei-Yung; Chang, Kuang-Yi
2013-03-01
Medical institutions are eager to introduce new information technology to improve patient safety and clinical efficiency. However, the acceptance of new information technology by medical personnel plays a key role in its adoption and application. This study aims to investigate whether perceived organizational learning capability (OLC) is associated with user acceptance of information technology among operating room nurse staff. Nurse anesthetists and operating room nurses were recruited in this questionnaire survey. A pilot study was performed to ensure the reliability and validity of the translated questionnaire, which consisted of 14 items from the four dimensions of OLC, and 16 items from the four constructs of user acceptance of information technology, including performance expectancy, effort expectancy, social influence, and behavioral intention. Confirmatory factor analysis was applied in the main survey to evaluate the construct validity of the questionnaire. Structural equation modeling was used to test the hypothetical relationships between the four dimensions of user acceptance of information technology and the second-ordered OLC. Goodness of fit of the hypothetic model was also assessed. Performance expectancy, effort expectancy, and social influence positively influenced behavioral intention of users of the clinical information system (all p < 0.001) and accounted for 75% of its variation. The second-ordered OLC was positively associated with performance expectancy, effort expectancy, and social influence (all p < 0.001). However, the hypothetic relationship between perceived OLC and behavioral intention was not significant (p = 0.87). The fit statistical analysis indicated reasonable model fit to data (root mean square error of approximation = 0.07 and comparative fit index = 0.91). Perceived OLC indirectly affects user behavioral intention through the mediation of performance expectancy, effort expectancy, and social influence in the operating room setting. Copyright © 2013. Published by Elsevier B.V.
Longitudinal Validation of General and Specific Structural Features of Personality Pathology
Wright, Aidan G.C.; Hopwood, Christopher J.; Skodol, Andrew E.; Morey, Leslie C.
2016-01-01
Theorists have long argued that personality disorder (PD) is best understood in terms of general impairments shared across the disorders as well as more specific instantiations of pathology. A model based on this theoretical structure was proposed as part of the DSM-5 revision process. However, only recently has this structure been subjected to formal quantitative evaluation, with little in the way of validation efforts via external correlates or prospective longitudinal prediction. We used the Collaborative Longitudinal Study of Personality Disorders dataset to: (1) estimate structural models that parse general from specific variance in personality disorder features, (2) examine patterns of growth in general and specific features over the course of 10 years, and (3) establish concurrent and dynamic longitudinal associations in PD features and a host of external validators including basic personality traits and psychosocial functioning scales. We found that general PD exhibited much lower absolute stability and was most strongly related to broad markers of psychosocial functioning, concurrently and longitudinally, whereas specific features had much higher mean stability and exhibited more circumscribed associations with functioning. However, both general and specific factors showed recognizable associations with normative and pathological traits. These results can inform efforts to refine the conceptualization and diagnosis of personality pathology. PMID:27819472
Dunmyre, Justin R
2011-06-01
The pre-Bötzinger complex of the mammalian brainstem is a heterogeneous neuronal network, and individual neurons within the network have varying strengths of the persistent sodium and calcium-activated nonspecific cationic currents. Individually, these currents have been the focus of modeling efforts. Previously, Dunmyre et al. (J Comput Neurosci 1-24, 2011) proposed a model and studied the interactions of these currents within one self-coupled neuron. In this work, I consider two identical, reciprocally coupled model neurons and validate the reduction to the self-coupled case. I find that all of the dynamics of the two model neuron network and the regions of parameter space where these distinct dynamics are found are qualitatively preserved in the reduction to the self-coupled case.
Sebok, Angelia; Wickens, Christopher D
2017-03-01
The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.
Dislocation dynamics: simulation of plastic flow of bcc metals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lassila, D H
This is the final report for the LDRD strategic initiative entitled ''Dislocation Dynamic: Simulation of Plastic Flow of bcc Metals'' (tracking code: 00-SI-011). This report is comprised of 6 individual sections. The first is an executive summary of the project and describes the overall project goal, which is to establish an experimentally validated 3D dislocation dynamics simulation. This first section also gives some information of LLNL's multi-scale modeling efforts associated with the plasticity of bcc metals, and the role of this LDRD project in the multiscale modeling program. The last five sections of this report are journal articles that weremore » produced during the course of the FY-2000 efforts.« less
Design and validation of diffusion MRI models of white matter
NASA Astrophysics Data System (ADS)
Jelescu, Ileana O.; Budde, Matthew D.
2017-11-01
Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus.
Design and validation of diffusion MRI models of white matter
Jelescu, Ileana O.; Budde, Matthew D.
2018-01-01
Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus. PMID:29755979
Magnavita, N
2007-01-01
Occupational stress is currently studied by the Job Demand/Control model of Karasek, and the Effort/Reward Imbalance model of Siegrist. In this study we have translated into Italian and validated the short form of the Job Content Questionnaire (JCQ) and of the Effort Reward Imbalance Questionnaire (ERI). The questionnaires were applied to 531 health care workers during periodical medical examinations. Estimations of internal consistency, based on the correlation among the variables comprising the set (Cronbach's alpha), in each case were satisfactory (alpha ranging from 0.76 to 0.89), with the exception of the control" scale of JCQ (alpha = 0.57). Exploratory factor analysis showed that "control" scale of JCQ, and "reward" scale of ERI could be divided into two and, respectively, three sub-scales. The Karasek's and Siegrist's models made distinct contributions to explaining perceived work stress. Both JCQ and ERI questionnaire may be useful in occupational health.
Cryogenic Fluid Storage Technology Development: Recent and Planned Efforts at NASA
NASA Technical Reports Server (NTRS)
Moran, Matthew E.
2009-01-01
Recent technology development work conducted at NASA in the area of Cryogenic Fluid Management (CFM) storage is highlighted, including summary results, key impacts, and ongoing efforts. Thermodynamic vent system (TVS) ground test results are shown for hydrogen, methane, and oxygen. Joule-Thomson (J-T) device tests related to clogging in hydrogen are summarized, along with the absence of clogging in oxygen and methane tests. Confirmation of analytical relations and bonding techniques for broad area cooling (BAC) concepts based on tube-to-tank tests are presented. Results of two-phase lumped-parameter computational fluid dynamic (CFD) models are highlighted, including validation of the model with hydrogen self pressurization test data. These models were used to simulate Altair representative methane and oxygen tanks subjected to 210 days of lunar surface storage. Engineering analysis tools being developed to support system level trades and vehicle propulsion system designs are also cited. Finally, prioritized technology development risks identified for Constellation cryogenic propulsion systems are presented, and future efforts to address those risks are discussed.
Modeling of Texture Evolution During Hot Forging of Alpha/Beta Titanium Alloys (Preprint)
2007-06-01
treatment. The approach was validated via an industrial -scale trail comprising hot pancake forging of Ti- 6Al-4V. 15. SUBJECT TERMS titanium... industrial -scale trial comprising hot pancake forging of Ti-6Al-4V. Keywords: Titanium, Texture, Modeling, Strain Partitioning, Variant Selection... industrial -scale forging of Ti- 6Al-4V. 2. Background A brief review of pertinent previous efforts in the area of texture modeling is presented below
NASA Space Radiation Risk Project: Overview and Recent Results
NASA Technical Reports Server (NTRS)
Blattnig, Steve R.; Chappell, Lori J.; George, Kerry A.; Hada, Megumi; Hu, Shaowen; Kidane, Yared H.; Kim, Myung-Hee Y.; Kovyrshina, Tatiana; Norman, Ryan B.; Nounu, Hatem N.;
2015-01-01
The NASA Space Radiation Risk project is responsible for integrating new experimental and computational results into models to predict risk of cancer and acute radiation syndrome (ARS) for use in mission planning and systems design, as well as current space operations. The project has several parallel efforts focused on proving NASA's radiation risk projection capability in both the near and long term. This presentation will give an overview, with select results from these efforts including the following topics: verification, validation, and streamlining the transition of models to use in decision making; relative biological effectiveness and dose rate effect estimation using a combination of stochastic track structure simulations, DNA damage model calculations and experimental data; ARS model improvements; pathway analysis from gene expression data sets; solar particle event probabilistic exposure calculation including correlated uncertainties for use in design optimization.
2012-01-01
Background This paper reports on results of a newly developed questionnaire for the assessment of effort-reward imbalance (ERI) in unpaid household and family work. Methods: Using a cross-sectional population-based survey of German mothers (n = 3129) the dimensional structure of the theoretical ERI model was validated by means of Confirmatory Factor Analysis (CFA). Analyses of Variance were computed to examine relationships between ERI and social factors and health outcomes. Results CFA revealed good psychometric properties indicating that the subscale 'effort' is based on one latent factor and the subscale 'reward' is composed of four dimensions: 'intrinsic value of family and household work', 'societal esteem', 'recognition from the partner', and 'affection from the child(ren)'. About 19.3% of mothers perceived lack of reciprocity and 23.8% showed high rates of overcommitment in terms of inability to withdraw from household and family obligations. Socially disadvantaged mothers were at higher risk of ERI, in particular with respect to the perception of low societal esteem. Gender inequality in the division of household and family work and work-family conflict accounted most for ERI in household and family work. Analogous to ERI in paid work we could demonstrate that ERI affects self-rated health, somatic complaints, mental health and, to some extent, hypertension. Conclusions The newly developed questionnaire demonstrates satisfied validity and promising results for extending the ERI model to household and family work. PMID:22221851
NASA Astrophysics Data System (ADS)
Kuznetsova, Maria
The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) was established at the dawn of the new millennium as a long-term flexible solution to the problem of transition of progress in space environment modeling to operational space weather forecasting. CCMC hosts an expanding collection of state-of-the-art space weather models developed by the international space science community. Over the years the CCMC acquired the unique experience in preparing complex models and model chains for operational environment and developing and maintaining custom displays and powerful web-based systems and tools ready to be used by researchers, space weather service providers and decision makers. In support of space weather needs of NASA users CCMC is developing highly-tailored applications and services that target specific orbits or locations in space and partnering with NASA mission specialists on linking CCMC space environment modeling with impacts on biological and technological systems in space. Confidence assessment of model predictions is an essential element of space environment modeling. CCMC facilitates interaction between model owners and users in defining physical parameters and metrics formats relevant to specific applications and leads community efforts to quantify models ability to simulate and predict space environment events. Interactive on-line model validation systems developed at CCMC make validation a seamless part of model development circle. The talk will showcase innovative solutions for space weather research, validation, anomaly analysis and forecasting and review on-going community-wide model validation initiatives enabled by CCMC applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Canhai; Xu, Zhijie; Pan, Wenxiao
2016-01-01
To quantify the predictive confidence of a solid sorbent-based carbon capture design, a hierarchical validation methodology—consisting of basic unit problems with increasing physical complexity coupled with filtered model-based geometric upscaling has been developed and implemented. This paper describes the computational fluid dynamics (CFD) multi-phase reactive flow simulations and the associated data flows among different unit problems performed within the said hierarchical validation approach. The bench-top experiments used in this calibration and validation effort were carefully designed to follow the desired simple-to-complex unit problem hierarchy, with corresponding data acquisition to support model parameters calibrations at each unit problem level. A Bayesianmore » calibration procedure is employed and the posterior model parameter distributions obtained at one unit-problem level are used as prior distributions for the same parameters in the next-tier simulations. Overall, the results have demonstrated that the multiphase reactive flow models within MFIX can be used to capture the bed pressure, temperature, CO2 capture capacity, and kinetics with quantitative accuracy. The CFD modeling methodology and associated uncertainty quantification techniques presented herein offer a solid framework for estimating the predictive confidence in the virtual scale up of a larger carbon capture device.« less
TOPEX Microwave Radiometer - Thermal design verification test and analytical model validation
NASA Technical Reports Server (NTRS)
Lin, Edward I.
1992-01-01
The testing of the TOPEX Microwave Radiometer (TMR) is described in terms of hardware development based on the modeling and thermal vacuum testing conducted. The TMR and the vacuum-test facility are described, and the thermal verification test includes a hot steady-state segment, a cold steady-state segment, and a cold survival mode segment totalling 65 hours. A graphic description is given of the test history which is related temperature tracking, and two multinode TMR test-chamber models are compared to the test results. Large discrepancies between the test data and the model predictions are attributed to contact conductance, effective emittance from the multilayer insulation, and heat leaks related to deviations from the flight configuration. The TMR thermal testing/modeling effort is shown to provide technical corrections for the procedure outlined, and the need for validating predictive models is underscored.
Development of Methods to Predict the Effects of Test Media in Ground-Based Propulsion Testing
NASA Technical Reports Server (NTRS)
Drummond, J. Philip; Danehy, Paul M.; Gaffney, Richard L., Jr.; Parker, Peter A.; Tedder, Sarah A.; Chelliah, Harsha K.; Cutler, Andrew D.; Bivolaru, Daniel; Givi, Peyman; Hassan, Hassan A.
2009-01-01
This report discusses work that began in mid-2004 sponsored by the Office of the Secretary of Defense (OSD) Test & Evaluation/Science & Technology (T&E/S&T) Program. The work was undertaken to improve the state of the art of CFD capabilities for predicting the effects of the test media on the flameholding characteristics in scramjet engines. The program had several components including the development of advanced algorithms and models for simulating engine flowpaths as well as a fundamental experimental and diagnostic development effort to support the formulation and validation of the mathematical models. This report provides details of the completed work, involving the development of phenomenological models for Reynolds averaged Navier-Stokes codes, large-eddy simulation techniques and reduced-kinetics models. Experiments that provided data for the modeling efforts are also described, along with with the associated nonintrusive diagnostics used to collect the data.
Predicting the Effects of Test Media in Ground-Based Propulsion Testing
NASA Technical Reports Server (NTRS)
Drummond, J. Philip; Danehy, Paul M.; Bivolaru, Daniel; Gaffney, Richard L.; Parker, Peter A.; Chelliah, Harsha K.; Cutler, Andrew D.; Givi, Peyman; Hassan, Hassan, A.
2006-01-01
This paper discusses the progress of work which began in mid-2004 sponsored by the Office of the Secretary of Defense (OSD) Test & Evaluation/Science & Technology (T&E/S&T) Program. The purpose of the work is to improve the state of the art of CFD capabilities for predicting the effects of the test media on the flameholding characteristics in scramjet engines. The program has several components including the development of advance algorithms and models for simulating engine flowpaths as well as a fundamental experimental and diagnostic development effort to support the formulation and validation of the mathematical models. The paper will provide details of current work involving the development of phenomenological models for Reynolds averaged Navier-Stokes codes, large-eddy simulation techniques and reduced-kinetics models. Experiments that will provide data for the modeling efforts will also be described, along with with the associated nonintrusive diagnostics used to collect the data.
van Rooij, Antonius J; Van Looy, Jan; Billieux, Joël
2017-07-01
Some people have serious problems controlling their Internet and video game use. The DSM-5 now includes a proposal for 'Internet Gaming Disorder' (IGD) as a condition in need of further study. Various studies aim to validate the proposed diagnostic criteria for IGD and multiple new scales have been introduced that cover the suggested criteria. Using a structured approach, we demonstrate that IGD might be better interpreted as a formative construct, as opposed to the current practice of conceptualizing it as a reflective construct. Incorrectly approaching a formative construct as a reflective one causes serious problems in scale development, including: (i) incorrect reliance on item-to-total scale correlation to exclude items and incorrectly relying on indices of inter-item reliability that do not fit the measurement model (e.g., Cronbach's α); (ii) incorrect interpretation of composite or mean scores that assume all items are equal in contributing value to a sum score; and (iii) biased estimation of model parameters in statistical models. We show that these issues are impacting current validation efforts through two recent examples. A reinterpretation of IGD as a formative construct has broad consequences for current validation efforts and provides opportunities to reanalyze existing data. We discuss three broad implications for current research: (i) composite latent constructs should be defined and used in models; (ii) item exclusion and selection should not rely on item-to-total scale correlations; and (iii) existing definitions of IGD should be enriched further. © 2016 The Authors. Psychiatry and Clinical Neurosciences © 2016 Japanese Society of Psychiatry and Neurology.
Predictive Capability Maturity Model for computational modeling and simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.
2007-10-01
The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronauticsmore » and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.« less
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
Time Domain Tool Validation Using ARES I-X Flight Data
NASA Technical Reports Server (NTRS)
Hough, Steven; Compton, James; Hannan, Mike; Brandon, Jay
2011-01-01
The ARES I-X vehicle was launched from NASA's Kennedy Space Center (KSC) on October 28, 2009 at approximately 11:30 EDT. ARES I-X was the first test flight for NASA s ARES I launch vehicle, and it was the first non-Shuttle launch vehicle designed and flown by NASA since Saturn. The ARES I-X had a 4-segment solid rocket booster (SRB) first stage and a dummy upper stage (US) to emulate the properties of the ARES I US. During ARES I-X pre-flight modeling and analysis, six (6) independent time domain simulation tools were developed and cross validated. Each tool represents an independent implementation of a common set of models and parameters in a different simulation framework and architecture. Post flight data and reconstructed models provide the means to validate a subset of the simulations against actual flight data and to assess the accuracy of pre-flight dispersion analysis. Post flight data consists of telemetered Operational Flight Instrumentation (OFI) data primarily focused on flight computer outputs and sensor measurements as well as Best Estimated Trajectory (BET) data that estimates vehicle state information from all available measurement sources. While pre-flight models were found to provide a reasonable prediction of the vehicle flight, reconstructed models were generated to better represent and simulate the ARES I-X flight. Post flight reconstructed models include: SRB propulsion model, thrust vector bias models, mass properties, base aerodynamics, and Meteorological Estimated Trajectory (wind and atmospheric data). The result of the effort is a set of independently developed, high fidelity, time-domain simulation tools that have been cross validated and validated against flight data. This paper presents the process and results of high fidelity aerospace modeling, simulation, analysis and tool validation in the time domain.
Exploration of Uncertainty in Glacier Modelling
NASA Technical Reports Server (NTRS)
Thompson, David E.
1999-01-01
There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.
NASA Technical Reports Server (NTRS)
Taber, William; Port, Dan
2014-01-01
At the Mission Design and Navigation Software Group at the Jet Propulsion Laboratory we make use of finite exponential based defect models to aid in maintenance planning and management for our widely used critical systems. However a number of pragmatic issues arise when applying defect models for a post-release system in continuous use. These include: how to utilize information from problem reports rather than testing to drive defect discovery and removal effort, practical model calibration, and alignment of model assumptions with our environment.
Isolated Open Rotor Noise Prediction Assessment Using the F31A31 Historical Blade Set
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Jones, William T.; Boyd, D. Douglas, Jr.; Zawodny, Nikolas S.
2016-01-01
In an effort to mitigate next-generation fuel efficiency and environmental impact concerns for aviation, open rotor propulsion systems have received renewed interest. However, maintaining the high propulsive efficiency while simultaneously meeting noise goals has been one of the challenges in making open rotor propulsion a viable option. Improvements in prediction tools and design methodologies have opened the design space for next generation open rotor designs that satisfy these challenging objectives. As such, validation of aerodynamic and acoustic prediction tools has been an important aspect of open rotor research efforts. This paper describes validation efforts of a combined computational fluid dynamics and Ffowcs Williams and Hawkings equation methodology for open rotor aeroacoustic modeling. Performance and acoustic predictions were made for a benchmark open rotor blade set and compared with measurements over a range of rotor speeds and observer angles. Overall, the results indicate that the computational approach is acceptable for assessing low-noise open rotor designs. Additionally, this approach may be used to provide realistic incident source fields for acoustic shielding/scattering studies on various aircraft configurations.
Preliminary Impacts of North Carolina's Rural Innovative Schools Project
ERIC Educational Resources Information Center
Naumenko, Oksana; Henson, Robert; Hutchins, Bryan
2016-01-01
Funded by an Investing in Innovation (i3) Validation grant, the Rural Innovative Schools (RIS) Project is the first widespread effort to scale up the early college model by implementing it in comprehensive high schools. This paper will present preliminary findings from the evaluation of this project. The impact study uses a quasi-experimental…
ERIC Educational Resources Information Center
Poekert, Philip; Alexandrou, Alex; Shannon, Darbianne
2016-01-01
Teacher leadership is increasingly being touted as a practical response to guide teacher learning in school improvement and policy reform efforts. However, the field of research on teacher leadership in relation to post-compulsory educational development has been and remains largely atheoretical to date. This empirical study proposes a grounded…
Validation Testing of a Peridynamic Impact Damage Model Using NASA's Micro-Particle Gun
NASA Technical Reports Server (NTRS)
Baber, Forrest E.; Zelinski, Brian J.; Guven, Ibrahim; Gray, Perry
2017-01-01
Through a collaborative effort between the Virginia Commonwealth University and Raytheon, a peridynamic model for sand impact damage has been developed1-3. Model development has focused on simulating impacts of sand particles on ZnS traveling at velocities consistent with aircraft take-off and landing speeds. The model reproduces common features of impact damage including pit and radial cracks, and, under some conditions, lateral cracks. This study focuses on a preliminary validation exercise in which simulation results from the peridynamic model are compared to a limited experimental data set generated by NASA's recently developed micro-particle gun (MPG). The MPG facility measures the dimensions and incoming and rebound velocities of the impact particles. It also links each particle to a specific impact site and its associated damage. In this validation exercise parameters of the peridynamic model are adjusted to fit the experimentally observed pit diameter, average length of radial cracks and rebound velocities for 4 impacts of 300 µm glass beads on ZnS. Results indicate that a reasonable fit of these impact characteristics can be obtained by suitable adjustment of the peridynamic input parameters, demonstrating that the MPG can be used effectively as a validation tool for impact modeling and that the peridynamic sand impact model described herein possesses not only a qualitative but also a quantitative ability to simulate sand impact events.
Revision of empirical electric field modeling in the inner magnetosphere using Cluster data
NASA Astrophysics Data System (ADS)
Matsui, H.; Torbert, R. B.; Spence, H. E.; Khotyaintsev, Yu. V.; Lindqvist, P.-A.
2013-07-01
Using Cluster data from the Electron Drift (EDI) and the Electric Field and Wave (EFW) instruments, we revise our empirically-based, inner-magnetospheric electric field (UNH-IMEF) model at 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jordan, Amy B.; Stauffer, Philip H.; Reed, Donald T.
The primary objective of the experimental effort described here is to aid in understanding the complex nature of liquid, vapor, and solid transport occurring around heated nuclear waste in bedded salt. In order to gain confidence in the predictive capability of numerical models, experimental validation must be performed to ensure that (a) hydrological and physiochemical parameters and (b) processes are correctly simulated. The experiments proposed here are designed to study aspects of the system that have not been satisfactorily quantified in prior work. In addition to exploring the complex coupled physical processes in support of numerical model validation, lessons learnedmore » from these experiments will facilitate preparations for larger-scale experiments that may utilize similar instrumentation techniques.« less
Development of a Turbofan Engine Simulation in a Graphical Simulation Environment
NASA Technical Reports Server (NTRS)
Parker, Khary I.; Guo, Ten-Heui
2003-01-01
This paper presents the development of a generic component level model of a turbofan engine simulation with a digital controller, in an advanced graphical simulation environment. The goal of this effort is to develop and demonstrate a flexible simulation platform for future research in propulsion system control and diagnostic technology. A previously validated FORTRAN-based model of a modern, high-performance, military-type turbofan engine is being used to validate the platform development. The implementation process required the development of various innovative procedures, which are discussed in the paper. Open-loop and closed-loop comparisons are made between the two simulations. Future enhancements that are to be made to the modular engine simulation are summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bons, Jeffrey; Ameri, Ali
2016-01-08
The objective of this research effort was to develop a validated computational modeling capability for the characterization of the effects of hot streaks and particulate deposition on the heat load of modern gas turbines. This was accomplished with a multi-faceted approach including analytical, experimental, and computational components. A 1-year no cost extension request was approved for this effort, so the total duration was 4 years. The research effort succeeded in its ultimate objective by leveraging extensive experimental deposition studies complemented by computational modeling. Experiments were conducted with hot streaks, vane cooling, and combinations of hot streaks with vane cooling. Thesemore » studies contributed to a significant body of corporate knowledge of deposition, in combination with particle rebound and deposition studies funded by other agencies, to provide suitable conditions for the development of a new model. The model includes the following physical phenomena: elastic deformation, plastic deformation, adhesion, and shear removal. It also incorporates material property sensitivity to temperature and tangential-normal velocity rebound cross-dependencies observed in experiments. The model is well-suited for incorporation in CFD simulations of complex gas turbine flows due to its algebraic (explicit) formulation. This report contains model predictions compared to coefficient of restitution data available in the open literature as well as deposition results from two different high temperature turbine deposition facilities. While the model comparisons with experiments are in many cases promising, several key aspects of particle deposition remain elusive. The simple phenomenological nature of the model allows for parametric dependencies to be evaluated in a straightforward manner. This effort also included the first-ever full turbine stage deposition model published in the open literature. The simulations included hot streaks and simulated vane cooling. The new deposition model was implemented into the CFD model as a wall boundary condition, with various particle sizes investigated in the simulation. Simulations utilizing a steady mixing plane formulation and an unsteady sliding mesh were conducted and the flow solution of each was validated against experimental data. Results from each of these simulations, including impact and capture distributions and efficiencies, were compared and potential reasons for differences discussed in detail. The inclusion of a large range of particle sizes allowed investigation of trends with particle size, such as increased radial migration and reduced sticking efficiency at the larger particle sizes. The unsteady simulation predicted lower sticking efficiencies on the rotor blades than the mixing plane simulation for the majority of particle sizes. This is postulated to be due to the preservation of the hot streak and cool vane wake through the vane-rotor interface (which are smeared out circumferentially in the mixing-plane simulation). The results reported here represent the successful implementation of a novel deposition model into validated vane-rotor flow solutions that include a non-uniform inlet temperature profile and simulated vane cooling.« less
NASA Technical Reports Server (NTRS)
Kypuros, Javier A.; Colson, Rodrigo; Munoz, Afredo
2004-01-01
This paper describes efforts conducted to improve dynamic temperature estimations of a turbine tip clearance system to facilitate design of a generalized tip clearance controller. This work builds upon research previously conducted and presented in and focuses primarily on improving dynamic temperature estimations of the primary components affecting tip clearance (i.e. the rotor, blades, and casing/shroud). The temperature profiles estimated by the previous model iteration, specifically for the rotor and blades, were found to be inaccurate and, more importantly, insufficient to facilitate controller design. Some assumptions made to facilitate the previous results were not valid, and thus improvements are presented here to better match the physical reality. As will be shown, the improved temperature sub- models, match a commercially validated model and are sufficiently simplified to aid in controller design.
Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview
NASA Technical Reports Server (NTRS)
Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard
1996-01-01
The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'
NASA Astrophysics Data System (ADS)
Romine, William Lee; Walter, Emily Marie
2014-11-01
Efficacy of the Measure of Understanding of Macroevolution (MUM) as a measurement tool has been a point of contention among scholars needing a valid measure for knowledge of macroevolution. We explored the structure and construct validity of the MUM using Rasch methodologies in the context of a general education biology course designed with an emphasis on macroevolution content. The Rasch model was utilized to quantify item- and test-level characteristics, including dimensionality, reliability, and fit with the Rasch model. Contrary to previous work, we found that the MUM provides a valid, reliable, and unidimensional scale for measuring knowledge of macroevolution in introductory non-science majors, and that its psychometric behavior does not exhibit large changes across time. While we found that all items provide productive measurement information, several depart substantially from ideal behavior, warranting a collective effort to improve these items. Suggestions for improving the measurement characteristics of the MUM at the item and test levels are put forward and discussed.
How motivation affects academic performance: a structural equation modelling analysis.
Kusurkar, R A; Ten Cate, Th J; Vos, C M P; Westers, P; Croiset, G
2013-03-01
Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous Motivation (RAM, a measure of the balance between AM and CM) affects academic performance through good study strategy and higher study effort and compare this model between subgroups: males and females; students selected via two different systems namely qualitative and weighted lottery selection. Data on motivation, study strategy and effort was collected from 383 medical students of VU University Medical Center Amsterdam and their academic performance results were obtained from the student administration. Structural Equation Modelling analysis technique was used to test a hypothesized model in which high RAM would positively affect Good Study Strategy (GSS) and study effort, which in turn would positively affect academic performance in the form of grade point averages. This model fit well with the data, Chi square = 1.095, df = 3, p = 0.778, RMSEA model fit = 0.000. This model also fitted well for all tested subgroups of students. Differences were found in the strength of relationships between the variables for the different subgroups as expected. In conclusion, RAM positively correlated with academic performance through deep strategy towards study and higher study effort. This model seems valid in medical education in subgroups such as males, females, students selected by qualitative and weighted lottery selection.
Integral Full Core Multi-Physics PWR Benchmark with Measured Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forget, Benoit; Smith, Kord; Kumar, Shikhar
In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less
NASA Astrophysics Data System (ADS)
Song, S. G.
2016-12-01
Simulation-based ground motion prediction approaches have several benefits over empirical ground motion prediction equations (GMPEs). For instance, full 3-component waveforms can be produced and site-specific hazard analysis is also possible. However, it is important to validate them against observed ground motion data to confirm their efficiency and validity before practical uses. There have been community efforts for these purposes, which are supported by the Broadband Platform (BBP) project at the Southern California Earthquake Center (SCEC). In the simulation-based ground motion prediction approaches, it is a critical element to prepare a possible range of scenario rupture models. I developed a pseudo-dynamic source model for Mw 6.5-7.0 by analyzing a number of dynamic rupture models, based on 1-point and 2-point statistics of earthquake source parameters (Song et al. 2014; Song 2016). In this study, the developed pseudo-dynamic source models were tested against observed ground motion data at the SCEC BBP, Ver 16.5. The validation was performed at two stages. At the first stage, simulated ground motions were validated against observed ground motion data for past events such as the 1992 Landers and 1994 Northridge, California, earthquakes. At the second stage, they were validated against the latest version of empirical GMPEs, i.e., NGA-West2. The validation results show that the simulated ground motions produce ground motion intensities compatible with observed ground motion data at both stages. The compatibility of the pseudo-dynamic source models with the omega-square spectral decay and the standard deviation of the simulated ground motion intensities are also discussed in the study
Grid Modernization Laboratory Consortium - Testing and Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroposki, Benjamin; Skare, Paul; Pratt, Rob
This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.
NASA Astrophysics Data System (ADS)
Taha, Mutasem O.; Habash, Maha; Khanfar, Mohammad A.
2014-05-01
Glucokinase (GK) is involved in normal glucose homeostasis and therefore it is a valid target for drug design and discovery efforts. GK activators (GKAs) have excellent potential as treatments of hyperglycemia and diabetes. The combined recent interest in GKAs, together with docking limitations and shortages of docking validation methods prompted us to use our new 3D-QSAR analysis, namely, docking-based comparative intermolecular contacts analysis (dbCICA), to validate docking configurations performed on a group of GKAs within GK binding site. dbCICA assesses the consistency of docking by assessing the correlation between ligands' affinities and their contacts with binding site spots. Optimal dbCICA models were validated by receiver operating characteristic curve analysis and comparative molecular field analysis. dbCICA models were also converted into valid pharmacophores that were used as search queries to mine 3D structural databases for new GKAs. The search yielded several potent bioactivators that experimentally increased GK bioactivity up to 7.5-folds at 10 μM.
Updates on CCMC Activities and GSFC Space Weather Services
NASA Technical Reports Server (NTRS)
Zhengm Y.; Hesse, M.; Kuznetsova, M.; Pulkkinen, A.; Rastaetter, L.; Maddox, M.; Taktakishvili, A.; Berrios, D.; Chulaki, A.; Lee, H.;
2011-01-01
In this presentation, we provide updates on CCMC modeling activities, CCMC metrics and validation studies, and other CCMC efforts. In addition, an overview of GSFC Space Weather Services (a sibling organization to the Community Coordinated Modeling Center) and its products/capabilities will be given. We show how some of the research grade models, if running in an operational mode, can help address NASA's space weather needs by providing forecasting/now casting capabilities of significant space weather events throughout the solar system.
Venta, Kimberly; Baker, Erin; Fidopiastis, Cali; Stanney, Kay
2017-12-01
The purpose of this study was to investigate the potential of developing an EHR-based model of physician competency, named the Skill Deficiency Evaluation Toolkit for Eliminating Competency-loss Trends (Skill-DETECT), which presents the opportunity to use EHR-based models to inform selection of Continued Medical Education (CME) opportunities specifically targeted at maintaining proficiency. The IBM Explorys platform provided outpatient Electronic Health Records (EHRs) representing 76 physicians with over 5000 patients combined. These data were used to develop the Skill-DETECT model, a predictive hybrid model composed of a rule-based model, logistic regression model, and a thresholding model, which predicts cognitive clinical skill deficiencies in internal medicine physicians. A three-phase approach was then used to statistically validate the model performance. Subject Matter Expert (SME) panel reviews resulted in a 100% overall approval rate of the rule based model. Area under the receiver-operating characteristic curves calculated for each logistic regression curve resulted in values between 0.76 and 0.92, which indicated exceptional performance. Normality, skewness, and kurtosis were determined and confirmed that the distribution of values output from the thresholding model were unimodal and peaked, which confirmed effectiveness and generalizability. The validation has confirmed that the Skill-DETECT model has a strong ability to evaluate EHR data and support the identification of internal medicine cognitive clinical skills that are deficient or are of higher likelihood of becoming deficient and thus require remediation, which will allow both physician and medical organizations to fine tune training efforts. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Stefanescu, D. M.; Catalina, A. V.; Juretzko, Frank R.; Sen, Subhayu; Curreri, P. A.
2003-01-01
The objective of the work on Particle Engulfment and Pushing by Solidifying Interfaces (PEP) include: 1) to obtain fundamental understanding of the physics of particle pushing and engulfment, 2) to develop mathematical models to describe the phenomenon, and 3) to perform critical experiments in the microgravity environment of space to provide benchmark data for model validation. Successful completion of this project will yield vital information relevant to a diverse area of terrestrial applications. With PEP being a long term research effort, this report will focus on advances in the theoretical treatment of the solid/liquid interface interaction with an approaching particle, experimental validation of some aspects of the developed models, and the experimental design aspects of future experiments to be performed on board the International Space Station.
ERIC Educational Resources Information Center
Patterson, Brian F.; Mattern, Krista D.
2009-01-01
In an effort to continuously monitor the validity of the SAT for predicting first-year college grades, the College Board has continued its multi-year effort to recruit four-year colleges and universities (henceforth, "institutions") to provide data on the cohorts of first-time, first-year students entering in the fall semester beginning…
Validation of a reduced-order jet model for subsonic and underexpanded hydrogen jets
Li, Xuefang; Hecht, Ethan S.; Christopher, David M.
2016-01-01
Much effort has been made to model hydrogen releases from leaks during potential failures of hydrogen storage systems. A reduced-order jet model can be used to quickly characterize these flows, with low computational cost. Notional nozzle models are often used to avoid modeling the complex shock structures produced by the underexpanded jets by determining an “effective” source to produce the observed downstream trends. In our work, the mean hydrogen concentration fields were measured in a series of subsonic and underexpanded jets using a planar laser Rayleigh scattering system. Furthermore, we compared the experimental data to a reduced order jet modelmore » for subsonic flows and a notional nozzle model coupled to the jet model for underexpanded jets. The values of some key model parameters were determined by comparisons with the experimental data. Finally, the coupled model was also validated against hydrogen concentrations measurements for 100 and 200 bar hydrogen jets with the predictions agreeing well with data in the literature.« less
Huerta, Snjezana; Zerr, Argero A.; Eisenberg, Nancy; Spinrad, Tracy L.; Valiente, Carlos; Di Giunta, Laura; Pina, Armando A.; Eggum, Natalie D.; Sallquist, Julie; Edwards, Alison; Kupfer, Anne; Lonigan, Christopher J.; Phillips, Beth M.; Wilson, Shauna B.; Clancy-Menchetti, Jeanine; Landry, Susan H.; Swank, Paul R.; Assel, Michael A.; Taylor, Heather B.
2010-01-01
Measurement invariance of a one-factor model of effortful control (EC) was tested for 853 low-income preschoolers (M age = 4.48 years). Using a teacher-report questionnaire and seven behavioral measures, configural invariance (same factor structure across groups), metric invariance (same pattern of factor loadings across groups), and partial scalar invariance (mostly the same intercepts across groups) were established across ethnicity (European Americans, African Americans and Hispanics) and across sex. These results suggest that the latent construct of EC behaved in a similar way across ethnic groups and sex, and that comparisons of mean levels of EC are valid across sex and probably valid across ethnicity, especially when larger numbers of tasks are used. The findings also support the use of diverse behavioral measures as indicators of a single latent EC construct. PMID:20593008
NASA Astrophysics Data System (ADS)
MacLeod, Dave A.; Jones, Anne; Di Giuseppe, Francesca; Caminade, Cyril; Morse, Andrew P.
2015-04-01
The severity and timing of seasonal malaria epidemics is strongly linked with temperature and rainfall. Advance warning of meteorological conditions from seasonal climate models can therefore potentially anticipate unusually strong epidemic events, building resilience and adapting to possible changes in the frequency of such events. Here we present validation of a process-based, dynamic malaria model driven by hindcasts from a state-of-the-art seasonal climate model from the European Centre for Medium-Range Weather Forecasts. We validate the climate and malaria models against observed meteorological and incidence data for Botswana over the period 1982-2006 the longest record of observed incidence data which has been used to validate a modeling system of this kind. We consider the impact of climate model biases, the relationship between climate and epidemiological predictability and the potential for skillful malaria forecasts. Forecast skill is demonstrated for upper tercile malaria incidence for the Botswana malaria season (January-May), using forecasts issued at the start of November; the forecast system anticipates six out of the seven upper tercile malaria seasons in the observational period. The length of the validation time series gives confidence in the conclusion that it is possible to make reliable forecasts of seasonal malaria risk, forming a key part of a health early warning system for Botswana and contributing to efforts to adapt to climate change.
Wagner, C.R.; Mueller, D.S.
2001-01-01
The quantification of current patterns is an essential component of a Water Quality Analysis Simulation Program (WASP) application in a riverine environment. The U.S. Geological Survey (USGS) provided a field validated two-dimensional Resource Management Associates-2 (RMA-2) hydrodynamic model capable of quantifying the steady-flowpatterns in the Ohio River extending from river mile 590 to 630 for the Ohio River Valley Water Sanitation Commission (ORSANCO) water-quality modeling efforts on that reach. Because of the hydrodynamic complexities induced by McAlpine Locks and Dam (Ohio River mile 607), the model was split into two segments: an upstream reach, which extended from the dam upstream to the upper terminus of the study reach at Ohio River mile 590; and a downstream reach, which extended from the dam downstream to a lower terminus at Ohio River mile 636. The model was calibrated to a low-flow hydraulic survey (approximately 35,000 cubic feet per second (ft3/s)) and verified with data collected during a high-flow survey (approximately 390,000 ft3/s). The model calibration and validation process included matching water-surface elevations at 10 locations and velocity profiles at 30 cross sections throughout the study reach. Based on the calibration and validation results, the model is a representative simulation of the Ohio River steady-flow patterns below discharges of approximately 400,000 ft3/s.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burge, S.W.
Erosion has been identified as one of the significant design issues in fluid beds. A cooperative R&D venture of industry, research, and government organizations was recently formed to meet the industry need for a better understanding of erosion in fluid beds. Research focussed on bed hydrodynamics, which are considered to be the primary erosion mechanism. As part of this work, ANL developed an analytical model (FLUFIX) for bed hydrodynamics. Partial validation was performed using data from experiments sponsored by the research consortium. Development of a three-dimensional fluid bed hydrodynamic model was part of Asea-Babcock`s in-kind contribution to the R&D venture.more » This model, FORCE2, was developed by Babcock & Wilcox`s Research and Development Division existing B&W program and on the gas-solids modeling and was based on an existing B&W program and on the gas-solids modeling technology developed by ANL and others. FORCE2 contains many of the features needed to model plant size beds and, therefore can be used along with the erosion technology to assess metal wastage in industrial equipment. As part of the development efforts, FORCE2 was partially validated using ANL`s two-dimensional model, FLUFIX, and experimental data. Time constraints as well as the lack of good hydrodynamic data, particularly at the plant scale, prohibited a complete validation of FORCE2. This report describes this initial validation of FORCE2.« less
Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1997-01-01
Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.
Real-time In-Flight Strain and Deflection Monitoring with Fiber Optic Sensors
NASA Technical Reports Server (NTRS)
Richards, Lance; Parker, Allen R.; Ko, William L.; Piazza, Anthony
2008-01-01
This viewgraph presentation reviews Dryden's efforts to develop in-flight monitoring based on Fiber Optics. One of the motivating factors for this development was the breakup of the Helios aircraft. On Ikhana the use of fiber optics for wing shape sensing is being developed. They are being used to flight validate fiber optic sensor measurements and real-time wing shape sensing predictions on NASA's Ikhana vehicle; validate fiber optic mathematical models and design tools; Assess technical viability and, if applicable, develop methodology and approach to incorporate wing shape measurements within the vehicle flight control system, and develop and flight validate advanced approaches to perform active wing shape control.
Cyber Selection Test Research Effort for U.S. Army New Accessions
2017-10-12
assessment game 3. Develop an operational version of the STA game which incorporates assessments from phase 1 and (through game -play) examines...3 more STA abilities •5 STA behaviors 4. Validate the system thinking assessment game in an operational setting C O M PL ET ED PL AN N ED Research...Information Identifies Elements of Systems Models Relationships Understands System Dynamics Evaluates & Revises Model Applies Understanding to Problem STA Game
Performance testing of a vertical Bridgman furnace using experiments and numerical modeling
NASA Astrophysics Data System (ADS)
Rosch, W. R.; Fripp, A. L.; Debnam, W. J.; Pendergrass, T. K.
1997-04-01
This paper details a portion of the work performed in preparation for the growth of lead tin telluride crystals during a Space Shuttle flight. A coordinated effort of experimental measurements and numerical modeling was completed to determine the optimum growth parameters and the performance of the furnace. This work was done using NASA's Advanced Automated Directional Solidification Furnace, but the procedures used should be equally valid for other vertical Bridgman furnaces.
Synthetic Teammates as Team Players: Coordination of Human and Synthetic Teammates
2016-05-31
distribution is unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT This project is part of a larger effort that focuses on human-automation coordination in the...context of the development, integration, and validation of a computational cognitive model that acts as a full-fledged synthetic teammate on an...integrated the synthetic teammate model into the CERTT II (Cognitive Engineering Research on Team Tasks II) testbed in order to empirically address these
Prediction of fishing effort distributions using boosted regression trees.
Soykan, Candan U; Eguchi, Tomoharu; Kohin, Suzanne; Dewar, Heidi
2014-01-01
Concerns about bycatch of protected species have become a dominant factor shaping fisheries management. However, efforts to mitigate bycatch are often hindered by a lack of data on the distributions of fishing effort and protected species. One approach to overcoming this problem has been to overlay the distribution of past fishing effort with known locations of protected species, often obtained through satellite telemetry and occurrence data, to identify potential bycatch hotspots. This approach, however, generates static bycatch risk maps, calling into question their ability to forecast into the future, particularly when dealing with spatiotemporally dynamic fisheries and highly migratory bycatch species. In this study, we use boosted regression trees to model the spatiotemporal distribution of fishing effort for two distinct fisheries in the North Pacific Ocean, the albacore (Thunnus alalunga) troll fishery and the California drift gillnet fishery that targets swordfish (Xiphias gladius). Our results suggest that it is possible to accurately predict fishing effort using < 10 readily available predictor variables (cross-validated correlations between model predictions and observed data -0.6). Although the two fisheries are quite different in their gears and fishing areas, their respective models had high predictive ability, even when input data sets were restricted to a fraction of the full time series. The implications for conservation and management are encouraging: Across a range of target species, fishing methods, and spatial scales, even a relatively short time series of fisheries data may suffice to accurately predict the location of fishing effort into the future. In combination with species distribution modeling of bycatch species, this approach holds promise as a mitigation tool when observer data are limited. Even in data-rich regions, modeling fishing effort and bycatch may provide more accurate estimates of bycatch risk than partial observer coverage for fisheries and bycatch species that are heavily influenced by dynamic oceanographic conditions.
Soft computing techniques toward modeling the water supplies of Cyprus.
Iliadis, L; Maris, F; Tachos, S
2011-10-01
This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Rubinstein, R. (Editor); Rumsey, C. L. (Editor); Salas, M. D. (Editor); Thomas, J. L. (Editor); Bushnell, Dennis M. (Technical Monitor)
2001-01-01
Advances in turbulence modeling are needed in order to calculate high Reynolds number flows near the onset of separation and beyond. To this end, the participants in this workshop made the following recommendations. (1) A national/international database and standards for turbulence modeling assessment should be established. Existing experimental data sets should be reviewed and categorized. Advantage should be taken of other efforts already under-way, such as that of the European Research Community on Flow, Turbulence, and Combustion (ERCOFTAC) consortium. Carefully selected "unit" experiments will be needed, as well as advances in instrumentation, to fill the gaps in existing datasets. A high priority should be given to document existing turbulence model capabilities in a standard form, including numerical implementation issues such as grid quality and resolution. (2) NASA should support long-term research on Algebraic Stress Models and Reynolds Stress Models. The emphasis should be placed on improving the length-scale equation, since it is the least understood and is a key component of two-equation and higher models. Second priority should be given to the development of improved near-wall models. Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) would provide valuable guidance in developing and validating new Reynolds-averaged Navier-Stokes (RANS) models. Although not the focus of this workshop, DNS, LES, and hybrid methods currently represent viable approaches for analysis on a limited basis. Therefore, although computer limitations require the use of RANS methods for realistic configurations at high Reynolds number in the foreseeable future, a balanced effort in turbulence modeling development, validation, and implementation should include these approaches as well.
Metric analysis and data validation across FORTRAN projects
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.; Phillips, Tsai-Yun
1983-01-01
The desire to predict the effort in developing or explaining the quality of software has led to the proposal of several metrics. As a step toward validating these metrics, the Software Engineering Laboratory (SEL) has analyzed the software science metrics, cyclomatic complexity, and various standard program measures for their relation to effort (including design through acceptance testing), development errors (both discrete and weighted according to the amount of time to locate and fix), and one another. The data investigated are collected from a project FORTRAN environment and examined across several projects at once, within individual projects and by reporting accuracy checks demonstrating the need to validate a database. When the data comes from individual programmers or certain validated projects, the metrics' correlations with actual effort seem to be strongest. For modules developed entirely by individual programmers, the validity ratios induce a statistically significant ordering of several of the metrics' correlations. When comparing the strongest correlations, neither software science's E metric cyclomatic complexity not source lines of code appears to relate convincingly better with effort than the others.
A meta-model for computer executable dynamic clinical safety checklists.
Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong
2017-12-12
Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.
Wormhole Formation in RSRM Nozzle Joint Backfill
NASA Technical Reports Server (NTRS)
Stevens, J.
2000-01-01
The RSRM nozzle uses a barrier of RTV rubber upstream of the nozzle O-ring seals. Post flight inspection of the RSRM nozzle continues to reveal occurrence of "wormholes" into the RTV backfill. The term "wormholes", sometimes called "gas paths", indicates a gas flow path not caused by pre-existing voids, but by a little-understood internal failure mode of the material during motor operation. Fundamental understanding of the mechanics of the RSRM nozzle joints during motor operation, nonlinear viscoelastic characterization of the RTV backfill material, identification of the conditions that predispose the RTV to form wormholes, and screening of candidate replacement materials is being pursued by a joint effort between Thiokol Propulsion, NASA, and the Army Propulsion & Structures Directorate at Redstone Arsenal. The performance of the RTV backfill in the joint is controlled by the joint environment. Joint movement, which applies a tension and shear load on the material, coupled with the introduction of high pressure gas in combination create an environment that exceeds the capability of the material to withstand the wormhole effect. Little data exists to evaluate why the material fails under the modeled joint conditions, so an effort to characterize and evaluate the material under these conditions was undertaken. Viscoelastic property data from characterization testing will anchor structural analysis models. Data over a range of temperatures, environmental pressures, and strain rates was used to develop a nonlinear viscoelastic model to predict material performance, develop criteria for replacement materials, and quantify material properties influencing wormhole growth. Three joint simulation analogs were developed to analyze and validate joint thermal barrier (backfill) material performance. Two exploratory tests focus on detection of wormhole failure under specific motor operating conditions. A "validation" test system provides data to "validate" computer models and predictions. Finally, two candidate replacement materials are being screened and "validated" using the developed test systems.
Validation and Continued Development of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2017-10-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. An extended MHD model has shown good agreement with experimental data at 14 kHz injector operation. Efforts to extend the existing validation to a range of higher frequencies (36, 53, 68 kHz) using the PSI-Tet 3D extended MHD code will be presented, along with simulations of potential combinations of flux conserver features and helicity injector configurations and their impact on current drive performance, density control, and temperature for future SIHI experiments. Work supported by USDoE.
Review and assessment of turbulence models for hypersonic flows
NASA Astrophysics Data System (ADS)
Roy, Christopher J.; Blottner, Frederick G.
2006-10-01
Accurate aerodynamic prediction is critical for the design and optimization of hypersonic vehicles. Turbulence modeling remains a major source of uncertainty in the computational prediction of aerodynamic forces and heating for these systems. The first goal of this article is to update the previous comprehensive review of hypersonic shock/turbulent boundary-layer interaction experiments published in 1991 by Settles and Dodson (Hypersonic shock/boundary-layer interaction database. NASA CR 177577, 1991). In their review, Settles and Dodson developed a methodology for assessing experiments appropriate for turbulence model validation and critically surveyed the existing hypersonic experiments. We limit the scope of our current effort by considering only two-dimensional (2D)/axisymmetric flows in the hypersonic flow regime where calorically perfect gas models are appropriate. We extend the prior database of recommended hypersonic experiments (on four 2D and two 3D shock-interaction geometries) by adding three new geometries. The first two geometries, the flat plate/cylinder and the sharp cone, are canonical, zero-pressure gradient flows which are amenable to theory-based correlations, and these correlations are discussed in detail. The third geometry added is the 2D shock impinging on a turbulent flat plate boundary layer. The current 2D hypersonic database for shock-interaction flows thus consists of nine experiments on five different geometries. The second goal of this study is to review and assess the validation usage of various turbulence models on the existing experimental database. Here we limit the scope to one- and two-equation turbulence models where integration to the wall is used (i.e., we omit studies involving wall functions). A methodology for validating turbulence models is given, followed by an extensive evaluation of the turbulence models on the current hypersonic experimental database. A total of 18 one- and two-equation turbulence models are reviewed, and results of turbulence model assessments for the six models that have been extensively applied to the hypersonic validation database are compiled and presented in graphical form. While some of the turbulence models do provide reasonable predictions for the surface pressure, the predictions for surface heat flux are generally poor, and often in error by a factor of four or more. In the vast majority of the turbulence model validation studies we review, the authors fail to adequately address the numerical accuracy of the simulations (i.e., discretization and iterative error) and the sensitivities of the model predictions to freestream turbulence quantities or near-wall y+ mesh spacing. We recommend new hypersonic experiments be conducted which (1) measure not only surface quantities but also mean and fluctuating quantities in the interaction region and (2) provide careful estimates of both random experimental uncertainties and correlated bias errors for the measured quantities and freestream conditions. For the turbulence models, we recommend that a wide-range of turbulence models (including newer models) be re-examined on the current hypersonic experimental database, including the more recent experiments. Any future turbulence model validation efforts should carefully assess the numerical accuracy and model sensitivities. In addition, model corrections (e.g., compressibility corrections) should be carefully examined for their effects on a standard, low-speed validation database. Finally, as new experiments or direct numerical simulation data become available with information on mean and fluctuating quantities, they should be used to improve the turbulence models and thus increase their predictive capability.
Impact of Learning Model Based on Cognitive Conflict toward Student’s Conceptual Understanding
NASA Astrophysics Data System (ADS)
Mufit, F.; Festiyed, F.; Fauzan, A.; Lufri, L.
2018-04-01
The problems that often occur in the learning of physics is a matter of misconception and low understanding of the concept. Misconceptions do not only happen to students, but also happen to college students and teachers. The existing learning model has not had much impact on improving conceptual understanding and remedial efforts of student misconception. This study aims to see the impact of cognitive-based learning model in improving conceptual understanding and remediating student misconceptions. The research method used is Design / Develop Research. The product developed is a cognitive conflict-based learning model along with its components. This article reports on product design results, validity tests, and practicality test. The study resulted in the design of cognitive conflict-based learning model with 4 learning syntaxes, namely (1) preconception activation, (2) presentation of cognitive conflict, (3) discovery of concepts & equations, (4) Reflection. The results of validity tests by some experts on aspects of content, didactic, appearance or language, indicate very valid criteria. Product trial results also show a very practical product to use. Based on pretest and posttest results, cognitive conflict-based learning models have a good impact on improving conceptual understanding and remediating misconceptions, especially in high-ability students.
Continued Development and Validation of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2015-11-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.
Control Activity in Support of NASA Turbine Based Combined Cycle (TBCC) Research
NASA Technical Reports Server (NTRS)
Stueber, Thomas J.; Vrnak, Daniel R.; Le, Dzu K.; Ouzts, Peter J.
2010-01-01
Control research for a Turbine Based Combined Cycle (TBCC) propulsion system is the current focus of the Hypersonic Guidance, Navigation, and Control (GN&C) discipline team. The ongoing work at the NASA Glenn Research Center (GRC) supports the Hypersonic GN&C effort in developing tools to aid the design of control algorithms to manage a TBCC airbreathing propulsion system during a critical operating period. The critical operating period being addressed in this paper is the span when the propulsion system transitions from one cycle to another, referred to as mode transition. One such tool, that is a basic need for control system design activities, is computational models (hereto forth referred to as models) of the propulsion system. The models of interest for designing and testing controllers are Control Development Models (CDMs) and Control Validation Models (CVMs). CDMs and CVMs are needed for each of the following propulsion system elements: inlet, turbine engine, ram/scram dual-mode combustor, and nozzle. This paper presents an overall architecture for a TBCC propulsion system model that includes all of the propulsion system elements. Efforts are under way, focusing on one of the propulsion system elements, to develop CDMs and CVMs for a TBCC propulsion system inlet. The TBCC inlet aerodynamic design being modeled is that of the Combined-Cycle Engine (CCE) Testbed. The CCE Testbed is a large-scale model of an aerodynamic design that was verified in a small-scale screening experiment. The modeling approach includes employing existing state-of-the-art simulation codes, developing new dynamic simulations, and performing system identification experiments on the hardware in the NASA GRC 10 by10-Foot Supersonic Wind Tunnel. The developed CDMs and CVMs will be available for control studies prior to hardware buildup. The system identification experiments on the CCE Testbed will characterize the necessary dynamics to be represented in CDMs for control design. These system identification models will also be the reference models to validate the CDM and CVM models. Validated models will give value to the tools used to develop the models.
The New NASA Orbital Debris Engineering Model ORDEM2000
NASA Technical Reports Server (NTRS)
Liou, Jer-Chyi; Matney, Mark J.; Anz-Meador, Phillip D.; Kessler, Donald; Jansen, Mark; Theall, Jeffery R.
2002-01-01
The NASA Orbital Debris Program Office at Johnson Space Center has developed a new computer-based orbital debris engineering model, ORDEM2000, which describes the orbital debris environment in the low Earth orbit region between 200 and 2000 km altitude. The model is appropriate for those engineering solutions requiring knowledge and estimates of the orbital debris environment (debris spatial density, flux, etc.). ORDEM2000 can also be used as a benchmark for ground-based debris measurements and observations. We incorporated a large set of observational data, covering the object size range from 10 mm to 10 m, into the ORDEM2000 debris database, utilizing a maximum likelihood estimator to convert observations into debris population probability distribution functions. These functions then form the basis of debris populations. We developed a finite element model to process the debris populations to form the debris environment. A more capable input and output structure and a user-friendly graphical user interface are also implemented in the model. ORDEM2000 has been subjected to a significant verification and validation effort. This document describes ORDEM2000, which supersedes the previous model, ORDEM96. The availability of new sensor and in situ data, as well as new analytical techniques, has enabled the construction of this new model. Section 1 describes the general requirements and scope of an engineering model. Data analyses and the theoretical formulation of the model are described in Sections 2 and 3. Section 4 describes the verification and validation effort and the sensitivity and uncertainty analyses. Finally, Section 5 describes the graphical user interface, software installation, and test cases for the user.
Using Remote Sensing Imagery to Improve and Validate Predictions of Phalaris arundinacea Invasion
USDA-ARS?s Scientific Manuscript database
The prediction of the spread of invasive species has become an important tool to help land managers focus their efforts. Here we identify how the definition of an invasive species as a driver or passenger of change is important in determining the best modeling approach for a species and how invasion...
ERIC Educational Resources Information Center
Lee, Chien-Ti; Beckert, Troy E.; Goodrich, Thane R.
2010-01-01
In an effort to validate the use of a Western model of adolescent development with Asian youth, 781 urban and rural Taiwanese high school students (56% female) completed questionnaires about their development. Adolescents were first divided into cultural value orientations (i.e. collectivistic, individualistic, or transitional) and compared…
ERIC Educational Resources Information Center
Rivet, Ann E.; Kastens, Kim A.
2012-01-01
In recent years, science education has placed increasing importance on learners' mastery of scientific reasoning. This growing emphasis presents a challenge for both developers and users of assessments. We report on our effort around the conceptualization, development, and testing the validity of an assessment of students' ability to reason around…
Using the Depression, Anxiety, Stress Scales-21 with U.S. Adolescents: An Alternate Models Analysis
ERIC Educational Resources Information Center
Moore, Stephanie A.; Dowdy, Erin; Furlong, Michael J.
2017-01-01
As part of universal screening efforts in schools, validated measures that identify internalizing distress are needed. One promising available measure, the Depression, Anxiety, and Stress Scales-21 (DASS-21), has yet to be thoroughly investigated with adolescents in the United States. This study investigated the underlying factor structure of the…
ERIC Educational Resources Information Center
Cameron, Kim; And Others
This study attempted to develop a reliable and valid instrument for assessing work environment and continuous quality improvement efforts in the non-academic sectors of colleges and universities particularly those institutions who have adopted Total Quality Management programs. A model of a work environment for continuous quality improvement was…
Microstructure Modeling of Third Generation Disk Alloys
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2010-01-01
The objective of this program was to model, validate, and predict the precipitation microstructure evolution, using PrecipiCalc (QuesTek Innovations LLC) software, for 3rd generation Ni-based gas turbine disc superalloys during processing and service, with a set of logical and consistent experiments and characterizations. Furthermore, within this program, the originally research-oriented microstructure simulation tool was to be further improved and implemented to be a useful and user-friendly engineering tool. In this report, the key accomplishments achieved during the third year (2009) of the program are summarized. The activities of this year included: Further development of multistep precipitation simulation framework for gamma prime microstructure evolution during heat treatment; Calibration and validation of gamma prime microstructure modeling with supersolvus heat treated LSHR; Modeling of the microstructure evolution of the minor phases, particularly carbides, during isothermal aging, representing the long term microstructure stability during thermal exposure; and the implementation of software tools. During the research and development efforts to extend the precipitation microstructure modeling and prediction capability in this 3-year program, we identified a hurdle, related to slow gamma prime coarsening rate, with no satisfactory scientific explanation currently available. It is desirable to raise this issue to the Ni-based superalloys research community, with hope that in future there will be a mechanistic understanding and physics-based treatment to overcome the hurdle. In the mean time, an empirical correction factor was developed in this modeling effort to capture the experimental observations.
Using Ground-Based Measurements and Retrievals to Validate Satellite Data
NASA Technical Reports Server (NTRS)
Dong, Xiquan
2002-01-01
The proposed research is to use the DOE ARM ground-based measurements and retrievals as the ground-truth references for validating satellite cloud results and retrieving algorithms. This validation effort includes four different ways: (1) cloud properties on different satellites, therefore different sensors, TRMM VIRS and TERRA MODIS; (2) cloud properties at different climatic regions, such as DOE ARM SGP, NSA, and TWP sites; (3) different cloud types, low and high level cloud properties; and (4) day and night retrieving algorithms. Validation of satellite-retrieved cloud properties is very difficult and a long-term effort because of significant spatial and temporal differences between the surface and satellite observing platforms. The ground-based measurements and retrievals, only carefully analyzed and validated, can provide a baseline for estimating errors in the satellite products. Even though the validation effort is so difficult, a significant progress has been made during the proposed study period, and the major accomplishments are summarized in the follow.
Prognostics of Power Electronics, Methods and Validation Experiments
NASA Technical Reports Server (NTRS)
Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai
2012-01-01
Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.
MRMAide: a mixed resolution modeling aide
NASA Astrophysics Data System (ADS)
Treshansky, Allyn; McGraw, Robert M.
2002-07-01
The Mixed Resolution Modeling Aide (MRMAide) technology is an effort to semi-automate the implementation of Mixed Resolution Modeling (MRM). MRMAide suggests ways of resolving differences in fidelity and resolution across diverse modeling paradigms. The goal of MRMAide is to provide a technology that will allow developers to incorporate model components into scenarios other than those for which they were designed. Currently, MRM is implemented by hand. This is a tedious, error-prone, and non-portable process. MRMAide, in contrast, will automatically suggest to a developer where and how to connect different components and/or simulations. MRMAide has three phases of operation: pre-processing, data abstraction, and validation. During pre-processing the components to be linked together are evaluated in order to identify appropriate mapping points. During data abstraction those mapping points are linked via data abstraction algorithms. During validation developers receive feedback regarding their newly created models relative to existing baselined models. The current work presents an overview of the various problems encountered during MRM and the various technologies utilized by MRMAide to overcome those problems.
Development and validation of a 10-year-old child ligamentous cervical spine finite element model.
Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H
2013-12-01
Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.
Uncertainty propagation for statistical impact prediction of space debris
NASA Astrophysics Data System (ADS)
Hoogendoorn, R.; Mooij, E.; Geul, J.
2018-01-01
Predictions of the impact time and location of space debris in a decaying trajectory are highly influenced by uncertainties. The traditional Monte Carlo (MC) method can be used to perform accurate statistical impact predictions, but requires a large computational effort. A method is investigated that directly propagates a Probability Density Function (PDF) in time, which has the potential to obtain more accurate results with less computational effort. The decaying trajectory of Delta-K rocket stages was used to test the methods using a six degrees-of-freedom state model. The PDF of the state of the body was propagated in time to obtain impact-time distributions. This Direct PDF Propagation (DPP) method results in a multi-dimensional scattered dataset of the PDF of the state, which is highly challenging to process. No accurate results could be obtained, because of the structure of the DPP data and the high dimensionality. Therefore, the DPP method is less suitable for practical uncontrolled entry problems and the traditional MC method remains superior. Additionally, the MC method was used with two improved uncertainty models to obtain impact-time distributions, which were validated using observations of true impacts. For one of the two uncertainty models, statistically more valid impact-time distributions were obtained than in previous research.
Hodge, N. E.; Ferencz, R. M.; Vignes, R. M.
2016-05-30
Selective laser melting (SLM) is an additive manufacturing process in which multiple, successive layers of metal powders are heated via laser in order to build a part. Modeling of SLM requires consideration of the complex interaction between heat transfer and solid mechanics. Here, the present work describes the authors initial efforts to validate their first generation model. In particular, the comparison of model-generated solid mechanics results, including both deformation and stresses, is presented. Additionally, results of various perturbations of the process parameters and modeling strategies are discussed.
Cost Modeling for Space Optical Telescope Assemblies
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda
2011-01-01
Parametric cost models are used to plan missions, compare concepts and justify technology investments. This paper reviews an on-going effort to develop cost modes for space telescopes. This paper summarizes the methodology used to develop cost models and documents how changes to the database have changed previously published preliminary cost models. While the cost models are evolving, the previously published findings remain valid: it costs less per square meter of collecting aperture to build a large telescope than a small telescope; technology development as a function of time reduces cost; and lower areal density telescopes cost more than more massive telescopes.
2015-07-01
steps to identify and mitigate potential challenges; (2) extent the services’ efforts to validate gender -neutral occupational standards are...to address statutory and Joint Staff requirements for validating gender -neutral occupational standards. GAO identified five elements required for...SOCOM Have Studies Underway to Validate Gender -Neutral Occupational Standards 21 DOD Is Providing Oversight of Integration Efforts, but Has Not
Meyniel, Florent; Safra, Lou; Pessiglione, Mathias
2014-01-01
A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty. PMID:24743711
Cavitation, Flow Structure and Turbulence in the Tip Region of a Rotor Blade
NASA Technical Reports Server (NTRS)
Wu, H.; Miorini, R.; Soranna, F.; Katz, J.; Michael, T.; Jessup, S.
2010-01-01
Objectives: Measure the flow structure and turbulence within a Naval, axial waterjet pump. Create a database for benchmarking and validation of parallel computational efforts. Address flow and turbulence modeling issues that are unique to this complex environment. Measure and model flow phenomena affecting cavitation within the pump and its effect on pump performance. This presentation focuses on cavitation phenomena and associated flow structure in the tip region of a rotor blade.
Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See
2005-01-01
The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.
Law, Bradley; Caccamo, Gabriele; Roe, Paul; Truskinger, Anthony; Brassil, Traecey; Gonsalves, Leroy; McConville, Anna; Stanton, Matthew
2017-09-01
Species distribution models have great potential to efficiently guide management for threatened species, especially for those that are rare or cryptic. We used MaxEnt to develop a regional-scale model for the koala Phascolarctos cinereus at a resolution (250 m) that could be used to guide management. To ensure the model was fit for purpose, we placed emphasis on validating the model using independently-collected field data. We reduced substantial spatial clustering of records in coastal urban areas using a 2-km spatial filter and by modeling separately two subregions separated by the 500-m elevational contour. A bias file was prepared that accounted for variable survey effort. Frequency of wildfire, soil type, floristics and elevation had the highest relative contribution to the model, while a number of other variables made minor contributions. The model was effective in discriminating different habitat suitability classes when compared with koala records not used in modeling. We validated the MaxEnt model at 65 ground-truth sites using independent data on koala occupancy (acoustic sampling) and habitat quality (browse tree availability). Koala bellows ( n = 276) were analyzed in an occupancy modeling framework, while site habitat quality was indexed based on browse trees. Field validation demonstrated a linear increase in koala occupancy with higher modeled habitat suitability at ground-truth sites. Similarly, a site habitat quality index at ground-truth sites was correlated positively with modeled habitat suitability. The MaxEnt model provided a better fit to estimated koala occupancy than the site-based habitat quality index, probably because many variables were considered simultaneously by the model rather than just browse species. The positive relationship of the model with both site occupancy and habitat quality indicates that the model is fit for application at relevant management scales. Field-validated models of similar resolution would assist in guiding management of conservation-dependent species.
Developing Cognitive Models for Social Simulation from Survey Data
NASA Astrophysics Data System (ADS)
Alt, Jonathan K.; Lieberman, Stephen
The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.
Modeling multiphase migration of organic chemicals in groundwater systems--a review and assessment.
Abriola, L M
1989-01-01
Over the past two decades, a number of models have been developed to describe the multiphase migration of organic chemicals in the subsurface. This paper presents the state-of-the-art with regard to such modeling efforts. The mathematical foundations of these models are explored and individual models are presented and discussed. Models are divided into three groups: a) those that assume a sharp interface between the migrating fluids; b) those that incorporate capillarity; and c) those that consider interphase transport of mass. Strengths and weaknesses of each approach are considered along with supporting data for model validation. Future research directions are also highlighted. PMID:2695322
NASA advanced turboprop research and concept validation program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlow, J.B. Jr.; Sievers, G.K.
1988-01-01
NASA has determined by experimental and analytical effort that use of advanced turboprop propulsion instead of the conventional turbofans in the older narrow-body airline fleet could reduce fuel consumption for this type of aircraft by up to 50 percent. In cooperation with industry, NASA has defined and implemented an Advanced Turboprop (ATP) program to develop and validate the technology required for these new high-speed, multibladed, thin, swept propeller concepts. This paper presents an overview of the analysis, model-scale test, and large-scale flight test elements of the program together with preliminary test results, as available.
Measuring infrastructure: A key step in program evaluation and planning
Schmitt, Carol L.; Glasgow, LaShawn; Lavinghouze, S. Rene; Rieker, Patricia P.; Fulmer, Erika; McAleer, Kelly; Rogers, Todd
2016-01-01
State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General’s call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model’s utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. PMID:27037655
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2005-01-01
Researchers from NASA Glenn Research Center s Combustion Branch and the Ohio Aerospace Institute (OAI) have developed a transferable calibration standard for an optical technique called spontaneous Raman scattering (SRS) in high-pressure flames. SRS is perhaps the only technique that provides spatially and temporally resolved, simultaneous multiscalar measurements in turbulent flames. Such measurements are critical for the validation of numerical models of combustion. This study has been a combined experimental and theoretical effort to develop a spectral calibration database for multiscalar diagnostics using SRS in high-pressure flames. However, in the past such measurements have used a one-of-a-kind experimental setup and a setup-dependent calibration procedure to empirically account for spectral interferences, or crosstalk, among the major species of interest. Such calibration procedures, being non-transferable, are prohibitively expensive to duplicate. A goal of this effort is to provide an SRS calibration database using transferable standards that can be implemented widely by other researchers for both atmospheric-pressure and high-pressure (less than 30 atm) SRS studies. A secondary goal of this effort is to provide quantitative multiscalar diagnostics in high pressure environments to validate computational combustion codes.
NASA Astrophysics Data System (ADS)
Stotsky, Jay A.; Hammond, Jason F.; Pavlovsky, Leonid; Stewart, Elizabeth J.; Younger, John G.; Solomon, Michael J.; Bortz, David M.
2016-07-01
The goal of this work is to develop a numerical simulation that accurately captures the biomechanical response of bacterial biofilms and their associated extracellular matrix (ECM). In this, the second of a two-part effort, the primary focus is on formally presenting the heterogeneous rheology Immersed Boundary Method (hrIBM) and validating our model by comparison to experimental results. With this extension of the Immersed Boundary Method (IBM), we use the techniques originally developed in Part I ([19]) to treat biofilms as viscoelastic fluids possessing variable rheological properties anchored to a set of moving locations (i.e., the bacteria locations). In particular, we incorporate spatially continuous variable viscosity and density fields into our model. Although in [14,15], variable viscosity is used in an IBM context to model discrete viscosity changes across interfaces, to our knowledge this work and Part I are the first to apply the IBM to model a continuously variable viscosity field. We validate our modeling approach from Part I by comparing dynamic moduli and compliance moduli computed from our model to data from mechanical characterization experiments on Staphylococcus epidermidis biofilms. The experimental setup is described in [26] in which biofilms are grown and tested in a parallel plate rheometer. In order to initialize the positions of bacteria in the biofilm, experimentally obtained three dimensional coordinate data was used. One of the major conclusions of this effort is that treating the spring-like connections between bacteria as Maxwell or Zener elements provides good agreement with the mechanical characterization data. We also found that initializing the simulations with different coordinate data sets only led to small changes in the mechanical characterization results. Matlab code used to produce results in this paper will be available at https://github.com/MathBioCU/BiofilmSim.
Fiber Optic Wing Shape Sensing on NASA's Ikhana UAV
NASA Technical Reports Server (NTRS)
Richards, Lance; Parker, Allen R.; Ko, William L.; Piazza, Anthony
2008-01-01
This document discusses the development of fiber optic wing shape sensing on NASA's Ikhana vehicle. The Dryden Flight Research Center's Aerostructures Branch initiated fiber-optic instrumentation development efforts in the mid-1990s. Motivated by a failure to control wing dihedral resulting in a mishap with the Helios aircraft, new wing displacement techniques were developed. Research objectives for Ikhana included validating fiber optic sensor measurements and real-time wing shape sensing predictions; the validation of fiber optic mathematical models and design tools; assessing technical viability and, if applicable, developing methodology and approaches to incorporate wing shape measurements within the vehicle flight control system; and, developing and flight validating approaches to perform active wing shape control using conventional control surfaces and active material concepts.
OWL-based reasoning methods for validating archetypes.
Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2013-04-01
Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.
Browning, J. R.; Jonkman, J.; Robertson, A.; ...
2014-12-16
In this study, high-quality computer simulations are required when designing floating wind turbines because of the complex dynamic responses that are inherent with a high number of degrees of freedom and variable metocean conditions. In 2007, the FAST wind turbine simulation tool, developed and maintained by the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL), was expanded to include capabilities that are suitable for modeling floating offshore wind turbines. In an effort to validate FAST and other offshore wind energy modeling tools, DOE funded the DeepCwind project that tested three prototype floating wind turbines at 1/50 th scalemore » in a wave basin, including a semisubmersible, a tension-leg platform, and a spar buoy. This paper describes the use of the results of the spar wave basin tests to calibrate and validate the FAST offshore floating simulation tool, and presents some initial results of simulated dynamic responses of the spar to several combinations of wind and sea states. Wave basin tests with the spar attached to a scale model of the NREL 5-megawatt reference wind turbine were performed at the Maritime Research Institute Netherlands under the DeepCwind project. This project included free-decay tests, tests with steady or turbulent wind and still water (both periodic and irregular waves with no wind), and combined wind/wave tests. The resulting data from the 1/50th model was scaled using Froude scaling to full size and used to calibrate and validate a full-size simulated model in FAST. Results of the model calibration and validation include successes, subtleties, and limitations of both wave basin testing and FAST modeling capabilities.« less
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
Peoples, Shelagh M; O'Dwyer, Laura M; Shields, Katherine A; Wang, Yang
2013-01-01
This research describes the development process, psychometric analyses and part validation study of a theoretically-grounded Rasch-based instrument, the Nature of Science Instrument-Elementary (NOSI-E). The NOSI-E was designed to measure elementary students' understanding of the Nature of Science (NOS). Evidence is provided for three of the six validity aspects (content, substantive and generalizability) needed to support the construct validity of the NOSI-E. A future article will examine the structural and external validity aspects. Rasch modeling proved especially productive in scale improvement efforts. The instrument, designed for large-scale assessment use, is conceptualized using five construct domains. Data from 741 elementary students were used to pilot the Rasch scale, with continuous improvements made over three successive administrations. The psychometric properties of the NOSI-E instrument are consistent with the basic assumptions of Rasch measurement, namely that the items are well-fitting and invariant. Items from each of the five domains (Empirical, Theory-Laden, Certainty, Inventive, and Socially and Culturally Embedded) are spread along the scale's continuum and appear to overlap well. Most importantly, the scale seems appropriately calibrated and responsive for elementary school-aged children, the target age group. As a result, the NOSI-E should prove beneficial for science education research. As the United States' science education reform efforts move toward students' learning science through engaging in authentic scientific practices (NRC, 2011), it will be important to assess whether this new approach to teaching science is effective. The NOSI-E can be used as one measure of whether this reform effort has an impact.
ERIC Educational Resources Information Center
Shindler, John; Taylor, Clint; Cadenas, Herminia; Jones, Albert
This study was a pilot effort to examine the efficacy of an analytic trait scale school climate assessment instrument and democratic change system in two urban high schools. Pilot study results indicate that the instrument shows promising soundness in that it exhibited high levels of validity and reliability. In addition, the analytic trait format…
Good, Andrew C; Hermsmeier, Mark A
2007-01-01
Research into the advancement of computer-aided molecular design (CAMD) has a tendency to focus on the discipline of algorithm development. Such efforts are often wrought to the detriment of the data set selection and analysis used in said algorithm validation. Here we highlight the potential problems this can cause in the context of druglikeness classification. More rigorous efforts are applied to the selection of decoy (nondruglike) molecules from the ACD. Comparisons are made between model performance using the standard technique of random test set creation with test sets derived from explicit ontological separation by drug class. The dangers of viewing druglike space as sufficiently coherent to permit simple classification are highlighted. In addition the issues inherent in applying unfiltered data and random test set selection to (Q)SAR models utilizing large and supposedly heterogeneous databases are discussed.
Life Modeling and Design Analysis for Ceramic Matrix Composite Materials
NASA Technical Reports Server (NTRS)
2005-01-01
The primary research efforts focused on characterizing and modeling static failure, environmental durability, and creep-rupture behavior of two classes of ceramic matrix composites (CMC), silicon carbide fibers in a silicon carbide matrix (SiC/SiC) and carbon fibers in a silicon carbide matrix (C/SiC). An engineering life prediction model (Probabilistic Residual Strength model) has been developed specifically for CMCs. The model uses residual strength as the damage metric for evaluating remaining life and is posed probabilistically in order to account for the stochastic nature of the material s response. In support of the modeling effort, extensive testing of C/SiC in partial pressures of oxygen has been performed. This includes creep testing, tensile testing, half life and residual tensile strength testing. C/SiC is proposed for airframe and propulsion applications in advanced reusable launch vehicles. Figures 1 and 2 illustrate the models predictive capabilities as well as the manner in which experimental tests are being selected in such a manner as to ensure sufficient data is available to aid in model validation.
Automated Boundary Conditions for Wind Tunnel Simulations
NASA Technical Reports Server (NTRS)
Carlson, Jan-Renee
2018-01-01
Computational fluid dynamic (CFD) simulations of models tested in wind tunnels require a high level of fidelity and accuracy particularly for the purposes of CFD validation efforts. Considerable effort is required to ensure the proper characterization of both the physical geometry of the wind tunnel and recreating the correct flow conditions inside the wind tunnel. The typical trial-and-error effort used for determining the boundary condition values for a particular tunnel configuration are time and computer resource intensive. This paper describes a method for calculating and updating the back pressure boundary condition in wind tunnel simulations by using a proportional-integral-derivative controller. The controller methodology and equations are discussed, and simulations using the controller to set a tunnel Mach number in the NASA Langley 14- by 22-Foot Subsonic Tunnel are demonstrated.
Uncertainty Assessment of Hypersonic Aerothermodynamics Prediction Capability
NASA Technical Reports Server (NTRS)
Bose, Deepak; Brown, James L.; Prabhu, Dinesh K.; Gnoffo, Peter; Johnston, Christopher O.; Hollis, Brian
2011-01-01
The present paper provides the background of a focused effort to assess uncertainties in predictions of heat flux and pressure in hypersonic flight (airbreathing or atmospheric entry) using state-of-the-art aerothermodynamics codes. The assessment is performed for four mission relevant problems: (1) shock turbulent boundary layer interaction on a compression corner, (2) shock turbulent boundary layer interaction due a impinging shock, (3) high-mass Mars entry and aerocapture, and (4) high speed return to Earth. A validation based uncertainty assessment approach with reliance on subject matter expertise is used. A code verification exercise with code-to-code comparisons and comparisons against well established correlations is also included in this effort. A thorough review of the literature in search of validation experiments is performed, which identified a scarcity of ground based validation experiments at hypersonic conditions. In particular, a shortage of useable experimental data at flight like enthalpies and Reynolds numbers is found. The uncertainty was quantified using metrics that measured discrepancy between model predictions and experimental data. The discrepancy data is statistically analyzed and investigated for physics based trends in order to define a meaningful quantified uncertainty. The detailed uncertainty assessment of each mission relevant problem is found in the four companion papers.
Computational fluid dynamics combustion analysis evaluation
NASA Technical Reports Server (NTRS)
Kim, Y. M.; Shang, H. M.; Chen, C. P.; Ziebarth, J. P.
1992-01-01
This study involves the development of numerical modelling in spray combustion. These modelling efforts are mainly motivated to improve the computational efficiency in the stochastic particle tracking method as well as to incorporate the physical submodels of turbulence, combustion, vaporization, and dense spray effects. The present mathematical formulation and numerical methodologies can be casted in any time-marching pressure correction methodologies (PCM) such as FDNS code and MAST code. A sequence of validation cases involving steady burning sprays and transient evaporating sprays will be included.
2009-11-24
assisted by the Brigade Combat Team (BCT) Modernization effort, the use of Models and Simulations ( M &S) becomes more crucial in supporting major...in 2008 via a slice of the Current Force (CF) BCT structure. To ensure realistic operational context, a M &S System-of- Systems (SoS) level...messages, and constructive representation of platforms, vehicles, and terrain. The M &S federation also provided test control, data collection, and live
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
Solar Occultation Retrieval Algorithm Development
NASA Technical Reports Server (NTRS)
Lumpe, Jerry D.
2004-01-01
This effort addresses the comparison and validation of currently operational solar occultation retrieval algorithms, and the development of generalized algorithms for future application to multiple platforms. initial development of generalized forward model algorithms capable of simulating transmission data from of the POAM II/III and SAGE II/III instruments. Work in the 2" quarter will focus on: completion of forward model algorithms, including accurate spectral characteristics for all instruments, and comparison of simulated transmission data with actual level 1 instrument data for specific occultation events.
de Jonge, Jan; van der Linden, Sjaak; Schaufeli, Wilmar; Peter, Richard; Siegrist, Johannes
2008-01-01
Key measures of Siegrist's (1996) Effort-Reward Imbalance (ERI) Model (i.e., efforts, rewards, and overcommitment) were psychometrically tested. To study change in organizational interventions, knowledge about the type of change underlying the instruments used is needed. Next to assessing baseline factorial validity and reliability, the factorial stability over time - known as alpha-beta-gamma change - of the ERI scales was examined. Psychometrics were tested among 383 and 267 healthcare workers from two Dutch panel surveys with different time lags. Baseline results favored a five-factor model (i.e., efforts, esteem rewards, financial/career-related aspects, job security, and overcommitment) over and above a three-factor solution (i.e., efforts, composite rewards, and overcommitment). Considering changes as a whole, particularly the factor loadings of the three ERI scales were not equal over time. Findings suggest in general that moderate changes in the ERI factor structure did not affect the interpretation of mean changes over time. Occupational health researchers utilizing the ERI scales can feel confident that self-reported changes are more likely to be due to factors other than structural change of the ERI scales over time, which has important implications for evaluating job stress and health interventions.
Test of memory malingering (TOMM) trial 1 as a screening measure for insufficient effort.
O'Bryant, Sid E; Engel, Lisa R; Kleiner, Jennifer S; Vasterling, Jennifer J; Black, F William
2007-05-01
The identification of insufficient effort is critical to neuropsychological evaluation, and several existing instruments assess effort on neuropsychological tasks. Yet instruments designed to detect insufficient effort are underutilized in standard neuropsychological assessments, perhaps in part because they typically require significant administration time and are, therefore, not ideally suited to screening contexts. The Test of Memory Malingering (TOMM) is a commonly administered, well-validated symptom validity test. This study evaluates the utility of TOMM Trial 1 as a relatively brief screening measure of insufficient effort. Results suggest that TOMM Trial 1 demonstrates high diagnostic accuracy and is a viable option for screening insufficient effort. Diagnostic accuracy estimates are presented for a range of base rates. The need for more comprehensive SVT assessment in most clinical and forensic situation is discussed.
Motlagh, Farhad Shafiepour; Yarmohammadian, Mohammad Hossein; Yaghoubi, Maryam
2012-03-01
One important factor in growth, progress, and increase in work efficiency of employees of any enterprise is to make considerable effort. Supreme leader of the Islamic Republic of Iran also addressed the issue of need for more efforts. The goal of this study was to determine the association of perceived organizational justice and organizational expectations with efforts of nurses to provide a suitable model. The current study was a descriptive study. The study group consists of all nurses who worked in hospitals of Isfahan. Due to some limitations all nurses of the special unit, surgery wards and operating room were questioned. The data collection tools were the Organizational Justice Questionnaire, organizational expectations questionnaire, and double effort questionnaire. Content validity of the mentioned questionnaires was confirmed after considering the experts' comments. The reliability of these questionnaires, using the Cronbach's alpha, were 0.79, 0.83 and 0.92, respectively. The Pearson correlation and the structural equation model were used for the analysis of data. There was a significant correlation between the perceived organizational justice and the double effort of nurses during the surgery of patients. Correlation of the expectation from job, usefulness of job, and its attractiveness with double effort of nurses before the surgery was also statistically significant. Moreover, it was shown that the root of the mean square error of estimation (RMSEA) was 0.087, the fitted goodness index (GFI) was 0.953, the value of chi-square was 268.5, and the model was statistically significant (p < 0.001). Today Justice is an essential need for human life and its importance in organizations and social life of individuals is evident.
Predicted deep-sea coral habitat suitability for the U.S. West coast.
Guinotte, John M; Davies, Andrew J
2014-01-01
Regional scale habitat suitability models provide finer scale resolution and more focused predictions of where organisms may occur. Previous modelling approaches have focused primarily on local and/or global scales, while regional scale models have been relatively few. In this study, regional scale predictive habitat models are presented for deep-sea corals for the U.S. West Coast (California, Oregon and Washington). Model results are intended to aid in future research or mapping efforts and to assess potential coral habitat suitability both within and outside existing bottom trawl closures (i.e. Essential Fish Habitat (EFH)) and identify suitable habitat within U.S. National Marine Sanctuaries (NMS). Deep-sea coral habitat suitability was modelled at 500 m×500 m spatial resolution using a range of physical, chemical and environmental variables known or thought to influence the distribution of deep-sea corals. Using a spatial partitioning cross-validation approach, maximum entropy models identified slope, temperature, salinity and depth as important predictors for most deep-sea coral taxa. Large areas of highly suitable deep-sea coral habitat were predicted both within and outside of existing bottom trawl closures and NMS boundaries. Predicted habitat suitability over regional scales are not currently able to identify coral areas with pin point accuracy and probably overpredict actual coral distribution due to model limitations and unincorporated variables (i.e. data on distribution of hard substrate) that are known to limit their distribution. Predicted habitat results should be used in conjunction with multibeam bathymetry, geological mapping and other tools to guide future research efforts to areas with the highest probability of harboring deep-sea corals. Field validation of predicted habitat is needed to quantify model accuracy, particularly in areas that have not been sampled.
Predicted Deep-Sea Coral Habitat Suitability for the U.S. West Coast
Guinotte, John M.; Davies, Andrew J.
2014-01-01
Regional scale habitat suitability models provide finer scale resolution and more focused predictions of where organisms may occur. Previous modelling approaches have focused primarily on local and/or global scales, while regional scale models have been relatively few. In this study, regional scale predictive habitat models are presented for deep-sea corals for the U.S. West Coast (California, Oregon and Washington). Model results are intended to aid in future research or mapping efforts and to assess potential coral habitat suitability both within and outside existing bottom trawl closures (i.e. Essential Fish Habitat (EFH)) and identify suitable habitat within U.S. National Marine Sanctuaries (NMS). Deep-sea coral habitat suitability was modelled at 500 m×500 m spatial resolution using a range of physical, chemical and environmental variables known or thought to influence the distribution of deep-sea corals. Using a spatial partitioning cross-validation approach, maximum entropy models identified slope, temperature, salinity and depth as important predictors for most deep-sea coral taxa. Large areas of highly suitable deep-sea coral habitat were predicted both within and outside of existing bottom trawl closures and NMS boundaries. Predicted habitat suitability over regional scales are not currently able to identify coral areas with pin point accuracy and probably overpredict actual coral distribution due to model limitations and unincorporated variables (i.e. data on distribution of hard substrate) that are known to limit their distribution. Predicted habitat results should be used in conjunction with multibeam bathymetry, geological mapping and other tools to guide future research efforts to areas with the highest probability of harboring deep-sea corals. Field validation of predicted habitat is needed to quantify model accuracy, particularly in areas that have not been sampled. PMID:24759613
Validation and Continued Development of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2016-10-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.
Pinxten, Maarten; Marsh, Herbert W; De Fraine, Bieke; Van Den Noortgate, Wim; Van Damme, Jan
2014-03-01
The multidimensionality of the academic self-concept in terms of domain specificity has been well established in previous studies, whereas its multidimensionality in terms of motivational functions (the so-called affect-competence separation) needs further examination. This study aims at exploring differential effects of enjoyment and competence beliefs on two external validity criteria in the field of mathematics. Data analysed in this study were part of a large-scale longitudinal research project. Following a five-wave design, math enjoyment, math competence beliefs, math achievement, and perceived math effort expenditure measures were repeatedly collected from a cohort of 4,724 pupils in Grades 3-7. Confirmatory factor analysis (CFA) was used to test the internal factor structure of the math self-concept. Additionally, a series of nested models was tested using structural equation modelling to examine longitudinal reciprocal interrelations between math competence beliefs and math enjoyment on the one hand and math achievement and perceived math effort expenditure on the other. Our results showed that CFA models with separate factors for math enjoyment and math competence beliefs fit the data substantially better than models without it. Furthermore, differential relationships between both constructs and the two educational outcomes were observed. Math competence beliefs had positive effects on math achievement and negative effects on perceived math effort expenditure. Math enjoyment had (mild) positive effects on subsequent perceived effort expenditure and math competence beliefs. This study provides further support for the affect-competence separation. Theoretical issues regarding adequate conceptualization and practical consequences for practitioners are discussed. © 2013 The British Psychological Society.
FDA 2011 process validation guidance: lifecycle compliance model.
Campbell, Cliff
2014-01-01
This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.
Quantitative Validation of the Integrated Medical Model (IMM) for ISS Missions
NASA Technical Reports Server (NTRS)
Young, Millennia; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.
2016-01-01
Lifetime Surveillance of Astronaut Health (LSAH) provided observed medical event data on 33 ISS and 111 STS person-missions for use in further improving and validating the Integrated Medical Model (IMM). Using only the crew characteristics from these observed missions, the newest development version, IMM v4.0, will simulate these missions to predict medical events and outcomes. Comparing IMM predictions to the actual observed medical event counts will provide external validation and identify areas of possible improvement. In an effort to improve the power of detecting differences in this validation study, the total over each program ISS and STS will serve as the main quantitative comparison objective, specifically the following parameters: total medical events (TME), probability of loss of crew life (LOCL), and probability of evacuation (EVAC). Scatter plots of observed versus median predicted TMEs (with error bars reflecting the simulation intervals) will graphically display comparisons while linear regression will serve as the statistical test of agreement. Two scatter plots will be analyzed 1) where each point reflects a mission and 2) where each point reflects a condition-specific total number of occurrences. The coefficient of determination (R2) resulting from a linear regression with no intercept bias (intercept fixed at zero) will serve as an overall metric of agreement between IMM and the real world system (RWS). In an effort to identify as many possible discrepancies as possible for further inspection, the -level for all statistical tests comparing IMM predictions to observed data will be set to 0.1. This less stringent criterion, along with the multiple testing being conducted, should detect all perceived differences including many false positive signals resulting from random variation. The results of these analyses will reveal areas of the model requiring adjustment to improve overall IMM output, which will thereby provide better decision support for mission critical applications.
ERIC Educational Resources Information Center
Espinoza, Penelope P.; Quezada, Stephanie A.; Rincones, Rodolfo; Strobach, E. Natalia; Gutierrez, Maria Armida Estrada
2012-01-01
The present work investigates the validation of a newly developed instrument, the attributional bias instrument, based on achievement attribution theories that distinguish between effort and ability explanations of behavior. The instrument further incorporates the distinction between explanations for success versus failure in academic performance.…
Lampropoulou, Sofia; Nowicky, Alexander V
2012-03-01
The aim of the study was to examine the reliability and validity of the numerical rating scale (0-10 NRS) for rating perception of effort during isometric elbow flexion in healthy people. 33 individuals (32 ± 8 years) participated in the study. Three re-test measurements within one session and three weekly sessions were undertaken to determine the reliability of the scale. The sensitivity of the scale following 10 min isometric fatiguing exercise of the elbow flexors as well as the correlation of the effort with the electromyographic (EMG) activity of the flexor muscles were tested. Perception of effort was tested during isometric elbow flexion at 10, 30, 50, 70, 90, and 100% MVC. The 0-10 NRS demonstrated an excellent test-retest reliability [intra class correlation (ICC) = 0.99 between measurements taken within a session and 0.96 between 3 consecutive weekly sessions]. Exploratory curve fitting for the relationship between effort ratings and voluntary force, and underlying EMG showed that both are best described by power functions (y = ax ( b )). There were also strong correlations (range 0.89-0.95) between effort ratings and EMG recordings of all flexor muscles supporting the concurrent criterion validity of the measure. The 0-10 NRS was sensitive enough to detect changes in the perceived effort following fatigue and significantly increased at the level of voluntary contraction used in its assessment (p < 0.001). These findings suggest the 0-10 NRS is a valid and reliable scale for rating perception of effort in healthy individuals. Future research should seek to establish the validity of the 0-10 NRS in clinical settings.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell H.; Schifer, Nicholas A.
2012-01-01
The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two high-efficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including testing validation hardware, known as the Thermal Standard, to provide a direct comparison to numerical and empirical models used to predict convertor net heat input. This validation hardware provided a comparison for scrutinizing and improving empirical correlations and numerical models of ASC-E2 net heat input. This hardware simulated the characteristics of an ASC-E2 convertor in both an operating and non-operating mode. This paper describes the Thermal Standard testing and the conclusions of the validation effort applied to the empirical correlation methods used by the Radioisotope Power System (RPS) team at NASA Glenn.
Ongoing Fixed Wing Research within the NASA Langley Aeroelasticity Branch
NASA Technical Reports Server (NTRS)
Bartels, Robert; Chwalowski, Pawel; Funk, Christie; Heeg, Jennifer; Hur, Jiyoung; Sanetrik, Mark; Scott, Robert; Silva, Walter; Stanford, Bret; Wiseman, Carol
2015-01-01
The NASA Langley Aeroelasticity Branch is involved in a number of research programs related to fixed wing aeroelasticity and aeroservoelasticity. These ongoing efforts are summarized here, and include aeroelastic tailoring of subsonic transport wing structures, experimental and numerical assessment of truss-braced wing flutter and limit cycle oscillations, and numerical modeling of high speed civil transport configurations. Efforts devoted to verification, validation, and uncertainty quantification of aeroelastic physics in a workshop setting are also discussed. The feasibility of certain future civil transport configurations will depend on the ability to understand and control complex aeroelastic phenomena, a goal that the Aeroelasticity Branch is well-positioned to contribute through these programs.
International Harmonization and Cooperation in the Validation of Alternative Methods.
Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie
The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.
Validation of a Deterministic Vibroacoustic Response Prediction Model
NASA Technical Reports Server (NTRS)
Caimi, Raoul E.; Margasahayam, Ravi
1997-01-01
This report documents the recently completed effort involving validation of a deterministic theory for the random vibration problem of predicting the response of launch pad structures in the low-frequency range (0 to 50 hertz). Use of the Statistical Energy Analysis (SEA) methods is not suitable in this range. Measurements of launch-induced acoustic loads and subsequent structural response were made on a cantilever beam structure placed in close proximity (200 feet) to the launch pad. Innovative ways of characterizing random, nonstationary, non-Gaussian acoustics are used for the development of a structure's excitation model. Extremely good correlation was obtained between analytically computed responses and those measured on the cantilever beam. Additional tests are recommended to bound the problem to account for variations in launch trajectory and inclination.
Optimal cooperative control synthesis of active displays
NASA Technical Reports Server (NTRS)
Garg, S.; Schmidt, D. K.
1985-01-01
A technique is developed that is intended to provide a systematic approach to synthesizing display augmentation for optimal manual control in complex, closed-loop tasks. A cooperative control synthesis technique, previously developed to design pilot-optimal control augmentation for the plant, is extended to incorporate the simultaneous design of performance enhancing displays. The technique utilizes an optimal control model of the man in the loop. It is applied to the design of a quickening control law for a display and a simple K/s(2) plant, and then to an F-15 type aircraft in a multi-channel task. Utilizing the closed loop modeling and analysis procedures, the results from the display design algorithm are evaluated and an analytical validation is performed. Experimental validation is recommended for future efforts.
Validation of a temperature prediction model for heat deaths in undocumented border crossers.
Ruttan, Tim; Stolz, Uwe; Jackson-Vance, Sara; Parks, Bruce; Keim, Samuel M
2013-04-01
Heat exposure is a leading cause of death in undocumented border crossers along the Arizona-Mexico border. We performed a validation study of a weather prediction model that predicts the probability of heat related deaths among undocumented border crossers. We analyzed a medical examiner registry cohort of undocumented border crosser heat- related deaths from January 1, 2002 to August 31, 2009 and used logistic regression to model the probability of one or more heat deaths on a given day using daily high temperature (DHT) as the predictor. At a critical threshold DHT of 40 °C, the probability of at least one heat death was 50 %. The probability of a heat death along the Arizona-Mexico border for suspected undocumented border crossers is strongly associated with ambient temperature. These results can be used in prevention and response efforts to assess the daily risk of deaths among undocumented border crossers in the region.
Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Garg Vijay; Ameri, Ali
2005-01-01
The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.
Extension of HCDstruct for Transonic Aeroservoelastic Analysis of Unconventional Aircraft Concepts
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2017-01-01
A substantial effort has been made to implement an enhanced aerodynamic modeling capability in the Higher-fidelity Conceptual Design and structural optimization tool. This additional capability is needed for a rapid, physics-based method of modeling advanced aircraft concepts at risk of structural failure due to dynamic aeroelastic instabilities. To adequately predict these instabilities, in particular for transonic applications, a generalized aerodynamic matching algorithm was implemented to correct the doublet-lattice model available in Nastran using solution data from a priori computational fluid dynamics anal- ysis. This new capability is demonstrated for two tube-and-wing aircraft configurations, including a Boeing 737-200 for implementation validation and the NASA D8 as a first use case. Results validate the current implementation of the aerodynamic matching utility and demonstrate the importance of using such a method for aircraft configurations featuring fuselage-wing aerodynamic interaction.
Unsteady Analysis of Separated Aerodynamic Flows Using an Unstructured Multigrid Algorithm
NASA Technical Reports Server (NTRS)
Pelaez, Juan; Mavriplis, Dimitri J.; Kandil, Osama
2001-01-01
An implicit method for the computation of unsteady flows on unstructured grids is presented. The resulting nonlinear system of equations is solved at each time step using an agglomeration multigrid procedure. The method allows for arbitrarily large time steps and is efficient in terms of computational effort and storage. Validation of the code using a one-equation turbulence model is performed for the well-known case of flow over a cylinder. A Detached Eddy Simulation model is also implemented and its performance compared to the one equation Spalart-Allmaras Reynolds Averaged Navier-Stokes (RANS) turbulence model. Validation cases using DES and RANS include flow over a sphere and flow over a NACA 0012 wing including massive stall regimes. The project was driven by the ultimate goal of computing separated flows of aerodynamic interest, such as massive stall or flows over complex non-streamlined geometries.
Rasti, Behnam; Heravi, Yeganeh Entezari
2018-06-01
Isoform diversity, critical physiological roles and involvement in major diseases/disorders such as glaucoma, epilepsy, Alzheimer's disease, obesity, and cancers have made carbonic anhydrase (CA), one of the most interesting case studies in the field of computer aided drug design. Since applying non-selective inhibitors can result in major side effects, there have been considerable efforts so far to achieve selective inhibitors for different isoforms of CA. Using proteochemometrics approach, the chemical interaction space governed by a group of 4-amino-substituted benzenesulfonamides and human CAs has been explored in the present study. Several validation methods have been utilized to assess the validity, robustness and predictivity power of the proposed proteochemometric model. Our model has offered major structural information that can be applied to design new selective inhibitors for distinct isoforms of CA. To prove the applicability of the proposed model, new compounds have been designed based on the offered discriminative structural features.
The Work-Health-Check (WHC): a brief new tool for assessing psychosocial stress in the workplace.
Gadinger, M C; Schilling, O; Litaker, D; Fischer, J E
2012-01-01
Brief, psychometrically robust questionnaires assessing work-related psychosocial stressors are lacking. The purpose of the study is to evaluate the psychometric properties of a brief new questionnaire for assessing sources of work-related psychosocial stress. Managers, blue- and white-collar workers (n= 628 at measurement point one, n=459 at measurement point two), sampled from an online panel of a German marketing research institute. We either developed or identified appropriate items from existing questionnaires for ten scales, which are conceptually based in work stress models and reflected either work-related demands or resources. Factorial structure was evaluated by confirmatory factor analyses (CFA). Scale reliability was assessed by Cronbach's Alpha, and test-retest; correlations with work-related efforts demonstrated convergent and discriminant validity for the demand and resource scales, respectively. Scale correlations with health indicators tested criterion validity. All scales had satisfactory reliability (Cronbach's Alpha: 0.74-0.93, retest reliabilities: 0.66-0.81). CFA supported the anticipated factorial structure. Significant correlations between job-related efforts and demand scales (mean r=0.44) and non-significant correlations with the resource scales (mean r=0.07) suggested good convergent and discriminant validity, respectively. Scale correlations with health indicators demonstrated good criterion validity. The WHC appears to be a brief, psychometrically robust instrument for assessing work-related psychosocial stressors.
Garrido-Hernansaiz, Helena; Martín-Fernández, Manuel; Castaño-Torrijos, Aida; Cuevas, Isabel
2018-01-01
Violence against non-heterosexual adolescents in educational contexts remains a worrying reality, but no adequate attitudes toward affective-sexual diversity (AtASD) measure exists for Spanish adolescent students. We developed a 27-item scale including cognitive, affective, and behavioral aspects, which was completed by 696 secondary school students from the Madrid area. Factor analyses suggested a unidimensional model, Cronbach's alpha indicated excellent scale scores reliability, and item calibration under the item response theory framework showed that the scale is especially informative for homophobic attitudes. A hierarchical multiple regression analysis showed that variables traditionally related to AtASD (gender, age, religion, nationality, perceived parental/peer attitudes, direct contact with LGB people) also were so in our sample. Moreover, interest in sexuality topics and perceived center's efforts to provide AtASD education were related to better AtASD. Our scale was reliable and valid, and it may also prove useful in efforts to detect those students with homophobic attitudes and to guide interventions.
From bedside to bench and back again: research issues in animal models of human disease.
Tkacs, Nancy C; Thompson, Hilaire J
2006-07-01
To improve outcomes for patients with many serious clinical problems, multifactorial research approaches by nurse scientists, including the use of animal models, are necessary. Animal models serve as analogies for clinical problems seen in humans and must meet certain criteria, including validity and reliability, to be useful in moving research efforts forward. This article describes research considerations in the development of rodent models. As the standard of diabetes care evolves to emphasize intensive insulin therapy, rates of severe hypoglycemia are increasing among patients with type 1 and type 2 diabetes mellitus. A consequence of this change in clinical practice is an increase in rates of two hypoglycemia-related diabetes complications: hypoglycemia-associated autonomic failure (HAAF) and resulting hypoglycemia unawareness. Work on an animal model of HAAF is in an early developmental stage, with several labs reporting different approaches to model this complication of type 1 diabetes mellitus. This emerging model serves as an example illustrating how evaluation of validity and reliability is critically important at each stage of developing and testing animal models to support inquiry into human disease.
NASA Astrophysics Data System (ADS)
Moonen, P.; Gromke, C.; Dorer, V.
2013-08-01
The potential of a Large Eddy Simulation (LES) model to reliably predict near-field pollutant dispersion is assessed. To that extent, detailed time-resolved numerical simulations of coupled flow and dispersion are conducted for a street canyon with tree planting. Different crown porosities are considered. The model performance is assessed in several steps, ranging from a qualitative comparison to measured concentrations, over statistical data analysis by means of scatter plots and box plots, up to the calculation of objective validation metrics. The extensive validation effort highlights and quantifies notable features and shortcomings of the model, which would otherwise remain unnoticed. The model performance is found to be spatially non-uniform. Closer agreement with measurement data is achieved near the canyon ends than for the central part of the canyon, and typical model acceptance criteria are satisfied more easily for the leeward than for the windward canyon wall. This demonstrates the need for rigorous model evaluation. Only quality-assured models can be used with confidence to support assessment, planning and implementation of pollutant mitigation strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2010-05-23
The increasing asymmetric nature of threats to the security, health and sustainable growth of our society requires that anticipatory reasoning become an everyday activity. Currently, the use of anticipatory reasoning is hindered by the lack of systematic methods for combining knowledge- and evidence-based models, integrating modeling algorithms, and assessing model validity, accuracy and utility. The workshop addresses these gaps with the intent of fostering the creation of a community of interest on model integration and evaluation that may serve as an aggregation point for existing efforts and a launch pad for new approaches.
Southern Regional Center for Lightweight Innovative Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Paul T.
The Southern Regional Center for Lightweight Innovative Design (SRCLID) has developed an experimentally validated cradle-to-grave modeling and simulation effort to optimize automotive components in order to decrease weight and cost, yet increase performance and safety in crash scenarios. In summary, the three major objectives of this project are accomplished: To develop experimentally validated cradle-to-grave modeling and simulation tools to optimize automotive and truck components for lightweighting materials (aluminum, steel, and Mg alloys and polymer-based composites) with consideration of uncertainty to decrease weight and cost, yet increase the performance and safety in impact scenarios; To develop multiscale computational models that quantifymore » microstructure-property relations by evaluating various length scales, from the atomic through component levels, for each step of the manufacturing process for vehicles; and To develop an integrated K-12 educational program to educate students on lightweighting designs and impact scenarios. In this final report, we divided the content into two parts: the first part contains the development of building blocks for the project, including materials and process models, process-structure-property (PSP) relationship, and experimental validation capabilities; the second part presents the demonstration task for Mg front-end work associated with USAMP projects.« less
NASA Technical Reports Server (NTRS)
Liever, Peter A.; West, Jeffrey S.
2016-01-01
A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.
A Stochastic Framework for Evaluating Seizure Prediction Algorithms Using Hidden Markov Models
Wong, Stephen; Gardner, Andrew B.; Krieger, Abba M.; Litt, Brian
2007-01-01
Responsive, implantable stimulation devices to treat epilepsy are now in clinical trials. New evidence suggests that these devices may be more effective when they deliver therapy before seizure onset. Despite years of effort, prospective seizure prediction, which could improve device performance, remains elusive. In large part, this is explained by lack of agreement on a statistical framework for modeling seizure generation and a method for validating algorithm performance. We present a novel stochastic framework based on a three-state hidden Markov model (HMM) (representing interictal, preictal, and seizure states) with the feature that periods of increased seizure probability can transition back to the interictal state. This notion reflects clinical experience and may enhance interpretation of published seizure prediction studies. Our model accommodates clipped EEG segments and formalizes intuitive notions regarding statistical validation. We derive equations for type I and type II errors as a function of the number of seizures, duration of interictal data, and prediction horizon length and we demonstrate the model’s utility with a novel seizure detection algorithm that appeared to predicted seizure onset. We propose this framework as a vital tool for designing and validating prediction algorithms and for facilitating collaborative research in this area. PMID:17021032
NASA Technical Reports Server (NTRS)
Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.
2010-01-01
This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.
Advanced 3D Characterization and Reconstruction of Reactor Materials FY16 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fromm, Bradley; Hauch, Benjamin; Sridharan, Kumar
2016-12-01
A coordinated effort to link advanced materials characterization methods and computational modeling approaches is critical to future success for understanding and predicting the behavior of reactor materials that operate at extreme conditions. The difficulty and expense of working with nuclear materials have inhibited the use of modern characterization techniques on this class of materials. Likewise, mesoscale simulation efforts have been impeded due to insufficient experimental data necessary for initialization and validation of the computer models. The objective of this research is to develop methods to integrate advanced materials characterization techniques developed for reactor materials with state-of-the-art mesoscale modeling and simulationmore » tools. Research to develop broad-ion beam sample preparation, high-resolution electron backscatter diffraction, and digital microstructure reconstruction techniques; and methods for integration of these techniques into mesoscale modeling tools are detailed. Results for both irradiated and un-irradiated reactor materials are presented for FY14 - FY16 and final remarks are provided.« less
The Integrated Airframe/Propulsion Control System Architecture program (IAPSA)
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Cohen, Gerald C.; Meissner, Charles W.
1990-01-01
The Integrated Airframe/Propulsion Control System Architecture program (IAPSA) is a two-phase program which was initiated by NASA in the early 80s. The first phase, IAPSA 1, studied different architectural approaches to the problem of integrating engine control systems with airframe control systems in an advanced tactical fighter. One of the conclusions of IAPSA 1 was that the technology to construct a suitable system was available, yet the ability to create these complex computer architectures has outpaced the ability to analyze the resulting system's performance. With this in mind, the second phase of IAPSA approached the same problem with the added constraint that the system be designed for validation. The intent of the design for validation requirement is that validation requirements should be shown to be achievable early in the design process. IAPSA 2 has demonstrated that despite diligent efforts, integrated systems can retain characteristics which are difficult to model and, therefore, difficult to validate.
The Application of Modeling and Simulation in Capacity Management within the ITIL Framework
NASA Technical Reports Server (NTRS)
Rahmani, Sonya; vonderHoff, Otto
2010-01-01
Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.
Perception that "everything requires a lot of effort": transcultural SCL-25 item validation.
Moreau, Nicolas; Hassan, Ghayda; Rousseau, Cécile; Chenguiti, Khalid
2009-09-01
This brief report illustrates how the migration context can affect specific item validity of mental health measures. The SCL-25 was administered to 432 recently settled immigrants (220 Haitian and 212 Arabs). We performed descriptive analyses, as well as Infit and Outfit statistics analyses using WINSTEPS Rasch Measurement Software based on Item Response Theory. The participants' comments about the item You feel everything requires a lot of effort in the SCL-25 were also qualitatively analyzed. Results revealed that the item You feel everything requires a lot of effort is an outlier and does not adjust in an expected and valid fashion with its cluster items, as it is over-endorsed by Haitian and Arab healthy participants. Our study thus shows that, in transcultural mental health research, the cultural and migratory contexts may interact and significantly influence the meaning of some symptom items and consequently, the validity of symptom scales.
NASA Astrophysics Data System (ADS)
Chan, V. S.; Wong, C. P. C.; McLean, A. G.; Luo, G. N.; Wirth, B. D.
2013-10-01
The Xolotl code under development by PSI-SciDAC will enhance predictive modeling capability of plasma-facing materials under burning plasma conditions. The availability and application of experimental data to compare to code-calculated observables are key requirements to validate the breadth and content of physics included in the model and ultimately gain confidence in its results. A dedicated effort has been in progress to collect and organize a) a database of relevant experiments and their publications as previously carried out at sample exposure facilities in US and Asian tokamaks (e.g., DIII-D DiMES, and EAST MAPES), b) diagnostic and surface analysis capabilities available at each device, and c) requirements for future experiments with code validation in mind. The content of this evolving database will serve as a significant resource for the plasma-material interaction (PMI) community. Work supported in part by the US Department of Energy under GA-DE-SC0008698, DE-AC52-07NA27344 and DE-AC05-00OR22725.
NASA Technical Reports Server (NTRS)
Hardman, R. R.; Mahan, J. R.; Smith, M. H.; Gelhausen, P. A.; Van Dalsem, W. R.
1991-01-01
The need for a validation technique for computational fluid dynamics (CFD) codes in STOVL applications has led to research efforts to apply infrared thermal imaging techniques to visualize gaseous flow fields. Specifically, a heated, free-jet test facility was constructed. The gaseous flow field of the jet exhaust was characterized using an infrared imaging technique in the 2 to 5.6 micron wavelength band as well as conventional pitot tube and thermocouple methods. These infrared images are compared to computer-generated images using the equations of radiative exchange based on the temperature distribution in the jet exhaust measured with the thermocouple traverses. Temperature and velocity measurement techniques, infrared imaging, and the computer model of the infrared imaging technique are presented and discussed. From the study, it is concluded that infrared imaging techniques coupled with the radiative exchange equations applied to CFD models are a valid method to qualitatively verify CFD codes used in STOVL applications.
SeaQuaKE: Sea-optimized Quantum Key Exchange
2014-06-01
is led by Applied Communications Sciences under the ONR Free Space Optical Quantum Key Distribution Special Notice (13-SN-0004 under ONRBAA13-001...In addition, we discuss our initial progress towards the free - space quantum channel model and planning for the experimental validation effort. 15...SUBJECT TERMS Quantum communications, free - space optical communications 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as
Characterization of a Robotic Manipulator for Dynamic Wind Tunnel Applications
2015-03-26
further enhancements would need to be performed individually for each joint. This research effort focused on the improvement of the MTA wrist roll ...Measurement Unit ( IMU ), was used to validate the Euler angle output calculated by the MTA Computer using forward kinematics. Additionally, fast-response...61 3.7 Modeling the Wrist Roll Motor and Controller . . . . . . . . . . . . . . . . . . . . . 64 3.8 Proportional Control for Improved Performance
Integrated Efforts for Analysis of Geophysical Measurements and Models.
1997-09-26
12b. DISTRIBUTION CODE 13. ABSTRACT ( Maximum 200 words) This contract supported investigations of integrated applications of physics, ephemerides...REGIONS AND GPS DATA VALIDATIONS 20 2.5 PL-SCINDA: VISUALIZATION AND ANALYSIS TECHNIQUES 22 2.5.1 View Controls 23 2.5.2 Map Selection...and IR data, about cloudy pixels. Clustering and maximum likelihood classification algorithms categorize up to four cloud layers into stratiform or
NASA Technical Reports Server (NTRS)
Sweeney, Christopher; Bunnell, John; Chung, William; Giovannetti, Dean; Mikula, Julie; Nicholson, Bob; Roscoe, Mike
2001-01-01
Joint Shipboard Helicopter Integration Process (JSHIP) is a Joint Test and Evaluation (JT&E) program sponsored by the Office of the Secretary of Defense (OSD). Under the JSHDP program is a simulation effort referred to as the Dynamic Interface Modeling and Simulation System (DIMSS). The purpose of DIMSS is to develop and test the processes and mechanisms that facilitate ship-helicopter interface testing via man-in-the-loop ground-based flight simulators. Specifically, the DIMSS charter is to develop an accredited process for using a flight simulator to determine the wind-over-the-deck (WOD) launch and recovery flight envelope for the UH-60A ship/helicopter combination. DIMSS is a collaborative effort between the NASA Ames Research Center and OSD. OSD determines the T&E and warfighter training requirements, provides the programmatics and dynamic interface T&E experience, and conducts ship/aircraft interface tests for validating the simulation. NASA provides the research and development element, simulation facility, and simulation technical experience. This paper will highlight the benefits of the NASA/JSHIP collaboration and detail achievements of the project in terms of modeling and simulation. The Vertical Motion Simulator (VMS) at NASA Ames Research Center offers the capability to simulate a wide range of simulation cueing configurations, which include visual, aural, and body-force cueing devices. The system flexibility enables switching configurations io allow back-to-back evaluation and comparison of different levels of cueing fidelity in determining minimum training requirements. The investigation required development and integration of several major simulation system at the VMS. A new UH-60A BlackHawk interchangeable cab that provides an out-the-window (OTW) field-of-view (FOV) of 220 degrees in azimuth and 70 degrees in elevation was built. Modeling efforts involved integrating Computational Fluid Dynamics (CFD) generated data of an LHA ship airwake and integrating a real-time ship motion model developed based on a batch model from Naval Surface Warfare Center. Engineering development and integration of a three degrees-of-freedom (DOF) dynamic seat to simulate high frequency rotor-dynamics dependent motion cues for use in conjunction with the large motion system was accomplished. The development of an LHA visual model in several different levels of resolution and an aural cueing system in which three separate fidelity levels could be selected were developed. VMS also integrated a PC-based E&S simFUSION system to investigate cost effective IG alternatives. The DIMSS project consists of three phases that follow an approved Validation, Verification and accreditation (VV&A) process. The first phase will support the accreditation of the individual subsystems and models. The second will follow the verification and validation of the integrated subsystems and models, and will address fidelity requirements of the integrated models and subsystems. The third and final phase will allow the verification and validation of the full system integration. This VV&A process will address the utility of the simulated WOD launch and recovery envelope. Simulations supporting the first two stages have been completed and the data is currently being reviewed and analyzed.
Uncertainty in Earth System Models: Benchmarks for Ocean Model Performance and Validation
NASA Astrophysics Data System (ADS)
Ogunro, O. O.; Elliott, S.; Collier, N.; Wingenter, O. W.; Deal, C.; Fu, W.; Hoffman, F. M.
2017-12-01
The mean ocean CO2 sink is a major component of the global carbon budget, with marine reservoirs holding about fifty times more carbon than the atmosphere. Phytoplankton play a significant role in the net carbon sink through photosynthesis and drawdown, such that about a quarter of anthropogenic CO2 emissions end up in the ocean. Biology greatly increases the efficiency of marine environments in CO2 uptake and ultimately reduces the impact of the persistent rise in atmospheric concentrations. However, a number of challenges remain in appropriate representation of marine biogeochemical processes in Earth System Models (ESM). These threaten to undermine the community effort to quantify seasonal to multidecadal variability in ocean uptake of atmospheric CO2. In a bid to improve analyses of marine contributions to climate-carbon cycle feedbacks, we have developed new analysis methods and biogeochemistry metrics as part of the International Ocean Model Benchmarking (IOMB) effort. Our intent is to meet the growing diagnostic and benchmarking needs of ocean biogeochemistry models. The resulting software package has been employed to validate DOE ocean biogeochemistry results by comparison with observational datasets. Several other international ocean models contributing results to the fifth phase of the Coupled Model Intercomparison Project (CMIP5) were analyzed simultaneously. Our comparisons suggest that the biogeochemical processes determining CO2 entry into the global ocean are not well represented in most ESMs. Polar regions continue to show notable biases in many critical biogeochemical and physical oceanographic variables. Some of these disparities could have first order impacts on the conversion of atmospheric CO2 to organic carbon. In addition, single forcing simulations show that the current ocean state can be partly explained by the uptake of anthropogenic emissions. Combined effects of two or more of these forcings on ocean biogeochemical cycles and ecosystems are challenging to predict since additive or antagonistic effects may occur. A benchmarking tool for accurate assessment and validation of marine biogeochemical outputs will be indispensable as the model community continues to improve ESM developments. It will provide a first order tool in understanding climate-carbon cycle feedbacks.
Yadegarfar, Ghasem; Alinia, Tahereh; Hosseini, Reihane; Hassannejad, Razieh; Fayaz, Mahsa; Sanati, Javad; Sanati, Kave; Harandi, Jalal; Hajnoorozali, Vahid; Baghi, Mahmood-Reza; Mirzavand, Enayat; Majeed, Azeem
2013-02-01
To assess the reliability and validity of the Farsi version of the effort-reward imbalance questionnaire (F-ERIQ) and to examine the responsiveness of the tool to changes over time. A longitudinal study was carried out among 227 male employees of Iran Polyacryl Corporation. The F-ERIQ was developed through a forward-backward translation process that includes three scales of effort, reward and over-commitment (OC). Reliability and internal consistency of the F-ERIQ were assessed by split-half and Cronbach's alpha coefficients. Confirmatory factor analysis, convergent and discriminant validity were conducted to evaluate construct validity. Depressive mood was used as an indicator for exploring criterion validity. The variations in mean scores over time for scales were regarded as measures of the responsiveness to changes. Baseline split-half correlations for effort, reward and OC were 0.53, 0.85 and 0.65, respectively; Cronbach's alpha coefficients improved from 0.61 to 0.70 for effort, 0.85 to 0.88 for reward and 0.67 to 0.72 for OC. All of item-total correlations were higher than 0.23 and item-scales correlations were higher than 0.4. Although Values of Goodness of Fit Index and Adjusted GFI were higher than 0.9 and Root Mean Square Error of Approximation, Root Mean Square Residual and Standardized RMR were lower than 0.05, confirmatory factor analysis only confirmed the construct of the effort and OC. People with higher job stress were at higher risk of depressive mood (at least 3 times more). Overall, the mean score of effort, OC and ERI increase, and the figures decrease for reward among people who experience changes. These findings provide evidence that the F-ERIQ is a reliable and valid instrument for assessing psychosocial stress at work among Farsi-speaking male employees. We propose that F-ERIQ be further evaluated across a variety of jobs and industries.
Program test objectives milestone 3. [Integrated Propulsion Technology Demonstrator
NASA Technical Reports Server (NTRS)
Gaynor, T. L.
1994-01-01
The following conclusions have been developed relative to propulsion system technology adequacy for efficient development and operation of recoverable and expendable launch vehicles (RLV and ELV) and the benefits which the integrated propulsion technology demonstrator will provide for enhancing technology: (1) Technology improvements relative to propulsion system design and operation can reduce program cost. Many features or improvement needs to enhance operability, reduce cost, and improve payload are identified. (2) The Integrated Propulsion Technology Demonstrator (IPTD) Program provides a means of resolving the majority of issues associated with improvement needs. (3) The IPTD will evaluate complex integration of vehicle and facility functions in fluid management and propulsion control systems, and provides an environment for validating improved mechanical and electrical components. (4) The IPTD provides a mechanism for investigating operational issues focusing on reducing manpower and time to perform various functions at the launch site. These efforts include model development, collection of data to validate subject models, and ultimate development of complex time line models. (5) The IPTD provides an engine test bed for tri/bi-propellant engine development firings which is representative of the actual vehicle environment. (6) The IPTD provides for only a limited multiengine configuration integration environment for RLV. Multiengine efforts may be simulated for a number of subsystems and a number of subsystems are relatively independent of the multiengine influences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miao, Yinbin; Mo, Kun; Jamison, Laura M.
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructure-basedmore » materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize the experimental efforts in FY16 including the following important experiments: (1) in-situ grain growth measurement of nano-grained UO 2; (2) investigation of surface morphology in micrograined UO 2; (3) Nano-indentation experiments on nano- and micro-grained UO 2. The highlight of this year is: we have successfully demonstrated our capability to in-situ measure grain size development while maintaining the stoichiometry of nano-grained UO 2 materials; the experiment is, for the first time, using synchrotron X-ray diffraction to in-situ measure grain growth behavior of UO 2.« less
NASA Astrophysics Data System (ADS)
Kaneko, Daijiro
2013-10-01
The author regards fundamental root functions as underpinning photosynthesis activities by vegetation and as affecting environmental issues, grain production, and desertification. This paper describes the present development of monitoring and near real-time forecasting of environmental projects and crop production by approaching established operational monitoring step-by-step. The author has been developing a thematic monitoring structure (named RSEM system) which stands on satellite-based photosynthesis models over several continents for operational supports in environmental fields mentioned above. Validation methods stand not on FLUXNET but on carbon partitioning validation (CPV). The models demand continuing parameterization. The entire frame system has been built using Reanalysis meteorological data, but model accuracy remains insufficient except for that of paddy rice. The author shall accomplish the system that incorporates global environmental forces. Regarding crop production applications, industrialization in developing countries achieved through direct investment by economically developed nations raises their income, resulting in increased food demand. Last year, China began to import rice as it had in the past with grains of maize, wheat, and soybeans. Important agro-potential countries make efforts to cultivate new crop lands in South America, Africa, and Eastern Europe. Trends toward less food sustainability and stability are continuing, with exacerbation by rapid social and climate changes. Operational monitoring of carbon sequestration by herbaceous and bore plants converges with efforts at bio-energy, crop production monitoring, and socio-environmental projects such as CDM A/R, combating desertification, and bio-diversity.
Mulder, Leontine; van der Molen, Renate; Koelman, Carin; van Leeuwen, Ester; Roos, Anja; Damoiseaux, Jan
2018-05-01
ISO 15189:2012 requires validation of methods used in the medical laboratory, and lists a series of performance parameters for consideration to include. Although these performance parameters are feasible for clinical chemistry analytes, application in the validation of autoimmunity tests is a challenge. Lack of gold standards or reference methods in combination with the scarcity of well-defined diagnostic samples of patients with rare diseases make validation of new assays difficult. The present manuscript describes the initiative of Dutch medical immunology laboratory specialists to combine efforts and perform multi-center validation studies of new assays in the field of autoimmunity. Validation data and reports are made available to interested Dutch laboratory specialists. Copyright © 2018 Elsevier B.V. All rights reserved.
Development and Validation of the Primary Care Team Dynamics Survey
Song, Hummy; Chien, Alyna T; Fisher, Josephine; Martin, Julia; Peters, Antoinette S; Hacker, Karen; Rosenthal, Meredith B; Singer, Sara J
2015-01-01
Objective To develop and validate a survey instrument designed to measure team dynamics in primary care. Data Sources/Study Setting We studied 1,080 physician and nonphysician health care professionals working at 18 primary care practices participating in a learning collaborative aimed at improving team-based care. Study Design We developed a conceptual model and administered a cross-sectional survey addressing team dynamics, and we assessed reliability and discriminant validity of survey factors and the overall survey's goodness-of-fit using structural equation modeling. Data Collection We administered the survey between September 2012 and March 2013. Principal Findings Overall response rate was 68 percent (732 respondents). Results support a seven-factor model of team dynamics, suggesting that conditions for team effectiveness, shared understanding, and three supportive processes are associated with acting and feeling like a team and, in turn, perceived team effectiveness. This model demonstrated adequate fit (goodness-of-fit index: 0.91), scale reliability (Cronbach's alphas: 0.71–0.91), and discriminant validity (average factor correlations: 0.49). Conclusions It is possible to measure primary care team dynamics reliably using a 29-item survey. This survey may be used in ambulatory settings to study teamwork and explore the effect of efforts to improve team-based care. Future studies should demonstrate the importance of team dynamics for markers of team effectiveness (e.g., work satisfaction, care quality, clinical outcomes). PMID:25423886
Development and validation of the primary care team dynamics survey.
Song, Hummy; Chien, Alyna T; Fisher, Josephine; Martin, Julia; Peters, Antoinette S; Hacker, Karen; Rosenthal, Meredith B; Singer, Sara J
2015-06-01
To develop and validate a survey instrument designed to measure team dynamics in primary care. We studied 1,080 physician and nonphysician health care professionals working at 18 primary care practices participating in a learning collaborative aimed at improving team-based care. We developed a conceptual model and administered a cross-sectional survey addressing team dynamics, and we assessed reliability and discriminant validity of survey factors and the overall survey's goodness-of-fit using structural equation modeling. We administered the survey between September 2012 and March 2013. Overall response rate was 68 percent (732 respondents). Results support a seven-factor model of team dynamics, suggesting that conditions for team effectiveness, shared understanding, and three supportive processes are associated with acting and feeling like a team and, in turn, perceived team effectiveness. This model demonstrated adequate fit (goodness-of-fit index: 0.91), scale reliability (Cronbach's alphas: 0.71-0.91), and discriminant validity (average factor correlations: 0.49). It is possible to measure primary care team dynamics reliably using a 29-item survey. This survey may be used in ambulatory settings to study teamwork and explore the effect of efforts to improve team-based care. Future studies should demonstrate the importance of team dynamics for markers of team effectiveness (e.g., work satisfaction, care quality, clinical outcomes). © Health Research and Educational Trust.
NASA Astrophysics Data System (ADS)
Huang, Ying; Bevans, W. J.; Xiao, Hai; Zhou, Zhi; Chen, Genda
2012-04-01
During or after an earthquake event, building system often experiences large strains due to shaking effects as observed during recent earthquakes, causing permanent inelastic deformation. In addition to the inelastic deformation induced by the earthquake effect, the post-earthquake fires associated with short fuse of electrical systems and leakage of gas devices can further strain the already damaged structures during the earthquakes, potentially leading to a progressive collapse of buildings. Under these harsh environments, measurements on the involved building by various sensors could only provide limited structural health information. Finite element model analysis, on the other hand, if validated by predesigned experiments, can provide detail structural behavior information of the entire structures. In this paper, a temperature dependent nonlinear 3-D finite element model (FEM) of a one-story steel frame is set up by ABAQUS based on the cited material property of steel from EN 1993-1.2 and AISC manuals. The FEM is validated by testing the modeled steel frame in simulated post-earthquake environments. Comparisons between the FEM analysis and the experimental results show that the FEM predicts the structural behavior of the steel frame in post-earthquake fire conditions reasonably. With experimental validations, the FEM analysis of critical structures could be continuously predicted for structures in these harsh environments for a better assistant to fire fighters in their rescue efforts and save fire victims.
Validation of The Scenarios Designed For The Eu Registration of Pesticides
NASA Astrophysics Data System (ADS)
Piñeros Garcet, J. D.; de Nie, D.; Vanclooster, M.; Tiktak, A.; Klein, M.
As part of recent efforts to harmonise registration procedures for pesticides within the EU, a set of uniform principles were developed, setting out the detailed evaluation and decision making criteria for pesticide registration. The EU directive 91/414/EEC places great importance on the use of validated models to calculate Predicted Envi- ronmental Concentrations (PECs), as a basis for assessing the environmental risks and health effects. To be used in a harmonised registration process, the quality of PEC modelling needs to be assured. Quality assurance of mathematical modelling implies, amongst others, the validation of the environmental modelling scenarios. The FOrum for the CO-ordination of pesticide fate models and their USe (FOCUS), is the cur- rent platform where common modelling methodologies are designed and subjected for approval to the European authorities. In 2000, the FOCUS groundwater scenarios working group defined the procedures for realising tier 1 PEC groundwater calcula- tions for the active substances of plant protection products at the pan-european level. The procedures and guidelines were approved by the Standing Committee on Plant Health, and are now recommended for tier 1 PEC groundwater calculations in the reg-istration dossier. Yet, the working group also identified a range of uncertainties related to the validity of the present leaching scenarios. To mitigate some of these problems,the EU R&D project APECOP was designed and approved for support in the frame-work of the EU-FP5-Quality of Life Programme. One of the objectives of the project is to evaluate the appropriateness of the current Tier 1 groundwater scenarios. In this paper, we summarise the methodology and results of the scenarios validation.
Validation of The Scenarios Designed For The Eu Registration of Pesticides
NASA Astrophysics Data System (ADS)
Piñeros Garcet, J. D.; de Nie, D.; Vanclooster, M.; Tiktak, A.; Klein, M.; Jones, A.
As part of recent efforts to harmonise registration procedures for pesticides within the EU, a set of uniform principles were developed, setting out the detailed evaluation and decision making criteria for pesticide registration. The EU directive 91/414/EEC places great importance on the use of validated models to calculate Predicted Envi- ronmental Concentrations (PECs), as a basis for assessing the environmental risks and health effects. To be used in a harmonised registration process, the quality of PEC modelling needs to be assured. Quality assurance of mathematical modelling implies, amongst others, the validation of the environmental modelling scenarios. The FOrum for the CO-ordination of pesticide fate models and their USe (FOCUS), is the cur- rent platform where common modelling methodologies are designed and subjected for approval to the European authorities. In 2000, the FOCUS groundwater scenarios working group defined the procedures for realising tier 1 PEC groundwater calcula- tions for the active substances of plant protection products at the pan-european level. The procedures and guidelines were approved by the Standing Committee on Plant Health, and are now recommended for tier 1 PEC groundwater calculations in the reg- istration dossier. Yet, the working group also identified a range of uncertainties related to the validity of the present leaching scenarios. To mitigate some of these problems, the EU R&D project APECOP was designed and approved for support in the frame- work of the EU-FP5-Quality of Life Programme. One of the objectives of the project is to evaluate the appropriateness of the current Tier 1 groundwater scenarios. In this paper, we summarise the methodology and results of the scenarios validation.
Containment Sodium Chemistry Models in MELCOR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.; Denman, Matthew R
To meet regulatory needs for sodium fast reactors’ future development, including licensing requirements, Sandia National Laboratories is modernizing MELCOR, a severe accident analysis computer code developed for the U.S. Nuclear Regulatory Commission (NRC). Specifically, Sandia is modernizing MELCOR to include the capability to model sodium reactors. However, Sandia’s modernization effort primarily focuses on the containment response aspects of the sodium reactor accidents. Sandia began modernizing MELCOR in 2013 to allow a sodium coolant, rather than water, for conventional light water reactors. In the past three years, Sandia has been implementing the sodium chemistry containment models in CONTAIN-LMR, a legacy NRCmore » code, into MELCOR. These chemistry models include spray fire, pool fire and atmosphere chemistry models. Only the first two chemistry models have been implemented though it is intended to implement all these models into MELCOR. A new package called “NAC” has been created to manage the sodium chemistry model more efficiently. In 2017 Sandia began validating the implemented models in MELCOR by simulating available experiments. The CONTAIN-LMR sodium models include sodium atmosphere chemistry and sodium-concrete interaction models. This paper presents sodium property models, the implemented models, implementation issues, and a path towards validation against existing experimental data.« less
CFD Code Validation of Wall Heat Fluxes for a G02/GH2 Single Element Combustor
NASA Technical Reports Server (NTRS)
Lin, Jeff; West, Jeff S.; Williams, Robert W.; Tucker, P. Kevin
2005-01-01
This paper puts forth the case for the need for improved injector design tools to meet NASA s Vision for Space Exploration goals. Requirements for this improved tool are outlined and discussed. The potential for Computational Fluid Dynamics (CFD) to meet these requirements is noted along with its current shortcomings, especially relative to demonstrated solution accuracy. The concept of verification and validation is introduced as the primary process for building and quantifying the confidence necessary for CFD to be useful as an injector design tool. The verification and validation process is considered in the context of the Marshall Space Flight Center (MSFC) Combustion Devices CFD Simulation Capability Roadmap via the Simulation Readiness Level (SRL) concept. The portion of the validation process which demonstrates the ability of a CFD code to simulate heat fluxes to a rocket engine combustor wall is the focus of the current effort. The FDNS and Loci-CHEM codes are used to simulate a shear coaxial single element G02/GH2 injector experiment. The experiment was conducted a t a chamber pressure of 750 psia using hot propellants from preburners. A measured wall temperature profile is used as a boundary condition to facilitate the calculations. Converged solutions, obtained from both codes by using wall functions with the K-E turbulence model and integrating to the wall using Mentor s baseline turbulence model, are compared to the experimental data. The initial solutions from both codes revealed significant issues with the wall function implementation associated with the recirculation zone between the shear coaxial jet and the chamber wall. The FDNS solution with a corrected implementation shows marked improvement in overall character and level of comparison to the data. With the FDNS code, integrating to the wall with Mentor s baseline turbulence model actually produce a degraded solution when compared to the wall function solution with the K--E model. The Loci-CHEM solution, produced by integrating to the wall with Mentor s baseline turbulence model, matches both the heat flux rise rate in the near injector region and the peak heat flux level very well. However, it moderately over predicts the heat fluxes downstream of the reattachment point. The Loci-CHEM solution achieved by integrating to the wall with Mentor s baseline turbulence model was clearly superior to the other solutions produced in this effort.
Information system end-user satisfaction and continuance intention: A unified modeling approach.
Hadji, Brahim; Degoulet, Patrice
2016-06-01
Permanent evaluation of end-user satisfaction and continuance intention is a critical issue at each phase of a clinical information system (CIS) project, but most validation studies are concerned with the pre- or early post-adoption phases. The purpose of this study was twofold: to validate at the Pompidou University Hospital (HEGP) an information technology late post-adoption model built from four validated models and to propose a unified metamodel of evaluation that could be adapted to each context or deployment phase of a CIS project. Five dimensions, i.e., CIS quality (CISQ), perceived usefulness (PU), confirmation of expectations (CE), user satisfaction (SAT), and continuance intention (CI) were selected to constitute the CI evaluation model. The validity of the model was tested using the combined answers to four surveys performed between 2011 and 2015, i.e., more than ten years after the opening of HEGP in July 2000. Structural equation modeling was used to test the eight model-associated hypotheses. The multi-professional study group of 571 responders consisted of 158 doctors, 282 nurses, and 131 secretaries. The evaluation model accounted for 84% of variance of satisfaction and 53% of CI variance for the period 2011-2015 and for 92% and 69% for the period 2014-2015. In very late post adoption, CISQ appears to be the major determinant of satisfaction and CI. Combining the results obtained at various phases of CIS deployment, a Unified Model of Information System Continuance (UMISC) is proposed. In a meaningful CIS use situation at HEGP, this study confirms the importance of CISQ in explaining satisfaction and CI. The proposed UMISC model that can be adapted to each phase of CIS deployment could facilitate the necessary efforts of permanent CIS acceptance and continuance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.
Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis
NASA Astrophysics Data System (ADS)
Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.
2013-12-01
Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.
Developing a predictive model for the chemical composition of soot nanoparticles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Violi, Angela; Michelsen, Hope; Hansen, Nils
In order to provide the scientific foundation to enable technology breakthroughs in transportation fuel, it is important to develop a combustion modeling capability to optimize the operation and design of evolving fuels in advanced engines for transportation applications. The goal of this proposal is to develop a validated predictive model to describe the chemical composition of soot nanoparticles in premixed and diffusion flames. Atomistic studies in conjunction with state-of-the-art experiments are the distinguishing characteristics of this unique interdisciplinary effort. The modeling effort has been conducted at the University of Michigan by Prof. A. Violi. The experimental work has entailed amore » series of studies using different techniques to analyze gas-phase soot precursor chemistry and soot particle production in premixed and diffusion flames. Measurements have provided spatial distributions of polycyclic aromatic hydrocarbons and other gas-phase species and size and composition of incipient soot nanoparticles for comparison with model results. The experimental team includes Dr. N. Hansen and H. Michelsen at Sandia National Labs' Combustion Research Facility, and Dr. K. Wilson as collaborator at Lawrence Berkeley National Lab's Advanced Light Source. Our results show that the chemical and physical properties of nanoparticles affect the coagulation behavior in soot formation, and our results on an experimentally validated, predictive model for the chemical composition of soot nanoparticles will not only enhance our understanding of soot formation since but will also allow the prediction of particle size distributions under combustion conditions. These results provide a novel description of soot formation based on physical and chemical properties of the particles for use in the next generation of soot models and an enhanced capability for facilitating the design of alternative fuels and the engines they will power.« less
Mathematical modeling of a single stage ultrasonically assisted distillation process.
Mahdi, Taha; Ahmad, Arshad; Ripin, Adnan; Abdullah, Tuan Amran Tuan; Nasef, Mohamed M; Ali, Mohamad W
2015-05-01
The ability of sonication phenomena in facilitating separation of azeotropic mixtures presents a promising approach for the development of more intensified and efficient distillation systems than conventional ones. To expedite the much-needed development, a mathematical model of the system based on conservation principles, vapor-liquid equilibrium and sonochemistry was developed in this study. The model that was founded on a single stage vapor-liquid equilibrium system and enhanced with ultrasonic waves was coded using MATLAB simulator and validated with experimental data for ethanol-ethyl acetate mixture. The effects of both ultrasonic frequency and intensity on the relative volatility and azeotropic point were examined, and the optimal conditions were obtained using genetic algorithm. The experimental data validated the model with a reasonable accuracy. The results of this study revealed that the azeotropic point of the mixture can be totally eliminated with the right combination of sonication parameters and this can be utilized in facilitating design efforts towards establishing a workable ultrasonically intensified distillation system. Copyright © 2014 Elsevier B.V. All rights reserved.
Parrish, Rudolph S.; Smith, Charles N.
1990-01-01
A quantitative method is described for testing whether model predictions fall within a specified factor of true values. The technique is based on classical theory for confidence regions on unknown population parameters and can be related to hypothesis testing in both univariate and multivariate situations. A capability index is defined that can be used as a measure of predictive capability of a model, and its properties are discussed. The testing approach and the capability index should facilitate model validation efforts and permit comparisons among competing models. An example is given for a pesticide leaching model that predicts chemical concentrations in the soil profile.
Magnetospheric Substorm Evolution in the Magnetotail: Challenge to Global MHD Modeling.
NASA Astrophysics Data System (ADS)
Kuznetsova, M. M.; Hesse, M.; Dorelli, J.; Rastaetter, L.
2003-12-01
Testing the ability of global MHD models to describe magnetotail evolution during substroms is one of the elements of science based validation efforts at CCMC. We perform simulations of magnetotail dynamics using global MHD models residing at CCMC. We select solar wind conditions which drive the accumulation of magnetic field in the tail lobes and subsequent magnetic reconnection and energy release. We will analyze the effects of spatial resolution in the plasma sheet on modeled expansion phase evolution, maximum energy stored in the tail, and details of magnetotail reconnection. We will pay special attention to current sheet thinning and multiple plasmoid formation.
RF model of the distribution system as a communication channel, phase 2. Volume 1: Summary Report
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
The design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial (tree) configured distribution feeders was undertaken. That work demonstrated the feasibility and validity based on verification measurements made on a limited size portion of an actual live feeder. On that basis a follow-on effort concerned with (1) extending the verification based on a greater variety of situations and network size, (2) extending the model capabilities for reverse direction propagation, (3) investigating parameter sensitivities, (4) improving transformer models, and (5) investigating procedures/fixes for ameliorating propagation trouble spots was conducted. Results are summarized.
Animal models of post-traumatic stress disorder: face validity
Goswami, Sonal; Rodríguez-Sierra, Olga; Cascardi, Michele; Paré, Denis
2013-01-01
Post-traumatic stress disorder (PTSD) is a debilitating condition that develops in a proportion of individuals following a traumatic event. Despite recent advances, ethical limitations associated with human research impede progress in understanding PTSD. Fortunately, much effort has focused on developing animal models to help study the pathophysiology of PTSD. Here, we provide an overview of animal PTSD models where a variety of stressors (physical, psychosocial, or psychogenic) are used to examine the long-term effects of severe trauma. We emphasize models involving predator threat because they reproduce human individual differences in susceptibility to, and in the long-term consequences of, psychological trauma. PMID:23754973
Cognitive Support During High-Consequence Episodes of Care in Cardiovascular Surgery.
Conboy, Heather M; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Christov, Stefan C; Goldman, Julian M; Yule, Steven J; Zenati, Marco A
2017-03-01
Despite significant efforts to reduce preventable adverse events in medical processes, such events continue to occur at unacceptable rates. This paper describes a computer science approach that uses formal process modeling to provide situationally aware monitoring and management support to medical professionals performing complex processes. These process models represent both normative and non-normative situations, and are validated by rigorous automated techniques such as model checking and fault tree analysis, in addition to careful review by experts. Context-aware Smart Checklists are then generated from the models, providing cognitive support during high-consequence surgical episodes. The approach is illustrated with a case study in cardiovascular surgery.
Insights on in vitro models for safety and toxicity assessment of cosmetic ingredients.
Almeida, Andreia; Sarmento, Bruno; Rodrigues, Francisca
2017-03-15
According to the current European legislation, the safety assessment of each individual cosmetic ingredient of any formulation is the basis for the safety evaluation of a cosmetic product. Also, animal testing in the European Union is prohibited for cosmetic ingredients and products since 2004 and 2009, respectively. Additionally, the commercialization of any cosmetic products containing ingredients tested on animal models was forbidden in 2009. In consequence of these boundaries, the European Centre for the Validation of Alternative Methods (ECVAM) proposes a list of validated cell-based in vitro models for predicting the safety and toxicity of cosmetic ingredients. These models have been demonstrated as valuable and effective tools to overcome the limitations of animal in vivo studies. Although the use of in vitro cell-based models for the evaluation of absorption and permeability of cosmetic ingredients is widespread, a detailed study on the properties of these platforms and the in vitro-in vivo correlation compared with human data are required. Moreover, additional efforts must be taken to develop in vitro models to predict carcinogenicity, repeat dose toxicity and reproductive toxicity, for which no alternative in vitro methods are currently available. This review paper summarizes and characterizes the most relevant in vitro models validated by ECVAM employed to predict the safety and toxicology of cosmetic ingredients. Copyright © 2017 Elsevier B.V. All rights reserved.
Validation of coupled atmosphere-fire behavior models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bossert, J.E.; Reisner, J.M.; Linn, R.R.
1998-12-31
Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexitymore » of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.« less
Common modeling system for digital simulation
NASA Technical Reports Server (NTRS)
Painter, Rick
1994-01-01
The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.
The logic-bias effect: The role of effortful processing in the resolution of belief-logic conflict.
Howarth, Stephanie; Handley, Simon J; Walsh, Clare
2016-02-01
According to the default interventionist dual-process account of reasoning, belief-based responses to reasoning tasks are based on Type 1 processes generated by default, which must be inhibited in order to produce an effortful, Type 2 output based on the validity of an argument. However, recent research has indicated that reasoning on the basis of beliefs may not be as fast and automatic as this account claims. In three experiments, we presented participants with a reasoning task that was to be completed while they were generating random numbers (RNG). We used the novel methodology introduced by Handley, Newstead & Trippas (Journal of Experimental Psychology: Learning, Memory, and Cognition, 37, 28-43, 2011), which required participants to make judgments based upon either the validity of a conditional argument or the believability of its conclusion. The results showed that belief-based judgments produced lower rates of accuracy overall and were influenced to a greater extent than validity judgments by the presence of a conflict between belief and logic for both simple and complex arguments. These findings were replicated in Experiment 3, in which we controlled for switching demands in a blocked design. Across all three experiments, we found a main effect of RNG, implying that both instructional sets require some effortful processing. However, in the blocked design RNG had its greatest impact on logic judgments, suggesting that distinct executive resources may be required for each type of judgment. We discuss the implications of our findings for the default interventionist account and offer a parallel competitive model as an alternative interpretation for our findings.
NASA Astrophysics Data System (ADS)
Zaron, Edward D.; Fitzpatrick, Patrick J.; Cross, Scott L.; Harding, John M.; Bub, Frank L.; Wiggert, Jerry D.; Ko, Dong S.; Lau, Yee; Woodard, Katharine; Mooers, Christopher N. K.
2015-12-01
In response to the Deepwater Horizon (DwH) oil spill event in 2010, the Naval Oceanographic Office deployed a nowcast-forecast system covering the Gulf of Mexico and adjacent Caribbean Sea that was designated Americas Seas, or AMSEAS, which is documented in this manuscript. The DwH disaster provided a challenge to the application of available ocean-forecast capabilities, and also generated a historically large observational dataset. AMSEAS was evaluated by four complementary efforts, each with somewhat different aims and approaches: a university research consortium within an Integrated Ocean Observing System (IOOS) testbed; a petroleum industry consortium, the Gulf of Mexico 3-D Operational Ocean Forecast System Pilot Prediction Project (GOMEX-PPP); a British Petroleum (BP) funded project at the Northern Gulf Institute in response to the oil spill; and the Navy itself. Validation metrics are presented in these different projects for water temperature and salinity profiles, sea surface wind, sea surface temperature, sea surface height, and volume transport, for different forecast time scales. The validation found certain geographic and time biases/errors, and small but systematic improvements relative to earlier regional and global modeling efforts. On the basis of these positive AMSEAS validation studies, an oil spill transport simulation was conducted using archived AMSEAS nowcasts to examine transport into the estuaries east of the Mississippi River. This effort captured the influences of Hurricane Alex and a non-tropical cyclone off the Louisiana coast, both of which pushed oil into the western Mississippi Sound, illustrating the importance of the atmospheric influence on oil spills such as DwH.
Chen, S C; You, S H; Liu, C Y; Chio, C P; Liao, C M
2012-09-01
The aim of this work was to use experimental infection data of human influenza to assess a simple viral dynamics model in epithelial cells and better understand the underlying complex factors governing the infection process. The developed study model expands on previous reports of a target cell-limited model with delayed virus production. Data from 10 published experimental infection studies of human influenza was used to validate the model. Our results elucidate, mechanistically, the associations between epithelial cells, human immune responses, and viral titres and were supported by the experimental infection data. We report that the maximum total number of free virions following infection is 10(3)-fold higher than the initial introduced titre. Our results indicated that the infection rates of unprotected epithelial cells probably play an important role in affecting viral dynamics. By simulating an advanced model of viral dynamics and applying it to experimental infection data of human influenza, we obtained important estimates of the infection rate. This work provides epidemiologically meaningful results, meriting further efforts to understand the causes and consequences of influenza A infection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana, S.; Damiani, R.; vanDam, J.
As part of an ongoing effort to improve the modeling and prediction of small wind turbine dynamics, NREL tested a small horizontal axis wind turbine in the field at the National Wind Technology Center (NWTC). The test turbine was a 2.1-kW downwind machine mounted on an 18-meter multi-section fiberglass composite tower. The tower was instrumented and monitored for approximately 6 months. The collected data were analyzed to assess the turbine and tower loads and further validate the simplified loads equations from the International Electrotechnical Commission (IEC) 61400-2 design standards. Field-measured loads were also compared to the output of an aeroelasticmore » model of the turbine. Ultimate loads at the tower base were assessed using both the simplified design equations and the aeroelastic model output. The simplified design equations in IEC 61400-2 do not accurately model fatigue loads. In this project, we compared fatigue loads as measured in the field, as predicted by the aeroelastic model, and as calculated using the simplified design equations.« less
Predictive ability of a comprehensive incremental test in mountain bike marathon.
Ahrend, Marc-Daniel; Schneeweiss, Patrick; Martus, Peter; Niess, Andreas M; Krauss, Inga
2018-01-01
Traditional performance tests in mountain bike marathon (XCM) primarily quantify aerobic metabolism and may not describe the relevant capacities in XCM. We aimed to validate a comprehensive test protocol quantifying its intermittent demands. Forty-nine athletes (38.8±9.1 years; 38 male; 11 female) performed a laboratory performance test, including an incremental test, to determine individual anaerobic threshold (IAT), peak power output (PPO) and three maximal efforts (10 s all-out sprint, 1 min maximal effort and 5 min maximal effort). Within 2 weeks, the athletes participated in one of three XCM races (n=15, n=9 and n=25). Correlations between test variables and race times were calculated separately. In addition, multiple regression models of the predictive value of laboratory outcomes were calculated for race 3 and across all races (z-transformed data). All variables were correlated with race times 1, 2 and 3: 10 s all-out sprint (r=-0.72; r=-0.59; r=-0.61), 1 min maximal effort (r=-0.85; r=-0.84; r=-0.82), 5 min maximal effort (r=-0.57; r=-0.85; r=-0.76), PPO (r=-0.77; r=-0.73; r=-0.76) and IAT (r=-0.71; r=-0.67; r=-0.68). The best-fitting multiple regression models for race 3 (r 2 =0.868) and across all races (r 2 =0.757) comprised 1 min maximal effort, IAT and body weight. Aerobic and intermittent variables correlated least strongly with race times. Their use in a multiple regression model confirmed additional explanatory power to predict XCM performance. These findings underline the usefulness of the comprehensive incremental test to predict performance in that sport more precisely.
NASA Technical Reports Server (NTRS)
West, Jeff; Strutzenberg, Louise L.; Putnam, Gabriel C.; Liever, Peter A.; Williams, Brandon R.
2012-01-01
This paper presents development efforts to establish modeling capabilities for launch vehicle liftoff acoustics and ignition transient environment predictions. Peak acoustic loads experienced by the launch vehicle occur during liftoff with strong interaction between the vehicle and the launch facility. Acoustic prediction engineering tools based on empirical models are of limited value in efforts to proactively design and optimize launch vehicles and launch facility configurations for liftoff acoustics. Modeling approaches are needed that capture the important details of the plume flow environment including the ignition transient, identify the noise generation sources, and allow assessment of the effects of launch pad geometric details and acoustic mitigation measures such as water injection. This paper presents a status of the CFD tools developed by the MSFC Fluid Dynamics Branch featuring advanced multi-physics modeling capabilities developed towards this goal. Validation and application examples are presented along with an overview of application in the prediction of liftoff environments and the design of targeted mitigation measures such as launch pad configuration and sound suppression water placement.
Benchmarking on Tsunami Currents with ComMIT
NASA Astrophysics Data System (ADS)
Sharghi vand, N.; Kanoglu, U.
2015-12-01
There were no standards for the validation and verification of tsunami numerical models before 2004 Indian Ocean tsunami. Even, number of numerical models has been used for inundation mapping effort, evaluation of critical structures, etc. without validation and verification. After 2004, NOAA Center for Tsunami Research (NCTR) established standards for the validation and verification of tsunami numerical models (Synolakis et al. 2008 Pure Appl. Geophys. 165, 2197-2228), which will be used evaluation of critical structures such as nuclear power plants against tsunami attack. NCTR presented analytical, experimental and field benchmark problems aimed to estimate maximum runup and accepted widely by the community. Recently, benchmark problems were suggested by the US National Tsunami Hazard Mitigation Program Mapping & Modeling Benchmarking Workshop: Tsunami Currents on February 9-10, 2015 at Portland, Oregon, USA (http://nws.weather.gov/nthmp/index.html). These benchmark problems concentrated toward validation and verification of tsunami numerical models on tsunami currents. Three of the benchmark problems were: current measurement of the Japan 2011 tsunami in Hilo Harbor, Hawaii, USA and in Tauranga Harbor, New Zealand, and single long-period wave propagating onto a small-scale experimental model of the town of Seaside, Oregon, USA. These benchmark problems were implemented in the Community Modeling Interface for Tsunamis (ComMIT) (Titov et al. 2011 Pure Appl. Geophys. 168, 2121-2131), which is a user-friendly interface to the validated and verified Method of Splitting Tsunami (MOST) (Titov and Synolakis 1995 J. Waterw. Port Coastal Ocean Eng. 121, 308-316) model and is developed by NCTR. The modeling results are compared with the required benchmark data, providing good agreements and results are discussed. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)
High-Temperature Strain Sensing for Aerospace Applications
NASA Technical Reports Server (NTRS)
Piazza, Anthony; Richards, Lance W.; Hudson, Larry D.
2008-01-01
Thermal protection systems (TPS) and hot structures are utilizing advanced materials that operate at temperatures that exceed abilities to measure structural performance. Robust strain sensors that operate accurately and reliably beyond 1800 F are needed but do not exist. These shortcomings hinder the ability to validate analysis and modeling techniques and hinders the ability to optimize structural designs. This presentation examines high-temperature strain sensing for aerospace applications and, more specifically, seeks to provide strain data for validating finite element models and thermal-structural analyses. Efforts have been made to develop sensor attachment techniques for relevant structural materials at the small test specimen level and to perform laboratory tests to characterize sensor and generate corrections to apply to indicated strains. Areas highlighted in this presentation include sensors, sensor attachment techniques, laboratory evaluation/characterization of strain measurement, and sensor use in large-scale structures.
Monitoring programs to assess reintroduction efforts: A critical component in recovery
Muths, E.; Dreitz, V.
2008-01-01
Reintroduction is a powerful tool in our conservation toolbox. However, the necessary follow-up, i.e. long-term monitoring, is not commonplace and if instituted may lack rigor. We contend that valid monitoring is possible, even with sparse data. We present a means to monitor based on demographic data and a projection model using the Wyoming toad (Bufo baxten) as an example. Using an iterative process, existing data is built upon gradually such that demographic estimates and subsequent inferences increase in reliability. Reintroduction and defensible monitoring may become increasingly relevant as the outlook for amphibians, especially in tropical regions, continues to deteriorate and emergency collection, captive breeding, and reintroduction become necessary. Rigorous use of appropriate modeling and an adaptive approach can validate the use of reintroduction and substantially increase its value to recovery programs. ?? 2008 Museu de Cie??ncies Naturals.
Use of Synchronized Phasor Measurements for Model Validation in ERCOT
NASA Astrophysics Data System (ADS)
Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill
2013-05-01
This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.
LEWICE 2.2 Capabilities and Thermal Validation
NASA Technical Reports Server (NTRS)
Wright, William B.
2002-01-01
A computational model of bleed air anti-icing and electrothermal de-icing have been added to the LEWICE 2.0 software by integrating the capabilities of two previous programs, ANTICE and LEWICE/ Thermal. This combined model has been released as LEWICE version 2.2. Several advancements have also been added to the previous capabilities of each module. This report will present the capabilities of the software package and provide results for both bleed air and electrothermal cases. A comprehensive validation effort has also been performed to compare the predictions to an existing electrothermal database. A quantitative comparison shows that for deicing cases, the average difference is 9.4 F (26%) compared to 3 F for the experimental data while for evaporative cases the average difference is 2 F (32%) compared to an experimental error of 4 F.
1981-11-01
i very little effort has been put upon the model validation, which is essential in any scientific research. T’-- -rientation we aim at in the present...better than the former to the target function. This implies that, although the interval of ability e of our interest is even a little smaller than [-3.0...approaches turned out to be similar, with some deviations, i.e., some of them are a little closer to the theoretical density function, and some of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, K. M.; Edwards, T. B.; Best, D. R.
2015-07-07
In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for several simulated low activity waste (LAW) glasses (designated as the August and October 2014 LAW glasses) fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions.
Final technical report for DE-SC00012633 AToM (Advanced Tokamak Modeling)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Christopher; Orlov, Dmitri; Izzo, Valerie
This final report for the AToM project documents contributions from University of California, San Diego researchers over the period of 9/1/2014 – 8/31/2017. The primary focus of these efforts was on performing validation studies of core tokamak transport models using the OMFIT framework, including development of OMFIT workflow scripts. Additional work was performed to develop tools for use of the nonlinear magnetohydrodynamics code NIMROD in OMFIT, and its use in the study of runaway electron dynamics in tokamak disruptions.
International forensic automotive paint database
NASA Astrophysics Data System (ADS)
Bishea, Gregory A.; Buckle, Joe L.; Ryland, Scott G.
1999-02-01
The Technical Working Group for Materials Analysis (TWGMAT) is supporting an international forensic automotive paint database. The Federal Bureau of Investigation and the Royal Canadian Mounted Police (RCMP) are collaborating on this effort through TWGMAT. This paper outlines the support and further development of the RCMP's Automotive Paint Database, `Paint Data Query'. This cooperative agreement augments and supports a current, validated, searchable, automotive paint database that is used to identify make(s), model(s), and year(s) of questioned paint samples in hit-and-run fatalities and other associated investigations involving automotive paint.
Radiological Dispersal Devices: Select Issues in Consequence Management
2004-03-10
goals, following which medical treatment of the radiation effects can be provided.10 Post- exposure medical therapy is designed to treat the consequences ...the approach that radiation related health effects can be extrapolated, i.e. the damage caused by radiation exposure CRS-3 8 For example, see Health...effort to determine the validity of these models, the federal government funds research into the health effects of radiation exposure through the
NASA Technical Reports Server (NTRS)
Herman, Daniel A; Shastry, Rohit; Huang, Wensheng; Soulas, George C.; KamHawi, Hani
2012-01-01
In order to aid in the design of high-power Hall thrusters and provide experimental validation for existing modeling efforts, plasma potential and Langmuir probe measurements were performed in the near-field plume of the NASA 300M Hall thruster. A probe array consisting of a Faraday probe, Langmuir probe, and emissive probe was used to interrogate the plume from approximately 0.1 - 2.0 DT,m downstream of the thruster exit plane at four operating conditions: 300 V, 400 V, and 500 V at 20 kW as well as 300 V at 10 kW. Results show that the acceleration zone and high-temperature region were contained within 0.3 DT,m from the exit plane at all operating conditions. Isothermal lines were shown to strongly follow magnetic field lines in the nearfield, with maximum temperatures ranging from 19 - 27 eV. The electron temperature spatial distribution created large drops in measured floating potentials in front of the magnetic pole surfaces where the plasma density was small, which suggests strong sheaths at these surfaces. The data taken have provided valuable information for future design and modeling validation, and complements ongoing internal measurement efforts on the NASA 300 M.
An integrated radar model solution for mission level performance and cost trades
NASA Astrophysics Data System (ADS)
Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia
2017-05-01
A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
I. M. Robertson; A. Beaudoin; J. Lambros
2004-01-05
OAK-135 Development and validation of constitutive models for polycrystalline materials subjected to high strain rate loading over a range of temperatures are needed to predict the response of engineering materials to in-service type conditions (foreign object damage, high-strain rate forging, high-speed sheet forming, deformation behavior during forming, response to extreme conditions, etc.). To account accurately for the complex effects that can occur during extreme and variable loading conditions, requires significant and detailed computational and modeling efforts. These efforts must be closely coupled with precise and targeted experimental measurements that not only verify the predictions of the models, but also providemore » input about the fundamental processes responsible for the macroscopic response. Achieving this coupling between modeling and experimentation is the guiding principle of this program. Specifically, this program seeks to bridge the length scale between discrete dislocation interactions with grain boundaries and continuum models for polycrystalline plasticity. Achieving this goal requires incorporating these complex dislocation-interface interactions into the well-defined behavior of single crystals. Despite the widespread study of metal plasticity, this aspect is not well understood for simple loading conditions, let alone extreme ones. Our experimental approach includes determining the high-strain rate response as a function of strain and temperature with post-mortem characterization of the microstructure, quasi-static testing of pre-deformed material, and direct observation of the dislocation behavior during reloading by using the in situ transmission electron microscope deformation technique. These experiments will provide the basis for development and validation of physically-based constitutive models, which will include dislocation-grain boundary interactions for polycrystalline systems. One aspect of the program will involve the dire ct observation of specific mechanisms of micro-plasticity, as these will indicate the boundary value problem that should be addressed. This focus on the pre-yield region in the quasi-static effort (the elasto-plastic transition) is also a tractable one from an experimental and modeling viewpoint. In addition, our approach will minimize the need to fit model parameters to experimental data to obtain convergence. These are critical steps to reach the primary objective of simulating and modeling material performance under extreme loading conditions. In this annual report, we describe the progress made in the first year of this program.« less
Physician groups' use of data from patient experience surveys.
Friedberg, Mark W; SteelFisher, Gillian K; Karp, Melinda; Schneider, Eric C
2011-05-01
In Massachusetts, physician groups' performance on validated surveys of patient experience has been publicly reported since 2006. Groups also receive detailed reports of their own performance, but little is known about how physician groups have responded to these reports. To examine whether and how physician groups are using patient experience data to improve patient care. During 2008, we conducted semi-structured interviews with the leaders of 72 participating physician groups (out of 117 groups receiving patient experience reports). Based on leaders' responses, we identified three levels of engagement with patient experience reporting: no efforts to improve (level 1), efforts to improve only the performance of low-scoring physicians or practice sites (level 2), and efforts to improve group-wide performance (level 3). Groups' level of engagement and specific efforts to improve patient care. Forty-four group leaders (61%) reported group-wide improvement efforts (level 3), 16 (22%) reported efforts to improve only the performance of low-scoring physicians or practice sites (level 2), and 12 (17%) reported no performance improvement efforts (level 1). Level 3 groups were more likely than others to have an integrated medical group organizational model (84% vs. 31% at level 2 and 33% at level 1; P < 0.005) and to employ the majority of their physicians (69% vs. 25% and 20%; P < 0.05). Among level 3 groups, the most common targets for improvement were access, communication with patients, and customer service. The most commonly reported improvement initiatives were changing office workflow, providing additional training for nonclinical staff, and adopting or enhancing an electronic health record. Despite statewide public reporting, physician groups' use of patient experience data varied widely. Integrated organizational models were associated with greater engagement, and efforts to enhance clinicians' interpersonal skills were uncommon, with groups predominantly focusing on office workflow and support staff.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gowardhan, Akshay; Neuscamman, Stephanie; Donetti, John
Aeolus is an efficient three-dimensional computational fluid dynamics code based on finite volume method developed for predicting transport and dispersion of contaminants in a complex urban area. It solves the time dependent incompressible Navier-Stokes equation on a regular Cartesian staggered grid using a fractional step method. It also solves a scalar transport equation for temperature and using the Boussinesq approximation. The model also includes a Lagrangian dispersion model for predicting the transport and dispersion of atmospheric contaminants. The model can be run in an efficient Reynolds Average Navier-Stokes (RANS) mode with a run time of several minutes, or a moremore » detailed Large Eddy Simulation (LES) mode with run time of hours for a typical simulation. This report describes the model components, including details on the physics models used in the code, as well as several model validation efforts. Aeolus wind and dispersion predictions are compared to field data from the Joint Urban Field Trials 2003 conducted in Oklahoma City (Allwine et al 2004) including both continuous and instantaneous releases. Newly implemented Aeolus capabilities include a decay chain model and an explosive Radiological Dispersal Device (RDD) source term; these capabilities are described. Aeolus predictions using the buoyant explosive RDD source are validated against two experimental data sets: the Green Field explosive cloud rise experiments conducted in Israel (Sharon et al 2012) and the Full-Scale RDD Field Trials conducted in Canada (Green et al 2016).« less
Development and validation of a numerical model of the swine head subjected to open-field blasts
NASA Astrophysics Data System (ADS)
Kalra, A.; Zhu, F.; Feng, K.; Saif, T.; Kallakuri, S.; Jin, X.; Yang, K.; King, A.
2017-11-01
A finite element model of the head of a 55-kg Yucatan pig was developed to calculate the incident pressure and corresponding intracranial pressure due to the explosion of 8 lb (3.63 kg) of C4 at three different distances. The results from the model were validated by comparing findings with experimentally obtained data from five pigs at three different blast overpressure levels: low (150 kPa), medium (275 kPa), and high (400 kPa). The peak values of intracranial pressures from numerical model at different locations of the brain such as the frontal, central, left temporal, right temporal, parietal, and occipital regions were compared with experimental values. The model was able to predict the peak pressure with reasonable percentage differences. The differences for peak incident and intracranial pressure values between the simulation results and the experimental values were found to be less than 2.2 and 29.3%, respectively, at all locations other than the frontal region. Additionally, a series of parametric studies shows that the intracranial pressure was very sensitive to sensor locations, the presence of air bubbles, and reflections experienced during the experiments. Further efforts will be undertaken to correlate the different biomechanical response parameters, such as the intracranial pressure gradient, stress, and strain results obtained from the validated model with injured brain locations once the histology data become available.
ALS Biomarkers for Therapy Development: State of the Field & Future Directions
Benatar, Michael; Boylan, Kevin; Jeromin, Andreas; Rutkove, Seward B.; Berry, James; Atassi, Nazem; Bruijn, Lucie
2015-01-01
Biomarkers have become the focus of intense research in the field of amyotrophic lateral sclerosis (ALS), with the hope that they might aid therapy development efforts. Notwithstanding the discovery of many candidate biomarkers, none have yet emerged as validated tools for drug development. In this review we present a nuanced view of biomarkers based on the perspective of the FDA; highlight the distinction between discovery and validation; describe existing and emerging resources; review leading biological fluid-based, electrophysiological and neuroimaging candidates relevant to therapy development efforts; discuss lessons learned from biomarker initiatives in related neurodegenerative diseases; and outline specific steps that we, as a field, might take in order to hasten the development and validation of biomarkers that will prove useful in enhancing efforts to develop effective treatments for ALS patients. Most important among these perhaps is the proposal to establish a federated ALS Biomarker Consortium (ABC) in which all interested and willing stakeholders may participate with equal opportunity to contribute to the broader mission of biomarker development and validation. PMID:26574709
Cloud Microphysics and Absorption Validation
NASA Technical Reports Server (NTRS)
Ackerman, Steven
2002-01-01
Vertical distributions of particle size and habit were developed from in-situ data collected from three midlatitude cirrus field campaigns (FIRE-1, FIRE-2, and ARM IOP). These new midlatitude microphysical models were used to develop new cirrus scattering models at a number of wavelengths appropriate for use with the MODIS imager (Nasiri et al. 2002). This was the first successful collaborative effort between all the investigators on this proposal. Recent efforts have extended the midlatitude cirrus cloud analyses to tropical cirrus, using in-situ data collected during the Tropical Rainfall Measurement Mission (TRMM) Kwajalein field campaign in 1999. We note that there are critical aspects to the work: a) Improvement in computing the scattering and radiative properties of ice crystals; b) Requirement for copious amounts of cirrus in-situ data, presented in terms of both particle size and habit distributions; c) Development of cirrus microphysical and optical models for various satellite, aircraft, and ground-based instruments based on the theoretical calculations and in-situ measurements; d) Application to satellite data.
Predicting Critical Power in Elite Cyclists: Questioning the Validity of the 3-Minute All-Out Test.
Bartram, Jason C; Thewlis, Dominic; Martin, David T; Norton, Kevin I
2017-07-01
New applications of the critical-power concept, such as the modeling of intermittent-work capabilities, are exciting prospects for elite cycling. However, accurate calculation of the required parameters is traditionally time invasive and somewhat impractical. An alternative single-test protocol (3-min all-out) has recently been proposed, but validation in an elite population is lacking. The traditional approach for parameter establishment, but with fewer tests, could also prove an acceptable compromise. Six senior Australian endurance track-cycling representatives completed 6 efforts to exhaustion on 2 separate days over a 3-wk period. These included 1-, 4-, 6-, 8-, and 10-min self-paced efforts, plus the 3-min all-out protocol. Traditional work-vs-time calculations of CP and anaerobic energy contribution (W') using the 5 self-paced efforts were compared with calculations from the 3-min all-out protocol. The impact of using just 2 or 3 self-paced efforts for traditional CP and W' estimation was also explored using thresholds of agreement (8 W, 2.0 kJ, respectively). CP estimated from the 3-min all-out approach was significantly higher than from the traditional approach (402 ± 33, 351 ± 27 W, P < .001), while W' was lower (15.5 ± 3.0, 24.3 ± 4.0 kJ, P = .02). Five different combinations of 2 or 3 self-paced efforts led to CP estimates within the threshold of agreement, with only 1 combination deemed accurate for W'. In elite cyclists the 3-min all-out approach is not suitable to estimate CP when compared with the traditional method. However, reducing the number of tests used in the traditional method lessens testing burden while maintaining appropriate parameter accuracy.
Testing Pearl Model In Three European Sites
NASA Astrophysics Data System (ADS)
Bouraoui, F.; Bidoglio, G.
The Plant Protection Product Directive (91/414/EEC) stresses the need of validated models to calculate predicted environmental concentrations. The use of models has become an unavoidable step before pesticide registration. In this context, European Commission, and in particular DGVI, set up a FOrum for the Co-ordination of pes- ticide fate models and their USe (FOCUS). In a complementary effort, DG research supported the APECOP project, with one of its objective being the validation and im- provement of existing pesticide fate models. The main topic of research presented here is the validation of the PEARL model for different sites in Europe. The PEARL model, actually used in the Dutch pesticide registration procedure, was validated in three well- instrumented sites: Vredepeel (the Netherlands), Brimstone (UK), and Lanna (Swe- den). A step-wise procedure was used for the validation of the PEARL model. First the water transport module was calibrated, and then the solute transport module, using tracer measurements keeping unchanged the water transport parameters. The Vrede- peel site is characterised by a sandy soil. Fourteen months of measurements were used for the calibration. Two pesticides were applied on the site: bentazone and etho- prophos. PEARL predictions were very satisfactory for both soil moisture content, and pesticide concentration in the soil profile. The Brimstone site is characterised by a cracking clay soil. The calibration was conducted on a time series measurement of 7 years. The validation consisted in comparing predictions and measurement of soil moisture at different soil depths, and in comparing the predicted and measured con- centration of isoproturon in the drainage water. The results, even if in good agreement with the measuremens, highlighted the limitation of the model when the preferential flow becomes a dominant process. PEARL did not reproduce well soil moisture pro- file during summer months, and also under-predicted the arrival of isoproturon to the drains. The Lanna site is characterised by s structured clay soil. PEARL was success- ful in predicting soil moisture profiles and the draining water. PEARL performed well in predicting the soil concentration of bentazone at different depth. However, since PEARL does not consider cracks in the soil, it did not predict well the peak concen- trations of bentazone in the drainage water. Along with the validation results for the three sites, a sensitivity analysis of the model is presented.
Murphy, James T; Voisin, Marie; Johnson, Mark; Viard, Frédérique
2016-06-01
The data presented in this article are related to the research article entitled "A modelling approach to explore the critical environmental parameters influencing the growth and establishment of the invasive seaweed Undaria pinnatifida in Europe" [1]. This article describes raw simulation data output from a novel individual-based model of the invasive kelp species Undaria pinnatifida. It also includes field data of monthly abundance and recruitment values for a population of invasive U. pinnatifida (in Brest harbour, France) that were used to validate the model. The raw model output and field data are made publicly available in order to enable critical analysis of the model predictions and to inform future modelling efforts of the study species.
Results and current status of the NPARC alliance validation effort
NASA Technical Reports Server (NTRS)
Towne, Charles E.; Jones, Ralph R.
1996-01-01
The NPARC Alliance is a partnership between the NASA Lewis Research Center (LeRC) and the USAF Arnold Engineering Development Center (AEDC) dedicated to the establishment of a national CFD capability, centered on the NPARC Navier-Stokes computer program. The three main tasks of the Alliance are user support, code development, and validation. The present paper is a status report on the validation effort. It describes the validation approach being taken by the Alliance. Representative results are presented for laminar and turbulent flat plate boundary layers, a supersonic axisymmetric jet, and a glancing shock/turbulent boundary layer interaction. Cases scheduled to be run in the future are also listed. The archive of validation cases is described, including information on how to access it via the Internet.
ERIC Educational Resources Information Center
Krell, Moritz
2017-01-01
This study evaluates a 12-item instrument for subjective measurement of mental load (ML) and mental effort (ME) by analysing different sources of validity evidence. The findings of an expert judgement (N = 8) provide "evidence based on test content" that the formulation of the items corresponds to the meaning of ML and ME. An empirical…
ERIC Educational Resources Information Center
Harrison, Allyson G.; Green, Paul; Flaro, Lloyd
2012-01-01
It is almost self-evident that test results will be unreliable and misleading if those undergoing assessments do not make a full effort on testing. Nevertheless, objective tests of effort have not typically been used with young adults to determine whether test results are valid or not. Because of the potential economic and/or recreational benefits…
Validation of Extended MHD Models using MST RFP Plasmas
NASA Astrophysics Data System (ADS)
Jacobson, C. M.; Chapman, B. E.; Craig, D.; McCollam, K. J.; Sovinec, C. R.
2016-10-01
Significant effort has been devoted to improvement of computational models used in fusion energy sciences. Rigorous validation of these models is necessary in order to increase confidence in their ability to predict the performance of future devices. MST is a well diagnosed reversed-field pinch (RFP) capable of operation over a wide range of parameters. In particular, the Lundquist number S, a key parameter in resistive magnetohydrodynamics (MHD), can be varied over a wide range and provide substantial overlap with MHD RFP simulations. MST RFP plasmas are simulated using both DEBS, a nonlinear single-fluid visco-resistive MHD code, and NIMROD, a nonlinear extended MHD code, with S ranging from 104 to 5 ×104 for single-fluid runs, with the magnetic Prandtl number Pm = 1 . Experiments with plasma current IP ranging from 60 kA to 500 kA result in S from 4 ×104 to 8 ×106 . Validation metric comparisons are presented, focusing on how magnetic fluctuations b scale with S. Single-fluid NIMROD results give S b - 0.21 , and experiments give S b - 0.28 for the dominant m = 1 , n = 6 mode. Preliminary two-fluid NIMROD results are also presented. Work supported by US DOE.
Tests and Techniques for Characterizing and Modeling X-43A Electromechanical Actuators
NASA Technical Reports Server (NTRS)
Lin, Yohan; Baumann, Ethan; Bose, David M.; Beck, Roger; Jenney, Gavin
2008-01-01
A series of tests were conducted on the electromechanical actuators of the X-43A research vehicle in preparation for the Mach 7 and 10 hypersonic flights. The tests were required to help validate the actuator models in the simulation and acquire a better understanding of the installed system characteristics. Static and dynamic threshold, multichannel crosstalk, command-to-surface timing, free play, voltage regeneration, calibration, frequency response, compliance, hysteretic damping, and aircraft-in-the-loop tests were performed as part of this effort. This report describes the objectives, configurations, and methods for those tests, as well as the techniques used for developing second-order actuator models from the test results. When the first flight attempt failed because of actuator problems with the launch vehicle, further analysis and model enhancements were performed as part of the return-to-flight activities. High-fidelity models are described, along with the modifications that were required to match measurements taken from the research vehicle. Problems involving the implementation of these models into the X-43A simulation are also discussed. This report emphasizes lessons learned from the actuator testing, simulation modeling, and integration efforts for the X-43A hypersonic research vehicle.
The development of a survey instrument for community health improvement.
Bazos, D A; Weeks, W B; Fisher, E S; DeBlois, H A; Hamilton, E; Young, M J
2001-01-01
OBJECTIVE: To develop a survey instrument that could be used both to guide and evaluate community health improvement efforts. DATA SOURCES/STUDY SETTING: A randomized telephone survey was administered to a sample of about 250 residents in two communities in Lehigh Valley, Pennsylvania in the fall of 1997. METHODS: The survey instrument was developed by health professionals representing diverse health care organizations. This group worked collaboratively over a period of two years to (1) select a conceptual model of health as a foundation for the survey; (2) review relevant literature to identify indicators that adequately measured the health constructs within the chosen model; (3) develop new indicators where important constructs lacked specific measures; and (4) pilot test the final survey to assess the reliability and validity of the instrument. PRINCIPAL FINDINGS: The Evans and Stoddart Field Model of the Determinants of Health and Well-Being was chosen as the conceptual model within which to develop the survey. The Field Model depicts nine domains important to the origins and production of health and provides a comprehensive framework from which to launch community health improvement efforts. From more than 500 potential indicators we identified 118 survey questions that reflected the multiple determinants of health as conceptualized by this model. Sources from which indicators were selected include the Behavior Risk Factor Surveillance Survey, the National Health Interview Survey, the Consumer Assessment of Health Plans Survey, and the SF-12 Summary Scales. The work group developed 27 new survey questions for constructs for which we could not locate adequate indicators. Twenty-five questions in the final instrument can be compared to nationally published norms or benchmarks. The final instrument was pilot tested in 1997 in two communities. Administration time averaged 22 minutes with a response rate of 66 percent. Reliability of new survey questions was adequate. Face validity was supported by previous findings from qualitative and quantitative studies. CONCLUSIONS: We developed, pilot tested, and validated a survey instrument designed to provide more comprehensive and timely data to communities for community health assessments. This instrument allows communities to identify and measure critical domains of health that have previously not been captured in a single instrument. PMID:11508639
Misleading prioritizations from modelling range shifts under climate change
Sofaer, Helen R.; Jarnevich, Catherine S.; Flather, Curtis H.
2018-01-01
AimConservation planning requires the prioritization of a subset of taxa and geographical locations to focus monitoring and management efforts. Integration of the threats and opportunities posed by climate change often relies on predictions from species distribution models, particularly for assessments of vulnerability or invasion risk for multiple taxa. We evaluated whether species distribution models could reliably rank changes in species range size under climate and land use change.LocationConterminous U.S.A.Time period1977–2014.Major taxa studiedPasserine birds.MethodsWe estimated ensembles of species distribution models based on historical North American Breeding Bird Survey occurrences for 190 songbirds, and generated predictions to recent years given c. 35 years of observed land use and climate change. We evaluated model predictions using standard metrics of discrimination performance and a more detailed assessment of the ability of models to rank species vulnerability to climate change based on predicted range loss, range gain, and overall change in range size.ResultsSpecies distribution models yielded unreliable and misleading assessments of relative vulnerability to climate and land use change. Models could not accurately predict range expansion or contraction, and therefore failed to anticipate patterns of range change among species. These failures occurred despite excellent overall discrimination ability and transferability to the validation time period, which reflected strong performance at the majority of locations that were either always or never occupied by each species.Main conclusionsModels failed for the questions and at the locations of greatest interest to conservation and management. This highlights potential pitfalls of multi-taxa impact assessments under global change; in our case, models provided misleading rankings of the most impacted species, and spatial information about range changes was not credible. As modelling methods and frameworks continue to be refined, performance assessments and validation efforts should focus on the measures of risk and vulnerability useful for decision-making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Analysis Methods for Progressive Damage of Composite Structures
NASA Technical Reports Server (NTRS)
Rose, Cheryl A.; Davila, Carlos G.; Leone, Frank A.
2013-01-01
This document provides an overview of recent accomplishments and lessons learned in the development of general progressive damage analysis methods for predicting the residual strength and life of composite structures. These developments are described within their State-of-the-Art (SoA) context and the associated technology barriers. The emphasis of the authors is on developing these analysis tools for application at the structural level. Hence, modeling of damage progression is undertaken at the mesoscale, where the plies of a laminate are represented as a homogenous orthotropic continuum. The aim of the present effort is establish the ranges of validity of available models, to identify technology barriers, and to establish the foundations of the future investigation efforts. Such are the necessary steps towards accurate and robust simulations that can replace some of the expensive and time-consuming "building block" tests that are currently required for the design and certification of aerospace structures.
Experience Transitioning Models and Data at the NOAA Space Weather Prediction Center
NASA Astrophysics Data System (ADS)
Berger, Thomas
2016-07-01
The NOAA Space Weather Prediction Center has a long history of transitioning research data and models into operations and with the validation activities required. The first stage in this process involves demonstrating that the capability has sufficient value to customers to justify the cost needed to transition it and to run it continuously and reliably in operations. Once the overall value is demonstrated, a substantial effort is then required to develop the operational software from the research codes. The next stage is to implement and test the software and product generation on the operational computers. Finally, effort must be devoted to establishing long-term measures of performance, maintaining the software, and working with forecasters, customers, and researchers to improve over time the operational capabilities. This multi-stage process of identifying, transitioning, and improving operational space weather capabilities will be discussed using recent examples. Plans for future activities will also be described.
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...
2017-03-28
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Theoretical Foundations and a Research Agenda to Validate Measures of Intercultural Effort
ERIC Educational Resources Information Center
Dowd, Alicia C.; Sawatzky, Misty; Korn, Randi
2011-01-01
The concept of "student effort" is foundational to such commonly used assessments of institutional effectiveness as the National Survey of Student Engagement (NSSE) and the Community College Survey of Student Engagement (CCSSE). However, the current measure of student effort omits intercultural effort, which is particularly salient to the academic…
Li, Jingwen; Ye, Qing; Ding, Li; Liao, Qianfang
2017-07-01
Extravehicular activity (EVA) is an inevitable task for astronauts to maintain proper functions of both the spacecraft and the space station. Both experimental research in a microgravity simulator (e.g. neutral buoyancy tank, zero-g aircraft or a drop tower/tube) and mathematical modeling were used to study EVA to provide guidance for the training on Earth and task design in space. Modeling has become more and more promising because of its efficiency. Based on the task analysis, almost 90% of EVA activity is accomplished through upper limb motions. Therefore, focusing on upper limb models of the body and space suit is valuable to this effort. In previous modeling studies, some multi-rigid-body systems were developed to simplify the human musculoskeletal system, and the space suit was mostly considered as a part of the astronaut body. With the aim to improve the reality of the models, we developed an astronauts' upper limb model, including a torque model and a muscle-force model, with the counter torques from the space suit being considered as a boundary condition. Inverse kinematics and the Maggi-Kane's method was applied to calculate the joint angles, joint torques and muscle force given that the terminal trajectory of upper limb motion was known. Also, we validated the muscle-force model using electromyogram (EMG) data collected in a validation experiment. Muscle force calculated from our model presented a similar trend with the EMG data, supporting the effectiveness and feasibility of the muscle-force model we established, and also, partially validating the joint model in kinematics aspect.
A Review of Flood Loss Models as Basis for Harmonization and Benchmarking
Kreibich, Heidi; Franco, Guillermo; Marechal, David
2016-01-01
Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss–or flood vulnerability–relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework. PMID:27454604
A Review of Flood Loss Models as Basis for Harmonization and Benchmarking.
Gerl, Tina; Kreibich, Heidi; Franco, Guillermo; Marechal, David; Schröter, Kai
2016-01-01
Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss-or flood vulnerability-relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C C
The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less
The Earthquake‐Source Inversion Validation (SIV) Project
Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf
2016-01-01
Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
NASA Technical Reports Server (NTRS)
Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.
2016-01-01
A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.
Martyr-Koller, R.C.; Kernkamp, H.W.J.; Van Dam, Anne A.; Mick van der Wegen,; Lucas, Lisa; Knowles, N.; Jaffe, B.; Fregoso, T.A.
2017-01-01
A linked modeling approach has been undertaken to understand the impacts of climate and infrastructure on aquatic ecology and water quality in the San Francisco Bay-Delta region. The Delft3D Flexible Mesh modeling suite is used in this effort for its 3D hydrodynamics, salinity, temperature and sediment dynamics, phytoplankton and water-quality coupling infrastructure, and linkage to a habitat suitability model. The hydrodynamic model component of the suite is D-Flow FM, a new 3D unstructured finite-volume model based on the Delft3D model. In this paper, D-Flow FM is applied to the San Francisco Bay-Delta to investigate tidal, seasonal and annual dynamics of water levels, river flows and salinity under historical environmental and infrastructural conditions. The model is driven by historical winds, tides, ocean salinity, and river flows, and includes federal, state, and local freshwater withdrawals, and regional gate and barrier operations. The model is calibrated over a 9-month period, and subsequently validated for water levels, flows, and 3D salinity dynamics over a 2 year period.Model performance was quantified using several model assessment metrics and visualized through target diagrams. These metrics indicate that the model accurately estimated water levels, flows, and salinity over wide-ranging tidal and fluvial conditions, and the model can be used to investigate detailed circulation and salinity patterns throughout the Bay-Delta. The hydrodynamics produced through this effort will be used to drive affiliated sediment, phytoplankton, and contaminant hindcast efforts and habitat suitability assessments for fish and bivalves. The modeling framework applied here will serve as a baseline to ultimately shed light on potential ecosystem change over the current century.
NASA Astrophysics Data System (ADS)
Martyr-Koller, R. C.; Kernkamp, H. W. J.; van Dam, A.; van der Wegen, M.; Lucas, L. V.; Knowles, N.; Jaffe, B.; Fregoso, T. A.
2017-06-01
A linked modeling approach has been undertaken to understand the impacts of climate and infrastructure on aquatic ecology and water quality in the San Francisco Bay-Delta region. The Delft3D Flexible Mesh modeling suite is used in this effort for its 3D hydrodynamics, salinity, temperature and sediment dynamics, phytoplankton and water-quality coupling infrastructure, and linkage to a habitat suitability model. The hydrodynamic model component of the suite is D-Flow FM, a new 3D unstructured finite-volume model based on the Delft3D model. In this paper, D-Flow FM is applied to the San Francisco Bay-Delta to investigate tidal, seasonal and annual dynamics of water levels, river flows and salinity under historical environmental and infrastructural conditions. The model is driven by historical winds, tides, ocean salinity, and river flows, and includes federal, state, and local freshwater withdrawals, and regional gate and barrier operations. The model is calibrated over a 9-month period, and subsequently validated for water levels, flows, and 3D salinity dynamics over a 2 year period. Model performance was quantified using several model assessment metrics and visualized through target diagrams. These metrics indicate that the model accurately estimated water levels, flows, and salinity over wide-ranging tidal and fluvial conditions, and the model can be used to investigate detailed circulation and salinity patterns throughout the Bay-Delta. The hydrodynamics produced through this effort will be used to drive affiliated sediment, phytoplankton, and contaminant hindcast efforts and habitat suitability assessments for fish and bivalves. The modeling framework applied here will serve as a baseline to ultimately shed light on potential ecosystem change over the current century.
NASA Astrophysics Data System (ADS)
Wi, S.; Ray, P. A.; Brown, C.
2015-12-01
A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.
Computational Fluid Dynamics Technology for Hypersonic Applications
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.
2003-01-01
Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.
Simulating Small-Scale Experiments of In-Tunnel Airblast Using STUN and ALE3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neuscamman, Stephanie; Glenn, Lewis; Schebler, Gregory
2011-09-12
This report details continuing validation efforts for the Sphere and Tunnel (STUN) and ALE3D codes. STUN has been validated previously for blast propagation through tunnels using several sets of experimental data with varying charge sizes and tunnel configurations, including the MARVEL nuclear driven shock tube experiment (Glenn, 2001). The DHS-funded STUNTool version is compared to experimental data and the LLNL ALE3D hydrocode. In this particular study, we compare the performance of the STUN and ALE3D codes in modeling an in-tunnel airblast to experimental results obtained by Lunderman and Ohrt in a series of small-scale high explosive experiments (1997).
NASA Technical Reports Server (NTRS)
Walker, Eric L.
2005-01-01
Wind tunnel experiments will continue to be a primary source of validation data for many types of mathematical and computational models in the aerospace industry. The increased emphasis on accuracy of data acquired from these facilities requires understanding of the uncertainty of not only the measurement data but also any correction applied to the data. One of the largest and most critical corrections made to these data is due to wall interference. In an effort to understand the accuracy and suitability of these corrections, a statistical validation process for wall interference correction methods has been developed. This process is based on the use of independent cases which, after correction, are expected to produce the same result. Comparison of these independent cases with respect to the uncertainty in the correction process establishes a domain of applicability based on the capability of the method to provide reasonable corrections with respect to customer accuracy requirements. The statistical validation method was applied to the version of the Transonic Wall Interference Correction System (TWICS) recently implemented in the National Transonic Facility at NASA Langley Research Center. The TWICS code generates corrections for solid and slotted wall interference in the model pitch plane based on boundary pressure measurements. Before validation could be performed on this method, it was necessary to calibrate the ventilated wall boundary condition parameters. Discrimination comparisons are used to determine the most representative of three linear boundary condition models which have historically been used to represent longitudinally slotted test section walls. Of the three linear boundary condition models implemented for ventilated walls, the general slotted wall model was the most representative of the data. The TWICS code using the calibrated general slotted wall model was found to be valid to within the process uncertainty for test section Mach numbers less than or equal to 0.60. The scatter among the mean corrected results of the bodies of revolution validation cases was within one count of drag on a typical transport aircraft configuration for Mach numbers at or below 0.80 and two counts of drag for Mach numbers at or below 0.90.
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
Turner, Todd J.; Shade, Paul A.; Bernier, Joel V.; ...
2016-11-18
High-Energy Diffraction Microscopy (HEDM) is a 3-d x-ray characterization method that is uniquely suited to measuring the evolving micromechanical state and microstructure of polycrystalline materials during in situ processing. The near-field and far-field configurations provide complementary information; orientation maps computed from the near-field measurements provide grain morphologies, while the high angular resolution of the far-field measurements provide intergranular strain tensors. The ability to measure these data during deformation in situ makes HEDM an ideal tool for validating micro-mechanical deformation models that make their predictions at the scale of individual grains. Crystal Plasticity Finite Element Models (CPFEM) are one such classmore » of micro-mechanical models. While there have been extensive studies validating homogenized CPFEM response at a macroscopic level, a lack of detailed data measured at the level of the microstructure has hindered more stringent model validation efforts. We utilize an HEDM dataset from an alphatitanium alloy (Ti-7Al), collected at the Advanced Photon Source, Argonne National Laboratory, under in situ tensile deformation. The initial microstructure of the central slab of the gage section, measured via near-field HEDM, is used to inform a CPFEM model. The predicted intergranular stresses for 39 internal grains are then directly compared to data from 4 far-field measurements taken between ~4% and ~80% of the macroscopic yield strength. In conclusion, the intergranular stresses from the CPFEM model and far-field HEDM measurements up to incipient yield are shown to be in good agreement, and implications for application of such an integrated computational/experimental approach to phenomena such as fatigue and crack propagation is discussed.« less
An Empirical Exploration Into the Measurement of Rape Culture.
Johnson, Nicole L; Johnson, Dawn M
2017-09-01
Feminist scholars have long argued the presence of a "rape culture" within the United States; however, limited efforts have been made to quantify this construct. A model of rape culture was first proposed in 1980 and expanded in the 1990s in an effort to quantify rape myth acceptance. This model posits that five underlying components make up a rape culture: traditional gender roles, sexism, adversarial sexual beliefs, hostility toward women, and acceptance of violence. Although these components are proposed as cultural phenomenon and thus distinct from individually held beliefs, they have been exclusively explored on an individual level. Thus, to promote exploration of this phenomenon beyond individually held beliefs, the authors adapted a series of well-established measures to assess the perceived peer support of the constructs proposed to underlie rape culture and assess initial reliability and validity in a sample of 314 college students. Following determination of reliability and validity of these adapted measures, a hierarchical confirmatory factor analysis was run to examine the proposed model of rape culture. Results of this study highlight the uniqueness between individual and cultural factors as several items did not translate from an individual (i.e., personal endorsement) to a cultural level (i.e., perceived peer support) and were subsequently removed from the proposed final measurements. Furthermore, initial support for the aforementioned model of rape culture was identified. These findings are crucial given that limited conclusions may be drawn about the existence and in turn eradication of rape culture without an agreed upon definition and source of measurement.
Snyder, Hannah R.; Gulley, Lauren D.; Bijttebier, Patricia; Hartman, Catharina A.; Oldehinkel, Albertine J.; Mezulis, Amy; Young, Jami F.; Hankin, Benjamin L.
2015-01-01
Temperament is associated with important outcomes in adolescence, including academic and interpersonal functioning and psychopathology. Rothbart’s temperament model is among the most well-studied and supported approaches to adolescent temperament, and contains three main components: positive emotionality (PE), negative emotionality (NE), and effortful control (EC). However, the latent factor structure of Rothbart’s temperament measure for adolescents, the Early Adolescent Temperament Questionnaire Revised (EATQ-R, Ellis & Rothbart, 2001) has not been definitively established. To address this problem and investigate links between adolescent temperament and functioning, we used confirmatory factor analysis to examine the latent constructs of the EATQ-R in a large combined sample. For EC and NE, bifactor models consisting of a common factor plus specific factors for some sub-facets of each component fit best, providing a more nuanced understanding of these temperament dimensions. The nature of the PE construct in the EATQ-R is less clear. Models replicated in a hold-out dataset. The common components of high NE and low EC where broadly associated with increased psychopathology symptoms, and poor interpersonal and school functioning, while specific components of NE were further associated with corresponding specific components of psychopathology. Further questioning the construct validity of PE as measured by the EATQ-R, PE factors did not correlate with construct validity measures in a way consistent with theories of PE. Bringing consistency to the way the EATQ-R is modeled and using purer latent variables has the potential to advance the field in understanding links between dimensions of temperament and important outcomes of adolescent development. PMID:26011660
Snyder, Hannah R; Gulley, Lauren D; Bijttebier, Patricia; Hartman, Catharina A; Oldehinkel, Albertine J; Mezulis, Amy; Young, Jami F; Hankin, Benjamin L
2015-12-01
Temperament is associated with important outcomes in adolescence, including academic and interpersonal functioning and psychopathology. Rothbart's temperament model is among the most well-studied and supported approaches to adolescent temperament, and contains 3 main components: positive emotionality (PE), negative emotionality (NE), and effortful control (EC). However, the latent factor structure of Rothbart's temperament measure for adolescents, the Early Adolescent Temperament Questionnaire Revised (EATQ-R; Ellis & Rothbart, 2001) has not been definitively established. To address this problem and investigate links between adolescent temperament and functioning, we used confirmatory factor analysis to examine the latent constructs of the EATQ-R in a large combined sample. For EC and NE, bifactor models consisting of a common factor plus specific factors for some subfacets of each component fit best, providing a more nuanced understanding of these temperament dimensions. The nature of the PE construct in the EATQ-R is less clear. Models replicated in a hold-out dataset. The common components of high NE and low EC where broadly associated with increased psychopathology symptoms, and poor interpersonal and school functioning, while specific components of NE were further associated with corresponding specific components of psychopathology. Further questioning the construct validity of PE as measured by the EATQ-R, PE factors did not correlate with construct validity measures in a way consistent with theories of PE. Bringing consistency to the way the EATQ-R is modeled and using purer latent variables has the potential to advance the field in understanding links between dimensions of temperament and important outcomes of adolescent development. (c) 2015 APA, all rights reserved).
Satellite Remote Sensing is Key to Water Cycle Integrator
NASA Astrophysics Data System (ADS)
Koike, T.
2016-12-01
To promote effective multi-sectoral, interdisciplinary collaboration based on coordinated and integrated efforts, the Global Earth Observation System of Systems (GEOSS) is now developing a "GEOSS Water Cycle Integrator (WCI)", which integrates "Earth observations", "modeling", "data and information", "management systems" and "education systems". GEOSS/WCI sets up "work benches" by which partners can share data, information and applications in an interoperable way, exchange knowledge and experiences, deepen mutual understanding and work together effectively to ultimately respond to issues of both mitigation and adaptation. (A work bench is a virtual geographical or phenomenological space where experts and managers collaborate to use information to address a problem within that space). GEOSS/WCI enhances the coordination of efforts to strengthen individual, institutional and infrastructure capacities, especially for effective interdisciplinary coordination and integration. GEOSS/WCI archives various satellite data to provide various hydrological information such as cloud, rainfall, soil moisture, or land-surface snow. These satellite products were validated using land observation in-situ data. Water cycle models can be developed by coupling in-situ and satellite data. River flows and other hydrological parameters can be simulated and validated by in-situ data. Model outputs from weather-prediction, seasonal-prediction, and climate-prediction models are archived. Some of these model outputs are archived on an online basis, but other models, e.g., climate-prediction models are archived on an offline basis. After models are evaluated and biases corrected, the outputs can be used as inputs into the hydrological models for predicting the hydrological parameters. Additionally, we have already developed a data-assimilation system by combining satellite data and the models. This system can improve our capability to predict hydrological phenomena. The WCI can provide better predictions of the hydrological parameters for integrated water resources management (IWRM) and also assess the impact of climate change and calculate adaptation needs.
Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R
2017-01-01
Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.
A revised model of fluid transport optimization in Physarum polycephalum.
Bonifaci, Vincenzo
2017-02-01
Optimization of fluid transport in the slime mold Physarum polycephalum has been the subject of several modeling efforts in recent literature. Existing models assume that the tube adaptation mechanism in P. polycephalum's tubular network is controlled by the sheer amount of fluid flow through the tubes. We put forward the hypothesis that the controlling variable may instead be the flow's pressure gradient along the tube. We carry out the stability analysis of such a revised mathematical model for a parallel-edge network, proving that the revised model supports the global flow-optimizing behavior of the slime mold for a substantially wider class of response functions compared to previous models. Simulations also suggest that the same conclusion may be valid for arbitrary network topologies.
NASA Astrophysics Data System (ADS)
Liu, Chi; Ye, Rui; Lian, Liping; Song, Weiguo; Zhang, Jun; Lo, Siuming
2018-05-01
In the context of global aging, how to design traffic facilities for a population with a different age composition is of high importance. For this purpose, we propose a model based on the least effort principle to simulate heterogeneous pedestrian flow. In the model, the pedestrian is represented by a three-disc shaped agent. We add a new parameter to realize pedestrians' preference to avoid changing their direction of movement too quickly. The model is validated with numerous experimental data on unidirectional pedestrian flow. In addition, we investigate the influence of corridor width and velocity distribution of crowds on unidirectional heterogeneous pedestrian flow. The simulation results reflect that widening corridors could increase the specific flow for the crowd composed of two kinds of pedestrians with significantly different free velocities. Moreover, compared with a unified crowd, the crowd composed of pedestrians with great mobility differences requires a wider corridor to attain the same traffic efficiency. This study could be beneficial in providing a better understanding of heterogeneous pedestrian flow, and quantified outcomes could be applied in traffic facility design.
A classification procedure for the effective management of changes during the maintenance process
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Basili, Victor R.
1992-01-01
During software operation, maintainers are often faced with numerous change requests. Given available resources such as effort and calendar time, changes, if approved, have to be planned to fit within budget and schedule constraints. In this paper, we address the issue of assessing the difficulty of a change based on known or predictable data. This paper should be considered as a first step towards the construction of customized economic models for maintainers. In it, we propose a modeling approach, based on regular statistical techniques, that can be used in a variety of software maintenance environments. The approach can be easily automated, and is simple for people with limited statistical experience to use. Moreover, it deals effectively with the uncertainty usually associated with both model inputs and outputs. The modeling approach is validated on a data set provided by NASA/GSFC which shows it was effective in classifying changes with respect to the effort involved in implementing them. Other advantages of the approach are discussed along with additional steps to improve the results.
MOD3D: a model for incorporating MODTRAN radiative transfer into 3D simulations
NASA Astrophysics Data System (ADS)
Berk, Alexander; Anderson, Gail P.; Gossage, Brett N.
2001-08-01
MOD3D, a rapid and accurate radiative transport algorithm, is being developed for application to 3D simulations. MOD3D couples to optical property databases generated by the MODTRAN4 Correlated-k (CK) band model algorithm. The Beer's Law dependence of the CK algorithm provides for proper coupling of illumination and line-of-sight paths. Full 3D spatial effects are modeled by scaling and interpolating optical data to local conditions. A C++ version of MOD3D has been integrated into JMASS for calculation of path transmittances, thermal emission and single scatter solar radiation. Results from initial validation efforts are presented.
Motivation: In recent years there have been several efforts to generate sensitivity profiles of collections of genomically characterized cell lines to panels of candidate therapeutic compounds. These data provide the basis for the development of in silico models of sensitivity based on cellular, genetic, or expression biomarkers of cancer cells. However, a remaining challenge is an efficient way to identify accurate sets of biomarkers to validate.
Blast Load Simulator Experiments for Computational Model Validation: Report 1
2016-08-01
involving the inclusion of non-responding box-type structures in a BLS simulated blast environment. The BLS is a highly tunable com- pressed-gas-driven...Blast Load Simulator (BLS) to evaluate its suitability for a future effort involving the inclusion of non-responding box-type structures located in...Recommendations Preliminary testing indicated that inclusion of the grill and diaphragm striker resulted in a decrease in peak pressure of about 12
Technology Transfer Challenges for High-Assurance Software Engineering Tools
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.
2003-01-01
In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, K. M.; Edwards, T. B.; Riley, W. T.
In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for several simulated low activity waste (LAW) glasses (designated as the January, March, and April 2015 LAW glasses) fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions.
Validation plays the role of a "bridge" in connecting remote sensing research and applications
NASA Astrophysics Data System (ADS)
Wang, Zhiqiang; Deng, Ying; Fan, Yida
2018-07-01
Remote sensing products contribute to improving earth observations over space and time. Uncertainties exist in products of different levels; thus, validation of these products before and during their applications is critical. This study discusses the meaning of validation in depth and proposes a new definition of reliability for use with such products. In this context, validation should include three aspects: a description of the relevant uncertainties, quantitative measurement results and a qualitative judgment that considers the needs of users. A literature overview is then presented evidencing improvements in the concepts associated with validation. It shows that the root mean squared error (RMSE) is widely used to express accuracy; increasing numbers of remote sensing products have been validated; research institutes contribute most validation efforts; and sufficient validation studies encourage the application of remote sensing products. Validation plays a connecting role in the distribution and application of remote sensing products. Validation connects simple remote sensing subjects with other disciplines, and it connects primary research with practical applications. Based on the above findings, it is suggested that validation efforts that include wider cooperation among research institutes and full consideration of the needs of users should be promoted.
Manufacturing data analytics using a virtual factory representation.
Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun
2017-01-01
Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.
Effective prediction of biodiversity in tidal flat habitats using an artificial neural network.
Yoo, Jae-Won; Lee, Yong-Woo; Lee, Chang-Gun; Kim, Chang-Soo
2013-02-01
Accurate predictions of benthic macrofaunal biodiversity greatly benefit the efficient planning and management of habitat restoration efforts in tidal flat habitats. Artificial neural network (ANN) prediction models for such biodiversity were developed and tested based on 13 biophysical variables, collected from 50 sites of tidal flats along the coast of Korea during 1991-2006. The developed model showed high predictions during training, cross-validation and testing. Besides the training and testing procedures, an independent dataset from a different time period (2007-2010) was used to test the robustness and practical usage of the model. High prediction on the independent dataset (r = 0.84) validated the networks proper learning of predictive relationship and its generality. Key influential variables identified by follow-up sensitivity analyses were related with topographic dimension, environmental heterogeneity, and water column properties. Study demonstrates the successful application of ANN for the accurate prediction of benthic macrofaunal biodiversity and understanding of dynamics of candidate variables. Copyright © 2012 Elsevier Ltd. All rights reserved.
Analysis and Ground Testing for Validation of the Inflatable Sunshield in Space (ISIS) Experiment
NASA Technical Reports Server (NTRS)
Lienard, Sebastien; Johnston, John; Adams, Mike; Stanley, Diane; Alfano, Jean-Pierre; Romanacci, Paolo
2000-01-01
The Next Generation Space Telescope (NGST) design requires a large sunshield to protect the large aperture mirror and instrument module from constant solar exposure at its L2 orbit. The structural dynamics of the sunshield must be modeled in order to predict disturbances to the observatory attitude control system and gauge effects on the line of site jitter. Models of large, non-linear membrane systems are not well understood and have not been successfully demonstrated. To answer questions about sunshield dynamic behavior and demonstrate controlled deployment, the NGST project is flying a Pathfinder experiment, the Inflatable Sunshield in Space (ISIS). This paper discusses in detail the modeling and ground-testing efforts performed at the Goddard Space Flight Center to: validate analytical tools for characterizing the dynamic behavior of the deployed sunshield, qualify the experiment for the Space Shuttle, and verify the functionality of the system. Included in the discussion will be test parameters, test setups, problems encountered, and test results.
Abe Silverstein 10- by 10-Foot Supersonic Wind Tunnel Validated for Low-Speed (Subsonic) Operation
NASA Technical Reports Server (NTRS)
Hoffman, Thomas R.
2001-01-01
The NASA Glenn Research Center and Lockheed Martin Corporation tested an aircraft model in two wind tunnels to compare low-speed (subsonic) flow characteristics. Objectives of the test were to determine and document the similarities and uniqueness of the tunnels and to validate that Glenn's 10- by 10-Foot Supersonic Wind Tunnel (10x10 SWT) is a viable low-speed test facility. Results from two of Glenn's wind tunnels compare very favorably and show that the 10x10 SWT is a viable low-speed wind tunnel. The Subsonic Comparison Test was a joint effort by NASA and Lockheed Martin using the Lockheed Martin's Joint Strike Fighter Concept Demonstration Aircraft model. Although Glenn's 10310 and 836 SWT's have many similarities, they also have unique characteristics. Therefore, test data were collected for multiple model configurations at various vertical locations in the test section, starting at the test section centerline and extending into the ceiling and floor boundary layers.
ECP Bone Workshop Day 2, Session 1: Validation of Exercise Countermeasures
NASA Technical Reports Server (NTRS)
Myers, Jerry G.
2007-01-01
The thesis of this session of the ECP Bone workshop is that computer modeling is required in order to evaluate factor of risk for fracture when considering the uniquely localized bone loss conditions experienced by Astronauts. This session provides an opportunity to introduce the Integrated Medical Model Bone Fracture Risk (IMM-BFxRM) simulation approach and how this and other models improve understanding of the effects of exercise countermeasures. This workshop session also provides an opportunity for the panel to provide recommendations on this and other "complex modeling" approaches, as well as, the importance of funding the IMM-BFxRM and companion efforts by external scientists (Lang and Keyak).
NASA Technical Reports Server (NTRS)
1974-01-01
After the simplified version of the 41-Node Stolwijk Metabolic Man Model was implemented on the Sigma 3 and UNIVAC 1110 computers in batch mode, it became desirable to make certain revisions. First, the availability of time-sharing terminals makes it possible to provide the capability and flexibility of conversational interaction between user and model. Secondly, recent physiological studies show the need to revise certain parameter values contained in the model. Thirdly, it was desired to make quantitative and accurate predictions of evaporative water loss for humans in an orbiting space station. The result of the first phase of this effort are reported.
MESUR: USAGE-BASED METRICS OF SCHOLARLY IMPACT
DOE Office of Scientific and Technical Information (OSTI.GOV)
BOLLEN, JOHAN; RODRIGUEZ, MARKO A.; VAN DE SOMPEL, HERBERT
2007-01-30
The evaluation of scholarly communication items is now largely a matter of expert opinion or metrics derived from citation data. Both approaches can fail to take into account the myriad of factors that shape scholarly impact. Usage data has emerged as a promising complement to existing methods o fassessment but the formal groundwork to reliably and validly apply usage-based metrics of schlolarly impact is lacking. The Andrew W. Mellon Foundation funded MESUR project constitutes a systematic effort to define, validate and cross-validate a range of usage-based metrics of schlolarly impact by creating a semantic model of the scholarly communication process.more » The constructed model will serve as the basis of a creating a large-scale semantic network that seamlessly relates citation, bibliographic and usage data from a variety of sources. A subsequent program that uses the established semantic network as a reference data set will determine the characteristics and semantics of a variety of usage-based metrics of schlolarly impact. This paper outlines the architecture and methodology adopted by the MESUR project and its future direction.« less
NASA Technical Reports Server (NTRS)
Baumeister, Joseph F.
1994-01-01
A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.
Suminski, Richard R; Robertson, Robert J; Goss, Fredric L; Olvera, Norma
2008-08-01
Whether the translation of verbal descriptors from English to Spanish affects the validity of the Children's OMNI Scale of Perceived Exertion is not known, so the validity of a Spanish version of the OMNI was examined with 32 boys and 36 girls (9 to 12 years old) for whom Spanish was the primary language. Oxygen consumption, ventilation, respiratory rate, respiratory exchange ratio, heart rate, and ratings of perceived exertion for the overall body (RPE-O) were measured during an incremental treadmill test. All response values displayed significant linear increases across test stages. The linear regression analyses indicated RPE-O values were distributed as positive linear functions of oxygen consumption, ventilation, respiratory rate, respiratory exchange ratio, heart rate, and percent of maximal oxygen consumption. All regression models were statistically significant. The Spanish OMNI Scale is valid for estimating exercise effort during walking and running amongst Hispanic youth whose primary language is Spanish.
Development and validation of measures to assess prevention and control of AMR in hospitals.
Flanagan, Mindy; Ramanujam, Rangaraj; Sutherland, Jason; Vaughn, Thomas; Diekema, Daniel; Doebbeling, Bradley N
2007-06-01
The rapid spread of antimicrobial resistance (AMR) in the US hospitals poses serious quality and safety problems. Expert panels, identifying strategies for optimizing antibiotic use and preventing AMR spread, have recommended hospitals undertake efforts to implement specific evidence-based practices. To develop and validate a measurement scale for assessing hospitals' efforts to implement recommended AMR prevention and control measures. Surveys were mailed to infection control professionals in a national sample of 670 US hospitals stratified by geographic region, bedsize, teaching status, and VA affiliation. : Four hundred forty-eight infection control professionals participated (67% response rate). Survey items measured implementation of guideline recommendations, practices for AMR monitoring and feedback, AMR-related outcomes (methicillin-resistant Staphylococcus aureus prevalence and outbreaks [MRSA]), and organizational features. "Derivation" and "validation" samples were randomly selected. Exploratory factor analysis was performed to identify factors underlying AMR prevention and control efforts. Multiple methods were used for validation. We identified 4 empirically distinct factors in AMR prevention and control: (1) practices for antimicrobial prescription/use, (2) information/resources for AMR control, (3) practices for isolating infected patients, and (4) organizational support for infection control policies. The Prevention and Control of Antimicrobial Resistance scale was reliable and had content and construct validity. MRSA prevalence was significantly lower in hospitals with higher resource/information availability and broader organizational support. The Prevention and Control of Antimicrobial Resistance scale offers a simple yet discriminating assessment of AMR prevention and control efforts. Use should complement assessment methods based exclusively on AMR outcomes.
Validation of a Thermo-Ablative Model of Elastomeric Internal Insulation Materials
NASA Technical Reports Server (NTRS)
Martin, Heath T.
2017-01-01
In thermo-ablative material modeling, as in many fields of analysis, the quality of the existing models significantly exceeds that of the experimental data required for their validation. In an effort to narrow this gap, a laboratory-scale internal insulation test bed was developed that exposes insulation samples to realistic solid rocket motor (SRM) internal environments while being instrumented to record real-time rates of both model inputs (i.e., chamber pressure, total surface heat flux, and radiative heat flux) as well as model outputs (i.e., material decomposition depths (MDDs) and in-depth material temperatures). In this work, the measured SRM internal environment parameters were used in conjunction with equilibrium thermochemistry codes as inputs to one-dimensional thermo-ablative models of the PBINBR and CFEPDM insulation samples used in the lab-scale test firings. The computed MDD histories were then compared with those deduced from real-time X-ray radiography of the insulation samples, and the calculated in-depth temperatures were compared with those measured by embedded thermocouples. The results of this exercise emphasize the challenges of modeling and testing elastomeric materials in SRM environments while illuminating the path forward to improved fidelity.
The International Reference Ionosphere - Climatological Standard for the Ionosphere
NASA Technical Reports Server (NTRS)
Bilitza, Dieter
2006-01-01
The International Reference Ionosphere (IRI) a joint project of URSI and COSPAR is the defacto standard for a climatological specification of ionospheric parameters. IRI is based on a wide range of ground and space data and has been steadily improved since its inception in 1969 with the ever-increasing volume of ionospheric data and with better mathematical descriptions of the observed global and temporal variation patterns. The IRI model has been validated with a large amount of data including data from the most recent ionospheric satellites (KOMPSAT, ROCSAT and TIMED) and data from global network of ionosondes. Several IRI teams are working on specific aspects of the IRI modeling effort including an improved representation of the topside ionosphere with a seamless transition to the plasmasphere, a new effort to represent the global variation of F2 peak parameters using the Neural Network (NN) technique, and the inclusion of several additional parameters in IRI, e.g., spread-F probability and ionospheric variability. Annual IRI workshops are the forum for discussions of these efforts and for all science activities related to IRI as well as applications of the IRI model in engineering and education. In this paper I will present a status report about the IRI effort with special emphasis on the presentations and results from the most recent IRI Workshops (Paris, 2004; Tortosa, 2005) and on the most important ongoing IRI activities. I will discuss the latest version of the IRI model, IRI-2006, highlighting the most recent changes and additions. Finally, the talk will review some of the applications of the IRI model with special emphasis on the use for radiowave propagation studies and communication purposes.
Effort-Reward Imbalance for Learning Is Associated with Fatigue in School Children
ERIC Educational Resources Information Center
Fukuda, Sanae; Yamano, Emi; Joudoi, Takako; Mizuno, Kei; Tanaka, Masaaki; Kawatani, Junko; Takano, Miyuki; Tomoda, Akemi; Imai-Matsumura, Kyoko; Miike, Teruhisa; Watanabe, Yasuyoshi
2010-01-01
We examined relationships among fatigue, sleep quality, and effort-reward imbalance for learning in school children. We developed an effort-reward for learning scale in school students and examined its reliability and validity. Self-administered surveys, including the effort reward for leaning scale and fatigue scale, were completed by 1,023…
NASA Astrophysics Data System (ADS)
Anderson, T.
2016-02-01
Ocean circulation forecasts can help answer questions regarding larval dispersal, passive movement of injured sea animals, oil spill mitigation, and search and rescue efforts. Circulation forecasts are often validated with GPS-tracked drifter paths, but how accurately do these drifters actually move with ocean currents? Drifters are not only moved by water, but are also forced by wind and waves acting on the exposed buoy and transmitter; this imperfect movement is referred to as drifter slip. The quantification and further understanding of drifter slip will allow scientists to differentiate between drifter imperfections and actual computer model error when comparing trajectory forecasts with actual drifter tracks. This will avoid falsely accrediting all discrepancies between a trajectory forecast and an actual drifter track to computer model error. During multiple deployments of drifters in Nantucket Sound and using observed wind and wave data, we attempt to quantify the slip of drifters developed by the Northeast Fisheries Science Center's (NEFSC) Student Drifters Program. While similar studies have been conducted previously, very few have directly attached current meters to drifters to quantify drifter slip. Furthermore, none have quantified slip of NEFSC drifters relative to the oceanographic-standard "CODE" drifter. The NEFSC drifter archive has over 1000 drifter tracks primarily off the New England coast. With a better understanding of NEFSC drifter slip, modelers can reliably use these tracks for model validation.
NASA Astrophysics Data System (ADS)
Anderson, T.
2015-12-01
Ocean circulation forecasts can help answer questions regarding larval dispersal, passive movement of injured sea animals, oil spill mitigation, and search and rescue efforts. Circulation forecasts are often validated with GPS-tracked drifter paths, but how accurately do these drifters actually move with ocean currents? Drifters are not only moved by water, but are also forced by wind and waves acting on the exposed buoy and transmitter; this imperfect movement is referred to as drifter slip. The quantification and further understanding of drifter slip will allow scientists to differentiate between drifter imperfections and actual computer model error when comparing trajectory forecasts with actual drifter tracks. This will avoid falsely accrediting all discrepancies between a trajectory forecast and an actual drifter track to computer model error. During multiple deployments of drifters in Nantucket Sound and using observed wind and wave data, we attempt to quantify the slip of drifters developed by the Northeast Fisheries Science Center's (NEFSC) Student Drifters Program. While similar studies have been conducted previously, very few have directly attached current meters to drifters to quantify drifter slip. Furthermore, none have quantified slip of NEFSC drifters relative to the oceanographic-standard "CODE" drifter. The NEFSC drifter archive has over 1000 drifter tracks primarily off the New England coast. With a better understanding of NEFSC drifter slip, modelers can reliably use these tracks for model validation.
Intuitive Logic Revisited: New Data and a Bayesian Mixed Model Meta-Analysis
Singmann, Henrik; Klauer, Karl Christoph; Kellen, David
2014-01-01
Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid) of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012). These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies () without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects) which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim. PMID:24755777
A verification and validation effort for high explosives at Los Alamos National Lab (u)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scovel, Christina A; Menikoff, Ralph S
2009-01-01
We have started a project to verify and validate ASC codes used to simulate detonation waves in high explosives. Since there are no non-trivial analytic solutions, we are going to compare simulated results with experimental data that cover a wide range of explosive phenomena. The intent is to compare both different codes and different high explosives (HE) models. The first step is to test the products equation of state used for the HE models, For this purpose, the cylinder test, flyer plate and plate-push experiments are being used. These experiments sample different regimes in thermodynamic phase space: the CJ isentropemore » for the cylinder tests, the isentrope behind an overdriven detonation wave for the flyer plate experiment, and expansion following a reflected CJ detonation for the plate-push experiment, which is sensitive to the Gruneisen coefficient. The results of our findings for PBX 9501 are presented here.« less
Optimal coordination and control of posture and movements.
Johansson, Rolf; Fransson, Per-Anders; Magnusson, Måns
2009-01-01
This paper presents a theoretical model of stability and coordination of posture and locomotion, together with algorithms for continuous-time quadratic optimization of motion control. Explicit solutions to the Hamilton-Jacobi equation for optimal control of rigid-body motion are obtained by solving an algebraic matrix equation. The stability is investigated with Lyapunov function theory and it is shown that global asymptotic stability holds. It is also shown how optimal control and adaptive control may act in concert in the case of unknown or uncertain system parameters. The solution describes motion strategies of minimum effort and variance. The proposed optimal control is formulated to be suitable as a posture and movement model for experimental validation and verification. The combination of adaptive and optimal control makes this algorithm a candidate for coordination and control of functional neuromuscular stimulation as well as of prostheses. Validation examples with experimental data are provided.
Detection and Prediction of Hail Storms in Satellite Imagery using Deep Learning
NASA Astrophysics Data System (ADS)
Pullman, M.; Gurung, I.; Ramachandran, R.; Maskey, M.
2017-12-01
Natural hazards, such as damaging hail storms, dramatically disrupt both industry and agriculture, having significant socio-economic impacts in the United States. In 2016, hail was responsible for 3.5 billion and 23 million dollars in damage to property and crops, respectively, making it the second costliest 2016 weather phenomenon in the United States. The destructive nature and high cost of hail storms has driven research into the development of more accurate hail-prediction algorithms in an effort to mitigate societal impacts. Recently, weather forecasting efforts have turned to deep learning neural networks because neural networks can more effectively model complex, nonlinear, dynamical phenomenon that exist in large datasets through multiple stages of transformation and representation. In an effort to improve hail-prediction techniques, we propose a deep learning technique that leverages satellite imagery to detect and predict the occurrence of hail storms. The technique is applied to satellite imagery from 2006 to 2016 for the contiguous United States and incorporates hail reports obtained from the National Center for Environmental Information Storm Events Database for training and validation purposes. In this presentation, we describe a novel approach to predicting hail via a neural network model that creates a large labeled dataset of hail storms, the accuracy and results of the model, and its applications for improving hail forecasting.
A Cellular Automata Model of Bone Formation
Van Scoy, Gabrielle K.; George, Estee L.; Asantewaa, Flora Opoku; Kerns, Lucy; Saunders, Marnie M.; Prieto-Langarica, Alicia
2017-01-01
Bone remodeling is an elegantly orchestrated process by which osteocytes, osteoblasts and osteoclasts function as a syncytium to maintain or modify bone. On the microscopic level, bone consists of cells that create, destroy and monitor the bone matrix. These cells interact in a coordinated manner to maintain a tightly regulated homeostasis. It is this regulation that is responsible for the observed increase in bone gain in the dominant arm of a tennis player and the observed increase in bone loss associated with spaceflight and osteoporosis. The manner in which these cells interact to bring about a change in bone quality and quantity has yet to be fully elucidated. But efforts to understand the multicellular complexity can ultimately lead to eradication of metabolic bone diseases such as osteoporosis and improved implant longevity. Experimentally validated mathematical models that simulate functional activity and offer eventual predictive capabilities offer tremendous potential in understanding multicellular bone remodeling. Here we undertake the initial challenge to develop a mathematical model of bone formation validated with in vitro data obtained from osteoblastic bone cells induced to mineralize and quantified at 26 days of culture. A cellular automata model was constructed to simulate the in vitro characterization. Permutation tests were performed to compare the distribution of the mineralization in the cultures and the distribution of the mineralization in the mathematical models. The results of the permutation test show the distribution of mineralization from the characterization and mathematical model come from the same probability distribution, therefore validating the cellular automata model. PMID:28189632
From QSAR to QSIIR: Searching for Enhanced Computational Toxicology Models
Zhu, Hao
2017-01-01
Quantitative Structure Activity Relationship (QSAR) is the most frequently used modeling approach to explore the dependency of biological, toxicological, or other types of activities/properties of chemicals on their molecular features. In the past two decades, QSAR modeling has been used extensively in drug discovery process. However, the predictive models resulted from QSAR studies have limited use for chemical risk assessment, especially for animal and human toxicity evaluations, due to the low predictivity of new compounds. To develop enhanced toxicity models with independently validated external prediction power, novel modeling protocols were pursued by computational toxicologists based on rapidly increasing toxicity testing data in recent years. This chapter reviews the recent effort in our laboratory to incorporate the biological testing results as descriptors in the toxicity modeling process. This effort extended the concept of QSAR to Quantitative Structure In vitro-In vivo Relationship (QSIIR). The QSIIR study examples provided in this chapter indicate that the QSIIR models that based on the hybrid (biological and chemical) descriptors are indeed superior to the conventional QSAR models that only based on chemical descriptors for several animal toxicity endpoints. We believe that the applications introduced in this review will be of interest and value to researchers working in the field of computational drug discovery and environmental chemical risk assessment. PMID:23086837
NASA Astrophysics Data System (ADS)
Adamson, E. T.; Pizzo, V. J.; Biesecker, D. A.; Mays, M. L.; MacNeice, P. J.; Taktakishvili, A.; Viereck, R. A.
2017-12-01
In 2011, NOAA's Space Weather Prediction Center (SWPC) transitioned the world's first operational space weather model into use at the National Weather Service's Weather and Climate Operational Supercomputing System (WCOSS). This operational forecasting tool is comprised of the Wang-Sheeley-Arge (WSA) solar wind model coupled with the Enlil heliospheric MHD model. Relying on daily-updated photospheric magnetograms produced by the National Solar Observatory's Global Oscillation Network Group (GONG), this tool provides critical predictive knowledge of heliospheric dynamics such as high speed streams and coronal mass ejections. With the goal of advancing this predictive model and quantifying progress, SWPC and NASA's Community Coordinated Modeling Center (CCMC) have initiated a collaborative effort to assess improvements in space weather forecasts at Earth by moving from a single daily-updated magnetogram to a sequence of time-dependent magnetograms to drive the ambient inputs for the WSA-Enlil model as well as incorporating the newly developed Air Force Data Assimilative Photospheric Flux Transport (ADAPT) model. We will provide a detailed overview of the scope of this effort and discuss preliminary results from the first phase focusing on the impact of time-dependent magnetogram inputs to the WSA-Enlil model.
Golas, Sara Bersche; Shibahara, Takuma; Agboola, Stephen; Otaki, Hiroko; Sato, Jumpei; Nakae, Tatsuya; Hisamitsu, Toru; Kojima, Go; Felsted, Jennifer; Kakarmath, Sujay; Kvedar, Joseph; Jethwani, Kamal
2018-06-22
Heart failure is one of the leading causes of hospitalization in the United States. Advances in big data solutions allow for storage, management, and mining of large volumes of structured and semi-structured data, such as complex healthcare data. Applying these advances to complex healthcare data has led to the development of risk prediction models to help identify patients who would benefit most from disease management programs in an effort to reduce readmissions and healthcare cost, but the results of these efforts have been varied. The primary aim of this study was to develop a 30-day readmission risk prediction model for heart failure patients discharged from a hospital admission. We used longitudinal electronic medical record data of heart failure patients admitted within a large healthcare system. Feature vectors included structured demographic, utilization, and clinical data, as well as selected extracts of un-structured data from clinician-authored notes. The risk prediction model was developed using deep unified networks (DUNs), a new mesh-like network structure of deep learning designed to avoid over-fitting. The model was validated with 10-fold cross-validation and results compared to models based on logistic regression, gradient boosting, and maxout networks. Overall model performance was assessed using concordance statistic. We also selected a discrimination threshold based on maximum projected cost saving to the Partners Healthcare system. Data from 11,510 patients with 27,334 admissions and 6369 30-day readmissions were used to train the model. After data processing, the final model included 3512 variables. The DUNs model had the best performance after 10-fold cross-validation. AUCs for prediction models were 0.664 ± 0.015, 0.650 ± 0.011, 0.695 ± 0.016 and 0.705 ± 0.015 for logistic regression, gradient boosting, maxout networks, and DUNs respectively. The DUNs model had an accuracy of 76.4% at the classification threshold that corresponded with maximum cost saving to the hospital. Deep learning techniques performed better than other traditional techniques in developing this EMR-based prediction model for 30-day readmissions in heart failure patients. Such models can be used to identify heart failure patients with impending hospitalization, enabling care teams to target interventions at their most high-risk patients and improving overall clinical outcomes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yuhe; Mazur, Thomas R.; Green, Olga
Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expandedmore » to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less
Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold
2016-01-01
Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian’s kmc. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems. PMID:27370123
Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold
2016-07-01
The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.
Jones, Alvin; Ingram, M Victoria
2011-10-01
Using a relatively new statistical paradigm, Optimal Data Analysis (ODA; Yarnold & Soltysik, 2005), this research demonstrated that newly developed scales for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and MMPI-2 Restructured Form (MMPI-2-RF) specifically designed to assess over-reporting of cognitive and/or somatic symptoms were more effective than the MMPI-2 F-family of scales in predicting effort status on tests of cognitive functioning in a sample of 288 military members. ODA demonstrated that when all scales were performing at their theoretical maximum possible level of classification accuracy, the Henry Heilbronner Index (HHI), Response Bias Scale (RBS), Fake Bad Scale (FBS), and the Symptom Validity Scale (FBS-r) outperformed the F-family of scales on a variety of ODA indexes of classification accuracy, including an omnibus measure (effect strength total, EST) of the descriptive and prognostic utility of ODA models developed for each scale. Based on the guidelines suggested by Yarnold and Soltysik for evaluating effect strengths for ODA models, the newly developed scales had effects sizes that were moderate in size (37.66 to 45.68), whereas the F-family scales had effects strengths that ranged from weak to moderate (15.42 to 32.80). In addition, traditional analysis demonstrated that HHI, RBS, FBS, and FBS-R had large effect sizes (0.98 to 1.16) based on Cohen's (1988) suggested categorization of effect size when comparing mean scores for adequate versus inadequate effort groups, whereas F-family of scales had small to medium effect sizes (0.25 to 0.76). The MMPI-2-RF Infrequent Somatic Responses Scale (F(S)) tended to perform in a fashion similar to F, the best performing F-family scale.
Designing an activity-based costing model for a non-admitted prisoner healthcare setting.
Cai, Xiao; Moore, Elizabeth; McNamara, Martin
2013-09-01
To design and deliver an activity-based costing model within a non-admitted prisoner healthcare setting. Key phases from the NSW Health clinical redesign methodology were utilised: diagnostic, solution design and implementation. The diagnostic phase utilised a range of strategies to identify issues requiring attention in the development of the costing model. The solution design phase conceptualised distinct 'building blocks' of activity and cost based on the speciality of clinicians providing care. These building blocks enabled the classification of activity and comparisons of costs between similar facilities. The implementation phase validated the model. The project generated an activity-based costing model based on actual activity performed, gained acceptability among clinicians and managers, and provided the basis for ongoing efficiency and benchmarking efforts.
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2010 CFR
2010-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2013 CFR
2013-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2014 CFR
2014-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2011 CFR
2011-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2012 CFR
2012-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
A Unified Model of Performance: Validation of its Predictions across Different Sleep/Wake Schedules
Ramakrishnan, Sridhar; Wesensten, Nancy J.; Balkin, Thomas J.; Reifman, Jaques
2016-01-01
Study Objectives: Historically, mathematical models of human neurobehavioral performance developed on data from one sleep study were limited to predicting performance in similar studies, restricting their practical utility. We recently developed a unified model of performance (UMP) to predict the effects of the continuum of sleep loss—from chronic sleep restriction (CSR) to total sleep deprivation (TSD) challenges—and validated it using data from two studies of one laboratory. Here, we significantly extended this effort by validating the UMP predictions across a wide range of sleep/wake schedules from different studies and laboratories. Methods: We developed the UMP on psychomotor vigilance task (PVT) lapse data from one study encompassing four different CSR conditions (7 d of 3, 5, 7, and 9 h of sleep/night), and predicted performance in five other studies (from four laboratories), including different combinations of TSD (40 to 88 h), CSR (2 to 6 h of sleep/night), control (8 to 10 h of sleep/night), and nap (nocturnal and diurnal) schedules. Results: The UMP accurately predicted PVT performance trends across 14 different sleep/wake conditions, yielding average prediction errors between 7% and 36%, with the predictions lying within 2 standard errors of the measured data 87% of the time. In addition, the UMP accurately predicted performance impairment (average error of 15%) for schedules (TSD and naps) not used in model development. Conclusions: The unified model of performance can be used as a tool to help design sleep/wake schedules to optimize the extent and duration of neurobehavioral performance and to accelerate recovery after sleep loss. Citation: Ramakrishnan S, Wesensten NJ, Balkin TJ, Reifman J. A unified model of performance: validation of its predictions across different sleep/wake schedules. SLEEP 2016;39(1):249–262. PMID:26518594
Southern Africa Validation of NASA's Earth Observing System (SAVE EOS)
NASA Technical Reports Server (NTRS)
Privette, Jeffrey L.
2000-01-01
Southern Africa Validation of EOS (SAVE) is 4-year, multidisciplinary effort to validate operational and experimental products from Terra-the flagship satellite of NASA's Earth Observing System (EOS). At test sites from Zambia to South Africa, we are measuring soil, vegetation and atmospheric parameters over a range of ecosystems for comparison with products from Terra, Landsat 7, AVHRR and SeaWiFS. The data are also employed to parameterize and improve vegetation process models. Fixed-point and mobile "transect" sampling are used to collect the ground data. These are extrapolated over larger areas with fine-resolution multispectral imagery. We describe the sites, infrastructure, and measurement strategies developed underSAVE, as well as initial results from our participation in the first Intensive Field Campaign of SAFARI 2000. We also describe SAVE's role in the Kalahari Transect Campaign (February/March 2000) in Zambia and Botswana.
Bluck, Susan; Alea, Nicole
2011-07-01
Theory suggests that autobiographical remembering serves several functions. This research builds on previous empirical efforts (Bluck, Alea, Habermas, & Rubin, 2005) with the aim of constructing a brief, valid measure of three functions of autobiographical memory. Participants (N=306) completed 28 theoretically derived items concerning the frequency with which they use autobiographical memory to serve a variety of functions. To examine convergent and discriminant validity, participants rated their tendency to think about and talk about the past, and measures of future time orientation, self-concept clarity, and trait personality. Confirmatory factor analysis of the function items resulted in a respecified model with 15 items in three factors. The newly developed Thinking about Life Experiences scale (TALE) shows good internal consistency as well as convergent validity for three subscales: Self-Continuity, Social-Bonding, and Directing-Behaviour. Analyses demonstrate factorial equivalence across age and gender groups. Potential use and limitations of the TALE are discussed.
Bond, Mary Lou; Cason, Carolyn L
2014-01-01
To assess the content validity and internal consistency reliability of the Healthcare Professions Education Program Self-Assessment (PSA) and the Institutional Self-Assessment for Factors Supporting Hispanic Student Retention (ISA). Health disparities among vulnerable populations are among the top priorities demanding attention in the United States. Efforts to recruit and retain Hispanic nursing students are essential. Based on a sample of provosts, deans/directors, and an author of the Model of Institutional Support, participants commented on the perceived validity and usefulness of each item of the PSA and ISA. Internal consistency reliability was calculated by Cronbach's alpha using responses from nursing schools in states with large Hispanic populations. The ISA and PSA were found to be reliable and valid tools for assessing institutional friendliness. The instruments highlight strengths and identify potential areas of improvement at institutional and program levels.
Relations between inductive reasoning and deductive reasoning.
Heit, Evan; Rotello, Caren M
2010-05-01
One of the most important open questions in reasoning research is how inductive reasoning and deductive reasoning are related. In an effort to address this question, we applied methods and concepts from memory research. We used 2 experiments to examine the effects of logical validity and premise-conclusion similarity on evaluation of arguments. Experiment 1 showed 2 dissociations: For a common set of arguments, deduction judgments were more affected by validity, and induction judgments were more affected by similarity. Moreover, Experiment 2 showed that fast deduction judgments were like induction judgments-in terms of being more influenced by similarity and less influenced by validity, compared with slow deduction judgments. These novel results pose challenges for a 1-process account of reasoning and are interpreted in terms of a 2-process account of reasoning, which was implemented as a multidimensional signal detection model and applied to receiver operating characteristic data. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Risk terrain modeling predicts child maltreatment.
Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye
2016-12-01
As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
On explicit algebraic stress models for complex turbulent flows
NASA Technical Reports Server (NTRS)
Gatski, T. B.; Speziale, C. G.
1992-01-01
Explicit algebraic stress models that are valid for three-dimensional turbulent flows in noninertial frames are systematically derived from a hierarchy of second-order closure models. This represents a generalization of the model derived by Pope who based his analysis on the Launder, Reece, and Rodi model restricted to two-dimensional turbulent flows in an inertial frame. The relationship between the new models and traditional algebraic stress models -- as well as anistropic eddy visosity models -- is theoretically established. The need for regularization is demonstrated in an effort to explain why traditional algebraic stress models have failed in complex flows. It is also shown that these explicit algebraic stress models can shed new light on what second-order closure models predict for the equilibrium states of homogeneous turbulent flows and can serve as a useful alternative in practical computations.
48 CFR 1852.234-2 - Earned Value Management System.
Code of Federal Regulations, 2013 CFR
2013-10-01
... compliance/validation. The Contractor shall follow and implement the approved compliance/validation plan in a... process. (f) The Contractor shall be responsible for ensuring that its subcontractors, identified below... compliance/validation. (Contracting Officer to insert names of subcontractors or subcontracted effort.) (g...
48 CFR 1852.234-2 - Earned Value Management System.
Code of Federal Regulations, 2012 CFR
2012-10-01
... compliance/validation. The Contractor shall follow and implement the approved compliance/validation plan in a... process. (f) The Contractor shall be responsible for ensuring that its subcontractors, identified below... compliance/validation. (Contracting Officer to insert names of subcontractors or subcontracted effort.) (g...
48 CFR 1852.234-2 - Earned Value Management System.
Code of Federal Regulations, 2014 CFR
2014-10-01
... compliance/validation. The Contractor shall follow and implement the approved compliance/validation plan in a... process. (f) The Contractor shall be responsible for ensuring that its subcontractors, identified below... compliance/validation. (Contracting Officer to insert names of subcontractors or subcontracted effort.) (g...
Marine atmospheric effects on electro-optical systems performance
NASA Astrophysics Data System (ADS)
Richter, Juergen H.; Hughes, Herbert G.
1990-09-01
For the past twelve years, a coordinated tri-service effort has been underway in the United States Department of Defense to provide an atmospheric effects assessment capability for existing and planned electro-optical (E0) systems. This paper reviews the exploratory development effort in the US Navy. A key responsibility for the Navy was the development of marine aerosol models. An initial model, the Navy Aerosol Model (NAN), was developed, tested, and transitioned into LOWTRAN 6. A more comprehensive model, the Navy Oceanic Vertical Aerosol Model (NOVAM), has been formulated and is presently undergoing comprehensive evaluation and testing. Marine aerosols and their extinction properties are only one important factor in EO systems performance assessment. For many EO systems applications, an accurate knowledge of marine background radiances is required in addition to considering the effects of the intervening atmosphere. Accordingly, a capability was developed to estimate the apparent sea surface radiance for different sea states and meteorological conditions. Also, an empirical relationship was developed which directly relates apparent mean sea temperature to calculated mean sky temperature. In situ measurements of relevant environmental parameters are essential for real-time EO systems performance assessment. Direct measurement of slant path extinction would be most desirable. This motivated a careful investigation of lidar (light detection and ranging) techniques including improvements to single-ended lidar profile inversion algorithms and development of new lidar techniques such as double-ended and dual-angle configurations. It was concluded that single-ended, single frequency lidars can not be used to infer slant path extinction with an accuracy necessary to make meaningful performance assessments. Other lidar configurations may find limited application in model validation and research efforts. No technique has emerged yet which could be considered ready for shipboard implementation. A shipboard real-time performance assessment system was developed and named PREOS (Performance and Range for EO Systems). PREOS has been incorporated into the Navy's Tactical Environmental Support System (TESS). The present version of PREOS is a first step in accomplishing the complex task of real-time systems performance assessment. Improved target and background models are under development and will be incorporated into TESS when tested and validated. A reliable assessment capability can be used to develop Tactical Decision Aids (TDAs). TDAs permit optimum selection or combination of sensors and estimation of a ship's own vulnerability against hostile systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana, Scott; Van Dam, Jeroen J; Damiani, Rick R
As part of an ongoing effort to improve the modeling and prediction of small wind turbine dynamics, the National Renewable Energy Laboratory (NREL) tested a small horizontal-axis wind turbine in the field at the National Wind Technology Center. The test turbine was a 2.1-kW downwind machine mounted on an 18-m multi-section fiberglass composite tower. The tower was instrumented and monitored for approximately 6 months. The collected data were analyzed to assess the turbine and tower loads and further validate the simplified loads equations from the International Electrotechnical Commission (IEC) 61400-2 design standards. Field-measured loads were also compared to the outputmore » of an aeroelastic model of the turbine. In particular, we compared fatigue loads as measured in the field, predicted by the aeroelastic model, and calculated using the simplified design equations. Ultimate loads at the tower base were assessed using both the simplified design equations and the aeroelastic model output. The simplified design equations in IEC 61400-2 do not accurately model fatigue loads and a discussion about the simplified design equations is discussed.« less
A General Approach to Measuring Test-Taking Effort on Computer-Based Tests
ERIC Educational Resources Information Center
Wise, Steven L.; Gao, Lingyun
2017-01-01
There has been an increased interest in the impact of unmotivated test taking on test performance and score validity. This has led to the development of new ways of measuring test-taking effort based on item response time. In particular, Response Time Effort (RTE) has been shown to provide an assessment of effort down to the level of individual…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clemens, Noel
This project was a combined computational and experimental effort to improve predictive capability for boundary layer flashback of premixed swirl flames relevant to gas-turbine power plants operating with high-hydrogen-content fuels. During the course of this project, significant progress in modeling was made on four major fronts: 1) use of direct numerical simulation of turbulent flames to understand the coupling between the flame and the turbulent boundary layer; 2) improved modeling capability for flame propagation in stratified pre-mixtures; 3) improved portability of computer codes using the OpenFOAM platform to facilitate transfer to industry and other researchers; and 4) application of LESmore » to flashback in swirl combustors, and a detailed assessment of its capabilities and limitations for predictive purposes. A major component of the project was an experimental program that focused on developing a rich experimental database of boundary layer flashback in swirl flames. Both methane and high-hydrogen fuels, including effects of elevated pressure (1 to 5 atm), were explored. For this project, a new model swirl combustor was developed. Kilohertz-rate stereoscopic PIV and chemiluminescence imaging were used to investigate the flame propagation dynamics. In addition to the planar measurements, a technique capable of detecting the instantaneous, time-resolved 3D flame front topography was developed and applied successfully to investigate the flow-flame interaction. The UT measurements and legacy data were used in a hierarchical validation approach where flows with increasingly complex physics were used for validation. First component models were validated with DNS and literature data in simplified configurations, and this was followed by validation with the UT 1-atm flashback cases, and then the UT high-pressure flashback cases. The new models and portable code represent a major improvement over what was available before this project was initiated.« less
From psychiatric disorders to animal models: a bidirectional and dimensional approach
Donaldson, Zoe. R.; Hen, René
2014-01-01
Psychiatric genetics research is bidirectional in nature, with human and animal studies becoming more closely integrated as techniques for genetic manipulations allow for more subtle exploration of disease phenotypes. This synergy, however, highlights the importance of considering the way in which we approach the genotype-phenotype relationship. In particular, the nosological divide of psychiatric illness, while clinically relevant, is not directly translatable in animal models. For instance, mice will never fully re-capitulate the broad criteria for many psychiatric disorders; nor will they have guilty ruminations, suicidal thoughts, or rapid speech. Instead, animal models have been and continue to provide a means to explore dimensions of psychiatric disorders in order to identify neural circuits and mechanisms underlying disease-relevant phenotypes. Thus, the genetic investigation of psychiatric illness will yield the greatest insights if efforts continue to identify and utilize biologically valid phenotypes across species. In this review we discuss the progress to date and the future efforts that will enhance translation between human and animal studies, including the identification of intermediate phenotypes that can be studied across species, as well as the importance of refined modeling of human disease-associated genetic variation in mice and other animal models. PMID:24650688
Modeling Aerodynamically Generated Sound of Helicopter Rotors
NASA Technical Reports Server (NTRS)
Brentner, Kenneth S.; Farassat, F.
2002-01-01
A great deal of progress has been made in the modeling of aerodynamically generated sound of rotors over the past decade. Although the modeling effort has focused on helicopter main rotors, the theory is generally valid for a wide range of rotor configurations. The Ffowcs Williams Hawkings (FW-H) equation has been the foundation for much of the development. The monopole and dipole source terms of the FW-H equation account for the thickness and loading noise, respectively. Bladevortex-interaction noise and broadband noise are important types of loading noise, hence much research has been directed toward the accurate modeling of these noise mechanisms. Both subsonic and supersonic quadrupole noise formulations have been developed for the prediction of high-speed impulsive noise. In an effort to eliminate the need to compute the quadrupole contribution, the FW-H equation has also been utilized on permeable surfaces surrounding all physical noise sources. Comparisons of the Kirchhoff formulation for moving surfaces with the FW-H equation have shown that the Kirchhoff formulation for moving surfaces can give erroneous results for aeroacoustic problems. Finally, significant progress has been made incorporating the rotor noise models into full vehicle noise prediction tools.
NASA Astrophysics Data System (ADS)
Wang, Z.; Roman, M. O.; Pahlevan, N.; Stachura, M.; McCorkel, J.; Bland, G.; Schaaf, C.
2016-12-01
Albedo is a key climate forcing variable that governs the absorption of incoming solar radiation and its ultimate transfer to the atmosphere. Albedo contributes significant uncertainties in the simulation of climate changes; and as such, it is defined by the Global Climate Observing System (GCOS) as a terrestrial essential climate variable (ECV) required by global and regional climate and biogeochemical models. NASA's Goddard Space Flight Center's Multi AngLe Imaging Bidirectional Reflectance Distribution Function small-UAS (MALIBU) is part of a series of pathfinder missions to develop enhanced multi-angular remote sensing techniques using small Unmanned Aircraft Systems (sUAS). The MALIBU instrument package includes two multispectral imagers oriented at two different viewing geometries (i.e., port and starboard sides) capture vegetation optical properties and structural characteristics. This is achieved by analyzing the surface reflectance anisotropy signal (i.e., BRDF shape) obtained from the combination of surface reflectance from different view-illumination angles and spectral channels. Satellite measures of surface albedo from MODIS, VIIRS, and Landsat have been evaluated by comparison with spatially representative albedometer data from sparsely distributed flux towers at fixed heights. However, the mismatch between the footprint of ground measurements and the satellite footprint challenges efforts at validation, especially for heterogeneous landscapes. The BRDF (Bidirectional Reflectance Distribution Function) models of surface anisotropy have only been evaluated with airborne BRDF data over a very few locations. The MALIBU platform that acquires extremely high resolution sub-meter measures of surface anisotropy and surface albedo, can thus serve as an important source of reference data to enable global land product validation efforts, and resolve the errors and uncertainties in the various existing products generated by NASA and its national and international partners.
Research Program on Factors Affecting the Brightness of AC Plasma Panels.
1983-06-01
representing the second excited electronic con - figuration in neon. In an effort to devise means to improve the brightness and/or efficiency of neon filled...obtained from several panels was found to be very good (Ref. 6), a factor providing important support for the validity of the kinetic modeling pro - cedures...pressure neon Penning mixtures for I conditions typical of plasma panel displays [II]. Their work has pro - vied valuable insight regarding the electrical
Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties
NASA Technical Reports Server (NTRS)
Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.
2015-01-01
For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.
Experimental Validation of a Closed Brayton Cycle System Transient Simulation
NASA Technical Reports Server (NTRS)
Johnson, Paul K.; Hervol, David S.
2006-01-01
The Brayton Power Conversion Unit (BPCU) is a closed cycle system with an inert gas working fluid. It is located in Vacuum Facility 6 at NASA Glenn Research Center. Was used in previous solar dynamic technology efforts (SDGTD). Modified to its present configuration by replacing the solar receiver with an electrical resistance heater. The first closed-Brayton-cycle to be coupled with an ion propulsion system. Used to examine mechanical dynamic characteristics and responses. The focus of this work was the validation of a computer model of the BPCU. Model was built using the Closed Cycle System Simulation (CCSS) design and analysis tool. Test conditions were then duplicated in CCSS. Various steady-state points. Transients involving changes in shaft rotational speed and heat input. Testing to date has shown that the BPCU is able to generate meaningful, repeatable data that can be used for computer model validation. Results generated by CCSS demonstrated that the model sufficiently reproduced the thermal transients exhibited by the BPCU system. CCSS was also used to match BPCU steady-state operating points. Cycle temperatures were within 4.1% of the data (most were within 1%). Cycle pressures were all within 3.2%. Error in alternator power (as much as 13.5%) was attributed to uncertainties in the compressor and turbine maps and alternator and bearing loss models. The acquired understanding of the BPCU behavior gives useful insight for improvements to be made to the CCSS model as well as ideas for future testing and possible system modifications.
Work-related stress assessed by a text message single-item stress question.
Arapovic-Johansson, B; Wåhlin, C; Kwak, L; Björklund, C; Jensen, I
2017-12-02
Given the prevalence of work stress-related ill-health in the Western world, it is important to find cost-effective, easy-to-use and valid measures which can be used both in research and in practice. To examine the validity and reliability of the single-item stress question (SISQ), distributed weekly by short message service (SMS) and used for measurement of work-related stress. The convergent validity was assessed through associations between the SISQ and subscales of the Job Demand-Control-Support model, the Effort-Reward Imbalance model and scales measuring depression, exhaustion and sleep. The predictive validity was assessed using SISQ data collected through SMS. The reliability was analysed by the test-retest procedure. Correlations between the SISQ and all the subscales except for job strain and esteem reward were significant, ranging from -0.186 to 0.627. The SISQ could also predict sick leave, depression and exhaustion at 12-month follow-up. The analysis on reliability revealed a satisfactory stability with a weighted kappa between 0.804 and 0.868. The SISQ, administered through SMS, can be used for the screening of stress levels in a working population. © The Author 2017. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Validation and verification of the laser range safety tool (LRST)
NASA Astrophysics Data System (ADS)
Kennedy, Paul K.; Keppler, Kenneth S.; Thomas, Robert J.; Polhamus, Garrett D.; Smith, Peter A.; Trevino, Javier O.; Seaman, Daniel V.; Gallaway, Robert A.; Crockett, Gregg A.
2003-06-01
The U.S. Dept. of Defense (DOD) is currently developing and testing a number of High Energy Laser (HEL) weapons systems. DOD range safety officers now face the challenge of designing safe methods of testing HEL's on DOD ranges. In particular, safety officers need to ensure that diffuse and specular reflections from HEL system targets, as well as direct beam paths, are contained within DOD boundaries. If both the laser source and the target are moving, as they are for the Airborne Laser (ABL), a complex series of calculations is required and manual calculations are impractical. Over the past 5 years, the Optical Radiation Branch of the Air Force Research Laboratory (AFRL/HEDO), the ABL System Program Office, Logicon-RDA, and Northrup-Grumman, have worked together to develop a computer model called teh Laser Range Safety Tool (LRST), specifically designed for HEL reflection hazard analyses. The code, which is still under development, is currently tailored to support the ABL program. AFRL/HEDO has led an LRST Validation and Verification (V&V) effort since 1998, in order to determine if code predictions are accurate. This paper summarizes LRST V&V efforts to date including: i) comparison of code results with laboratory measurements of reflected laser energy and with reflection measurements made during actual HEL field tests, and ii) validation of LRST's hazard zone computations.
Rabinowitz, Amanda R; Merritt, Victoria; Arnett, Peter A
2016-08-01
Baseline neuropsychological testing is commonly used in the management of sports-related concussion. However, underperformance due to poor effort could lead to invalid conclusions regarding postconcussion cognitive decline. We designed the Motivation Behaviors Checklist (MBC) as an observational rating scale to assess effort towards baseline neuropsychological testing. Here we present preliminary data in support of its reliability and validity. MBC items were generated based on the consensus of a panel of graduate students, undergraduates, and a clinical neuropsychologist who conduct neuropsychological evaluations for a sports concussion management program. A total of 261 college athletes were administered a standard neuropsychological test battery in addition to the MBC. A subset of evaluations (n= 101) was videotape and viewed by a second rater. Exploratory factor analysis (EFA) was used to refine the scale, and reliability and validity were evaluated. EFA revealed that the MBC items represent four latent factors-Complaints, Poor Focus, Psychomotor Agitation, and Impulsivity. Reliability analyses demonstrated that the MBC has good inter-rater reliability (intraclass correlation coefficient, ICC = .767) and internal consistency (α = .839). The construct validity of the MBC is supported by large correlations with examiners' ratings of effort (ρ = -.623) and medium-sized relationships with cognitive performance and self-ratings of effort (|ρ| between .263 and .345). Discriminant validity was supported by nonsignificant correlations with measures of depression and postconcussion symptoms (ρ = .056 and .082, respectively). These findings provide preliminary evidence that the MBC could be a useful adjunct to baseline neuropsychological evaluations for sports-concussion management.
Reddy, L. Felice; Barch, Deanna M.; Buchanan, Robert W.; Dunayevich, Eduardo; Gold, James M.; Marder, Steven R.; Wynn, Jonathan K.; Young, Jared W.; Green, Michael F.
2015-01-01
Effort-based decision making has strong conceptual links to the motivational disturbances that define a key subdomain of negative symptoms. However, the extent to which effort-based decision-making performance relates to negative symptoms, and other clinical and functionally important variables has yet to be systematically investigated. In 94 clinically stable outpatients with schizophrenia, we examined the external validity of 5 effort-based paradigms, including the Effort Expenditure for Rewards, Balloon Effort, Grip Strength Effort, Deck Choice Effort, and Perceptual Effort tasks. These tasks covered 3 types of effort: physical, cognitive, and perceptual. Correlations between effort related performance and 6 classes of variables were examined, including: (1) negative symptoms, (2) clinically rated motivation and community role functioning, (3) self-reported motivational traits, (4) neurocognition, (5) other psychiatric symptoms and clinical/demographic characteristics, and (6) subjective valuation of monetary rewards. Effort paradigms showed small to medium relationships to clinical ratings of negative symptoms, motivation, and functioning, with the pattern more consistent for some measures than others. They also showed small to medium relations with neurocognitive functioning, but were generally unrelated to other psychiatric symptoms, self-reported traits, antipsychotic medications, side effects, and subjective valuation of money. There were relatively strong interrelationships among the effort measures. In conjunction with findings from a companion psychometric article, all the paradigms warrant further consideration and development, and 2 show the strongest potential for clinical trial use at this juncture. PMID:26209546
Rienksma, Rienk A; Suarez-Diez, Maria; Spina, Lucie; Schaap, Peter J; Martins dos Santos, Vitor A P
2014-12-01
Systems-level metabolic network reconstructions and the derived constraint-based (CB) mathematical models are efficient tools to explore bacterial metabolism. Approximately one-fourth of the Mycobacterium tuberculosis (Mtb) genome contains genes that encode proteins directly involved in its metabolism. These represent potential drug targets that can be systematically probed with CB models through the prediction of genes essential (or the combination thereof) for the pathogen to grow. However, gene essentiality depends on the growth conditions and, so far, no in vitro model precisely mimics the host at the different stages of mycobacterial infection, limiting model predictions. These limitations can be circumvented by combining expression data from in vivo samples with a validated CB model, creating an accurate description of pathogen metabolism in the host. To this end, we present here a thoroughly curated and extended genome-scale CB metabolic model of Mtb quantitatively validated using 13C measurements. We describe some of the efforts made in integrating CB models and high-throughput data to generate condition specific models, and we will discuss challenges ahead. This knowledge and the framework herein presented will enable to identify potential new drug targets, and will foster the development of optimal therapeutic strategies. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
The 2.3 kW Ion Thruster Wear Test
NASA Technical Reports Server (NTRS)
Parkes, James; Rawlin, Vincent K.; Sovey, James S.; Kussmaul, Michael J.; Patterson, Michael J.
1995-01-01
A 30-cm diameter xenon ion thruster is under development at NASA to provide an ion propulsion option for auxiliary and primary propulsion on missions of national interest. Specific efforts include thruster design optimizations, component life testing and validation, and performance characterizations. Under this program, the ion thruster will be brought to engineering model development status. This paper describes the results of a 2.3-kW 2000-hour wear test performed to identify life limiting phenomena, measure the performance and characterize the operation of the thruster, and obtain wear, erosion, and surface contamination data. These data are being using as a data base for proceeding with additional life validation tests, and to provide input to flight thruster design requirements.
Estimating Flow-Through Balance Momentum Tares with CFD
NASA Technical Reports Server (NTRS)
Melton, John E.; James, Kevin D.; Long, Kurtis R.; Flamm, Jeffrey D.
2016-01-01
This paper describes the process used for estimating flow-through balance momentum tares. The interaction of jet engine exhausts on the BOEINGERA Hybrid Wing Body (HWB) was simulated in the NFAC 40x80 wind tunnel at NASA Ames using a pair of turbine powered simulators (TPS). High-pressure air was passed through a flow-through balance and manifold before being delivered to the TPS units. The force and moment tares that result from the internal shear and pressure distribution were estimated using CFD. Validation of the CFD simulations for these complex internal flows is a challenge, given limited experimental data due to the complications of the internal geometry. Two CFD validation efforts are documented, and comparisons with experimental data from the final model installation are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, C.M.
2011-06-01
The need for risk-driven field experiments for CO{sub 2} geologic storage processes to complement ongoing pilot-scale demonstrations is discussed. These risk-driven field experiments would be aimed at understanding the circumstances under which things can go wrong with a CO{sub 2} capture and storage (CCS) project and cause it to fail, as distinguished from accomplishing this end using demonstration and industrial scale sites. Such risk-driven tests would complement risk-assessment efforts that have already been carried out by providing opportunities to validate risk models. In addition to experimenting with high-risk scenarios, these controlled field experiments could help validate monitoring approaches to improvemore » performance assessment and guide development of mitigation strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.
2012-08-01
Phase change materials (PCM) represent a potential technology to reduce peak loads and HVAC energy consumption in residential buildings. This paper summarizes NREL efforts to obtain accurate energy simulations when PCMs are modeled in residential buildings: the overall methodology to verify and validate Conduction Finite Difference (CondFD) and PCM algorithms in EnergyPlus is presented in this study. It also shows preliminary results of three residential building enclosure technologies containing PCM: PCM-enhanced insulation, PCM impregnated drywall and thin PCM layers. The results are compared based on predicted peak reduction and energy savings using two algorithms in EnergyPlus: the PCM and Conductionmore » Finite Difference (CondFD) algorithms.« less
Designing and encoding models for synthetic biology.
Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas
2009-08-06
A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology 'loop'.
(International seminar on the inelastic behavior of solids: Models and utilization)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruggles, M.B.
The traveler attended the International Seminar on the Inelastic Behavior of Solids: Models and Utilization, and presented an invited paper. Development and validation of constitutive models for complex loading and environmental conditions was the principal subject of the seminar. Session 1. Constitutive Models: Theoretical Development, Analysis and Comparison, and Session 2. Constitutive Models: Experimental Identification and Use, were of particular interest to the ORNL constitutive equations development effort. The traveler also visited the Applied Mechanics Laboratory at the University of Franche-Comte in Besancon and the Laboratory of Mechanics and Technology at the ENSET/Paris University 6 in Cachan. In both laboratoriesmore » the traveler held discussions regarding inelastic material behavior at room and elevated temperatures, exploratory testing and modeling, and materials testing equipment and techniques.« less
CFD validation needs for advanced concepts at Northrop Corporation
NASA Technical Reports Server (NTRS)
George, Michael W.
1987-01-01
Information is given in viewgraph form on the Computational Fluid Dynamics (CFD) Workshop held July 14 - 16, 1987. Topics covered include the philosophy of CFD validation, current validation efforts, the wing-body-tail Euler code, F-20 Euler simulated oil flow, and Euler Navier-Stokes code validation for 2D and 3D nozzle afterbody applications.
Validity Is an Action Verb: Commentary on--"Clarifying the Consensus Definition of Validity"
ERIC Educational Resources Information Center
Lissitz, Robert W.; Calico, Tiago
2012-01-01
This paper presents the authors' critique on "Clarifying the Consensus Definition of Validity" by Paul E. Newton (this issue). There are serious differences of opinion regarding the topic of validity. Newton is aware of these differences, as made clear by his choice of references and particularly his effort to respond to the various Borsboom…
Model-Based Method for Sensor Validation
NASA Technical Reports Server (NTRS)
Vatan, Farrokh
2012-01-01
Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).
Xie, Zhixiao; Liu, Zhongwei; Jones, John W.; Higer, Aaron L.; Telis, Pamela A.
2011-01-01
The hydrologic regime is a critical limiting factor in the delicate ecosystem of the greater Everglades freshwater wetlands in south Florida that has been severely altered by management activities in the past several decades. "Getting the water right" is regarded as the key to successful restoration of this unique wetland ecosystem. An essential component to represent and model its hydrologic regime, specifically water depth, is an accurate ground Digital Elevation Model (DEM). The Everglades Depth Estimation Network (EDEN) supplies important hydrologic data, and its products (including a ground DEM) have been well received by scientists and resource managers involved in Everglades restoration. This study improves the EDEN DEMs of the Loxahatchee National Wildlife Refuge, also known as Water Conservation Area 1 (WCA1), by adopting a landscape unit (LU) based interpolation approach. The study first filtered the input elevation data based on newly available vegetation data, and then created a separate geostatistical model (universal kriging) for each LU. The resultant DEMs have encouraging cross-validation and validation results, especially since the validation is based on an independent elevation dataset (derived by subtracting water depth measurements from EDEN water surface elevations). The DEM product of this study will directly benefit hydrologic and ecological studies as well as restoration efforts. The study will also be valuable for a broad range of wetland studies.
NASA Astrophysics Data System (ADS)
Tesser, D.; Hoang, L.; McDonald, K. C.
2017-12-01
Efforts to improve municipal water supply systems increasingly rely on an ability to elucidate variables that drive hydrologic dynamics within large watersheds. However, fundamental model variables such as precipitation, soil moisture, evapotranspiration, and soil freeze/thaw state remain difficult to measure empirically across large, heterogeneous watersheds. Satellite remote sensing presents a method to validate these spatially and temporally dynamic variables as well as better inform the watershed models that monitor the water supply for many of the planet's most populous urban centers. PALSAR 2 L-band, Sentinel 1 C-band, and SMAP L-band scenes covering the Cannonsville branch of the New York City (NYC) water supply watershed were obtained for the period of March 2015 - October 2017. The SAR data provides information on soil moisture, free/thaw state, seasonal surface inundation, and variable source areas within the study site. Integrating the remote sensing products with watershed model outputs and ground survey data improves the representation of related processes in the Soil and Water Assessment Tool (SWAT) utilized to monitor the NYC water supply. PALSAR 2 supports accurate mapping of the extent of variable source areas while Sentinel 1 presents a method to model the timing and magnitude of snowmelt runoff events. SMAP Active Radar soil moisture product directly validates SWAT outputs at the subbasin level. This blended approach verifies the distribution of soil wetness classes within the watershed that delineate Hydrologic Response Units (HRUs) in the modified SWAT-Hillslope. The research expands the ability to model the NYC water supply source beyond a subset of the watershed while also providing high resolution information across a larger spatial scale. The global availability of these remote sensing products provides a method to capture fundamental hydrology variables in regions where current modeling efforts and in situ data remain limited.
Amin, Sk Abdul; Adhikari, Nilanjan; Jha, Tarun; Gayen, Shovanlal
2016-12-01
Huntington's disease (HD) is caused by mutation of huntingtin protein (mHtt) leading to neuronal cell death. The mHtt induced toxicity can be rescued by inhibiting the kynurenine monooxygenase (KMO) enzyme. Therefore, KMO is a promising drug target to address the neurodegenerative disorders such as Huntington's diseases. Fiftysix arylpyrimidine KMO inhibitors are structurally explored through regression and classification based multi-QSAR modeling, pharmacophore mapping and molecular docking approaches. Moreover, ten new compounds are proposed and validated through the modeling that may be effective in accelerating Huntington's disease drug discovery efforts. Copyright © 2016 Elsevier Ltd. All rights reserved.